Culture and Evaluation

Instructor: Dr. Leona Ba

DescriptionThis course focuses on examining how culture affects the effectiveness of evaluation. It begins with a definition of culture and a brief overview of major cultural theories and models. Participants will be encouraged to reflect on their own cultural sensitivity, a prerequisite for conducting culturally sensitive evaluations. The course will discuss cultural factors affecting the effectiveness of evaluation at different levels, including the evaluator, the evaluation context and the evaluation process. Participants will explore strategies for applying cultural sensitivity to evaluation practice using examples from first-hand experience and from reviews of various program evaluations. In order to make the most of this one-day course, students will be sent reading materials to review prior to the course.

Prerequisites: Understanding of evaluation and research design


Effective Reporting Strategies for Evaluators

Instructor: Dr. Kathryn Newcomer

Description: The use and usefulness of evaluation work is highly affected by the effectiveness of reporting strategies and tools. Care in crafting both the style and substance of findings and recommendations is critical to ensure that stakeholders pay attention to the message. Skill in presenting sufficient information — yet not overwhelming the audience — is essential to raise the likelihood that potential users of the information will be convinced with both the relevance and the validity of the data. This course will provide guidance and practical tips on reporting evaluation findings. Attention will be given to the selection of appropriate reporting strategies/formats for different audiences and to the preparation of: effective executive summaries; clear analytical summaries of quantitative and qualitative data; user-friendly tables and figures; discussion of limitations to measurement validity, generalizability; causal inferences, statistical conclusion validity, and data reliability; and useful recommendations. The text provided as part of course fee is Torres et al., Evaluation Strategies for Communicating and Reporting (2nd Ed., Sage, 2005). 


How to Build a Successful Evaluation Consulting Practice

Instructor: Dr. Michael Quinn Patton

Description: This class offers the opportunity for participants to learn from someone who has been a successful evaluation consultant for 30 years. Issues addressed include: What does it take to establish an independent consulting practice?  How do you find your consulting niche?  How do you attract clients, determine how much to charge, create collaborations, and generate return business?  Included will be discussion on such topics as marketing, pricing, bidding on contracts, managing projects, resolving conflicts, professional ethics, and client satisfaction.   Participants will be invited to share their own experiences and seek advice on situations they’ve encountered.  The course is highly interactive and participant-focused.

Participants will receive a copy of the instructor’s text: Essentials of Utilization-Focused Evaluation (Sage, 2012).


Implementation Analysis for Feedback on Program Progress and Results

Instructor: Dr. Arnold Love

Description: Many programs do not achieve intended outcomes because of how they are implemented. Thus, implementation analysis (IA) is very important for policy and funding decisions. IA fills the methodological gap between outcome evaluations that treat a program as a “black box” and process evaluations that present a flood of descriptive data. IA provides essential feedback on the “critical ingredients” of a program, and helps drive change through an understanding of factors affecting implementation and short-term results. Topics include: importance of IA; conceptual and theoretical foundations of IA; how IA drives change and complements other program evaluation approaches; major models of IA and their strengths/weaknesses; how to build an IA framework and select appropriate IA methods; concrete examples of how IA can keep programs on-track, spot problems early, enhance outcomes, and strengthen collaborative ventures; and suggestions for employing IA in your organization. Detailed course materials and in-class exercises are provided.


Making Evaluation Data Actionable

Instructor: Dr. Ann Doucette

Description: Interventions and programs are implemented within complex environments that present challenges for collecting program performance information. A general problem for performance measurement initiatives — and what often causes them to fall short of their intended objectives — is the failure to choose performance measures that are actionable, meaning that they are linked to practices that an organization or agency can actually do something about, and the changes in those practices can be linked directly to improved outcomes and sustained impact.

This class introduces complex adaptive systems (CAS) thinking and addresses the implication of CAS in evaluating the outcomes and impact of interventions and programs. Examples used in this case range from healthcare, education, transportation and safety, developing countries, and research and development environments. The class examines performance measurement strategies that support actionable data. The focus will be on data-based decision making, value-based issues, and practice-based evidence that can assist in moving performance measurement and quality monitoring activities from a process, outcome, and impact evaluation approach to continuous quality improvement. Business models such as Toyota Production System, Six-sigma, Balanced Scorecards, as well as knowledge management and benchmarking strategies will be discussed in terms of how they can inform improvement strategies.

NOTE: Persons with some experience in program evaluation, and those with interest in a systems perspective will likely derive the most benefit from this course.


Presenting Data Effectively: Practical Methods for Improving Evaluation Communication

Instructor: Dr. Stephanie Evergreen

Description: Crystal clear reports, slides, and graphs are valuable – they save an audience’s mental energies, keep a reader engaged, and make you look smart.  In this workshop, attendees will learn the science behind presenting data effectively and will leave with direct, pointed changes that can be immediately administered to their own evaluation deliverables. The workshop will address principles of data visualization, slideshow, and report design that support legibility, comprehension, and retention of our data in the minds of our clients. Grounded in visual processing theory, the principles will enhance attendees’ ability to communicate more effectively with peers, colleagues, and clients through a focus on the proper use of color, arrangement, graphics, and text. Attendees are strongly encouraged to maximize the workshop experience by bringing printouts of graphs, slides, and reports under current construction.

In the second day of workshop, Dr. Stephanie Evergreen will lead attendees through how to manipulate Excel into making impactful charts and graphs, step-by-step, using provided data sets distributed to the audience. Audience members will leave the session with more in depth knowledge about to craft effective data displays. The demonstration will occur in the computer lab on PCs running Office 2010. Completing the session moves one to Excel Ninja Level 10.

Attendees will learn:

1.     Graphic design best practices based in visual processing theory

2.     How to apply graphic design best practices and visual processing theory to enhance evaluation communications

3.     How to create high impact data visualizations in Excel

Workshop attendees will leave with helpful handouts and a copy of Effective Data Visualization (Sage, 2016).

Registrants should regularly develop graphs, slideshows, technical reports and other written communication for evaluation work and be familiar with the navigational and layout tools available in simple software programs, like Microsoft Office.


Project Management and Oversight for Evaluators

Instructor: Tessie Catsambas, MPP

Description: The purpose of this course is to provide new and experienced evaluation professionals and funders with strategies, tools and skills to: (1) develop realistic evaluation plans; (2) negotiate needed adjustments when issues arise; (3) organize and manage evaluation teams; (4) monitor evaluation activities and budgets; (5) protect evaluation independence and rigor while responding to client needs; and (6) ensure the quality of evaluation products and briefings.

Evaluation managers have a complex job: they oversee the evaluation process and are responsible for safeguarding the methodological integrity, evaluation activities, and budgets. In many cases they must also manage people, including clients, various stakeholders, and other evaluation team members. Evaluation managers shoulder the responsibility for the success of the evaluation, frequently dealing with unexpected challenges, and making decisions that influence the quality and usefulness of the evaluation.

Against a backdrop of demanding technical requirements and a dynamic political environment, the goal of evaluation management is to develop, with available resources and time, valid and useful measurement information and findings, and ensure the quality of the process, products and services included in the contract. Management decisions influence methodological decisions and vice versa, as method choice has cost implications.

The course methodology will be experiential and didactic, drawing on participants’ experience and engaging them with diverse material. It will include paper and online tools for managing teams, work products and clients; an in-class simulation game with expert judges; case examples; reading; and a master checklist of processes and sample forms to organize and manage an evaluation effectively. At the end of this training, participants will be prepared to follow a systematic process with support tools for commissioning and managing evaluations, and will feel more confident to lead evaluation teams and negotiate with clients and evaluators for better evaluations.


Strategy Mapping

Instructor: Dr. John Bryson

Description: The world is often a muddled, complicated, dynamic place in which it seems as if everything connects to everything else – and that is the problem! The connections can be problematic because – while we know things are connected – sometimes we do not know how, or else there are so many connections we cannot comprehend them all. Alternatively, we may not realize how connected things are and our actions lead to unforeseen and unhappy consequences. Either way, we would benefit from an approach to strategizing, problem solving, conflict management, and evaluation that helps us understand just how connected the world is, what the effects of those connections are, and what might be done to change some of the connections and their effects.

Action-oriented strategy mapping (AOSM) is a simple and useful technique for addressing situations where thinking – as an individual or as a group – matters. An action-oriented strategy map is a word-and-arrow diagram in which ideas and actions are causally linked with one another through the use of arrows. The arrows indicate how one idea or action leads to another – and specifically how coherent sets of actions, strategies, and goals can be created, communicated, managed, and implemented. AOSM makes it possible to articulate a large number of ideas and their interconnections in such a way that people can know what to do in an area of concern, how to do it, and why, because the arrows indicate the causes and consequences of an idea or action. AOSM therefore is a technique for linking strategic thinking, acting, and learning; helping make sense of complex problems; communicating to oneself and others what might be done about them; and also managing the inevitable conflicts that arise. The technique is useful for formulating and implementing mission, goals, and strategies and for being clear about how to evaluate strategies. The bottom line is: AOSM is one of the most powerful strategic management tools in existence. AOSM is what to do when thinking matters!

When can mapping help?  There are a number of situations that are tailor-made for mapping. Mapping is particularly useful when:

·      effective strategies need to be developed

·      persuasive arguments are needed

·      effective and logical communication is essential

·      effective understanding and management of conflict are needed

·      it is vital that a situation be understood better as a prelude to any action

·      organizational or strategic logic needs to be clarified in order to design useful evaluations

These situations are not meant to be mutually exclusive. Often they will overlap in practice. In addition, mapping is very helpful for creating business models (that link competencies and distinctive competencies to mission, goals, and critical success factors) and Balanced Scorecards (which are more useful for strategy implementation than for strategy formulation). AOSMs are related to logic models, as both are word-and-arrow diagrams, but are more tied to goals, strategies, and actions and are more careful about articulating causal connections.

Objectives: (Strategy Mapping)

At the end of the course, participants will:

·      Understand the theory of mapping

·      Know the difference between action-oriented strategy maps, business model maps, and balanced scorecard maps

·      Be able to create action-oriented strategy maps for individuals – that is, either for oneself or by interviewing another person

·      Be able to create action-oriented maps for groups

·      Be able to create a business model map linking competencies and distinctive competencies to goals and critical success factors

·      Know how to design and manage change processes in which mapping is prominent

·      Have an action plan for an individual project


Using Program Evaluation in Nonprofit Environments

Instructor: Dr. Kathryn Newcomer

Description: Funders and oversight boards typically need data on the results obtained by the programs they fund. Within foundations program officers want information about grantees, and about lines of effort they fund to guide planning and future allocation of resources.  Executive officers and members of the boards that oversee nonprofit service providers also want to know what works and what does not. This class provides background that program officers and overseers need to understand how evaluation can serve their information needs, and how to assess the quality of the evidence they receive.

Drawing upon cases from foundations and nonprofits, the session will help attendees:

  • Clarify where to start in using evaluation to improve nonprofit social service programs.
  • Learn what/who drives program evaluation and performance measurement in public and nonprofit service providers;
  • Explore uses of evaluation and outcomes assessment in the non-profit sector
  • Understand how to frame useful scopes of work (SOWs) and requests for proposals (RFPs)  for evaluations and performance measurement systems;
  • Identify and apply relevant criteria  in choosing contractors and consultants to provide evaluation assistance;
  • Discuss challenges to measurement of social service outcomes;
  • Understand what questions to ask of  internal evaluation staff and outside consultants about the quality of their work