April 30-May 1

Conducting Successful Evaluation Surveys

Instructor: Jolene D. Smyth, PhD

Description: The success of many evaluation projects depends on the quality of survey data collected. In the last decade, sample members have become increasingly reluctant to respond, especially in evaluation contexts. In response to these challenges and to technological innovation, methods for doing surveys are changing rapidly. This course will provide new and cutting-edge information about best practices for designing and conducing internet, mail, and mixed-mode surveys.

Students will gain an understanding of the multiple sources of survey error and how to identify and fix commonly occurring survey issues. The course will cover writing questions; visual design of questions (drawing on concepts from the vision sciences); putting individual questions together into a formatted questionnaire; designing web surveys; designing for multiple modes; and fielding surveys and encouraging response by mail, web, or in a mixed-mode design.

The course is made up of a mixture of PowerPoint presentation, discussion, and activities built around real-world survey examples and case studies. Participants will apply what they are learning in activities and will have ample opportunity to ask questions during the course (or during breaks) and to discuss the survey challenges they face with the instructor and other participants.

Recommended text: Internet, mail, and Mixed-Mode Surveys: The Tailored Design Method by Don A. Dillman, Jolene D. Smyth, and Leah Melani Christian (4th Edition, 2014).


April 30-May 1

Evaluating Training Programs: Frameworks and Fundamentals

Instructor: Ann Doucette, PhD

Description: The evaluation of training programs typically emphasizes participants’ initial acceptance and reaction to training content; learning, knowledge and skill acquisition; participant performance and behavioral application of training; and, benefits at the organizational and societal levels that result from training participation. The evaluation of training programs, especially behavioral application of content and organizational benefits from training, continues to be an evaluation challenge. Today’s training approaches are wide-ranging, including classroom type presentations, self-directed online study courses, online tutorials and coaching components, supportive technical assistance, and so forth. Evaluation approaches must be sufficiently facile to accommodate training modalities and the individual and organizational outcomes that result from training efforts.

The Kirkpatrick (1959, 1976) training model has been a longstanding evaluation approach; however, it is not without criticism or suggested modification. The course provides an overview of two training program evaluation frameworks: 1) the Kirkpatrick model and modifications, which emphasizes participant reaction, learning, behavioral application and organizational benefits, and 2) the Concerns-based Adoption Model (CBAM), a diagnostic approach that assesses stages of participant concern about how training will affect individual job performance, describes how training will be configured and practiced within the workplace, and gauges the actual level of training use.

The course is designed to be interactive and to provide a practical approach for planning (those leading or commissioning training evaluations), implementing, conducting or managing training evaluations. The course covers an overview of training evaluation models; pre-training assessment and training program expectations; training evaluation planning; development of key indicators, metrics and measures; training evaluation design; data collection – instrumentation and administration, data quality; reporting progress, change, results; and, disseminating findings and recommendations – knowledge management resulting from training initiatives. Case examples will be used throughout the course to illustrate course content.


May 2-3

Introduction to Cost-Benefit and Cost-Effectiveness Analysis

Instructor: Robert D. Shand, PhD

Description: The tools and techniques of cost-benefit and cost-effectiveness analysis will be presented. The goal of the course is to provide analysts with the skills to interpret cost-benefit and cost-effectiveness analyses. Content includes identification and measurement of costs using the ingredients method; how to specify effectiveness; shadow pricing for benefits using revealed preference and contingent valuation methods; discounting; calculation of cost-effectiveness ratios, net present value, cost-benefit ratios, and internal rates of return. Sensitivity testing and uncertainty will also be addressed. Individuals will work in groups to assess various costs, effects, and benefits applicable to selected case studies across various policy fields. Case studies will be selected from across policy fields (e.g. health, education, environmental sciences).


May 2-4

Outcome and Impact Assessment

Instructor: Melvin Mark, Ph.D.

Description: Valid assessment of the outcomes or impact of a social program is among the most challenging evaluation tasks, but also one of the most important. This course will review monitoring and tracking approaches to assessing outcomes as well as the experimental and quasi-experimental methods that are the foundation for contemporary impact evaluation. Attention will also be given to issues related to the measurement of outcomes, ensuring detection of meaningful program effects, and interpreting the magnitude of effects. Emphasis will mainly be on the logic of outcome evaluation and the conceptual and methodological nature of the approaches, including research design and associated analysis issues. Nonetheless, some familiarity with social science methods and statistical analysis is necessary to effectively engage the topics covered in this course.

Prerequisites: At least some background in social science methods and statistical analysis or direct experience with outcome measurement and impact assessment designs.


 

May 4-5

Presenting Data Effectively: Practical Methods for Improving Evaluation Communication

Instructor: Stephanie Evergreen, PhD

Description: Crystal clear charts and graphs are valuable–they save an audience’s mental energies, keep a reader engaged, and make you look smart. In this workshop, attendees will learn the science behind presenting data effectively. We will go behind-the-scenes in Excel and discuss how each part of a visualization can be modified to best tell the story in a particular dataset. We will discuss how to choose the best chart type, given audience needs, cognitive capacity, and the story that needs to be told about the data–and this will include both quantitative and qualitative visualizations. We will walk step-by-step through how to create newer types of data visualizations and how to manipulate the default settings to customize graphs so that they have a more powerful impact. Attendees will build with a prepared spreadsheet to learn the secrets to becoming an Excel dataviz ninja. Attendees will get hands-on practice implementing direct, practical steps that can be immediately implemented after the workshop to clarify data presentation and support clearer decision-making. Full of guidelines and examples, after this workshop you’ll be better able to package your data so it represents your smart, professional quality.

Note: Attendees are strongly encouraged to maximize the workshop experience by bringing a slideshow that contains graphs under current construction. Attendees should bring their own laptops loaded with Microsoft Excel. No tablets or smart phones. PCs preferred; Macs okay.

In the second day of workshop, Dr. Stephanie Evergreen will lead attendees through how to manipulate Excel into making impactful charts and graphs, step-by-step, using provided data sets distributed to the audience. Audience members will leave the session with more in depth knowledge about to craft effective data displays. Completing the session moves one to Excel Ninja Level 10.

Attendees will learn:

  1. Visual processing theory and why it is relevant for evaluators
  2. How to apply graphic design best practices and visual processing theory to enhance data visualizations with simple, immediately implementable steps
  3. Which chart type to use, when
  4. How to construct data visualizations and other evaluation communication to best tell the story in the data
  5. Alternative methods for reporting

Workshop attendees will leave with helpful handouts and a copy of the instructor’s book, Effective Data Visualization.

Registrants should regularly develop graphs, slideshows, technical reports and other written communication for evaluation work and be familiar with the navigational and layout tools available in simple software programs, like Microsoft Office.

Contact Us

The Evaluators’ Institute

tei@cgu.edu