Getting Started with Program Evaluation: Chapter 2
Planning a Process Evaluation
A process evaluation answers the question: How well is our program being implemented?
A process evaluation will help your program determine whether the activities and outputs from your logic model are happening as intended. This can be especially helpful in the early years of program development, but it is also equally valuable for long-running programs.
Key concepts we’ll explore in this chapter:
- Fidelity of implementation
- Identifying sources of process evaluation data
-
fidelity of information The key question in a process evaluation is whether your program is being consistently implemented with fidelity to its intended model. When you take a closer look at your program in practice, are the activities and outputs from your logic model happening as they should be?
Take a moment to find out what Super Scholars learned from conducting a process evaluation:
-
developing process evaluation questions Planning for a process evaluation starts with identifying key questions you’d like to answer about the activities, expected outputs, and assumptions you identified in your logic model.
Three key concepts are helpful to keep in mind when drafting process questions: consistency, participation, and quality.
Process Evaluation Concept Sample Super Scholars Process Evaluation Question Consistency: To what extent are program staff members and mentors implementing program components with the intended regularity and depth? What percentage of intended mentor trainings are conducted each semester? Participation: To what extent does each mentee or mentor take part in or receive intended activities or experiences? What percentage of mentors completed their one hour of ongoing training? Quality: To what degree is the program being implemented to a desired standard? What percentage of mentors found the ongoing training received from program staff members to be helpful or very helpful? -
identifying sources of process data A strong process evaluation generally includes both quantitative (can be counted or expressed numerically) and qualitative (narrative or descriptive) data.
The data you collect — and how you collect those data — will be driven by your evaluation questions, as well as the availability (or ease of access) of the data and the resources (e.g., funding, staff members’ time, evaluation expertise) required to gather the data. Some possible sources of process data include:
- Program records (e.g., implementation checklists, staff member logs, mentor logs, attendance sheets, case files, programs application/registration forms)
- These are often an excellent source of data for process evaluations. If your program is not gathering routine program records, then you’ve already found one area of process improvement.
- Direct observations of program activities (e.g., match interactions, group activities, match support, mentor training)
- Surveys and interviews (of mentors, mentees, parents/guardians, staff members, and/or partners)
- Focus groups (same potential participants as surveys)
No single method may provide all the data needed for a process evaluation. Consider the pros and cons of different data sources using the handout “Data Sources for Process Evaluation”.
- Program records (e.g., implementation checklists, staff member logs, mentor logs, attendance sheets, case files, programs application/registration forms)
-
Using Process Evaluation Data To help make use of your process evaluation data, schedule regular times for staff members to reflect on and discuss your findings. To guide these reflections, it’s helpful to have some benchmarks to assess your process questions.
Super Scholars created the following benchmarks to assess implementation of its ongoing mentor training.
Process Question Data Source Benchmarks What percentage of intended mentor trainings are conducted each semester? - Survey of mentors who attended
- Report from training facilitator
At least 95 percent of expected trainings were conducted each quarter What percentage of mentors completed their one hour of ongoing training? - Sign-up sheets or RSVPs from the event
- Mentor survey
More than 90 percent of mentors fully completed ongoing training What percentage of mentors found the ongoing training received from program staff members to be helpful or very helpful? - Mentor survey
- Focus groups with attendees
- Focus group with non-attendees
On a five-point scale, an average quality rating of 4.25 More than 80 percent of attendees said they found the ongoing training to be helpful or very helpful
-
planning for your own process evaluation (activity) Conducting a process evaluation requires organizational buy-in and contribution. To help discuss and outline your plans for conducting a process evaluation, consider taking the following steps with program staff members and leaders:
- Review the “Sample Plan for Gathering Process Data”
- Use the “Planning Your Process Evaluation worksheet” to:
- Identify key process questions related to implementation of your activities and outputs
- Identify sources of data to help answer those questions
- Determine how often you will collect and analyze the data