A process evaluation answers the question: How well is our program being implemented?

A process evaluation will help your program determine whether the activities and outputs from your logic model are happening as intended. This can be especially helpful in the early years of program development, but it is also equally valuable for long-running programs.

Key concepts we’ll explore in this chapter:

  • Fidelity of implementation
  • Identifying sources of process evaluation data
  • fidelity of information

    The key question in a process evaluation is whether your program is being consistently implemented with fidelity to its intended model. When you take a closer look at your program in practice, are the activities and outputs from your logic model happening as they should be?

    Take a moment to find out what Super Scholars learned from conducting a process evaluation:

    Reading 2: Is Our Program Being Delivered as Intended?
     

  • developing process evaluation questions

    Planning for a process evaluation starts with identifying key questions you’d like to answer about the activities, expected outputs, and assumptions you identified in your logic model.

    Three key concepts are helpful to keep in mind when drafting process questions: consistencyparticipation, and quality.
     

    Process Evaluation Concept Sample Super Scholars Process Evaluation Question
    Consistency: To what extent are program staff members and mentors implementing program components with the intended regularity and depth? What percentage of intended mentor trainings are conducted each semester?
    Participation: To what extent does each mentee or mentor take part in or receive intended activities or experiences? What percentage of mentors completed their one hour of ongoing training?
    Quality: To what degree is the program being implemented to a desired standard? What percentage of mentors found the ongoing training received from program staff members to be helpful or very helpful?
  • identifying sources of process data

    A strong process evaluation generally includes both quantitative (can be counted or expressed numerically) and qualitative (narrative or descriptive) data.

    The data you collect — and how you collect those data — will be driven by your evaluation questions, as well as the availability (or ease of access) of the data and the resources (e.g., funding, staff members’ time, evaluation expertise) required to gather the data. Some possible sources of process data include:

    • Program records (e.g., implementation checklists, staff member logs, mentor logs, attendance sheets, case files, programs application/registration forms)
      • These are often an excellent source of data for process evaluations. If your program is not gathering routine program records, then you’ve already found one area of process improvement.
    • Direct observations of program activities (e.g., match interactions, group activities, match support, mentor training)
    • Surveys and interviews (of mentors, mentees, parents/guardians, staff members, and/or partners)
    • Focus groups (same potential participants as surveys)

     

    No single method may provide all the data needed for a process evaluation. Consider the pros and cons of different data sources using the handout “Data Sources for Process Evaluation”.

  • Using Process Evaluation Data

    To help make use of your process evaluation data, schedule regular times for staff members to reflect on and discuss your findings. To guide these reflections, it’s helpful to have some benchmarks to assess your process questions.

    Super Scholars created the following benchmarks to assess implementation of its ongoing mentor training.

    Process Question Data Source  Benchmarks
    What percentage of intended mentor trainings are conducted each semester?
    • Survey of mentors who attended
    • Report from training facilitator

     

    At least 95 percent of expected trainings were conducted each quarter
    What percentage of mentors completed their one hour of ongoing training?
    • Sign-up sheets or RSVPs from the event
    • Mentor survey

     

    More than 90 percent of mentors fully completed ongoing training
    What percentage of mentors found the ongoing training received from program staff members to be helpful or very helpful?
    • Mentor survey
    • Focus groups with attendees
    • Focus group with non-attendees

     

    On a five-point scale, an average quality rating of 4.25

    More than 80 percent of attendees said they found the ongoing training to be helpful or very helpful

  • planning for your own process evaluation (activity)

    Conducting a process evaluation requires organizational buy-in and contribution. To help discuss and outline your plans for conducting a process evaluation, consider taking the following steps with program staff members and leaders:

    1. Review the “Sample Plan for Gathering Process Data
    2. Use the “Planning Your Process Evaluation worksheet” to:
      1. Identify key process questions related to implementation of your activities and outputs
      2. Identify sources of data to help answer those questions
      3. Determine how often you will collect and analyze the data

Back

Return to Building a Logic Model

Chapter 1

Continue

Continue to Measuring Relationship Quality

Chapter 3

  • Download Now
    Please select all that apply
  • MENTOR National and Affiliates will use the information you provide to better inform future publications and keep you up to date with advancements in the mentoring field. For more information, check out our privacy policy.

  • Download Now
    Please select all that apply
  • MENTOR National and Affiliates will use the information you provide to better inform future publications and keep you up to date with advancements in the mentoring field. For more information, check out our privacy policy.