Module 7 – Evaluation


Performance Practice Learning Module

Is your organization ready for external evaluation? Review past evaluations and future plans—and prepare for external evaluations that will help you ensure mission effectiveness.

This module helps executive leadership and program directors work through an evaluation plan, identify appropriate overarching evaluation questions, and understand what types of external evaluations are appropriate at what stages.

Careful consideration of this module helps you spend your organization’s resources wisely—and get meaningful evaluation results—by engaging external evaluators for the right types of evaluation at the right times.


User Guide

Introduction to the Performance Practice, acknowledgments, application, and development methodology

Download File



Use this worksheet to complete the self-assessment.

Download File


Reporting App

The reporting app compiles your results - no manual aggregation required!

Windows Instructions
Windows Reporting App

MacOS Instructions
MacOS App, up to v10.12/Sierra
MacOS App, v10.13+/High Sierra

Watch: Reporting App Guide

Evaluation Principles and Proof Points

Principle 7.1: Leaders complement internal monitoring with external evaluations conducted by highly skilled, independent experts.

7.1.1: To help drive improvements in our organization, we periodically arrange for external evaluations conducted by experts with credibility in the field.

Principle 7.2: Leaders commission external assessments to learn more about how well their programs are being run, what these programs are or are not accomplishing, who is or is not benefiting, and how the programs can be strengthened. Leaders do not use external assessments as a one-time, up-or-down verdict on the organization’s effectiveness.

7.2.1: My organization’s external evaluations are designed to assess the reliability and validity of our internal performance data; the quality of our implementation; and the overall effectiveness of our efforts.

Principle 7.3: Leaders recognize that there are many different types of external assessments, and no one type is right for every organization or for every stage of an organization’s development. Independent evaluators who understand how different methodologies fit different contexts can help leaders match the tool to the task.

7.3.1: My organization has adopted a formal external evaluation plan that spells out the different types of evaluations that will be relevant for us at different stages of our development. We update the plan periodically.

7.3.2: My organization’s plan includes formative (implementation) evaluation to help us determine:

  • the quality of our internal data and program implementation
  • whether we are delivering programs with fidelity to our model
  • how well we are doing at recruiting and enrolling the population for which our programs are designed
  • our program utilization, program completion, and participant engagement
  • which clients achieve the intended outcomes, which do not, and which exit the program prematurely.

7.3.3: My organization’s evaluation plan includes summative (impact) evaluation of programs that have been running as intended for several years, to help us determine whether we’re making a difference beyond what would have happened anyway.

Principle 7.4: Leaders draw a clear distinction between outputs (e.g., meals delivered, youth tutored) and outcomes (meaningful changes in knowledge, skills, behavior, or status). Those who are working to improve outcomes commission evaluations to assess whether they are having a positive net impact. In other words, they want to know to what extent, and for whom, they’re making a meaningful difference beyond what would have happened anyway.

7.4.1: My organization’s internal performance data clearly distinguish between outputs and outcomes—and have been validated by independent experts.

7.4.2: My organization’s external evaluators use output data to help us learn about program quality and fidelity.

7.4.3: My organization’s external evaluators use outcome data to help us determine whether we’re making a difference beyond what would have happened anyway. This requires using a reliable research design to compare data from our participants with data from similar people who did not receive our services.

Principle 7.5: Leaders who plan to expand significantly any programs aimed at improving outcomes have a special obligation to commission a rigorous evaluation that can assess net impact.

7.5.1: If my organization plans to grow significantly, we are conducting (or have conducted) both rigorous formative (implementation) and summative (impact) evaluations—with enough lead time to allow us to make critical adjustments and ensure that expanded programs will have the best chance of achieving net impact for those we serve.

7.5.2: My organization has or would put growth plans on hold—and look to redesign them before resuming growth—if/when evaluation findings show that we’re having significant trouble with implementation or our clients are not benefiting in the ways we had expected.

Principle 7.6: Even those leaders who commission the most rigorous of impact evaluations do not stop there. They commission additional assessments to gauge their impact in new settings (or for new populations) and achieve greater positive impact for the money they spend.

7.6.1: My organization conducts new external evaluations (formative or summative) whenever we make significant program changes, operate programs in new contexts, and/or enroll different target populations.

7.6.2: My organization periodically conducts new summative evaluations, because the societal context in which our organization and programs operate constantly changes.

Principle 7.7: Leaders share the methodology and results of their external assessments to help others learn and avoid mistakes.

7.7.1: My organization shares our evaluation plans throughout the organization and with interested stakeholders.

7.7.2: My organization shares our evaluation findings throughout the organization as the basis for strengthening our programs and with external stakeholders who can benefit from the knowledge.

Note: These principles and proof points were updated in October, 2019 to align with release 2.0 of the Performance Imperative.

Download Principles & Proof Points for Evaluation

Sample Reports

With the free reporting app, you can create a report like the sample below in minutes. To create your own report, you will first need to collect your data. Go to any of the modules to get started.

The Summary report shows the distribution of ratings for each proof point, gives a picture of the level of consensus, and opens the door to productive conversations about ways to move forward.

For a thorough understanding of individual perspectives, drill down to see each respondent’s ratings and comments per proof point.

Evaluation Module Principles & Proof Points
Principles and proof points for this module to provide overall perspective and context before completing the Workbook

Evaluation Module Worksheet
File to use to complete the self-assessment, includes instructions; open on laptop/desktop rather than mobile device

User Guide
Introduction to the Performance Practice, acknowledgments, application, and development methodology

Reporting App Downloads
The reporting app compiles data from all the worksheets into one spreadsheet.
Reporting App: Windows
Reporting App: MacOS (v10.13 + newer)
Reporting App: MacOS (v10.12 + older)

Got questions or feedback? Contact us at

Do NOT follow this link or you will be banned from the site!