skip to Main Content
On Demonstrating Impact

On demonstrating impact

Consider two scenarios.*

In the first we have Georgia, a non-profit CEO whose organization works with retirees to help them navigate the healthcare system.

In the second we have Tina, another CEO whose non-profit publishes books for young adults. Tina’s organization distributes these books in the hopes they help enhance mental health among young people.

Both have just started evaluations.

Georgia tells us: “We’ve been doing this work for quite a while, and we know—anecdotally—that it works, but we really need to collect some solid data that we can show to our funders to demonstrate our impact.”

Georgia hires an evaluator, agrees on an evaluation design, and waits to hear the results. She’s nervous but relatively confident. Georgia’s approach to evaluation centres around collecting solid data on a few core metrics using methods she knows their funders will find credible.

Tina, on the other hand, is less sure about what the evaluation will find. “We’ve been doing this work for quite a while and, well, I’d really like to make sure we’re having an impact. Anecdotally it seems like we might be having some good results, but I want to make sure we’re doing as much good as possible with the money we have. Ultimately I need some help getting data to guide us in that journey.”

Tina is also nervous, but hopeful the data will help her find a way forward.

Tina asks her evaluation team to focus on finding answers to a series of questions: (1) Are people even using our resources? (2) If so, what is it that attracts them to the books. If not, why aren’t they using our resources? Finally (3) when people do use the resources, does it help them in any way?

The results

Six months in, both Georgia and Tina receive findings they’re not entirely pleased about.

Georgia finds out the retirees she works with have no better health outcomes than those she doesn’t, although her partners do feel slightly less stressed about dealing with the healthcare system.

Tina too learns that only a small portion of potential users even read the books her organization produces. Most recipients toss them in a corner, unread and unused. The only exception to this rule seems to be when a trusted friend, adult or teacher recommends the materials, then makes time to read the materials together. In a very small number of cases, readers experience substantial benefits to their mental wellbeing. However, this only seems to occur when they were able to spend time talking through the materials with person who recommended them.

What to do?

Both these scenarios involve CEOs committed to evaluation. Both CEOs have invested time, money and resources to the endeavor, but one CEO is left disappointed; the other—with a way forward.

The difference?

The (sure, fairly obvious) difference between these two evaluations lies in their leaders’ intentions: Georgia’s evaluation was an effort to demonstrate impact. Georgia saw evaluation as an assessment exercise designed to identify positive metrics she could show to her funder in pursuit of increased funding.

In contrast, Tina’s evaluation was an effort to generate impact—she saw it as a learning opportunity designed to figure out how her program was going and to understand the conditions under which it appeared to work best. Even though Tina’s evaluation didn’t uncover the most positive of results, it nonetheless offered her and her team a way forward.

Evaluation’s potential

As evaluators we’re frequently called upon to talk with leaders about why they want to do an evaluation.

With some regularity, leaders will tell us they want to do an evaluation to demonstrate their impact.

This represents a worthy goal, for sure (who doesn’t want to show they’re having a positive impact on the people they work with?). But worthy as it may be, this goal also fails to capitalize on evaluation’s true potential. That is: acting as a vehicle to create and enhance organizational impact by figuring out what works best (and doesn’t) so program design and implementation can be refined over time, and in tandem with learning.

The core message of this post? If you’re about to embark on a new evaluation, consider what you might gain by orienting your approach to one that seeks a way forward rather than simply assessing the past. Let’s move away from the view that evaluation’s purpose is to demonstrate impact, and towards the view that evaluation’s purpose is to help you maximize your impact!

*These scenarios are entirely fictional.

Back To Top