Blog
Learning from the Literature: A First Step Toward Developing a Sector Driven Evaluation Strategy – Archived Content
We’ve been amazed by the number of people who are eager to talk evaluation with us. We think this interest arises in part because people understand that evaluation has huge potential to make a difference in the sector. Many of us see evaluation as a way to better explain the value of the work we do. At the same time, we know that people are keen to talk about evaluation because they are frustrated. Although the investment in evaluation has been large, the return has often not lived up to expectations. Many evaluation reports sit on shelves or are sent off to government departments or other funders never to be heard from again.
Before we got too far in our work, we felt we needed to begin to unpack this big, complex, and sometimes emotional evaluation discussion. We needed to more clearly understand the expectations that people have for evaluation, the conditions under which evaluation is most likely to be useful, and the mistakes to avoid. This report has helped us to refine our thoughts and focus in on some of the big systemic issues of evaluation in the nonprofit sector.
Here are a few of the things we learned.
While the sector talks a lot about evaluation, we found that most of this discussion is centred around methodology, tools, and indicators. We found that there has been less attention paid to the intended purpose and audiences for evaluation results — in other words, who is asking the evaluation questions and why. This turns out to be kind of important. You might think that an evaluation is most likely to be used when it is methodologically rigorous, with a large sample size, all the latest standardized measures, and a big thick report with lots of graphs. However, the research on evaluation use is pretty conclusive. Although these things do matter, they don’t predict use all that well. What matters more are things like these:
- whether the evaluation has a clear purpose that people see as important;
- whether the people who are expected to use the evaluation are involved in planning it;
- whether the people involved have worked to develop trust; and
- how much time and energy is invested in making sure there is good communication throughout the evaluation process.
Think a bit about your evaluation experience. Did these factors play a role in how useful the evaluation turned out to be? Frustration tends to arise in situations where people haven’t had input in the evaluation purpose, don’t understand how findings will be used, and have had limited opportunity to reflect critically on what is being learned.
According to the research that we reviewed, these kinds of scenarios — where the potential for frustration is high and the potential for the evaluation to lead to action is low — are most likely to occur when the evaluation has been required by a funder for the purposes of only holding the nonprofit accountable for use of grant money. The potential for learning and action is even lower if the process is poorly explained, based on unrealistic expectations, or under-resourced.
When it comes to managing expectations, we found that it is important to make sure that the evaluation approach you use fits well with your context. We noted that the term evaluation is used to cover a wide range of social research activities, undertaken by different stakeholder groups, for differing reasons. For some, evaluation might mean a group of staff getting together at the end of a program cycle to reflect on how it went. For others, evaluation could be a complex, multi-year research project with sites all over the province and access to a large team of academic experts.
Going forward, we’ll use this research to help us develop the products and ideas that will eventually make up the Sector Driven Evaluation Strategy. At each stage of development, we’ll also continue to seek your feedback to help us make this work relevant to the sector. In the meantime, this review will give you a better idea of where we’re headed and why. Happy reading!
Report_ONN-Evaluation-Executive-Summary_2016-01-21.pdf
Exploring-the-Issues_Evaluation-Lit-Review-1.pdf