Sector Driven Evaluation Strategy
The work found on this page was the result of a 25 month project funded by the Ministry of Citizenship and Immigration that wrapped up in June 2017.
The purpose of this work was to promote a more enabling ecosystem for evaluation in the nonprofit sector. In other words, we focused on the systemic issues of concern that are common to nonprofits regardless of size, mission, or location. The use of evidence can help make our work better and nonprofits need to engage many stakeholders, including funders, to create a plan for evaluation that meets everyone’s needs. Too often, however, nonprofits end up doing evaluation in reactive mode, spending time answering questions that have been chosen by others.
While there is a lot of great evaluation work that takes place in the nonprofit sector in Ontario, the conversation has tended to focus on the how to and emphasized things like workshops and toolkits as a way to improve. Those are important approaches and our work has been complementary to those efforts. However, our focus has been on the question of why and returning to the original reasons for and purposes of evaluation. Over the course of the project, we aimed to identify and address the key underlying issues that can get in the way of useful evaluation.
ONN believes the nonprofit sector needs an evaluation system that addresses questions that really matter. Fundamentally, the sector needs a system that makes it easier, more rewarding, and less stressful for nonprofits and their partners to do meaningful evaluation work. The materials found on this page are a start in helping to move the dialogue forward in advocating for a more prominent nonprofit role in setting the evaluation agenda.
Got questions? Get in touch: firstname.lastname@example.org.
Follow the links below for more information on all of our resources:
This guide is meant to help you articulate more clearly what you want to get out of an evaluation and what concerns you may have about the process. It is meant as a conversation starter and is a means to open up a dialogue with your stakeholders in a subject area that can be complex and difficult. That’s why we have developed this discussion guide. It provides tips about how to ask these questions in different contexts, the challenges that can come up, and what to do about them.
Don’t want the whole guide? Skip to the question(s) of interest to you:
This resource presents a vision and set of principles for evaluation. Our 2020 Vision for Nonprofit Evaluation articulates what a strong shared vision for useful evaluation can and should be. While our Principles to Help Us Get to Useful Evaluation will help to identify some practical basics for a nonprofit audience regarding how an evaluation process should unfold.
In Ontario’s nonprofit sector, evaluation is a word that gets used a lot. Different kinds of data gathering approaches with different purposes sometimes get lumped together under the general heading of evaluation. This can lead to miscommunication and unrealistic expectations. To try to clear things up a bit, we have created this resource.
In this report, we delve into some of the systemic issues of evaluation in the nonprofit sector. It is intended to help us begin to unpack this big, complex, and sometimes emotional evaluation discussion.
Evaluation should be useful. However, evaluation research suggests that the factors that are most likely to cause an evaluation to lead action have less to do with how good you are at designing a survey or developing a logic model (though both can be helpful) and more to do with how stakeholders view and participate in the process.
Check out this short video on the six factors:
This position paper is a call for systemic changes that will create an ecosystem within which it is straightforward, efficient, and rewarding for nonprofits and funders to invest in evaluation work. It is also intended to further critical conversations to build a nonprofit sector that is more responsive, accountable, and focused on the best ways to support the communities in which they work.
The seven recommendations in this report are a call to promote learning and action before measurement, make more strategic use of evaluation resources, and expand what evaluation can achieve for the nonprofit sector.
This resource contains a few of our favourite slides that we’ve used in a number of presentations to nonprofits and funders on what we’re doing, why it’s important, and what leads to useful evaluation.
Too often, in the nonprofit evaluation ecosystem, information only flows one way. Nonprofits don’t always talk to one another. Funders don’t always talk to one another. Information rarely reaches the community and there is no feedback loop.
In an ideal ecosystem, information should be multi-directional, dynamic, and free. Only then do we get insight and action!
(Click one of the circles to start the animation)
Our Evaluation Podcasts
Episode 3 (2017.05.25)
Interviewee: Jade Huguenin, Ontario Federation of Indigenous Friendship Centres (OFIFC)
Music: Podington Bear
Description: In our third evaluation podcast, Andrew and Ben chat with Jade Huguenin from the OFIFC about their work, the context of evaluation in indigenous communities, and the importance of a community driven approach.
:55 – Introducing Jade and the OFIFC
2:30 – How did Jade become a researcher?
5:01 – Evaluation has been a controversial process. Is it an uphill battle to re-frame evaluation or do people get it?
7:48 – The importance of empowering voices and a community driven approach.
9:53 – What’s in the OFIFC’s new evaluation framework?
12:03 – Elaborating on the “background grid” of the Evaluation Path. Example of the Tree of Peace and how to reach the good life in relation to evaluation.
15:00 – What is the USAI Research Framework and why is it important?
18:01 – An example of a local community using the USAI principles.
20:46 – Andrew plays devil’s advocate. What about “credible data?”
23:20 – Analogy to baking a cake and a reminder that evaluation should always be for the community.
23:41 – If you could change anything about the way evaluation is done, what would you change?
25:17 – Is the intent to also use the USAI principles to speak to funders about evaluation and its issues?
26:43 – Provincial networks like OFIFC and ONN are crucial players in helping to create space to talk about evaluation. “We’re stronger together.”
28:10 – What would Jade like to explore further when it comes to evaluation?
29:26 – What is the role of an evaluator?
31:13 – Where can people find more information about OFIFC and its work?
Episode 2 (2017.01.31)
Interviewee: Marie Zimmerman, Hillside Festival
Music: Podington Bear
Description: In our second evaluation podcast, Andrew and Ben sit down with Marie Zimmerman from Hillside Festival to chat evaluation and how it benefits the festival and its patrons.
1:16 – Introducing Marie and Hillside Festival
1:55 – How does Hillside build its culture of learning and evaluation?
3:02 – Why is it important for Hillside to do evaluation?
4:50 – Do you have to convince people to look at evaluation data?
5:46 – Asking the right questions
7:08 – Example of doing a survey/research that led to insight. Reference: Assessing the Intrinsic Impacts of a Live Performance by Alan Brown and Jennifer Novak
9:54 – Impact of Hillside on Andrew’s family
10:31 – How do you justify evaluation in an era when resources are limited and it is difficult just to keep your head above water?
11:59 – A story of failure
15:38 – Could funders work with you to help you do evaluation better? Or is the situation fine as is?
18:20 – It’s not only about how much we spend.
19:04 – What would you change in the evaluation system to make it work better for you?
19:53 – Where is the nonprofit sector’s Alan Brown?
20:59 – Favourite act or moment from Hillside
Episode 1 (2016.04.13)
Interviewee: Chanel Grenaway, Canadian Women’s Foundation
Music: Podington Bear
Description: In our first evaluation podcast, Andrew and Ben sit down with Chanel Grenaway from the Canadian Women’s Foundation (CWF). Tune in to hear the story of how evaluation is making a difference for the CWF and their grantees.
:41 – How Chanel got into evaluation, what the Canadian Women’s Foundation is all about, and what she loves about her job.
2:18 – When did evaluation really work well and what made it work well?
5:01 – What evaluation framework does the Canadian Women’s Foundation use? (Sustainable Livelihoods)
5:22 – Grantee input into evaluation
6:43 – Factors that led to success
7:56 – Did it make Chanel nervous having grantees and donors at the same table?
9:31 – How is evaluation used?
10:40 – How much communication happens between stakeholders?
11:29 – How are challenges navigated?
13:39 – What can others learn from how the Canadian Women’s Foundation approaches evaluation?
15:40 – What lessons can other grantmakers learn from your experience?
16:44 – How do you deal with rolling up information from different grantees?
17:54 – Misconceptions about evaluation
19:24 – What do you still want to learn?
20:59 – Chanel changes the system of evaluation
22:55 – Final thoughts
Our Blog Posts
We explore some of the key issues, challenges, and ways to move forward in our blog posts below.
- What Evaluation Can Really Do for Nonprofits (2015.08.13)
- Simple tips for communicating about impact – Part 1 (2015.09.16)
- Simple tips for communicating about impact – Part 2 (2015.09.30)
- Unpacking Nonprofit Evaluation: Who is taking the risks and who is making the decisions? (2015.12.10)
- Learning from the Literature: A First Step Toward Developing a Sector Driven Evaluation Strategy (2016.01.21)
- Treating the Cause Rather than the Symptoms: Building an Evaluation Agenda for the Nonprofit Sector – published on AEA365.org (2016.02.18)
- There’s an Art to It: Exploring Creative Evaluation (2016.03.23)
- Move over, Dilbert! Introducing The Evaluation Comic Series (2016.03.31)
- Making Evaluation Work for Nonprofits: Our Theory of Change (2016.04.19)
- Hot off the press: Our evaluation comic 2.0 (2016.05.11)
- Whaddaya mean, “evaluation?” — Mismatched expectations in nonprofit measurement (2016.05.17)
- What We Learned From Talking Evaluation to Funders (2016.07.21)
- Evaluation Comic 3.0! Oranges to Apples: Measuring What Counts (2016.07.25)
- Our 2020 Vision for Nonprofit Evaluation: Let’s Be Bold! (2016.08.09)
- Should nonprofits be reporting evaluation findings to funders? (2016.10.07)
- Who sets the evaluation agenda? Five important discussion questions to make evaluation useful (2016.10.28)
- Can We Talk? Promoting better evaluation conversations between funders and nonprofits – published on AEA365.org (2016.12.29)
- Evaluation: Expanding design learning (2017.01.30)
- Don’t let evaluation become the elephant in the room – Our 4th Comic (2017.02.07)
- Can a pizza party be evaluation, too? Updating mismatched expectations in nonprofit measurement (2017.02.07)
- More useful evaluation for nonprofits? Yes, we can! (2017.06.29)
- Building a Better Nonprofit Evaluation Ecosystem — 7 Recommendations for Cultivating Evaluations that Work – published on AEA365.org (2017.08.29)
Our Comic Series
We developed a comic series to explore key themes in our conversations with the sector about nonprofit evaluation.
Click below to view the comics at full size.
Our Webinars, Presentations, and Project Recap Timeline
Rethinking Evaluation: Developing a Strategy for the Sector, By the Sector (2016.01.27)
In this webinar, we want to hear from you! We have a few ideas for how we can start to change the way evaluation in the nonprofit sector works and we want your feedback on what we might include in a strategy (e.g. a vision and set of principles for evaluation, a negotiation guide to use with funders and other stakeholders, ways we can promote an evaluation culture, and how we can use a network approach to better share and collaborate). More specifically, we want to know what you think needs to change at a systems level and how we can change it together.
2016.01.27 Rethinking Evaluation Webinar Slides
2016.01.27 Rethinking Evaluation Webinar Recording
Evaluations that Work: What the Nonprofit Sector Can Learn from ONN and Vibrant Communities (2016.06.22)
Evaluations “work” when they lead to insight and action. We all know that the process can be resource-intensive, so it is important for us to maximize the probability of getting it right! In this webinar, two leading learning institutes, the Ontario Nonprofit Network (ONN) and Tamarack’s Vibrant Communities Canada, will unpack real-life stories from Cities Reducing Poverty members to identify cases where evaluation worked really well. Together we identify how they achieved exceptional success, and top takeaway points for the nonprofit sector.
2016.06.22 Webinar Recording & Resources
Five Important Discussion Questions to Make Evaluation Useful (2016.12.07)
Efforts to build evaluation capacity in the nonprofit sector often begin with the assumption that the problem is lack of skill, resources, or interest by nonprofits. However, based on our research we think the problem may have more to do with the fact that the nonprofit evaluation “system” is not well designed- i.e. the ways evaluation is funded, rewarded, disseminated and used at a societal level. Join this interactive webinar to explore our brand new guide to help you get it right. We’ll present some common evaluation scenarios and walk through how you can put this guide into action to get the most out of an evaluation. We’ll push your critical thinking about the purposes that evaluation work is serving and the reasons why it sometimes fails to deliver on its promise.
2016.12.07 Five Important Discussion Questions Webinar
2016.12.07 Five Important Discussion Questions Slides
Adapted Ignite Presentation from AEA Conference (2016.10.28):
A few helpful external resources
Here are a few resources we’ve come across from around the web.
- Collective Impact 3.0 (Tamarack Institute)
- Data as a Means, Not an End: A Brief Case Study (SSIR)
- Drowning in Data (SSIR)
- Evaluation and Foundations: Can We Have an Honest Conversation (NPQ)
- Evaluation issue from the Canadian Government Executive magazine
- Evidence is a journey. Should it lead to proving or improving? (AUE)
- How Evaluation Can Strengthen Communities (SSIR)
- How to Stop Blaming: Six Principles For Accountability Design
- Making Data and Evaluation Work for Foundations and Nonprofits (CEP)
- Markets for Good: Forcing Nonprofits To Lie about Data
- Measuring What Matters (SSIR)
- Putting Grantees at the Center of Philanthropy (SSIR)
- Reconsidering Evidence: What It Means and How We Use It (SSIR)
- Shortcomings of Modern Strategic Philanthropy and How to Overcome Them
- Stop (Just) Measuring Impact, Start Evaluating
- Thinking about Nonprofit Evaluation as Affected by Time (NPQ)
- We don’t all need to throw wellingtons: Too much evaluation is a waste of time and money
- What’s in a word? Finding the value in evaluation (The Mandarin)
- Who ever heard of an independent evaluation? (The Mandarin)
- 4 Reasons Why Nobody Reads (or Uses) Your Evaluation Report: Here’s How To Fix It
- 5 Mindsets that Hurt Your Small Nonprofit’s Ability to Achieve Results
- [Infographic] 10 Things to know about Evaluation — Overseas Development Institute (ODI)
- A Guide to Evaluating Place-Based Initiatives — Government of Canada
- A Practical Guide to Advocacy Evaluation — Innovation Network
- Admitting Failure
- Advocacy Evaluation Resources — Point K Learning Center
- AEA365 Blog — Daily tips by and for evaluators
- Asking Useful Evaluation Questions
- Balancing Act: A Guide to Proportionate Evaluation
- Blog on evaluation (en français) — ÉvalPop
- Community Tool Box — An evaluation toolkit
- Community Solutions — Planning & Evaluation
- Creative Monitoring & Evaluation — International Platform on Sport & Development
- Creative Strategies for Evaluation — My Peer Toolkit
- Cutting through the jargon
- Data Playbook
- Emerging Tools for Community-Driven Evaluations
- Evaluation Flash Cards: Embedding Evaluative Thinking In Organizational Culture
- Evaluation for Nonprofits — Nonprofit Answer Guide
- Evaluation Methods and Tools — Evaluation Support Scotland
- Feminist Evaluation
- Financial Literacy Outcome Evaluation Tool — Prosper Canada
- Five Principles for Achieving Impact
- Free Resources for Program Evaluation and Social Research Methods
- Good Evaluation Questions: A Checklist to Help Focus Your Evaluation
- Handbook on Participatory Action Research, Planning, and Evaluation
- How to Create an Effective Monitoring and Evaluation Framework
- INTRAC Monitoring and Evaluation Series
- Innoweave Impact and Strategic Clarity Module — Webinar
- IssueLab: Nonprofits and Philanthropy
- Kauffman Foundation Evaluation Reporting Guide — Kauffman Foundation
- Making Sense of Evaluation: A Handbook for the Social Sector
- Meaningful Evidence — Resources
- Qualitative Chart Chooser 3.0 — Evergreen Data
- Questions you should never ask if you’re in the business of making an impact
- Power of Reflection: An introduction to participatory evaluation techniques
- Practical Tools for Designing and Implementing Culturally Responsive and Inclusive Evaluations
- Project Evaluation Guide — Imagine Canada
- Theory Maker — A free and open source online tool
- Tools for Social Innovators — Spark Policy Institute
- Tools and Resources for Assessing Social Impact (TRASI)
- Volunteer Management Handbook: Evaluation and Recognition — Volunteer Canada
- What do we mean by ‘impact’?
- What is the Difference Between Research and Evaluation? — FSG
- Your RFP for Evaluation Services is Terrible—You Can Fix It! — Public Profit
- Youth Leading Community Change: An Evaluation Toolkit — Rural Youth Development Grant Program, U.S. Department of Agriculture
- An Evaluation Resource Guide for Arts Training
- Assessing the Intrinsic Impacts of a Live Performance
- Evaluation 101 from ArtReach
- Look I’m Priceless! Handbook on how to assess artistic organisations
- The Impact of Cultural Heritage: creating a common language
- Self-evaluation framework — Arts Council England
The following are a few links to some resources as well as organizations that either have done or continue to do some thinking on various evaluation tools, methods, and approaches.
- Advocacy and Social Justice: Measuring Impact — Canadian HIV/AIDS Legal Network
- Developing a Culture of Evaluation — Community Literacy of Ontario
- Evaluation — Ontario Mentoring Coaliaiton
- Evaluation Resources — Sustain Ontario
- Non-Profit Evaluation: A Summary Report of the Partnership Grant Program’s Evaluation Projects — Community Literacy of Ontario
- Package of Evaluation Resources — United Way Toronto & York Region
- Reflections on My Journey as an Evaluator (Blog) and Evaluation Plan – Youth and Philanthropy Initiative Canada
- USAI (Utility Self-Voicing Access Inter-Relationality) Research Framework — Ontario Federation of Indigenous Friendship Centres
Capacity-building support and training:
- Collective Impact & Developmental Evaluation — Innoweave
- Customized Evaluation Supports — YouthREX
- EvalU — Capacity Canada
- IMPACT — Ontario Council for International Cooperation
- Evaluating Community Impact & Collective Impact — Tamarack Institute
- Professional Development — Canadian Evaluation Society-Ontario Chapter
- Program Evaluation Course (Online) — Homeless Hub
- Sharing the Stories (Youth) — The Students Commission
- Benchmarking Foundation Evaluation Practices (CEP)
- Beyond Measure? The State of Evaluation and Action in Ontario’s Youth Sector (YouthREX)
- Evaluating Ecosystem Investments (FSG)
- Measuring Performance: Evaluation Practices and Perspectives in Canada’s Voluntary Sector
- Review of Evaluation Frameworks for the Saskatchewan Ministry of Education
- Room for Improvement: Foundations’ Support of Nonprofit Performance Assessment (Center for Effective Philanthropy)
- Sharing What Matters: Foundation Transparency (Center for Effective Philanthropy)
- State of Evaluation 2016 (Innovation Network)