Sector Driven Evaluation Strategy
The nonprofit sector needs an evaluation system that addresses questions that really matter. Fundamentally, the sector needs a system that makes it easier, more rewarding, and less stressful for nonprofits and their partners to do meaningful evaluation work.
While there is a lot of great evaluation work that takes place in the nonprofit sector in Ontario, the conversation has tended to focus on the how to and emphasized things like workshops and toolkits as a way to improve. While those are important approaches, we must also look at the question of why and return to the original reasons for and purposes of evaluation.
Follow the links below for more information on our resources:
This guide is meant to help you articulate more clearly what you want to get out of an evaluation and what concerns you may have about the process. It is meant as a conversation starter and is a means to open up a dialogue with your stakeholders in a subject area that can be complex and difficult. That’s why we have developed this discussion guide. It provides tips about how to ask these questions in different contexts, the challenges that can come up, and what to do about them.
Don’t want the whole guide? Skip to the question(s) of interest to you:
This resource presents a vision and set of principles for evaluation. Our 2020 Vision for Nonprofit Evaluation articulates what a strong shared vision for useful evaluation can and should be. While our Principles to Help Us Get to Useful Evaluation will help to identify some practical basics for a nonprofit audience regarding how an evaluation process should unfold.
In Ontario’s nonprofit sector, evaluation is a word that gets used a lot. Different kinds of data gathering approaches with different purposes sometimes get lumped together under the general heading of evaluation. This can lead to miscommunication and unrealistic expectations. To try to clear things up a bit, we have created this resource.
In this report, we delve into some of the systemic issues of evaluation in the nonprofit sector. It is intended to help us begin to unpack this big, complex, and sometimes emotional evaluation discussion.
Evaluation should be useful. However, evaluation research suggests that the factors that are most likely to cause an evaluation to lead action have less to do with how good you are at designing a survey or developing a logic model (though both can be helpful) and more to do with how stakeholders view and participate in the process.
Check out this short video on the six factors:
This position paper is a call for systemic changes that will create an ecosystem within which it is straightforward, efficient, and rewarding for nonprofits and funders to invest in evaluation work. It is also intended to further critical conversations to build a nonprofit sector that is more responsive, accountable, and focused on the best ways to support the communities in which they work.
The seven recommendations in this report are a call to promote learning and action before measurement, make more strategic use of evaluation resources, and expand what evaluation can achieve for the nonprofit sector.
This resource contains a few of our favourite slides that we’ve used in a number of presentations to nonprofits and funders on what we’re doing, why it’s important, and what leads to useful evaluation.
Too often, in the nonprofit evaluation ecosystem, information only flows one way. Nonprofits don’t always talk to one another. Funders don’t always talk to one another. Information rarely reaches the community and there is no feedback loop.
In an ideal ecosystem, information should be multi-directional, dynamic, and free. Only then do we get insight and action!
(Click one of the circles to start the animation)
Our Evaluation Podcasts
Episode 3 (2017.05.25)
Interviewee: Jade Huguenin, Ontario Federation of Indigenous Friendship Centres (OFIFC)
Music: Podington Bear
Description: In our third evaluation podcast, Andrew and Ben chat with Jade Huguenin from the OFIFC about their work, the context of evaluation in indigenous communities, and the importance of a community driven approach.
:55 – Introducing Jade and the OFIFC
2:30 – How did Jade become a researcher?
5:01 – Evaluation has been a controversial process. Is it an uphill battle to re-frame evaluation or do people get it?
7:48 – The importance of empowering voices and a community driven approach.
9:53 – What’s in the OFIFC’s new evaluation framework?
12:03 – Elaborating on the “background grid” of the Evaluation Path. Example of the Tree of Peace and how to reach the good life in relation to evaluation.
15:00 – What is the USAI Research Framework and why is it important?
18:01 – An example of a local community using the USAI principles.
20:46 – Andrew plays devil’s advocate. What about “credible data?”
23:20 – Analogy to baking a cake and a reminder that evaluation should always be for the community.
23:41 – If you could change anything about the way evaluation is done, what would you change?
25:17 – Is the intent to also use the USAI principles to speak to funders about evaluation and its issues?
26:43 – Provincial networks like OFIFC and ONN are crucial players in helping to create space to talk about evaluation. “We’re stronger together.”
28:10 – What would Jade like to explore further when it comes to evaluation?
29:26 – What is the role of an evaluator?
31:13 – Where can people find more information about OFIFC and its work?
Episode 2 (2017.01.31)
Interviewee: Marie Zimmerman, Hillside Festival
Music: Podington Bear
Description: In our second evaluation podcast, Andrew and Ben sit down with Marie Zimmerman from Hillside Festival to chat evaluation and how it benefits the festival and its patrons.
1:16 – Introducing Marie and Hillside Festival
1:55 – How does Hillside build its culture of learning and evaluation?
3:02 – Why is it important for Hillside to do evaluation?
4:50 – Do you have to convince people to look at evaluation data?
5:46 – Asking the right questions
7:08 – Example of doing a survey/research that led to insight. Reference: Assessing the Intrinsic Impacts of a Live Performance by Alan Brown and Jennifer Novak
9:54 – Impact of Hillside on Andrew’s family
10:31 – How do you justify evaluation in an era when resources are limited and it is difficult just to keep your head above water?
11:59 – A story of failure
15:38 – Could funders work with you to help you do evaluation better? Or is the situation fine as is?
18:20 – It’s not only about how much we spend.
19:04 – What would you change in the evaluation system to make it work better for you?
19:53 – Where is the nonprofit sector’s Alan Brown?
20:59 – Favourite act or moment from Hillside
Episode 1 (2016.04.13)
Interviewee: Chanel Grenaway, Canadian Women’s Foundation
Music: Podington Bear
Description: In our first evaluation podcast, Andrew and Ben sit down with Chanel Grenaway from the Canadian Women’s Foundation (CWF). Tune in to hear the story of how evaluation is making a difference for the CWF and their grantees.
:41 – How Chanel got into evaluation, what the Canadian Women’s Foundation is all about, and what she loves about her job.
2:18 – When did evaluation really work well and what made it work well?
5:01 – What evaluation framework does the Canadian Women’s Foundation use? (Sustainable Livelihoods)
5:22 – Grantee input into evaluation
6:43 – Factors that led to success
7:56 – Did it make Chanel nervous having grantees and donors at the same table?
9:31 – How is evaluation used?
10:40 – How much communication happens between stakeholders?
11:29 – How are challenges navigated?
13:39 – What can others learn from how the Canadian Women’s Foundation approaches evaluation?
15:40 – What lessons can other grantmakers learn from your experience?
16:44 – How do you deal with rolling up information from different grantees?
17:54 – Misconceptions about evaluation
19:24 – What do you still want to learn?
20:59 – Chanel changes the system of evaluation
22:55 – Final thoughts
Our Blog Posts
Exploring some of the key issues, challenges, and ways to move forward.
- Evaluating Public Benefit: How Changing Evaluation Practice Might Help to Solve an Identity Problem (December 2017)
- Building a Better Nonprofit Evaluation Ecosystem — 7 Recommendations for Cultivating Evaluations that Work – published on AEA365.org (August 2017)
- More useful evaluation for nonprofits? Yes, we can! (June 2017)
- Can a pizza party be evaluation, too? Updating mismatched expectations in nonprofit measurement (February 2017)
- Don’t let evaluation become the elephant in the room – Our 4th Comic (February 2017)
- Evaluation: Expanding design learning (January 2017)
- Can We Talk? Promoting better evaluation conversations between funders and nonprofits – published on AEA365.org (December 2016)
- Who sets the evaluation agenda? Five important discussion questions to make evaluation useful (October 2016)
- Should nonprofits be reporting evaluation findings to funders? (October 2016)
- Our 2020 Vision for Nonprofit Evaluation: Let’s Be Bold! (August 2016)
- Evaluation Comic 3.0! Oranges to Apples: Measuring What Counts (July 2016)
- What We Learned From Talking Evaluation to Funders (July 2016)
- Whaddaya mean, “evaluation?” — Mismatched expectations in nonprofit measurement (May 2016)
- Hot off the press: Our evaluation comic 2.0 (May 2016)
- Making Evaluation Work for Nonprofits: Our Theory of Change (April 2016)
- Move over, Dilbert! Introducing The Evaluation Comic Series (March 2016)
- There’s an Art to It: Exploring Creative Evaluation (March 2016)
- Treating the Cause Rather than the Symptoms: Building an Evaluation Agenda for the Nonprofit Sector – published on AEA365.org (February 2016)
- Learning from the Literature: A First Step Toward Developing a Sector Driven Evaluation Strategy (January 2016)
- Unpacking Nonprofit Evaluation: Who is taking the risks and who is making the decisions? (December 2015)
- Simple tips for communicating about impact – Part 2 (September 2015)
- Simple tips for communicating about impact – Part 1 (September 2015)
- What Evaluation Can Really Do for Nonprofits (August 2015)
Rethinking Evaluation: Developing a Strategy for the Sector, By the Sector (2016.01.27)
In this webinar, we want to hear from you! We have a few ideas for how we can start to change the way evaluation in the nonprofit sector works and we want your feedback on what we might include in a strategy (e.g. a vision and set of principles for evaluation, a negotiation guide to use with funders and other stakeholders, ways we can promote an evaluation culture, and how we can use a network approach to better share and collaborate). More specifically, we want to know what you think needs to change at a systems level and how we can change it together.
2016.01.27 Rethinking Evaluation Webinar Slides
Evaluations that Work: What the Nonprofit Sector Can Learn from ONN and Vibrant Communities (2016.06.22)
Evaluations “work” when they lead to insight and action. We all know that the process can be resource-intensive, so it is important for us to maximize the probability of getting it right! In this webinar, two leading learning institutes, the Ontario Nonprofit Network (ONN) and Tamarack’s Vibrant Communities Canada, will unpack real-life stories from Cities Reducing Poverty members to identify cases where evaluation worked really well. Together we identify how they achieved exceptional success, and top takeaway points for the nonprofit sector.
2016.06.22 Webinar Recording & Resources
Five Important Discussion Questions to Make Evaluation Useful (2016.12.07)
Efforts to build evaluation capacity in the nonprofit sector often begin with the assumption that the problem is lack of skill, resources, or interest by nonprofits. However, based on our research we think the problem may have more to do with the fact that the nonprofit evaluation “system” is not well designed- i.e. the ways evaluation is funded, rewarded, disseminated and used at a societal level. Join this interactive webinar to explore our brand new guide to help you get it right. We’ll present some common evaluation scenarios and walk through how you can put this guide into action to get the most out of an evaluation. We’ll push your critical thinking about the purposes that evaluation work is serving and the reasons why it sometimes fails to deliver on its promise.
Adapted Ignite Presentation from AEA Conference (2016.10.28):
A few helpful external resources
- Collective Impact 3.0 (Tamarack Institute)
- Data as a Means, Not an End: A Brief Case Study (SSIR)
- Drowning in Data (SSIR)
- Evaluation and Foundations: Can We Have an Honest Conversation (NPQ)
- Evaluation issue from the Canadian Government Executive magazine
- Evidence is a journey. Should it lead to proving or improving? (AUE)
- How Evaluation Can Strengthen Communities (SSIR)
- How to Stop Blaming: Six Principles For Accountability Design
- Making Data and Evaluation Work for Foundations and Nonprofits (CEP)
- Markets for Good: Forcing Nonprofits To Lie about Data
- Measuring What Matters (SSIR)
- Putting Grantees at the Center of Philanthropy (SSIR)
- Reconsidering Evidence: What It Means and How We Use It (SSIR)
- Stop (Just) Measuring Impact, Start Evaluating
- Thinking about Nonprofit Evaluation as Affected by Time (NPQ)
- We don’t all need to throw wellingtons: Too much evaluation is a waste of time and money
- What’s in a word? Finding the value in evaluation (The Mandarin)
- Who ever heard of an independent evaluation? (The Mandarin)
- Evaluations that Make a Difference — Case studies
- 5 Mindsets that Hurt Your Small Nonprofit’s Ability to Achieve Results
- [Infographic] 10 Things to know about Evaluation — Overseas Development Institute (ODI)
- A Guide to Evaluating Place-Based Initiatives — Government of Canada
- A Practical Guide to Advocacy Evaluation — Innovation Network
- Admitting Failure
- Advocacy Evaluation Resources — Point K Learning Center
- AEA365 Blog — Daily tips by and for evaluators
- Asking Useful Evaluation Questions
- Balancing Act: A Guide to Proportionate Evaluation
- Blog on evaluation (en français) — ÉvalPop
- Community Tool Box — An evaluation toolkit
- Community Solutions — Planning & Evaluation
- Creative Monitoring & Evaluation — International Platform on Sport & Development
- Creative Strategies for Evaluation — My Peer Toolkit
- Cutting through the jargon
- Data Playbook
- Emerging Tools for Community-Driven Evaluations
- Evaluation for Nonprofits — Nonprofit Answer Guide
- Evaluation Methods and Tools — Evaluation Support Scotland
- Feminist Evaluation
- Financial Literacy Outcome Evaluation Tool — Prosper Canada
- Five Principles for Achieving Impact
- Free Resources for Program Evaluation and Social Research Methods
- Good Evaluation Questions: A Checklist to Help Focus Your Evaluation
- Handbook on Participatory Action Research, Planning, and Evaluation
- How to Create an Effective Monitoring and Evaluation Framework
- Innoweave Impact and Strategic Clarity Module — Webinar
- IssueLab: Nonprofits and Philanthropy
- Kauffman Foundation Evaluation Reporting Guide — Kauffman Foundation
- Making Sense of Evaluation: A Handbook for the Social Sector
- Meaningful Evidence — Resources
- Qualitative Chart Cho
- Practical Tools for Designing and Implementing Culturally Responsive and Inclusive Evaluations
- Project Evaluation Guide — Imagine Canada
- Theory Maker — A free and open source online tool
- Tools for Social Innovators — Spark Policy Institute
- What do we mean by ‘impact’?
- What is the Difference Between Research and Evaluation? — FSG
- Your RFP for Evaluation Services is Terrible—You Can Fix It! — Public Profit
- Youth Leading Community Change: An Evaluation Toolkit — Rural Youth Development Grant Program, U.S. Department of Agriculture
- Advocacy and Social Justice: Measuring Impact — Canadian HIV/AIDS Legal Network
- Developing a Culture of Evaluation — Community Literacy of Ontario
- Evaluation — Ontario Mentoring Coaliaiton
- Evaluation Resources — Sustain Ontario
- Non-Profit Evaluation: A Summary Report of the Partnership Grant Program’s Evaluation Projects — Community Literacy of Ontario
- Package of Evaluation Resources — United Way Toronto & York Region
- Reflections on My Journey as an Evaluator (Blog) and Evaluation Plan – Youth and Philanthropy Initiative Canada
- USAI (Utility Self-Voicing Access Inter-Relationality) Research Framework — Ontario Federation of Indigenous Friendship Centres
Capacity-building support and training:
- Collective Impact & Developmental Evaluation — Innoweave
- Customized Evaluation Supports — YouthREX
- EvalU — Capacity Canada
- IMPACT — Ontario Council for International Cooperation
- Evaluating Community Impact & Collective Impact — Tamarack Institute
- Professional Development — Canadian Evaluation Society-Ontario Chapter
- Program Evaluation Course (Online) — Homeless Hub
- Sharing the Stories (Youth) — The Students Commission
- Benchmarking Foundation Evaluation Practices (CEP)
- Beyond Measure? The State of Evaluation and Action in Ontario’s Youth Sector (YouthREX)
- Evaluating Ecosystem Investments (FSG)
- Review of Evaluation Frameworks for the Saskatchewan Ministry of Education
- Room for Improvement: Foundations’ Support of Nonprofit Performance Assessment (Center for Effective Philanthropy)
- Sharing What Matters: Foundation Transparency (Center for Effective Philanthropy)
- State of Evaluation 2016 (Innovation Network)