I was a panelist on a recent webinar, discussing community and structural interventions to support maternal health equity.
During the conversation, I shared an evaluation struggle:
Evaluation is used to meet funder expectations, instead of serving as a learning tool for organizations.
A few years ago, I worked with an organization that received funding to build their evaluation capacity.
It was fun because I felt like a professor, creating an easy to understand curriculum on evaluation. Also, the staff who self-selected into this process were tasked with training departmental staff in building their evaluation capacity.
Unfortunately, organizations aren’t oriented to see evaluation in this way. Program evaluation is viewed as a means to an end, a funder request to confirm that funding was spent as intended. They’re asked to conduct “rigorous” evaluations on “evidence-based” programs, without any conversation on what rigor and evidence-based actually mean to an organization.
Before an evaluation begins, I recommend working with programming staff to clarify the program’s purpose AND what program success looks like for staff. Then, you use this to create an evaluation process that balances staff priorities and funder expectations.
This activity is ideal for:
- Anyone responsible for leading data collection and sense making processes
- Anyone interested in applying evaluative thinking into their work
What you’ll need:
- A setup conducive to capturing ideas (laptop, pen and paper, whiteboard, etc.). Make sure your notes are kept in a place where you can refer back to
The steps:
The goal of this activity is explore what it means to move away from funder-driven evaluation.
A funder-driven evaluation centers funder priorities over staff and program participants. In order to move away from funder-driven evaluation, these commitments are needed:
- A commitment to normalizing evaluative thinking and being data driven, using data for informed decision-making
- A commitment to creating and sustaining participatory data processes with stakeholders to gather and interpret the data
- A commitment to understanding the myriad factors that influence a program
- A commitment to learning AND taking action
- A commitment to accountability and long-term impact beyond funder expectations
Moving away from funder-driven evaluation means normalizing evaluative thinking as a strategy. Evaluative thinking champions the value of evidence, identified assumptions, posing thoughtful questions, and digs deeper.
Moving away from funder-driven evaluation means creating a monitoring and evaluation strategies for all programming, with dedicated time throughout the fiscal year to participate in data sense-making, and without the need of a funder asking.
Moving away from funder-driven evaluation means no longer prioritizing outcomes and impact over process. You’re committed to looking at the total picture of how a program is operating, if it’s operating as intended, what’s going well, and the circumstances that are leading to intended and intended results.
Moving away from funder-driven evaluation means no longer letting funders define your community’s challenges by creating programs that fit neatly into their portfolio.
This activity is flexible, and I’ll leave it up to you to facilitate. Depending on how many are invited to participate, it may be convenient to have everyone together or based on stakeholder type. Timing may not be ideal for everyone AND bringing people together with various levels of power dynamics may be an issue.
For this activity, reflect on the following (and take notes):
- How are we normalizing evaluative thinking and being data driven?
- How are we intentionally engaging our stakeholders?
- How can we interrogate the factors that may be influencing our programming?
- How are we learning AND taking action?
- How are we holding ourselves accountable?
- What do we need to do ensure each program has it own monitoring and evaluation strategy?
- How can we prioritize process, outcomes, AND impact?
- How can we work with our funders to balance their requests and our priorities?
Let’s process
As you explore these questions, more will emerge. And you’ll notice themes in your answers. Spend time exploring these themes, and as you create your organizational evaluation framework, consider sharing this with your stakeholders, including your funders, for community feedback, suggested changes, and implementation.
Key takeaway
Moving away from funder-driven evaluation calls for reversing the top-down, extractive approach, shifting away from prioritizing quantitative metrics and funder-defined problems. While this is easier said than done, more funders are realizing that philanthropy’s approach to evaluation needs to change. The problems you’re addressing are complex and long-standing, and your evaluations should reflect this.
Try this activity and let me know how it goes for you (or if you need support.)
Raise Your Voice: How are you moving away from funder-driven evaluation? Share your thoughts below in the comments section.
Was this useful? Subscribe to the Raise Your Voice newsletter, and explore my consulting services.
Â
[/fusion_text][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container]