When you sit down with your supervisor for your annual review or evaluation, it can go one of two ways.
When your supervisor spends more time on areas that “need improvement”, you may walk out of her office feeling defeated.
And how do you feel when you receive a glowing review? Pretty happy. Makes you feel like you’re excelling.
If you care about what you do, you welcome praise as well as recommendations for how to improve. Too much positive and you don’t feel the need to grow. If you’ve ever asked about things that you can improve of, and you don’t receive much of a response because everything is going well, how do you feel?
I’ve had my fair share of working with executive directors and program directors who only wanted me to focus on positive outcomes. And it makes me suspicious.
Of course, you want the people invested in what you’re doing to be happy. These people—the stakeholders—can range from anyone that is impacted directly on indirectly by the programs, services, or initiatives you’ve created for them.
Stakeholders want to see what’s going well. What’s going well can mean more media, more opportunities and more funding. “Negative” findings (and I use quotation marks because negative is subjective) can also lead to more media, and a lot of nonprofits fear this. Negative findings can give the impression that things are worse off than they really are.
But too much of the positive can give the impression that nothing needs to change. Let’s face it: Some nonprofits are out here designing surveys, in-depth interviews, and focus group questions that are so biased that one can’t expect anything but positive results. And that’s not valuable either.
How can positive findings give your staff the credit they deserve, and how can you address “negative” findings in a way that allows for your stakeholders to see opportunities?
Be Objective and Spot the Themes
Be as objective as possible throughout the evaluation process. This is crucial as you meet with staff and stakeholders who are closely linked to the program.
A great way to increase objectivity and participation in the process is through data interpretation. An example of a facilitative data interpretation technique is data placemats. Data placemats allow stakeholders to weigh in before a final report is developed.
As you comb through the findings, work with your staff and stakeholders to identify themes. Some themes are easier to spot that others.
Looking for themes is like reading a story to a child. You can come up with different meanings based on the story. Let’s use the Three Little Pigs as an example. One of the themes (or “the moral of the story”) is on the importance of planning well. Building a brick house is smarter than building a house made of straw. Another theme could also be about making smarter choices. One can come up with a good collective of themes based on the experience of many stakeholders.
Being objective and working together to identify themes allows for honest discussion on what is considered positive or negative, based on stakeholder feedback.
Determine What’s “Positive” and What’s “Negative”
Speaking of which, what is considered positive or negative?
Administering a survey for a program that received responses other than what you expected can be negative.
Or it could be positive. Instead of writing this off, think more about how the information was collected. How were the survey questions designed? How do the results of the survey data compare to regional or national data? Unexpected results can give insights into how your program is impacting stakeholders.
Scheduling focus groups, and no one showing up to participate can be suck. Having focus group participants that are too similar or too different can also be a negative.
What could be a positive lesson learned? Your staff can improve focus group outreach by allowing enough time to gauge interest in participating, and also having the logistics in place to run a fruitful focus group. Another lesson learned is ensuring that focus group participants have at least 2-3 things in common. If they’re too similar, you decrease the chances of having a variety of perspectives. If they have nothing in common, then what was the point of having focus groups in the first place?
Increase Your Credibility
A good evaluation of a program recognizes your staff’s great work, and also provides recommendations that moves work to the next level.
There’s nothing inherently wrong with all positive results. If an expected or unexpected outcome of occurs and an evaluation of your program, service, or initiative highlights it, celebrate it. While you’re celebrating, identify some reasons why a positively expected or unexpected result occurred, and if there are existing data that back up the findings.
Also, instead of being hard on yourself when an evaluation yields unexpected (or even “negative”) findings, use them to your advantage. Involve stakeholders in identifying obstacles and ways to overcome them. A high percentage of participants want to see your program expanded to other parts of the city, but you don’t have the staff capacity to meet the demand? Point this out to a funder as a reason for an increase in funding to hire more staff.
Work with a skilled evaluator to get suggestions for addressing deficiencies, and draw on your past data collected for the program, and the research literature. If negative findings occurred in the past, what steps were recommended to address them that may not have been put in place? Engage with your stakeholders to identify what internal and eternal factors were at play that causes for those recommendations to not be implemented.
Be intentional in balancing the awesome aspects of your program with areas for improvement. This will go a long way in developing programs that attract the ideal participants (and funding) to your organization.