16 Apr, 2025

Try This: Redefine Your Metrics

By |2025-04-16T09:57:07-04:00April 16th, 2025|Categories: Research & Evaluation|Tags: , , , |0 Comments

Black woman lifting a dumbbell with focus and strength, framed by bold gold text that reads “Try This – Redefine Your Metrics,” with Nicole Clark Consulting branding below.
Try this activity and let me know how it goes.

Whether your goal is weight loss or muscle gain, the first thing we tend to measure is the number on the scale. But if you want a more accurate view of your progress, it’s time to redefine your metrics.

In program design and evaluation, the indicators we choose shape not only what we measure—but how we define success. By learning to redefine your metrics, you’re evaluating your progress in a more holistic, human way.

This activity is another example of how evaluation shows up in our everyday, identifying more meaningful, context-driven indicators of progress. Whether you’re thinking about your own growth or your program’s impact, expanding how you measure can shift what you learn and how you respond.

Objective

To help staff apply evaluative thinking to both personal and programmatic progress by identifying non-traditional, meaningful indicators of success.

This activity is ideal for:

  • Program staff involved in design, implementation, or evaluation
  • Teams who want to move beyond surface-level outcomes
  • Organizations interested in becoming more data driven

What you’ll need:

  • Paper and pen, journal, or notes app
  • A shared document for team discussion
  • 90 minutes of uninterrupted staff time

The steps:

(more…)
9 Apr, 2025

The Recipe for a Good Evaluation

By |2025-04-09T10:43:55-04:00April 9th, 2025|Categories: Research & Evaluation|Tags: |0 Comments

Smiling chef in white uniform and black apron with arms crossed, standing beside the text “The Recipe for a Good Evaluation” and Nicole Clark Consulting logo.
Trying a new recipe? That’s evaluation in action.

The recipe for a good evaluation is similar to trying out a new recipe.

You find a new recipe that looks amazing—maybe it popped up on social media or was handed down from a friend.

You buy the ingredients, follow the steps, and give it a go.

But the final dish is just… okay. Not bad, but not great. So you make a mental note: Less salt next time, longer cooking time, or double an ingredient.

This quick post-dinner reflection is the start of a recipe for a good evaluation.

Like a chef creating a cookbook, recipes are a perfect example of how we gather feedback, make adjustments, and improve things over time.

(more…)
2 Apr, 2025

Ask Nicole: How We Use Evaluation Every Day

By |2025-04-02T15:22:15-04:00April 2nd, 2025|Categories: Research & Evaluation|Tags: , |0 Comments

Portrait of Nicole Clark with text overlay: “Ask Nicole: How We Use Evaluation Every Day” promoting a blog post on how we use evaluation every day in real life.
Have a question you’d like to be featured? Let me know.

When people hear “evaluation,” they often picture something dry, technical, and reserved for experts—maybe even a little intimidating. It sounds like one of those tedious processes filled with jargon and reports no one really reads.

In reality, we’re evaluating all the time. From the meals we cook to the shows we watch, we’re constantly assessing what works, what doesn’t, and what to do next.

Evaluation isn’t just a professional tool—it’s a part of how we live, make decisions, and improve things around us.

Here are 8 everyday experiences that show how evaluation shows up in your daily life:

(more…)
26 Mar, 2025

They Left Your Program—Now Use Their Feedback to Adapt

By |2025-03-25T16:19:03-04:00March 26th, 2025|Categories: Workshop, Program, & Curriculum Design|Tags: , , , |0 Comments

Blog post titled ‘They Left Your Program—Now Use Their Feedback to Adapt’ about redesigning programs based on participant feedback.
Participant drops out are program design feedback insights in disguise.

We don’t often think of former participants as a source of program design feedback.

We mostly tweak our programs and services based on program design feedback from participants who stay.

However, in getting feedback from participants who leave, we often discover ways to build stronger programs.

It’s a different angle, and maybe even an uncomfortable one. But former program participant feedback is valuable—regardless of why they’ve chosen to leave.

So instead of trying to get them back in the door, what if we used their exit as a signal for thoughtful adaptation?

(more…)
19 Mar, 2025

Try This: Learning From Program Participants Who Leave

By |2025-03-25T16:22:49-04:00March 19th, 2025|Categories: Workshop, Program, & Curriculum Design|Tags: , , |0 Comments

A row of four chairs against a gray wall, with three white chairs and one orange chair. The text reads 'Try This: Learning From Program Participants Who Leave.' Blog post by Nicole Clark Consulting.
Try this out and let me know how it goes.

Learning from program participants who leave? Have you ever considered this?

Understanding why participants leave a program can provide valuable insights for improvement. Instead of viewing dropouts as failures, organizations can learn from them to refine program structure, engagement strategies, and outreach.

Understanding why someone opts out can highlight gaps in accessibility, program design, or expectations that might not be obvious from the inside. Learning from participant exits allows organizations to make informed adjustments for a better, more aligned experience for future participants. It also helps refine who the program is best suited for, so staff to focus on attracting and retaining the right participants rather than simply boosting enrollment numbers.

Despite knowing how beneficial this is, many organizations hesitate to dig into why participants leave—often due to fears of what may be uncovered. Concerns that exit feedback will highlight program flaws, misalignment, or weaknesses can feel like criticism rather than opportunities for growth. Staff may worry that analyzing dropouts will reflect poorly on their work, leading to uncomfortable conversations with funders or stakeholders. Staff may also feel personally invested in the program’s success and view dropouts as a reflection of their efforts.

Avoiding this doesn’t make the issues disappear. Instead, reframing dropout analysis as a learning tool rather than a failure assessment can empower organizations to create stronger, more effective programs.

Use this activity to analyze participant exit patterns and use feedback to strengthen your program’s impact.

Objective:

Analyze participant exit patterns, use feedback to strengthen program impact, and support staff developing practices in learning from program participants who leave.

This activity is ideal for:

  • Program managers, coordinators, and staff overseeing participant engagement
  • Teams looking to improve retention strategies without relying on numbers alone
  • Organizations that want to align their programs more closely with participant needs
  • Organizations interested in becoming more data driven

What you’ll need:

  • Recent program participation and dropout data (if available)
  • Exit surveys data or interview transcripts from past participants
  • Flip chart paper or a whiteboard
  • Markers or sticky notes
  • Between 60 – 90 minutes for time management
(more…)

This Is A Custom Widget

This Sliding Bar can be switched on or off in theme options, and can take any widget you throw at it or even fill it with your custom HTML Code. Its perfect for grabbing the attention of your viewers. Choose between 1, 2, 3 or 4 columns, set the background color, widget divider color, activate transparency, a top border or fully disable it on desktop and mobile.
Go to Top