4 Mar, 2026

Ask Nicole: Why Should Funders Evaluate Their Portfolios?

By |2026-03-03T23:32:14-05:00March 4th, 2026|Categories: Research & Evaluation|Tags: , , , , |0 Comments

Headshot of Nicole Clark promoting a blog post titled “Why Should Funders Evaluate Their Portfolios?” for philanthropic program officers.
Have a question you’d like to be featured? Let me know.

Over the past few years, I’ve found myself increasingly in spaces with funders.

Not only are they hiring me to evaluate individual grantee programs, but I’m also working alongside their grantees, support learning agendas, and strengthen strategy implementation.

In one recent engagement, I partnered with a funder to develop a theory of change designed to sharpen and improve their investments in sexual and reproductive health, rights, and justice. That work pushed me to think more deeply about the relationship between how funders award individual grants and the broader ecosystem a funder is trying to influence.

Recently, I’ve shifted my focus toward evaluating at a more strategic level.

Not only should we ask, “Did this grantee meet their outputs?”, funders should also ask, “Is this portfolio coherent?Is it equitable? What measurable change is this portfolio driving, and how has it increased grantees’ capacity to sustain that change?

These are bigger questions. To answer them, funders — especially program officers responsible for managing funding portfolios — must step back and examine not only what they fund, but how and why they fund it.

(more…)
25 Feb, 2026

Strong Programs Are Evidence-Based AND Community-Informed

By |2026-02-21T18:14:31-05:00February 25th, 2026|Categories: Program, Service, & Campaign Design|Tags: , , , |0 Comments

Strong programs are evidence-based AND community-informed, integrating data and community knowledge.

Over the past few weeks, I’ve examined how “evidence-based” standards shape nonprofit work, how they can gatekeep access, and how they strain teams when expectations exceed infrastructure.

Yet one tension continues to surface in these conversations.

Many nonprofit teams feel pressure to choose between being evidence-based and being community-informed, as if rigor and relevance can’t coexist.

Strong organizations don’t abandon being data-driven to honor community voice, and they don’t silence community knowledge to appear credible.

They integrate both.

(more…)
18 Feb, 2026

Try This: Map the Gap Between Evidence & Implementation

By |2026-02-17T11:52:41-05:00February 18th, 2026|Categories: Program, Service, & Campaign Design|Tags: , , , |0 Comments

Blog graphic reading “Try This: Map the Gap Between Evidence & Implementation,” featuring two colleagues reviewing work together at a desk.
Try this out and let me know how it goes for you.

Over the past two weeks, we’ve examined how “evidence-based” standards shape nonprofit work, and how they can sometimes function as gatekeepers. But even when access isn’t the issue, another challenge often emerges: Implementation.

Nonprofit teams are asked to deliver evidence-based programs without the infrastructure to fully support them. The research may be strong and model may be sound, but staffing can be lean, funding is restricted, training is uneven, reporting requirements are heavy, and community needs are evolving.

When expectations outpace infrastructure, the strain doesn’t show up in research articles. It shows up in burnout, adaptation, and quiet improvisation.

This exercise helps you make visible what often goes unnamed: The gap between research and real-world capacity (and what teams lose in the process.)

Objective:

To identify where implementation expectations exceed infrastructure and determine what support teams require to close the gap.

This activity is ideal for:

  • Nonprofit staff and leadership teams responsible for implementing research-informed programs
  • Program directors and managers navigating the day-to-day realities of delivery
  • Evaluation and learning staff trying to align rigor with feasibility
  • Executive leaders assessing whether expectations match capacity

(Funders can benefit from the insights generated, but this exercise centers the experience of the teams doing the work.)

What you’ll need:

  • A whiteboard, flip chart, or shared virtual document
  • Sticky notes or a digital commenting tool
  • A copy of the evidence-based model or framework your team must use to guide your program or service
  • 45–60 minutes of uninterrupted time

(more…)
11 Feb, 2026

When “Evidence-Based” Becomes a Gatekeeper

By |2026-02-18T11:46:34-05:00February 11th, 2026|Categories: Program, Service, & Campaign Design|Tags: , , , , |0 Comments

Blog post graphic reading “When ‘Evidence-Based’ Becomes a Gatekeeper,” showing a person holding up their hand in a stop gesture, branded Nicole Clark Consulting.

In the nonprofit sector, “evidence-based” is treated as a marker of credibility, signaling rigor, effectiveness, and responsibility (especially in conversations about funding, accountability, and impact).

In theory, this makes sense. Evidence should help ensure that programs and services do what they claim to do. In practice, I’ve seen institutions define and enforce evidence-based standards in ways that quietly shape access to resources, trust, and organizational legitimacy.

Over time, I’ve come to see how evidence-based can function as a gatekeeper that shapes participation in ways that aren’t always intentional, transparent, or equitable.

(more…)
4 Feb, 2026

Ask Nicole: Evidence-Based….for WHO?

By |2026-02-02T13:17:32-05:00February 4th, 2026|Categories: Program, Service, & Campaign Design|Tags: , , , , , |0 Comments

romotional image for an Ask Nicole blog post featuring Nicole Clark smiling, with text reading “Ask Nicole: Evidence-Based… For Who?” and branding for Nicole Clark Consulting.
Have a question you’d like to be featured? Let me know.

So, what exactly is “evidence-based”?

I’ve been thinking a lot about this, how often people invoke it, how rarely they interrogate it, and how much weight it carries in nonprofit work.

At first glance, the idea seems straightforward: Programs and services should rely on evidence. In practice, I’ve seen people define, apply, and enforce standards in ways that shape what gets funded and whose evidence counts.

Over time, my own thinking has shifted. I understand the importance of evidence in framing effective programs and services and improving outcomes. At the same time, I’ve grown more attentive to how evidence, when use prescriptively, can flatten complexity, limit innovation, and miss the realities of the communities nonprofits are trying to serve.

(more…)

This Is A Custom Widget

This Sliding Bar can be switched on or off in theme options, and can take any widget you throw at it or even fill it with your custom HTML Code. Its perfect for grabbing the attention of your viewers. Choose between 1, 2, 3 or 4 columns, set the background color, widget divider color, activate transparency, a top border or fully disable it on desktop and mobile.
Go to Top