Reflections following one philanthropic organisation’s exploration of their approach to evaluation and learning.

"We know evaluations are often tricky and it may feel like funders are asking for evidence of change in a grant cycle that actually takes years to unfold, shaped by a complex wider system. When revisiting our strategy, we were clear that we wanted to find an approach that makes organisational self-improvement and learning a priority over accountability to us as the funder.

We also wanted to find this approach together, working closely with our grantees. Dartington was the perfect partner and we are so delighted to start the first step on our own learning journey with Kate and the Dartington team as our partners." - Dr Sophie Flemig, Cattanach Chief Executive


Cattanach commissioned the Dartington Service Design Lab to help reflect and refine their approach to evaluation and learning as an independent foundation. 

The work emerged from a recognition that voluntary sector organisations faced significant financial and administrative pressures that meant valuable learning about effective practice in the early years may be lost. Cattanach was keen to reflect on opportunities to better understand the context, and imagine alternative futures for evaluation and learning at the Foundation.

There emerged a clear need to disrupt the status quo and change the way evidence is predominantly funded, generated, and used across the sector. The was a call to create more equitable relationships to produce meaningful and usable evidence to improve practice and deepen understanding of complex challenges.

As an independent and agile funder, Cattanach has an opportunity to trial more innovative approaches to grant-making and evaluation.


The work led by Dartington involved:

  • Gathering perspectives from service delivery organisations via surveys and virtual sessions to understand evaluation needs, priorities and concerns.
  • Reflection with Cattanach staff and Trustees around values and ambitions for an evaluation and learning approach.
  • Review of existing data around evaluation capacity & evaluation outputs at the Foundation.
  • Speaking with evaluation experts to explore different learning approaches that would be feasible for the organisation to deliver.


The work found:

1. There was a practical cynicism about the way evidence is generated and used

  • Evidence was perceived as something to help ‘sell’ the service and to secure new or maintain existing funding.
  • There was a hesitance to openly share challenges experienced in delivery for fear of financial impact.
  • Evaluations have been largely focused on accountability (i.e. did you do what you said you’d do), and less on improvement and collaborative learning.
  • There was a recognition that evidence generated under these circumstances had an inherent bias, that distorted, or oversimplified the complex reality of providing early help to families. 

"In terms of accountability in evidence, an organisation will always have the tendency to tell the funder what they want to hear..." - Participant

"Any funder will ask what your outcomes are, so then you spend the rest of your cycle trying to prove those outcomes even when there are other outcomes that change. In my experience, sometimes the story is more interesting than the outcome. I’m almost suspicious of things that really back up the outcomes..." - Participant
"A large part of my role is working on funding bids…it’s more trying to prove something for somebody else than considering the whole story ." - Participant

2. Commissioning structures make it difficult to design and carry out evaluations for genuine learning and improvement

  • Such commissioning structures mean ‘evidence’ becomes a tool for organisational survival, rather than something meant to build understanding about the issues and foster a culture of learning and improvement.
  • The commissioning climate described does not best serve the families and children who are involved. It risks turning them into passive recipients providing data on their service experiences for purposes that are not fed back to them, or used in a way that could benefit them (or others similar to them).
  • Nor ironically, does it benefit the funder. Structures that gear evidence generation for predominately accountability purposes rather than learning, inhibits a deeper understanding of the issues and/or risks overly simplifying potential change mechanisms.

"…a lot of evidence which doesn’t even get looked at which is stressful. It’s alright if we use the information, but for [some] we don’t need to use so it’s just a piece of paper for them [the funder] to have…" - Participant

"Evaluation really should be for the beneficiaries rather than the funders. It’s really hard to communicate that- especially when in the ways funders collect some of the information [is] for the benefit of the funder, rather than the benefit of the beneficiary..." Participant

"Cattanach has probably been the best with this. Often funders give you the money but there’s almost an inherent distrust in that. So they give the money to do something but then they almost don’t believe you…it’s almost micromanaging in something that funders don’t even seem to be that fussed about. It’s almost like they’re also trapped in this system..." - Participant


There is a need for more equitable and trusting relationships between funders and grantees to create conditions for better evaluation and learning. It was recognised that there was, and always will be a power imbalance between those awarding funds, and those seeking funds. Individual voluntary sector organisations are to a large degree bound by the rules of the system (e.g. working in competition, rather than collaboration). The ability to create a fundamental change requires those who have the power, to disrupt and share that power. 

Cattanach, in collaboration with others, is in a position to help change the rules of the system.


  • Unrestricted designated grant funding (which is currently being progressed at the Foundation).
  • Proportionate/minimal reporting requirements for grants under a certain value.
  • Co-ordination with other funders to simplify (or streamline) evaluation requirements.
  • Generation and consolidation of a shared evidence resource/library for the early years (that can be accessed, contributed to and used by others).
  • Removing ‘the middle person’ and hearing evidence directly from families, or seeing practice in situ where ethically appropriate to do so.
  • Investing greater resource in supporting grantees building learning capacity and ability to tell their story.
  • Focus on key moments throughout the Cattanach journey to turn learning into an everyday activity (rather than a one-off reflection end of year, or midway point).
  • Development and refinement of their philanthropic Theory of Change that will be closely aligned to the evaluation strategy.
  • Peer review of evidence and approaches amongst delivery organisations, rather than the funder review – appropriately recompensed.

Written by Kate Tobin, Dartington Service Design Lab , Scotland Director