Phase 5 - Learn

5.1 Evaluations

Your commissioner(s) and even your investor(s) may be undertaking evaluations of their own. It is not usually advisable to rely on these externally-led evaluations, which might focus on the value of the commissioning mechanism itself:

  • Did the SIB framework enable the organisation to better meet specific policy objectives?

  • Did the SIB framework contribute to the project’s impact?

IIn addition to these, your evaluation would aim to answer various questions on the delivery of the intervention you are planning. These should include the following:

Evaluation questions

 

Isn’t a SIB its own evaluation?

For many SIBs, a key aim is to encourage innovation, which makes it even more important that they are accompanied by a strong evaluation to assess whether or not that innovation has succeeded. The scope of this may depend on how the SIB was set up. For example:

  • SIBs whose outcomes incorporate benchmarking against national data sets may find the comparison is relatively easy to achieve
  • SIBs whose outcomes orient mainly around individual achievements may have further to go to find a valid control group and establish statistical significance

Regardless, the better your definition of the target population and outcomes and the better your performance management throughout the SIB, the less money and effort you should need to spend on evaluation.

We have listened to delivery staff talk about the evaluations they took part in and have drawn together SIB-specific evaluation lessons from what we heard:

Table title
Lessons from past provider evaluations:

Providers have said…

Lesson

“People just want to leave and get on with their lives, and not be bothered again.”

In designing a SIB evaluation, try to make sure that crucial findings (or indeed outcome payments) don’t rest on getting in touch with participants after they leave the programme. If they do, numbers need to be large enough to accommodate low response rates.

“There were so many blank spaces in the online forms, you felt a bit like none of it was important."

Each party in a SIB can come with their own data requirements. Try to co-ordinate: argue that it’s better to  collect a few pieces of information and focus really hard on data quality than try to build everything you can think of into your IT system. That can be off-putting.

“Rubbish in, rubbish out…”

Any evaluation will be hamstrung by ‘rubbish in, rubbish out’ if you are guarded in your release of information. Dedicate serious time to onboarding and arming your evaluators: see it as bonus consultancy! Designing the evaluation in such a way as to separate it from financial incentives will make this easier.

“We had a great little programme, but it didn’t translate well.”

A good evaluation should include assessing ‘fidelity’ to a well-documented programme manual. SIBs have been known to stimulate fast growth in small, effective “black box” programmes which didn’t replicate well.

Be aware from the outset that not all evaluation outcomes are uniformly positive: they may show no difference or even an unintended impact on your beneficiaries and their outcomes.

How you manage an evaluation result tests your resilience as an organisation. Handled well, you can show staff, funders and external stakeholders evidence of your flexibility in the relentless pursuit of social impact.

Fresh growth:

If your evaluation throws up something you don’t like, get some moral support from your board and closest colleagues to overcome the bruised pride and emotions – and get on with the job. Improve the programme. Cut out the harmful or wasteful bits. Prune the tree so the fresh growth can blossom.

 – Kathryn Maunders, Ten Thousand Starlings

Masterclass: Using evaluation findings to best effect

Contributed by Bethia McNeill, Centre for Youth Impact

When evaluations yield an unsought answer it's either a management issue or it's an interesting opportunity – either of which is valuable.

For example, say you run an eight week workshop for 10 young people, but you discover that by the end, only four people on average are attending. Either:

  • There has been a logistical problem: perhaps the workshops are not easy for your cohort to get to, or they’re being held too late 

  • Whilst most people drop off after six weeks, their outcomes aren't affected. This is an opportunity to cut the duration of the programme

You can triangulate the various metrics – outcomes, participation data, feedback – to work out the cause and the solution.

It isn’t helpful to undertake or commission an evaluation if there isn’t some kind of space in which to analyse the results and adjust your approach accordingly.

But adjustments have to be made with discipline. Change one thing at a time, make a hypothesis about what difference that will make, and monitor what happens as a result.

 

Further resources multiple

Next: 5.2 After a SIB