Home / posts / News / CECAN Syllabus for Evaluation of Complex Policies and Programmes Goes Live!

CECAN Syllabus for Evaluation of Complex Policies and Programmes Goes Live!

Jun 4, 2018 | News

Late last year, a new CECAN syllabus was put together, as a training module for policy analysts and evaluators.The syllabus outlines 10, two hour sessions, which can be incorporated into a broader post graduate programme.

This year, we have had two opportunities to try the syllabus out with different audiences. Firstly, Dione Hills and Pete Barbrook-Johnson were invited to present the content in a two day workshop for senior analysts at the Department of Education in Sheffield. Twenty two experienced evaluators, researchers and evaluation commissioners attended the workshop, which was delivered in a mix of theory and practice based sessions. Tailoring the material to meet the needs of participants with a similar level of experience, and working in the same broad policy arena provided the opportunity to work throughout the workshop on a sector relevant example.

The feedback from the workshop was generally enthusiastic, with 90% of participants rating the relevance of the course to their professional work as either 4 or 5 out of 5, and 70% giving a similar score to their ‘ability to plan an evaluation for a complex policy or programme’ following the course. Comments included:

  • All really useful – I found the practical/application aspects easier to get my head around
  • Marrying up complexity with the evaluation approaches – good to break out and be active participants
  • I enjoyed the system mapping exercise as I could now see how I could apply this to my work
  • It helped me to understand some of the thinking behind evaluation design and why certain approaches may be selected
  • It will be good to put some of this into practice and see how it goes – putting it into practice will identify areas I need to understand better

The development of series of CPD workshops, to be delivered members of the CECAN team is providing another opportunity to put the syllabus into practice. The first of these was delivered jointly Corinna Elsenbroich, Emma Uprichard and Dione Hills, and incorporated material from the first four sessions from the syllabus.

This proved to be a lively and interactive workshop, with eleven participants with a wide mix of evaluation and sector experience. Their feedback – both positive and less positive – provides us with a chance to ensure that future courses meet a similar mix of participants needs.

This time, 70% rated the course highly in terms of its relevance to their professional or academic work, and 60% rated highly their ability to now design an evaluation for a complex intervention. Having a mix of participants on the course was welcomed by some:

  • Very useful to meet with people from different sectors experiencing the same challenges

The mix of presenters was also appreciated:

  • The mixed set-up of different presenters, rather than 1 presenter
  • All the speakers were very dynamic and engaging.

However, this time (perhaps because some participants had less prior evaluation experience) there was a strong call for more practical examples to be provided to illustrate the more theoretical points made.

  • More practical examples of actual evaluations using each of the methods outlined.
  • More case studies. Your course title could possibly include the word “theory”.
  • I would feel like this was even stronger if had had more ‘real’ examples of “successful” evaluations of complex policies.

While some found the workshop a bit ‘theory heavy’ while others welcomed this element:

  • How complexity theory and measurement theories can enhance my evaluation practice. Especially, helped me to explore more critically what I am going to evaluate and how to measure change.
  • Determining evaluation questions is a critical starting point. Questions before methods. Need to know purpose of evaluation and also the audience who’s interested.
  • Terminology for describing complexity and viewing complexity that can be applied to evaluation.

Both workshops are also providing an opportunity explore the best way to provide current and policy makers with the knowledge and skills the need to help them to plan and commission complexity appropriate evaluations. Several weeks after the workshops, participants are being sent a questionnaire asking further information on their views of the workshop, and how they are seeking to apply what they learned. This work is ongoing, but a few response from participants in the DfE workshop have been received. These give useful insights into both the benefits for government analysts can gain from attending a workshop of this kind, and the challenges they face in trying to apply what they have learned in the current policy, and evaluation, climate.

Generally speaking, the material presented at a workshop of this kind is seen as providing a useful ‘starting point’ for further exploration. Some of the methods presented (theory of change and logic mapping) can be used straight away, while others, like Qualitative Comparative Analysis, will require more study before being used. Although most participants indicated that they now have the confidence to talk to talk about colleagues about complex systems, and complex appropriate evaluation methods, getting these methods actually used remains a challenge, particularly in a policy environment with a fondness for ‘hard’ impact evaluations and the kind of clear cut results provided by RCTs. Analysts attending the course were however, already planning to link up to support each other in the ongoing use of and learning around complex evaluation.

For more information about the syllabus for the evaluation of complex policies and programmes and associated CPD courses, see the CECAN website. The next CPD session is on June 19th, focusing on Qualitative Comparative analysis.

Share This