Learning Lessons From Past Interventions - Collingwood Environmental Planning
CEP over the last 10 years (2006-2016) has undertaken an extensive range of evaluations in the natural environment arena, e.g. for Defra, Environment Agency, Natural England, Research Councils, Scottish Government, European Commission, Natural Resources Wales, Cefas, OECD, etc.
These have all been published as publicly available documents, but core knowledge and learning on the delivery of these projects still resides within CEP staff. We have extensive experience of applying Magenta Book principles, among other approaches, and have managed and been involved in these projects over this significant timescale, and at a time when evaluation has risen up the political agenda as part of increasing accountability and value for money.
There is therefore a wealth of institutional memory to be mined from these projects on evaluation approaches and methods, such as, for example, the rationale for the choice of methods used, the challenges in undertaking complex evaluations in these areas, including the objectives of policy/policy interventions and the objectives set for evaluations, process versus outcome/impact focused evaluations, lessons learned, and barriers and enablers to the evaluation of complex policy/policy interventions and so on.
In addition, many of the evaluations CEP has undertaken have included formative (process) evaluation as well as ex post (summative) evaluation of outcomes/impacts, often with explicit objectives of providing learning to the organisations/partners involved in policy implementation and policy interventions.
We have also undertaken specific projects in relation to the counterfactual (a particularly challenging aspect in the field of complex policy), many have made use of logic models and/or theory of change and we have undertaken evaluations across the UK and devolved administrations, as well as in the European/international context.
A major gap in policy making is learning the lessons from past interventions and in integrating the lessons from evaluations that have been undertaken. This project – an important meta-evaluation of a sample of CEP evaluation projects (23 in total), was therefore undertaken as an intensive piece of research over 4-5 months (July-November 2016).
The purpose of this meta-evaluation - to learn lessons from past policy evaluation around the NEXUS - fitted well within the ‘Scoping needs’ part of CECAN (and hence a reason for intensive activity). The outcomes of the meta-evaluation also provide some suggestions on evaluation practice for follow-up by CECAN and for researchers and practitioners.