Picking up on the CECAN webinar last year on the topic ‘How to evaluate – or commission – an evaluation when everything is messy’, Dione Hills (Tavistock Institute and CECAN Associate) was asked to give a key note speech last month at the Norwegian Evaluation Conference.
Alex Penn and Pete Barbrook-Johnson (Senior Research Fellows in CECAN) recently delivered a bespoke 1-day course on participatory systems mapping for researchers and policy makers in the new SIPHER project.
CECAN Fellow, Jayne Cox, has published a project report from her funded fellowship project entitled “How does the commissioning process inhibit the uptake of complexity-appropriate evaluation?”.
People attending CECAN training and events often want to hear about evaluations that ‘failed’ because the wrong (i.e. not complexity appropriate) methods or approach were used. Providing such examples is not easy: accounts of evaluations that ‘fail’ are rarely published, and when we have examples from our own practice, confidentiality – or embarrassment – can make it hard to talk about these in public.
Artists and academics and everyday people who care about the health of the planet have been doing the calculus for a long time. There is a new urgency to these calculations now, partly thanks to the School Strikes and Extinction Rebellion, and underpinned by a wave of action-oriented estimates about the state of the global ecosystem and its capacity to carry all life, including a human race projected to reach nearly 10 billion people by 2050 (UN 2017).
We and our colleagues often talk a good game when it comes to complexity. We want to understand nuance and context, and we know things are messier than we realise. Yet, we still struggle to grapple with complex issues and can feel frustrated with our lack of progress, or others’ seemingly foggy thinking.
The webinar views the challenge of creating complexity-informed evaluation by seeing it as a public management challenge. How can public management adopt a more complexity-informed approach? The session will outline an emerging complexity-informed approach to public management: the Human, Learning, Systems (HLS) approach.
CECAN Webinar: Co-Creation of Innovation: Group Concept Mapping to Value and Engage More Knowledgeable Others in Authoring and Valuing Complex Systems
Participation in defining elements or issues within a system is a critical requirement in defining that system adequately. In many social policy spaces, hearing the voices of those who are affected, and who have a lived knowledge of the issues, is foundational. This is equally important in defining the elements of a complex system to anchor evaluation.
CECAN Webinar: Using the System Effects Methodology to Understand the User Experience of Complex Systems
This webinar will present an overview of the System Effects methodology, an approach developed to understand the user experience of complex systems. System Effects draws on the methodological approaches of soft systems thinking, fuzzy cognitive mapping, and graph theoretical analysis.
We know research impact unfolds in complex and unpredictable ways, so how on earth do we learn from and evaluate it? In this blog, I will take a look at some of the approaches we have been developing and using in CECAN – a research centre set up to tackle the issue of complexity in evaluation. I will explain how you can use these approaches to do a quick and effective evaluation of complex research impacts, helping you understand what works and why.
Public policy questions are often complex. Decisions rely on both evidence and values. Where should these values come from: politicians, public officials, independent experts, or the public?
This session briefly outlines the origins of theory of change, its purpose, and wide variety of uses. The main message is to show that theory of change as a tool, is instrumental in helping evaluators, intervention managers, and wider stakeholders, understand how change happens. In turn this contributes to better targeted and shaped interventions, and ultimately improved practice.
The field of complexity related research is growing apace, and CECAN has been at the forefront of exploring how research methods from this field can be incorporated into evaluation practice (given sufficient resources, expertise and willingness on the part of evaluation funders!).
Many policies are ‘complex’, that is, they have multiple objectives and multiple actors with possibly conflicting objectives, feedback loops, and policy outcomes that are dependent on the details of the policy history. Evaluating such policies is correspondingly difficult, because often the impact of the policy is unpredictable and the outcomes may be highly context dependent.
CECAN Webinar – Risk Analysis at the Food Standards Agency Post EU-Exit: The Role of Economics and Social Science in Informing Risk Management
On 6th February, Vanna Aldin (Head of Analytics and Chief Economist at the Food Standards Agency) kindly presented a CECAN Webinar on ‘Risk Analysis at the Food Standards Agency post EU-Exit: The role of economics and social science in informing risk management’.
CECAN Ltd is pleased to be working with Defra to develop a Complexity Evaluation Framework that will equip Defra commissioners of evaluation with a checklist of core considerations to ensure that policy evaluations sufficiently consider the implications of complexity theory.
The rise of data science has opened up a number of opportunities for government policy planning and complex evaluation. Although both data science and evaluation involve using data to better understand a particular issue, the benefit of using data science for policy evaluation has still to be established.
There has recently been an upsurge of interest in what constitutes ‘Evaluative Thinking’ (ET). One frequently quoted definition of this term (Buckley 2015) refers to ET being “critical thinking applied to contexts of evaluation”[i]. This blog reflects on ways in which ‘evaluative thinking’ and the application of an understanding of complexity to evaluation can be mutually supportive.
CECAN Workshop: New Approaches to the Participatory Steering and Evaluation of Complex Adaptive Systems
Participation of stakeholders and citizens is increasingly recognized as a helpful and, in some circumstances, crucial ingredient when managing complex, adaptive socio-ecological systems. The scale, interconnectedness and ongoing dynamics of these systems require new approaches bringing together stakeholder knowledge and problem ownership with new complexity-appropriate tools. Such approaches have much to offer both ex-post evaluation and ex-ante policy appraisal.
CECAN Webinar: Risk Analysis at the Food Standards Agency Post EU-Exit: The Role of Economics and Social Science in Informing Risk Management
The Food Standards Agency, a non-ministerial government department, is currently revising its risk analysis process as part of the preparatory work for EU Exit. The food and feed safety risk analysis process contributes to the FSA objective to ensure that the high standard of food safety and consumer protection we enjoy in the UK is maintained when the UK leaves the EU.
To understand the complex systems we inhabit, surely, we must recognise a plurality of perspectives. After all everyone gets a different view. However mainstream economists, the dominant social scientists, only accept one perspective on how economies work drowning out others.
Helen, Dione and Martha are all part of a CECAN team that has been developing guidance for evaluation practitioners and commissioners struggling with complex interventions – and particularly those working in complex policy areas. The guidance was requested by the government team of evaluators who are revising the Magenta Book (Cross government guidance on evaluation) with the idea of having an annex on complex evaluation ready for when this is published.
We are pleased to announce that CECAN Research Fellows, Pete Barbrook-Johnson and Alex Penn, have published a report presenting a systems mapping project carried out by CECAN with BEIS between June 2017 and December 2018.
Wednesday 16th January 2019, 09:30 - 17:30 (registration 09:00), University of Surrey, Guildford, UK Tutor: Dr. Pete Barbrook-Johnson, UKRI Innovation Fellow and Research Fellow (University of Surrey) Course Details: This course is part of the CECAN Module for policy...
CECAN Presentation at the Political Evaluation and Evaluation Politics Questions and Research Lines Conference in Berlin
CECAN Co-Investigator Ian Christie presented the programme’s aims, projects and achievements to date in a talk at a conference in Berlin on 17th November on the political dimensions of policy evaluation.
The pluralistic evaluation framework is a new tool for policymakers that has gradually taken shape during the last 12 months of my CECAN fellowship. It is now ready to be presented at a webinar on 20 November, where I will be explaining the rationale in the space of 45 minutes. Here I want to share a little of the journey that it has been on, building on what I wrote here last January and June.
In a democratic system, how can we show that a policy has improved a complex situation in a way that represents good use of public funds? “Improve” and “good” are very general words, calling for pluralistic evaluation.
By Caroline Oliver, Reader in Sociology, Roehampton University and Research Affiliate at the Centre on Migration, Policy & Society, University of Oxford
We were delighted to have Stuart Astill and Simon Henderson host a CECAN webinar on 16th October on ‘Handling subjective views and bringing rigour to contribution analysis: Bayesian Belief Networks and evaluating likelihoods in action’.
CECAN Syllabus CPD Workshop: Evaluation Fundamentals – Choosing Appropriate Methods and Assessing Overall Quality of Evaluations
This course introduces participants to the fundamental aspects of the evaluation processes that precede and follow data collection, analysis, and report drafting; and that are relevant when drafting Invitations to Tender (ITT) or Terms of Reference (ToR), when evaluating proposals, and assessing the quality of intermediate and final evaluation products / reports.