Future Events

Complexity and Evaluation Failure

Complexity and Evaluation Failure

People attending CECAN training and events often want to hear about evaluations that ‘failed’ because the wrong (i.e. not complexity appropriate) methods or approach were used. Providing such examples is not easy: accounts of evaluations that ‘fail’ are rarely published, and when we have examples from our own practice, confidentiality – or embarrassment – can make it hard to talk about these in public. 

Value, Values and Valuing Nature

Value, Values and Valuing Nature

Artists and academics and everyday people who care about the health of the planet have been doing the calculus for a long time. There is a new urgency to these calculations now, partly thanks to the School Strikes and Extinction Rebellion, and underpinned by a wave of action-oriented estimates about the state of the global ecosystem and its capacity to carry all life, including a human race projected to reach nearly 10 billion people by 2050 (UN 2017).

Slipping into Simplicity

Slipping into Simplicity

We and our colleagues often talk a good game when it comes to complexity. We want to understand nuance and context, and we know things are messier than we realise. Yet, we still struggle to grapple with complex issues and can feel frustrated with our lack of progress, or others’ seemingly foggy thinking.

CECAN Webinar: The Human, Learning, Systems Approach to Managing in Complexity

CECAN Webinar: The Human, Learning, Systems Approach to Managing in Complexity

The webinar views the challenge of creating complexity-informed evaluation by seeing it as a public management challenge. How can public management adopt a more complexity-informed approach? The session will outline an emerging complexity-informed approach to public management: the Human, Learning, Systems (HLS) approach.

CECAN Webinar: Co-Creation of Innovation: Group Concept Mapping to Value and Engage More Knowledgeable Others in Authoring and Valuing Complex Systems

CECAN Webinar: Co-Creation of Innovation: Group Concept Mapping to Value and Engage More Knowledgeable Others in Authoring and Valuing Complex Systems

Participation in defining elements or issues within a system is a critical requirement in defining that system adequately.  In many social policy spaces, hearing the voices of those who are affected, and who have a lived knowledge of the issues, is foundational. This is equally important in defining the elements of a complex system to anchor evaluation.

How to Evaluate Complex Research Impact

How to Evaluate Complex Research Impact

We know research impact unfolds in complex and unpredictable ways, so how on earth do we learn from and evaluate it? In this blog, I will take a look at some of the approaches we have been developing and using in CECAN – a research centre set up to tackle the issue of complexity in evaluation. I will explain how you can use these approaches to do a quick and effective evaluation of complex research impacts, helping you understand what works and why. 

CECAN Webinar: Theory of Change – Getting the most out of it – Insights from a Practitioner

CECAN Webinar: Theory of Change – Getting the most out of it – Insights from a Practitioner

This session briefly outlines the origins of theory of change, its purpose, and wide variety of uses. The main message is to show that theory of change as a tool, is instrumental in helping evaluators, intervention managers, and wider stakeholders, understand how change happens. In turn this contributes to better targeted and shaped interventions, and ultimately improved practice. 

Evaluative Thinking in Complexity Research

Evaluative Thinking in Complexity Research

The field of complexity related research is growing apace, and CECAN has been at the forefront of exploring how research methods from this field can be incorporated into evaluation practice (given sufficient resources, expertise and willingness on the part of evaluation funders!).

CECAN Workshop: Commissioning Complex Evaluations

CECAN Workshop: Commissioning Complex Evaluations

Many policies are ‘complex’, that is, they have multiple objectives and multiple actors with possibly conflicting objectives, feedback loops, and policy outcomes that are dependent on the details of the policy history. Evaluating such policies is correspondingly difficult, because often the impact of the policy is unpredictable and the outcomes may be highly context dependent.  

Complexity Evaluation Framework

Complexity Evaluation Framework

CECAN Ltd is pleased to be working with Defra to develop a Complexity Evaluation Framework that will equip Defra commissioners of evaluation with a checklist of core considerations to ensure that policy evaluations sufficiently consider the implications of complexity theory.

CECAN Workshop: What Good Data Could Do for Evaluation

CECAN Workshop: What Good Data Could Do for Evaluation

The rise of data science has opened up a number of opportunities for government policy planning and complex evaluation. Although both data science and evaluation involve using data to better understand a particular issue, the benefit of using data science for policy evaluation has still to be established.

Complexity and Evaluative Thinking

Complexity and Evaluative Thinking

There has recently been an upsurge of interest in what constitutes ‘Evaluative Thinking’ (ET). One frequently quoted definition of this term (Buckley 2015) refers to ET being “critical thinking applied to contexts of evaluation”[i]. This blog reflects on ways in which ‘evaluative thinking’ and the application of an understanding of complexity to evaluation can be mutually supportive.

CECAN Workshop: New Approaches to the Participatory Steering and Evaluation of Complex Adaptive Systems

CECAN Workshop: New Approaches to the Participatory Steering and Evaluation of Complex Adaptive Systems

Participation of stakeholders and citizens is increasingly recognized as a helpful and, in some circumstances, crucial ingredient when managing complex, adaptive socio-ecological systems. The scale, interconnectedness and ongoing dynamics of these systems require new approaches bringing together stakeholder knowledge and problem ownership with new complexity-appropriate tools. Such approaches have much to offer both ex-post evaluation and ex-ante policy appraisal. 

CECAN Webinar: Risk Analysis at the Food Standards Agency Post EU-Exit: The Role of Economics and Social Science in Informing Risk Management

CECAN Webinar: Risk Analysis at the Food Standards Agency Post EU-Exit: The Role of Economics and Social Science in Informing Risk Management

The Food Standards Agency, a non-ministerial government department, is currently revising its risk analysis process as part of the preparatory work for EU Exit. The food and feed safety risk analysis process contributes to the FSA objective to ensure that the high standard of food safety and consumer protection we enjoy in the UK is maintained when the UK leaves the EU.

What has Economic Pluralism Got to Do With Understanding Complexity?

What has Economic Pluralism Got to Do With Understanding Complexity?

To understand the complex systems we inhabit, surely, we must recognise a plurality of perspectives. After all everyone gets a different view. However mainstream economists, the dominant social scientists, only accept one perspective on how economies work drowning out others.

CECAN Webinar: How to Evaluate – or Commission an Evaluation – When Everything is Messy

CECAN Webinar: How to Evaluate – or Commission an Evaluation – When Everything is Messy

Helen, Dione and Martha are all part of a CECAN team that has been developing guidance for evaluation practitioners and commissioners struggling with complex interventions – and particularly those working in complex policy areas. The guidance was requested by the government team of evaluators who are revising the Magenta Book (Cross government guidance on evaluation) with the idea of having an annex on complex evaluation ready for when this is published.

CECAN Case Study Report: BEIS Energy Trilemma

CECAN Case Study Report: BEIS Energy Trilemma

We are pleased to announce that CECAN Research Fellows, Pete Barbrook-Johnson and Alex Penn, have published a report presenting a systems mapping project carried out by CECAN with BEIS between June 2017 and December 2018.

The Story of the Pluralistic Evaluation Framework

The Story of the Pluralistic Evaluation Framework

The pluralistic evaluation framework is a new tool for policymakers that has gradually taken shape during the last 12 months of my CECAN fellowship. It is now ready to be presented at a webinar on 20 November, where I will be explaining the rationale in the space of 45 minutes. Here I want to share a little of the journey that it has been on, building on what I wrote here last January and June.

CECAN Syllabus CPD Workshop: Evaluation Fundamentals – Choosing Appropriate Methods and Assessing Overall Quality of Evaluations

CECAN Syllabus CPD Workshop: Evaluation Fundamentals – Choosing Appropriate Methods and Assessing Overall Quality of Evaluations

This course introduces participants to the fundamental aspects of the evaluation processes that precede and follow data collection, analysis, and report drafting; and that are relevant when drafting Invitations to Tender (ITT) or Terms of Reference (ToR), when evaluating proposals, and assessing the quality of intermediate and final evaluation products / reports.