Home / CECAN Resources / How to carry out a complex evaluation – Resource guide

How to carry out a complex evaluation – Resource guide

This guide offers key resources for evaluating complex systems, including toolkits, frameworks, and methods to navigate uncertainty and interdependencies. It provides practical guidance on participatory systems mapping, selecting appropriate evaluation approaches, and managing complexity in policy contexts; helping policymakers, evaluators, and researchers effectively assess complex policy landscapes.

Developed with the assistance of ChatGPT, an AI language model, this resource was refined and summarised to enhance clarity and conciseness. Please note, while care has been taken in its preparation, occasional errors or inaccuracies may still be present.

 

1. The Magenta Book 2020 Supplementary Guide: Handling Complexity in Policy Evaluation:

HM Treasury (2020). Magenta Book supplementary guide: Handling complexity in policy evaluation. Available at: https://www.gov.uk/government/publications/the-magenta-book

  • This guide supplements HM Treasury’s Magenta Book. It explains what complexity thinking is, what the features of complex systems are, and how new methodologies and tools can equip policymakers to work with unavoidable complexity.

 

2. The Complexity Evaluation Framework (CEF):

Bicket, M., Hejnowicz, A.P., Rowe, F., Hills, D., Wilkinson, H., Penn, A., Shaw, B., Gilbert, N., (2020). Complexity Evaluation Framework. Defra, London. Available at: https://sciencesearch.defra.gov.uk/ProjectDetails?ProjectId=20401

  • Commissioned by Defra, the Complexity Evaluation Framework is a framework of key considerations to guide the scoping, commissioning, management and delivery of complexity-appropriate evaluations. This framework is written for commissioners and managers of evaluations. Considerations for managing complexity-appropriate evaluations are embedded throughout each of the three core chapters of the framework: understanding, designing and embedding.

 

3. Toolkits and Methods Guides:

Barbrook-Johnson, P., & Penn, A. S. (2022). The Participatory Systems Mapping Toolkit. CECAN. Retrieved from https://www.cecan.ac.uk/resources/toolkits/ (PSM Toolkit)

  • The Participatory Systems Mapping Toolkit by Barbrook-Johnson and Penn (2022) offers practical guidance for collaboratively creating causal maps with stakeholders to understand complex systems. It aids in policy design and evaluation by visualising interdependencies, fostering shared understanding, and identifying actionable insights through participatory methods. 

 

Penn, A. S., & Barbrook-Johnson, P. (2022). How to design a participatory systems mapping process [CECAN report]. CECAN. Retrieved from https://www.cecan.ac.uk/resources/participatory-systems-mapping-resource-guide/

  • Penn and Barbrook-Johnson (2022) provide a guide on designing participatory systems mapping processes. The report outlines steps for engaging stakeholders in mapping complex systems, from defining the system and goals to selecting appropriate methods. It emphasises the importance of collaboration, inclusivity, and iterative processes to ensure meaningful participation and accurate system representations. This resource is aimed at facilitators and decision-makers working with systems thinking.

 

Bicket, M., Castellani, B., Elsenbroich, C., Gilbert, N., Hills, D., Gurney, M., Rowe, F., & Wilkinson, H. (2021). The Complexity Evaluation Toolkit (Version 1.0) [Toolkit]. CECAN. Retrieved from https://www.cecan.ac.uk/wp-content/uploads/2021/07/Toolkit-2021-web.pdf

  • The Complexity Evaluation Toolkit (Version 1.0), developed by CECAN, assists evaluators in navigating complex systems and addressing unpredictability in policy outcomes. It provides guidance on commissioning, designing, and managing evaluations through real-world case studies. The toolkit is structured into five chapters, covering topics such as introducing complexity in evaluation, commissioning, designing, managing evaluations, and achieving impact. It aims to enhance decision-making across interconnected domains like food, water, energy, and the environment.

 

CECAN. (2021). Participatory System Mapper (PRSM) [Software tool]. CECAN (University of Surrey). Retrieved from https://www.prsm.uk/

  • The Participatory System Mapper (PRSM) is a free, open-source web-based tool developed by CECAN to facilitate collaborative system mapping. It enables groups to co-create visual maps of interconnected factors and causal relationships in real time, supporting online workshops and meetings. PRSM is designed for users with varying levels of experience and is compliant with GDPR standards, offering features like chat, export options, and data analytics.

 

Befani, B. (2020). Choosing Appropriate Evaluation Methods: A tool for assessment & selection (Version 2) [Guidance tool]. CECAN. Retrieved from https://www.cecan.ac.uk/resources/toolkits/ (Methods tool v2)

  • Befani (2020) presents a practical tool for selecting suitable evaluation methods based on a project’s complexity, purpose, and context. The tool guides users through a structured decision-making process, helping align evaluation design with specific needs, particularly in complex policy environments. It supports both new and experienced evaluators in identifying methods that are methodologically sound and contextually appropriate.

 

Befani, B. (2016). Bayes Formula Confidence Updater [Spreadsheet tool]. CECAN. Retrieved from https://www.cecan.ac.uk/resources/toolkits/ (Bayesian updater spreadsheet)

  • The Bayes Formula Confidence Updater by Barbara Befani (2016) is a spreadsheet tool designed to assist evaluators in applying Bayesian updating to assess the strength of evidence supporting causal contribution claims. By inputting prior confidence levels and the diagnostic value of new evidence, users can calculate updated confidence levels, enhancing transparency and rigor in theory-based evaluations.

 

Mayne, J. (2008). Contribution analysis: An approach to exploring cause and effect. ILAC Brief 16. Institutional Learning and Change (ILAC) Initiative. Retrieved from https://hdl.handle.net/10568/70124 

  • Questions of cause and effect are critical to assessing the performance of programmes and projects. When it is not practical to design an experiment to assess performance, contribution analysis can provide credible assessments of cause and effect. Verifying the theory of change that the programme is based on, and paying attention to other factors that may influence the outcomes, provides reasonable evidence about the contribution being made by the programme.  This briefing note provides a good, digestible introduction to the approach.

 

4. Books:
Barbrook-Johnson, P., & Penn, A. (2022). Systems Mapping: How to build and use causal models of systems. Palgrave. Available open access at https://doi.org/10.1007/978-3-031-01919-7  

  • Provides a practical and in-depth discussion of causal systems mapping methods.
  • Provides guidance on running systems mapping workshops and using different types of data and evidence.
  • Orientates readers to the systems mapping landscape and explores how we can compare, choose, and combine methods.
  • This book is open access, which means that you have free and unlimited access.

 

5. Journal Articles:
Breeze, P. R., Squires, H., Ennis, K., Meier, P., Hayes, K., Lomax, N., Shiell, A., Kee, F., de Vocht, F., O’Flaherty, M., Gilbert, N., Purshouse, R., Robinson, S., Dodd, P. J., Strong, M., Paisley, S., Smith, R., Briggs, A., Shahab, L., … Brennan, A. (2023). Guidance on the use of complex systems models for economic evaluations of public health interventions. Health Economics, 32(7), 1603–1625. Available at: https://doi.org/10.1002/hec.4681

  • This paper provides a definition to identify and characterise complex systems models for economic evaluations and proposes guidance on key aspects of the process for health economics analysis.

 

Barbrook-Johnson, P., Castellani, B., Hills, D., Penn, A., & Gilbert, N. (2021). Policy evaluation for a complex world: Practical methods and reflections from the UK Centre for the Evaluation of Complexity across the Nexus. Evaluation, 27(1), 4-17. Available at: https://doi.org/10.1177/1356389020976491 

  • This introductory essay outlines the aims, contents, and contribution of this special issue of Evaluation. The issue, written by members of CECAN and some of its collaborators, aims to bring together a set of ideas and methods for handling complexity in evaluation based on extensive practical experience in addressing these issues, from a research perspective, in UK central government evaluation.

 

Barbrook-Johnson, P., & Penn, A. (2021). Participatory systems mapping for complex energy policy evaluation. Evaluation, 27(1), 57-79. Available at:  https://doi.org/10.1177/1356389020976153 

  • Presents the use of PSM in two real-world energy policy evaluation contexts and considers how this method can be applied more widely in evaluation.

 

Badham, J., Barbrook-Johnson, P., Caiado, C., & Castellani, B. (2021). Justified stories with agent-based modelling for local COVID-19 planning. Journal of Artificial Societies and Social Simulation, 24(1), 8. Retrieved from http://jasss.soc.surrey.ac.uk/24/1/8.html

  • Badham et al. (2021) explore using agent-based modelling (ABM) combined with “justified stories” to support local COVID-19 planning. The study highlights how ABM can simulate diverse social behaviours and interactions, helping policymakers assess potential outcomes of different strategies. The authors emphasise the value of integrating local knowledge and narratives into modelling to ensure more relevant, context-sensitive insights for decision-making during the pandemic.

 

Befani, B., Elsenbroich, C., & Badham, J. (2021). Diagnostic evaluation with simulated probabilities. Evaluation, 27(1), 102–115. https://doi.org/10.1177/1356389020980476

  • Befani, Elsenbroich, and Badham (2021) introduce a diagnostic evaluation approach using simulated probabilities to enhance decision-making in complex systems. The paper combines qualitative evaluation with probabilistic simulations to explore uncertainties and dynamic relationships. By applying this method, evaluators can better understand the impact of different interventions, providing more informed insights for policy development in complex, unpredictable environments.

 

Bicket, M., Hills, D., Wilkinson, H., & Penn, A. (2021). Don’t panic: Bringing complexity thinking to UK Government evaluation guidance. Evaluation, 27(1), 18-31. Available at: https://doi.org/10.1177/1356389020980479 

  • This article describes the development process of the Magenta Book supplementary guide: Handling complexity in policy evaluation; focusing on interdisciplinary collaboration and stakeholder input, and summarises the guide’s content and the challenges of communicating complex concepts.

 

Cox, J., & Barbrook-Johnson, P. (2021). How does the commissioning process hinder the uptake of complexity-appropriate evaluation? Evaluation, 27(1), 32–56. https://doi.org/10.1177/1356389020976157

  • Cox and Barbrook-Johnson (2021) explore how the commissioning process limits the adoption of complexity-appropriate evaluation methods. They identify barriers such as rigid, linear frameworks, lack of flexibility in funding, and insufficient understanding of complexity by commissioners. The authors suggest that to improve the uptake of these evaluations, commissioners must adopt more adaptable, systems-oriented approaches that better reflect the dynamic nature of complex social issues.

 

Gates, E. F., Walton, M., Vidueira, P., & McNall, M. (2021). Introducing systems- and complexity-informed evaluation. In E. F. Gates, M. Walton, & P. Vidueira (Eds.), Systems and Complexity-Informed Evaluation: Insights from Practice. New Directions for Evaluation, 13–25. https://doi.org/10.1002/ev.20466 

  • This introductory chapter explores core systems and complexity concepts and their practical application in evaluation. 

 

Wilkinson, H., Hills, D., Penn, A., & Barbrook-Johnson, P. (2021). Building a system-based Theory of Change using Participatory Systems Mapping. Evaluation, 27(1), 80-101. https://doi.org/10.1177/1356389020980493

  • Presents a methodology for building genuinely holistic, complexity-appropriate, system-based Theory of Change diagrams, using PSM as a starting point.

 

Wilkinson, H., Hills, D., & Penn, A. (2021). Don’t panic: Bringing complexity thinking to UK Government evaluation guidance. Evaluation, 27(1), 18–31. https://doi.org/10.1177/1356389020980479

  • Wilkinson, Hills, and Penn (2021) advocate for integrating complexity thinking into UK government evaluation guidance. They argue that traditional evaluation methods are often inadequate for addressing the complexities of public policy. By incorporating systems thinking and acknowledging the dynamic, interrelated nature of policy impacts, the authors suggest that evaluations can become more adaptive and better aligned with real-world challenges, leading to more effective policy decision-making.

 

Schimpf, C., Barbrook-Johnson, P., & Castellani, B. (2021). Case-based modelling and scenario simulation for ex-post evaluation. Evaluation, 27(1), 116–137. https://doi.org/10.1177/1356389020978490

  • Schimpf, Barbrook-Johnson, and Castellani (2021) explore the use of case-based modelling and scenario simulation for ex-post evaluation. They demonstrate how these methods can help assess past interventions by simulating different scenarios and understanding their outcomes. The paper highlights how scenario simulations provide insights into the causal mechanisms behind observed results, offering evaluators a powerful tool for understanding the effectiveness and impacts of policies or programs after implementation.

 

Schimpf, C., & Castellani, B. (2020). COMPLEX-IT: A case-based modelling and scenario simulation platform for social inquiry. Journal of Open Research Software, 8(1), 25. http://doi.org/10.5334/jors.298

  • Schimpf and Castellani (2020) introduce COMPLEX-IT, a case-based modelling and scenario simulation platform designed for social inquiry. The platform enables researchers to simulate and analyse complex social systems, helping to understand the dynamics and interactions within them. By offering an intuitive interface and flexible modelling capabilities, COMPLEX-IT supports the exploration of various scenarios, aiding in the development of more informed, evidence-based social policies and interventions.

 

Barbrook-Johnson, P., Proctor, A., Giorgi, S., & Phillipson, J. (2020). How do policy evaluators understand complexity? Evaluation, 26(3), 315–332. https://doi.org/10.1177/1356389020930053

  • Barbrook-Johnson et al. (2020) investigate how policy evaluators understand and address complexity in evaluations. Through interviews with evaluators, the study identifies key challenges, including the need for deeper systems thinking, the difficulty of capturing interdependencies, and balancing multiple perspectives. The authors suggest that evaluators should adopt more flexible, adaptive approaches to better navigate the complexities of real-world policy contexts.

 

Befani, B. (2020). Quality of quality: A diagnostic approach to qualitative evaluation. Evaluation, 26(3), 333–349. https://doi.org/10.1177/1356389019898223

  • Befani (2020) proposes a diagnostic approach to qualitative evaluation, focusing on the “quality of quality” in evaluation processes. The paper critiques traditional evaluation frameworks, advocating for a more nuanced approach that considers context, flexibility, and iterative learning. Befani emphasises the importance of balancing rigor with responsiveness to ensure that qualitative evaluations accurately capture complex social phenomena and deliver meaningful insights for decision-making.

 

Castellani, B. (2020). Making the global complexity turn in population health. In Y. Apostolopoulos, K. Hassmiller Lich, & M. Lemke (Eds.), Complex Systems and Population Health: A Primer. Oxford University Press. (Book chapter)

  • Castellani (2020) discusses the global complexity turn in population health, emphasising the importance of systems thinking to understand health outcomes. He argues that traditional approaches are inadequate for addressing the interconnected, dynamic factors influencing population health. By integrating complexity science, Castellani suggests a more holistic perspective that accounts for social, economic, and environmental determinants, promoting more effective, context-sensitive health interventions at global and local levels.

 

Dister, C., Rajaram, R., & Castellani, B. (2020). Modeling social complexity in infrastructures: A case-based approach to improving reliability and resiliency. In E. Mitleton-Kelly, A. Paraskevas, & C. Day (Eds.), Handbook of Research Methods in Complexity Science: Theory & Application. Edward Elgar. (Conference paper/book chapter)

  • Dister, Rajaram, and Castellani (2020) explore modelling social complexity in infrastructures, focusing on improving reliability and resilience. Through case studies, they demonstrate how systems thinking and computational models can capture complex interactions within infrastructures, highlighting the importance of social dynamics in managing risks. The chapter emphasises the need for interdisciplinary approaches to enhance the adaptability and robustness of infrastructure systems in the face of emerging challenges.

 

Kirsop-Taylor, N. A., & Hejnowicz, A. P. (2020). Designing public agencies for 21st century water–energy–food nexus complexity: The case of Natural Resources Wales. Public Policy and Administration, 35(4), 415–435. https://doi.org/10.1177/0952076720921444

  • Kirsop-Taylor and Hejnowicz (2020) explore how public agencies, using the case of Natural Resources Wales, can be designed to address the complexity of the water–energy–food nexus. They argue for adaptive, collaborative governance structures that integrate cross-sectoral perspectives and systems thinking. The paper emphasises the importance of flexibility, resilience, and coordination across agencies to tackle interconnected challenges in resource management in the 21st century.

 

Kirsop-Taylor, N. A., Hejnowicz, A. P., & Scott, K. (2020). Four cultural narratives for managing social-ecological complexity in public natural resource management. Environmental Management, 66(3), 419–434. https://doi.org/10.1007/s00267-020-01334-0

  • Kirsop-Taylor, Hejnowicz, and Scott (2020) identify four cultural narratives that influence the management of social-ecological complexity in public natural resource management. These narratives (technical, ecological, social, and adaptive) shape how stakeholders perceive and approach environmental challenges. The authors argue that integrating these diverse perspectives can lead to more effective, context-sensitive management practices that balance ecological sustainability with social and economic needs.

 

Barbrook-Johnson, P., Schimpf, C., & Castellani, B. (2019). Reflections on the use of complexity-appropriate computational modeling for public policy evaluation in the UK. Journal on Policy and Complex Systems, 5(1), 55–70. Retrieved from https://www.ipsonet.org/journal-of-policy-and-complex-systems

  • Barbrook-Johnson, Schimpf, and Castellani (2019) reflect on using complexity-appropriate computational modelling for public policy evaluation in the UK. They discuss the potential of methods like agent-based modelling to capture dynamic, nonlinear interactions within systems. The authors highlight challenges in applying these models, including data limitations, the need for interdisciplinary collaboration, and the difficulty of integrating complexity into conventional evaluation practices, but also emphasise their value in addressing policy challenges.

 

Pattyn, V., Molenveld, A., & Befani, B. (2019). Qualitative Comparative Analysis as an evaluation tool: Lessons from an application in development cooperation. American Journal of Evaluation, 40(1), 55–74. https://doi.org/10.1177/1098214017710502

  • Pattyn, Molenveld, and Befani (2019) explore the use of Qualitative Comparative Analysis (QCA) as an evaluation tool in development cooperation. They highlight how QCA helps identify patterns and causal relationships in complex, multi-dimensional contexts. The study reflects on challenges and lessons learned from its application, suggesting that QCA is a valuable method for evaluating interventions with diverse and interrelated factors, offering more nuanced insights than traditional approaches.

 

Stewart, B. D., Burns, C., Hejnowicz, A. P., Gravey, V., O’Leary, B. C., Hicks, K., … Hartley, S. E. (2019). Making Brexit work for the environment and livelihoods: Delivering a stakeholder-informed vision for agriculture and fisheries. People and Nature, 1(4), 442–456. https://doi.org/10.1002/pan3.10054

  • Stewart et al. (2019) propose a stakeholder-informed vision for agriculture and fisheries post-Brexit, aiming to balance environmental sustainability with livelihoods. They emphasise the importance of inclusive, cross-sectoral dialogue to design policies that address both ecological goals and the economic needs of communities. The paper highlights the need for adaptive governance frameworks that can respond to changing conditions and support long-term sustainability in the agricultural and fisheries sectors.

 

Penn, A. S. (2018). Moving from overwhelming to actionable complexity in population health policy: Can ALife help? Artificial Life, 24(3), 218–219. Retrieved from https://www.researchgate.net/publication/326458928_Moving_from_Overwhelming_to_Actionable_Complexity_in_Population_Health_Policy_Can_ALife_Help

  • Penn (2018) discusses how Artificial Life (ALife) models can help transform overwhelming complexity in population health policy into actionable insights. By simulating complex systems and interactions, ALife can provide a clearer understanding of health dynamics, supporting more effective policy decisions. Penn argues that these models offer valuable tools for navigating the intricacies of health interventions and improving outcomes in dynamic, interconnected environments.

 

Gilbert, N., Ahrweiler, P., Barbrook-Johnson, P., Narasimhan, K.P., Wilkinson, H. (2018). Computational modelling of public policy: reflections on practice. J Artif Soc Soc Simul, 21(1), 14. Available at: https://jasss.soc.surrey.ac.uk/21/1/14.html 

  • This paper reports on the experience of the authors in designing and using computational models of public policy.

 

Hejnowicz, A. P., & Rudd, M. A. (2017). The value landscape in ecosystem services: Value, value, wherefore art thou value? Sustainability, 9(5), 850. https://doi.org/10.3390/su9050850

  • Hejnowicz and Rudd (2017) examine the concept of value in ecosystem services, exploring how different values (e.g., ecological, economic, social) are assigned to ecosystem functions. They highlight the complexity of valuing ecosystem services, discussing challenges like context dependence and value pluralism. The paper calls for more nuanced approaches to understanding and incorporating diverse values into decision-making processes for sustainable ecosystem management.

 

Gates, E.F., (2016). Making sense of the emerging conversation in evaluation about systems thinking and complexity science. Evaluation and Program Planning, 59, 62–73. 

  • Explores how systems thinking and complexity science (STCS) are changing evaluation practices for social interventions. It argues that STCS offers new perspectives on key evaluation activities, including problem-solving, intervention framing, methodology, valuing, knowledge production, and utilisation.

 

Schiller, F., Penn, A., & Basson, L. (2014). Analyzing networks in industrial ecology – A review of Social-Material Network Analyses. Journal of Cleaner Production, 76. Available at: http://doi.org/10.1016/j.jclepro.2014.03.029

  • Reviews methodological challenges in integrating natural and social sciences within industrial ecology, proposing network analysis as a promising solution.

 

Powers, S. T., Penn, A. S., & Watson, R. A. (2011). The concurrent evolution of cooperation and the population structures that support it. Evolution; international journal of organic evolution, 65(6), 1527–1543. Available at: https://doi.org/10.1111/j.1558-5646.2011.01250.x

  • This study examines the simultaneous development of cooperation and supportive population structures.

 

6. Reports and CECAN Evaluation Policy and Practice Notes (EPPNs) for policy analysts and evaluators:

Davies, J., & Goldie, M. (2023). Navigating system change evaluation. Social Finance. Available at: https://www.socialfinance.org.uk/insights/navigating-system-change-evaluation 

  • This white paper offers insights on evaluating system change, addressing four key themes: the inherent difficulties of such evaluations; how evaluation methods should adapt while retaining core principles; practical examples of system change evaluation; and strategies for tracking progress.

 

CECAN (2023). CECAN Progress Report (March 2016 – February 2023). Centre for the Evaluation of Complexity Across the Nexus, University of Surrey. (Project overview and impact report)

  • The CECAN Progress Report (2016–2023) outlines the Centre’s achievements in applying complexity science to policy evaluation. It highlights key projects, methodologies developed, and collaborative efforts across environmental, social, and economic sectors. The report emphasises the impact of CECAN’s work in advancing systems thinking and improving evaluation practices, showcasing the Centre’s contribution to more adaptive, evidence-based decision-making in complex policy landscapes.

 

Giorgi, S. (2023). How to improve the evaluation of complex systems to better inform policymaking: Learning from evaluating Defra’s Reward and Recognition Fund [Fellowship project report]. Centre for the Evaluation of Complexity Across the Nexus (CECAN). Retrieved from https://www.cecan.ac.uk/projectreports/

  • Giorgi (2023) explores how to enhance the evaluation of complex systems to better inform policymaking, drawing lessons from evaluating Defra’s Reward and Recognition Fund. The report highlights challenges in evaluating multifaceted programs and emphasises the need for systems-based approaches that capture dynamic interactions. Giorgi advocates for integrating complexity-aware methods to provide more comprehensive insights that can guide more effective and adaptive policymaking in complex environments.

 

Gokhale, S., & Walton, M. (2023). Adaptive Evaluation: A Complexity-Based Approach to Systematic Learning for Innovation and Scaling in Development. (No. 428; CID Faculty Working Paper). Center for International Development, Harvard University. Retrieved from https://www.hks.harvard.edu/centers/cid/publications/faculty-working-papers 

  • This paper introduces Adaptive Evaluation, a complexity-based approach for learning and supporting innovation in evolving human systems. It emphasizes understanding complex environments and informing action for meaningful change. Adaptive Evaluation builds hypotheses from the field, prioritizes learning, embraces diverse techniques, and values dialogue in navigating complex processes. The paper outlines its theoretical basis, core ideology, process, and applications.

 

Astill, S., & Henderson, S. (2022). Outcome Likelihoods and Causal Analysis (OLCA): Structured influence mapping and Bayesian belief networks for evaluating outcome likelihoods (CECAN Evaluation Policy and Practice Note No. 19). Retrieved from https://www.cecan.ac.uk/resources/ (CECAN EPPN No. 19)

  • Astill and Henderson (2022) introduce Outcome Likelihoods and Causal Analysis (OLCA), a method combining structured influence mapping and Bayesian belief networks to evaluate outcome likelihoods. This approach helps assess the probability of different outcomes based on causal relationships, supporting more informed decision-making in evaluations. By visualising and quantifying uncertainty, OLCA aids in understanding complex systems and the potential impact of interventions in a clear, evidence-based manner.

 

CECAN (2021). CECAN Project Report 2021. Centre for the Evaluation of Complexity Across the Nexus, University of Surrey. (Annual project report)

  • The CECAN Project Report 2021 highlights the Centre’s work on evaluating complexity across the nexus of environmental, social, and economic systems. It summarises key activities, including the development of new evaluation methods and tools, case studies, and collaborations. The report reflects on lessons learned, challenges faced, and the impact of applying systems thinking and complexity science to policy evaluation, offering insights for future research and practice in complex policy areas.

 

Barbrook-Johnson, P., Shaw, B., & Penn, A. (2021). Mapping complex policy landscapes: The example of “Mobility as a Service” (CECAN Evaluation Policy and Practice Note No. 18). Retrieved from https://www.cecan.ac.uk/resources/ (CECAN EPPN No. 18)

  • Barbrook-Johnson, Shaw, and Penn (2021) explore the use of systems mapping to evaluate the complex policy landscape of “Mobility as a Service” (MaaS). They demonstrate how mapping interconnected factors, stakeholders, and dynamics can help reveal insights into MaaS’s potential impacts on transport policies. This approach supports more effective decision-making by clarifying the complexities and interactions that influence the success of MaaS in urban mobility systems.

 

Barbrook-Johnson, P. (2020). Participatory Systems Mapping in action – supporting the evaluation of the Renewable Heat Incentive (CECAN Evaluation Policy and Practice Note No. 17). Retrieved from https://www.cecan.ac.uk/resources/ (CECAN EPPN No. 17)

  • Barbrook-Johnson (2020) discusses the use of Participatory Systems Mapping to evaluate the UK’s Renewable Heat Incentive. The approach engages stakeholders in mapping the complex system of interactions surrounding the program, identifying key factors influencing outcomes. By involving diverse perspectives, this method helps uncover deeper insights into the program’s impacts, improving the understanding of its effectiveness and providing more informed recommendations for future energy policies.

 

Bicket, M., Christie, I., Gilbert, N., Penn, A., Hills, D., & Wilkinson, H. (2020). Complexity and what it means for policy design, implementation and evaluation (CECAN Evaluation Policy and Practice Note No. 16). Retrieved from https://www.cecan.ac.uk/resources/eppns/ (EPPN No. 16)

  • Bicket et al. (2020) discuss the implications of complexity for policy design, implementation, and evaluation. They argue that traditional approaches often fail to account for the dynamic, interconnected nature of policy systems. The paper emphasises the need for complexity-aware methods, such as systems thinking and adaptive approaches, to improve policy effectiveness. These methods help navigate uncertainty and interdependencies, ensuring policies are more responsive to evolving challenges and contexts.

 

Cox, J. (2020). How does the commissioning process inhibit the uptake of complexity-appropriate evaluation? [Fellowship project report]. Centre for the Evaluation of Complexity Across the Nexus (CECAN). Retrieved from https://www.cecan.ac.uk/projectreports/

  • Cox (2020) examines how the commissioning process hinders the adoption of complexity-appropriate evaluation methods. The report identifies barriers such as rigid funding structures, a focus on linear outcomes, and a lack of understanding of complexity by commissioners. Cox argues that overcoming these challenges requires more flexible, systems-oriented approaches in commissioning to enable the effective use of complexity-aware evaluation methods and better address complex policy issues.

 

Allen, R., Bicket, M., & Junge, K. (2019). Using complexity and theory of change to transform regulation: A complex theory of change for the Food Standards Agency’s “Regulating Our Future” programme (CECAN Evaluation Policy and Practice Note No. 15). Retrieved from https://www.cecan.ac.uk/eppn-no-15-food-standards-agencys-regulating-our-future-programme/

  • Allen, Bicket, and Junge (2019) explore using complexity and theory of change to transform regulation in the UK’s Food Standards Agency’s “Regulating Our Future” program. They develop a complex theory of change that incorporates systems thinking to better understand the interrelated factors influencing regulatory processes. This approach aims to improve the adaptability and effectiveness of the program, helping the agency respond to emerging challenges and uncertainties in food safety and regulation.

 

Noble, J. (2019). Theory of change in ten steps. New Philanthropy Capital (NPC). Retrieved from https://www.thinknpc.org/resource-hub/ten-steps/ 

  • A ten-step handbook to creating a theory of change, built on many years of developing them for charities and funders. It focusses on the basics and NPC’s core approach.

 

Sedighi, T., & Varga, L. (2019). A Bayesian network for policy evaluation (CECAN Evaluation Policy and Practice Note No. 13). Retrieved from https://www.cecan.ac.uk/wp-content/uploads/2020/08/EPPN-No-13-A-Bayesian-Network-for-Policy-Evaluation.pdf

  • Sedighi and Varga (2019) introduce a Bayesian network for policy evaluation, which uses probabilistic modelling to assess complex policy outcomes. The approach helps evaluate uncertainties and causal relationships by updating beliefs based on new data. Bayesian networks enable evaluators to better understand the likelihood of different policy impacts and improve decision-making by incorporating uncertainty and evidence into the evaluation process, leading to more robust policy recommendations.

 

Abercrombie, R., Boswell, K., & Thomasoo, R. (2018). Thinking big: how to use theory of change for systems change. Lankelly Chase Foundation. Retrieved from https://www.thinknpc.org/resource-hub/thinking-big-how-to-use-theory-of-change-for-systems-change/

  • This report identifies five common pitfalls that organisations fall into when using theory of change, and walks through five rules of thumb that will help organisations to use the approach to tackle complex problems. 

 

Barbrook-Johnson, P., & Penn, A. S. (2018). A participatory systems map of the energy trilemma [Report for BEIS]. Centre for the Evaluation of Complexity Across the Nexus (CECAN). (Also issued as CECAN EPPN No. 12)

  • Barbrook-Johnson and Penn (2018) present a participatory systems map of the energy trilemma, addressing the interconnected challenges of energy security, affordability, and sustainability. The report, created for BEIS, uses systems mapping to engage stakeholders and explore the complex relationships between these issues. This approach provides valuable insights into how policies can balance these competing demands and inform more integrated, effective energy strategies.

 

Proctor, A., & Greaves, J. (2018). Evaluating complexity in context using Qualitative Comparative Analysis: The Environment Agency and waste crime (CECAN Evaluation Policy and Practice Note No. 11). Retrieved from https://www.cecan.ac.uk/resources/eppns/ (EPPN No. 11)

  • Proctor and Greaves (2018) explore using Qualitative Comparative Analysis (QCA) to evaluate complexity in the context of the Environment Agency’s efforts to address waste crime. By comparing different cases and identifying key factors, QCA helps reveal causal patterns and understand the complex dynamics influencing waste crime. This approach allows evaluators to assess the effectiveness of interventions and better inform policy decisions in complex environmental issues.

 

Kwakkel, J. H. (2018). Managing deep uncertainty: Exploratory modelling, adaptive plans and decision support (CECAN Evaluation Policy and Practice Note No. 9). Retrieved from https://www.cecan.ac.uk/resources/eppns/ (EPPN No. 9)

  • Kwakkel (2018) discusses managing deep uncertainty in decision-making through exploratory modelling, adaptive planning, and decision support. The author emphasises the importance of exploring multiple future scenarios and using adaptive plans that can evolve as new information becomes available. This approach helps decision-makers navigate uncertainty, allowing for more flexible, robust, and informed choices in complex, unpredictable environments, particularly in policy and resource management contexts.

 

Scott, K. (2017). Maximising impact from evaluations in complex policy areas (CECAN Evaluation Policy and Practice Note No. 8). Retrieved from https://www.cecan.ac.uk/resources/eppns/ (EPPN No. 8)

  • Scott (2017) discusses strategies for maximising the impact of evaluations in complex policy areas; emphasising the need for flexible, adaptive evaluation methods that account for dynamic systems and evolving contexts. By engaging stakeholders and focusing on learning and adaptation, evaluations can provide actionable insights that influence decision-making, ensuring that policies remain effective and relevant in complex and uncertain environments.

 

Darnton, A. (2017). Revaluation: A participative approach to measuring and making change (CECAN Evaluation Policy and Practice Note No. 7). Retrieved from https://www.cecan.ac.uk/resources/eppns/ (EPPN No. 7)

  • Darnton (2017) presents “Revaluation”, a participative approach to measuring and making change. This method actively involves stakeholders in the evaluation process, encouraging reflection and adaptation. By focusing on collaboration and co-creation, Revaluation helps ensure that evaluations capture the complexity of real-world situations and produce actionable insights. This approach supports continuous learning, enhancing the effectiveness of interventions through collective understanding and involvement.

 

Haynes, P. (2017). Dynamic Pattern Synthesis (CECAN Evaluation Policy and Practice Note No. 6). Retrieved from https://www.cecan.ac.uk/wp-content/uploads/2020/08/EPPN-No-06-Dynamic-pattern-synthesis.pdf

  • Haynes (2017) introduces Dynamic Pattern Synthesis as a method for evaluating complex systems. This approach helps identify and analyse patterns of change within dynamic, interconnected systems by synthesising multiple data sources and perspectives. It supports understanding how different factors interact and evolve over time, providing valuable insights for decision-makers to adapt policies and interventions effectively in complex and shifting environments.

 

Sheate, W., & Twigger-Ross, C. (2017). Learning lessons for evaluating complexity across the nexus: A meta-evaluation of projects (CECAN Evaluation Policy and Practice Note No. 5). Retrieved from https://www.cecan.ac.uk/eppn-no-05-learning-lessons/

  • Sheate and Twigger-Ross (2017) conduct a meta-evaluation of projects focused on complexity across the nexus, aiming to draw lessons for future evaluations. They identify key challenges and successes in evaluating interconnected environmental, social, and economic systems. The authors emphasise the importance of adaptive, holistic approaches that consider interdependencies and evolving contexts, offering valuable insights for improving evaluation practices in complex, multi-dimensional policy areas.

 

Uprichard, E., Penn, A. S., Barons, M., Liddon, A., Byrne, D., Befani, B., … Smith, J. Q. (2016). Dependency models (CECAN Evaluation Policy and Practice Note No. 4). Retrieved from https://www.cecan.ac.uk/eppn-no-04-dependency-models/

  • Uprichard et al. (2016) present Dependency Models as a tool for evaluating complex systems, highlighting the relationships and dependencies between different factors. These models help identify how changes in one element of a system might impact others, supporting a deeper understanding of causal mechanisms. By mapping these dependencies, evaluators can better assess how interventions might influence broader system dynamics and improve decision-making in complex policy contexts.

 

Wilkinson, H., Gilbert, N., Varga, L., Elsenbroich, C., Leveson-Gower, H., & Dennis, J. (2016). Agent-Based Modelling for Evaluation (CECAN Evaluation Policy and Practice Note No. 3). Retrieved from https://www.cecan.ac.uk/eppn-no-03-agent-based-modelling-for-evaluation/

  • Wilkinson et al. (2016) introduce Agent-Based Modelling (ABM) as a tool for evaluation. ABM simulates interactions between autonomous agents to model complex systems and assess how different factors influence outcomes. The paper highlights ABM’s potential to capture dynamic behaviours, predict system responses, and explore various scenarios. It offers valuable insights for evaluating policies in complex environments, where traditional methods may fall short in understanding underlying interactions and causal relationships.

 

Befani, B., Rees, C., Varga, L., & Hills, D. (2016). Testing contribution claims with Bayesian updating (CECAN Evaluation Policy and Practice Note No. 2). Retrieved from https://www.cecan.ac.uk/eppn-no-02-testing-contribution-claims-with-bayesian-updating/

  • Befani, Rees, Varga, and Hills (2016) introduce Bayesian updating as a method for testing contribution claims in evaluation. By using Bayesian probability to update beliefs based on new evidence, this approach helps evaluate the likelihood of causal contributions and improve the robustness of findings. The method provides a systematic way to assess and refine the understanding of how interventions lead to observed outcomes, enhancing evaluative rigor and decision-making.

 

Byrne, D. (2016). Qualitative Comparative Analysis: A pragmatic method for evaluating intervention (CECAN Evaluation Policy and Practice Note No. 1). Retrieved from https://www.cecan.ac.uk/eppn-no-01-qualitative-comparative-analysis/

  • Byrne (2016) introduces Qualitative Comparative Analysis (QCA) as a pragmatic method for evaluating interventions. QCA helps identify patterns and causal relationships by comparing cases with different outcomes. The approach is particularly useful in complex settings, allowing evaluators to assess multiple factors and configurations that contribute to success or failure. This method supports more nuanced, context-specific insights into the effectiveness of interventions.

 

7. CECAN Syllabus:

CECAN. (2017). Evaluation of Complex Policy and Programs: A CECAN module for future policy analysts and evaluators. Version 1.0. Available at: https://www.cecan.ac.uk/resources/ 

  • This syllabus outlines strategies and methods for evaluating complex policies and programs, examining complexity in social science research and providing a range of methods to understand complex causality, emergence, and feedback loops.

 

8. Videos:

Bicket, M., Hills, D., & Wilkinson, H. (2020). CECAN Webinar: Handling Complexity in Policy Evaluation – Introducing the new Magenta Book 2020 Supplementary Guide. CECAN. Available at: https://youtu.be/O5ksQwiB0lc 

  • Presents the Magenta Book supplementary guide: Handling complexity in policy evaluation and its key features. Includes reflections on creating the guide and its impact on evaluation practices and policy in the UK.

 

9. Conference Presentations:
Bicket, M., Penn, A., Christie, I. (2020). An Introduction to the Magenta Book Supplementary Guide on Handling Complexity in Policy Evaluation. Available at: https://www.cecan.ac.uk/conferences/ 

  • Provides an overview of the Magenta Book supplementary guide: Handling complexity in policy evaluation; discussing complex system challenges and evaluation challenges.

 

Jarvis, A. (2020). Commissioning Complex Evaluations: The Contractors’ Perspective. Available at: https://www.cecan.ac.uk/conferences/ 

  • Discusses the challenges and considerations from the contractors’ viewpoint when commissioning complex evaluations.

 

CECAN Webinar – The benefits and challenges of conducting research with impact ‘built in’: reflections and findings from an evaluation of Electronic Monitoring with the Ministry of Justice, with Ian Brunton-Smith. 23 Jun, 1 - 2pm BST. Includes live Q&A! Register free: www.cecan.ac.uk/events/cecan...

[image or embed]

— CECAN (@cecan.bsky.social) April 9, 2025 at 12:22 PM

*New Resource* - 'Guidance on using large language models to extract cause-and-effect pairs from texts for systems mapping', written by Jordan White and Pete Barbrook-Johnson. See: www.cecan.ac.uk/resources/to...

[image or embed]

— CECAN (@cecan.bsky.social) April 3, 2025 at 2:53 PM
Share This