In development studies, one wants an evaluation team to face up to the challenge of combining surveys with semi-structured interview data. In this seminar, Wendy explained and demonstrated how these linkages are made with concrete data about villages in Bangladesh and India. The problem is that complex data can overwhelm the interpreting team. The solutions offered by Wendy are methodologically sound.
Professionals in evaluation pay a lot of attention to how to evaluate. We are less thoughtful about exactly what to evaluate, why, and how we define ‘success’. Stakeholders’ interests and questions tend to determine what will be useful to focus on at a particular point in time. Or we use a predetermined set of criteria, usually the so-called DAC criteria - relevance, efficiency, effectiveness, impact and sustainability - to direct and focus evaluations. Yet this widely accepted practice can prompt us to neglect important issues that should be on our agenda if we are serious about the role of evaluation in supporting sustainable development, and understanding whether we are on track to achieve the SDGs. Failure to focus our evaluations appropriately can lull us into dangerous complacency about accomplishment and success.
DEADLINE EXTENDED TO 14TH FEB 2018
CECAN has published a new paper, 'Computational Modelling of Public Policy: Reflections on Practice.'
Authors: Nigel Gilbert , Petra Ahrweiler , Pete Barbrook-Johnson, Kavin Preethi Narasimhan and Helen Wilkinson
CECAN was represented at the recent Social Innovation Conference in Lisbon, by Professor Liz Varga of Cranfield University.