A well designed theory-based evaluation (TBE) helps unpick the often unclear expectations and assumptions which underlie new policy, programmes and initiatives. It can go well beyond measuring what comes out of an intervention to better understand what works (and what doesn’t), and why, and what may be holding it back from working better. Although not a new approach, its use has been held back by confusion about the different approaches to TBE, and a lack of practitioner knowledge about how to apply these promising methods. The course will provide a practical introduction to TBE principles, setting out different ‘layered’ options and approaches and practical examples of how it can be used in often complex social and community development contexts.
By the end of the course participants will:
- Be aware of the origins of TBE and its relationship to other forms of evaluation practice
- Understand the use of TBE to unpick expectations and assumptions underpinning policy, programmes and initiatives and to set out realistic objectives for building these into evaluation
- Be aware of how to develop workable logic charts (models) and to go beyond this to build a credible, and appropriate theory of change into an evaluation design
- Recognise a range of TBE approaches and methods and how to combine different mixes of quantitative and qualitative information to meet evidence needs
- Appreciate how ‘counterfactual’ evidence can be built into TBE approaches to assess causality of interventions in complex community-based settings.
This is a practical and not theoretical course. It use a mixture of taught sessions, plenary discussion and Q&A, and practical scenario-based exercises to cover:
- The nature of TBE, its underpinning ideas and principles
- How TBE relates to other forms of evaluation practice, and to using ‘theory of change’ to meet evaluation ideas
- An introduction to different schools of thought in TBE, and their uses, and how these relate to different evaluation needs and circumstances
- Using ‘layered’ intervention methods to provide fit for purpose ‘theory frames’ for TBE design including assumptions profiling, logic charts and theory of change
- Use of proportionate approaches to engagement with stakeholders to test and refine theory frames, and provide more realistic expectations of the evaluation
- Proportionate methodological choices for evidence gathering and analysis in TBE for both process and impact evaluation including mixing quant and quali approaches
- Practical TBE approaches to assessing attribution within both configurational and generative causal contexts, including contribution analysis and process tracing.
Who will benefit
The course is appropriate to those thinking of the potential use of TBE for both large and small-scale interventions, and for either of formative and summative evaluation needs.
The course is aimed at practitioners and manager of evaluation who wish to better understand how TBE approaches may help in supporting the evaluation and programme review needs of policy teams, other colleagues and external stakeholders. It will be relevant to those with a research or analytical background who are designing and conducting complex evaluations or procuring, specifying or managing policy and programme evaluations in complex intervention circumstances
* It is important that participants are already familiar with the principles and practice of systematic evaluation methods, including evidence gathering methods and analysis, or have attended the SRA advanced course on (impact) evaluation. Without this foundation, participants will not get the full benefit of the day.*
Prof David Parsons is a longstanding evaluation specialist, consultant and ‘peer’ trainer who has many years’ experience of advising government departments, agencies, regulators and charitable trusts on evaluation. A practising evaluator, in the last two decades he has led over 50 independent evaluations of policy and programmes, many theory-based, across six government departments, various non-departmental agencies, devolved administrations, European Commission and others, and is an authority on cross-conceptual approaches to evaluation applied in complex community-based settings. David leads SRAs course programme for practitioner evaluation training, and is author of Demystifying Evaluation (Policy Press, 2017) and a number of guidebooks on evaluation design.
This course contributes 6 hours to the MRS CPD programme