Finding “hidden treasures” in public service evaluation

leanne teichner photoThe Social Services and Well-being Wales Act (2014) and the Well-being of Future Generations Wales Act (2015) triggered some thoughtful dialogues among providers of public services about how to meaningfully measure change. In this blog, Leanne Teichner (a Social Researcher at Data Cymru) discusses the challenges faced by a developing Welsh Evaluation Community of Practice.

The background

Both Acts emphasise the need for public services to evaluate the exact difference they are making to Welsh communities. Data Cymru supported work in this area by holding a National Intelligence Event, ‘Are we making a difference? Understanding our impact on well-being’, in late 2018 and subsequently launching their evaluation guide in 2019.

Data Cymru and partners - the Wales Council for Voluntary Action (WCVA Inspiring Impact programme), Co-production Network for Wales and Swansea University - want to further understand the real time challenges, tensions and opportunities accompanying increased Welsh public sector evaluation efforts and knowledge. We therefore held a “Community of Enquiry” in early 2020 - a dialogue between the voluntary and public sector that explored how to realistically assess and measure service effectiveness and impact.

Practitioners involved in the delivery and commissioning of Welsh public services were asked to attend a morning workshop facilitated by Swansea University, where they were encouraged to identify the key challenges surrounding evaluation. They were also asked to think about ways of addressing these. Further information about Community of Enquiry as a research method is available here.

The key question and the challenges

The Community of Enquiry led to attendees (“the community”) identifying one key question which was:

What does a positive ‘evaluation culture’ look like?

The community wanted to be able to understand how to establish a culture that could achieve the following:

  • Identify and define good evaluation;
  • Give due consideration to the complex landscapes public services are delivered within; and
  • Promote positive working and understanding among the diverse stakeholders (including frontline services, service users, commissioners and academics) involved in Welsh public service evaluation.

This key question led the community to identify four themed challenges:

Challenge one: Treasures and their recognition

The community discussed how evaluation can be used to recognise the subtle but meaningful interactions and outcomes (“treasures” and “magic moments”) that occur in everyday public service practice. For example, the reminiscing conversations, the acts of support and kindness, the laughter, the everyday and the somewhat mundane magic moments that support feelings of lasting well-being and inclusion.

The community compared these magic moments with the routine monitoring that takes place within service delivery - the handheld devices used by care staff to record patient food and medicine intake, bathroom use, etc. The community reflected on the reality that treasures and magic moments aren’t usually captured within standard data collection processes. We agreed that they should be treated as data because they demonstrate the interactions and events that support improvements in people’s well-being - they demonstrate the human mechanisms of change.

We also thought about the public service improvement needs identified through evaluation. For example, the need for more bespoke, patient focused support, or the need for more specialised funding, etc. The community referred to these improvement needs as “tragedies”. We recognised their hidden value in supporting more informed service delivery. We therefore called for a reframing of these tragedies, as “hidden treasures” because of their learning value.

Challenge two: Mindsets and cultures

The community discussed all stakeholders who are involved in Welsh evaluation, namely:

  • The voluntary sector, who often deliver public services and respond to the lessons identified through evaluation and research.
  • Local authorities and public bodies, who are largely responsible for setting evaluation priorities and monitoring intervention progress.
  • Academia, who have an interest in assessing the social impact of frontline services and making policy and service recommendations.

We considered the different evaluation approaches and terminology used by different stakeholders and the tensions that arise from this. For example, the community discussed the common tension that results from evaluation stakeholders deciding between using predominantly qualitative or quantitative data to evidence change. We agreed that these types of data are often treated as dichotomous, and that this over-simplifies evaluation evidence and its uses.

The community concluded that tensions like this result in disconnected evaluation worlds, which limits evaluation effectiveness. We agreed that there needed to be a collective and proactive effort to bring different evaluation stakeholders together to share evaluation lessons and practice, and grapple with the different understandings of evaluation.

Challenge three: Evaluation design and methods

The community discussed evaluation methods and the frameworks used to track progress. We talked about the well-being changes that people (service users and staff) tend to remember: change is usually emergent, messy and unplanned and so is the evidence of this change. For example, the special relationship between a reticent service user and staff member which results in the service user positively engaging with health care advice (treasures).

These types of changes are not easy to capture within the metric focused frameworks often used to manage and monitor public service performance. We recognised that this was because of pre-set priorities and metrics established when services and their evaluations are designed. This raised evaluation design implications. We therefore agreed that public performance and evaluation frameworks need to be designed to accommodate the expression of emergent and complex treasures, as well as pre-set priorities (indicators).

Challenge four: Reconciliation

The community discussed whether there was a way of reconciling challenges one, two and three, described above, and asked whether reconciliation was desirable.

Some were concerned that forcing together methods and approaches that were inherently incompatible could be potentially harmful. It could lead to less understood methods and approaches becoming diluted by more mainstream ones. For example, in mixed method designs (where masses of evidence are generated from administration data and questionnaires), would the messages created by a small number of narrative interviews hold the same weight as the larger scale evidence bases? We questioned whether full reconciliation was even possible in situations like this.

Conversely, the community also recognised the potential to validate and give credit to lesser-known research methods and evidence bases within public service evaluation. For example, narrative and visual art research methods, ‘Most Significant Change’, participant-led research (such as journals and logs), and more.

Because of these outstanding questions, the community didn’t come to a conclusion as to whether reconciliation was a realistic way forward.

Where is the community now, and what is its future focus?

The collective energy, enthusiasm and empathy within the community has encouraged and triggered the development of a Welsh based Community of Practice on Evaluation. The community is currently in the early stages, with interested members coming from the Welsh voluntary and public sector, central government, and academia. While the community still needs to establish its aims, it is clear that it would like to serve as a sharing and learning forum where evaluation principles, approaches and methods are presented, case studied and assessed within a Welsh context. Data Cymru, WCVA, Swansea University and Co-production Network also hope that the Community of Practice can serve as a mechanism through which to communicate developments in evaluation thinking and practice among the broad Welsh evaluation community.

There are challenges that the community will need to collectively acknowledge and navigate to become a meaningful contributor to Welsh evaluation practice. A huge part of this will be shaped by its ability to become a credible critical mass that can engage and maintain the commitment of influential evaluation stakeholders, including funders and commissioners who shape the direction of Welsh evaluation practice and principles.

The community’s clout will also be determined by its capability to identify concrete ways of making dents in national evaluation practice and standards. It has already made a step in this direction by having discussions with authors of the ‘Understanding experience and outcomes’ and ‘Using evidence to inform improvements’ elements of the National social care performance framework guidance about the four key challenges. The community is sure that more challenges and opportunities will expand and emerge as the Welsh evaluation community becomes braver and bolder, and the community is excited about being a part of these changes. Watch this space!

Biography

Leanne is a social researcher at Data Cymru where her role is to spotlight effective public engagement and social research within public service delivery. She is currently specialising in evaluation and journey planning and is managing a number of client projects. Leanne has a background in community development and co-production and has focused her qualitative research skills in voluntary and public service environments. Leanne has an MSc in Research Methods and Education from Cardiff University, and has an academic and work history in equality and diversity. She is also a qualified Adult Educator. Contact: [email protected]

Further reading

This is the third in a series of blogs from Leanne Teichner about the challenges faced by Welsh public services when measuring their impact on well-being. The measuring change conundrum Part 1 and Part 2 can be found on the SRA website.