SRA Annual Conference 2015: Workshop Presentations

This year's SRA annual conference, on 14 December at the British Library, asks: 'What makes for research evidence we can trust?' Research experts in a wide range of sectors and topics will present the 18 workshop sessions (in two concurrent sessions) that are summarised below:


*** WORKSHOPS 'A' - 12.05 - 13.05  -  choose one:

Innovations in qualitative research

Social research in a sociotechnical world: how can social researchers respond to our biggest challenge?
Jonathan Oldershaw, Madano Partnerships and Dr Adam Cooper, UCL

Tackling our warming climate is arguably the defining challenge of our age. Understanding how consumers and organisations can respond requires social researchers and engineers to find a common language and integrated ‘sociotechnical’ methodologies. Drawing on experience of researching energy efficiency with businesses and consumers, we explore challenges for designing qualitative research where social processes are increasingly integrated with technological processes.

Addressing the problems of speaking and listening in research
Dr Alastair Roy, University of Central Lancashire

This paper explores the potential of walking tour interviews in researching lives, identities and social practices. It draws on findings from a research project conducted with The Men’s Room, Manchester, an arts and social care agency working with vulnerable young men. ‘Mobile methods’ can productively alter relations in interviews, harnessing movement and the relationships between people, space and place.

Unconventional methods

Road to representivity: Addressing quality in social media research
Josh Keith, Ipsos MORI and Carl Miller, Centre for the Analysis of Social Media at Demos

This session explores what ‘quality’ means in the context of social media research. As the result of a yearlong project between Ipsos MORI and CASM, Josh and Carl will review primary research into how attitudes and opinions stated offline compare to those presented on social media, and present a new framework for producing rigorous insight.

What can real-time data offer, and are decision-makers ready for it anyway?
Dr Rachel Harris, Glasgow Centre for Population Health

Capturing people’s lived experience of socioeconomic changes in real-time is of significant interest to public health decision-makers. Right Here Right Now (RHRN) is piloting methods of doing just that. This paper outlines how RHRN is capturing data from 200 community researchers, and reflects on the value of repeat real-time data collection and how we decide about the ‘quality’ of findings.

Quantitative methods

Towards device-agnostic survey design: challenges and opportunities
Tim Hanson and Peter Matthews, TNS BMRB

As social studies increasingly become mixed-mode, we need to better understand the ways that people want to complete surveys online, how device choice can impact on respondent behaviour, the design challenges to overcome, and measurement opportunities offered by mobile devices. We present a range of evidence and consider what needs to happen next in the move towards device agnostic design.

Automated vs. manual methods of coding and analysing free text survey responses
Kathy Seymour, Seymour Research Ltd

Free text responses to open-ended survey questions present the researcher with a real challenge. Should they be individually read and coded, or should we opt for automated methods of identifying the main topics discussed? This workshop presents a comparison of the two methods aimed to help researchers make balanced decisions on how to analyse free text data.

Defining/maintaining quality

What is high quality social research?
Ivana La Valle, SRA Trustee; William Solesbury, King’s College London; Teresa Williams, Nuffield Foundation

How do you know what good quality research looks like? How do you make a case for investing time and money in ensuring research is of good quality? How do you demonstrate that research quality matters? In the workshop we will discuss how the SRA guidelines can help you to answer these and many other questions relating research quality, and how the SRA quality principles can be applied to different methodologies, in a variety of research settings and can be used by both researchers and research users to assess research quality.

"There is no evidence" - Use of evidence and research practice in contested spaces: A case study of gambling
Heather Wardle, Heather Wardle Research Ltd/Gambling & Place Research Hub, Geofutures

This presentation explores how research evidence is used (and, potentially, abused) in contested areas. Drawing on my experience of working in gambling research for the past decade, this presentation will highlight how different actors view research and evidence and discuss the implications for researchers working in complex stakeholder environments, where research findings are deeply scrutinised, contested and challenged. I'll share insight into what worked and what didn't work and encourage discussion among delegates to share their experiences.

Contested evidence

Press releases: Take control of your findings
Amy Sippitt and Phoebe Arnold, Full Fact

Full Fact is the UK's only independent factchecking organisation. Accurate and informed public debate is just as reliant on effective communication of research as on the analysis itself—in addition to how information is used by those arguing about politics. Full Fact take you through some examples of where better communication would have reduced the likelihood of inaccurate reporting.

Food bank use in the UK: When is research evidence good enough?
Jane Perry, Independent social researcher

Emergency Use Only, a mixed mode research project, aimed to point to practical, measured changes in policy to help reduce the need for food banks in the UK. On publication it was well received by many, but dismissed by the Government. This seminar asks what can be learnt from that experience and when, if ever, is research evidence good enough?


*** WORKSHOPS 'B' - 16.00 - 17.00 -  choose one:

Involving participants in evaluations

Evaluating early intervention services using contribution analysis: lessons learned
Cara Blaisdell, University of Edinburgh, Centre for Research on Families and Relationships

What counts as evidence in public service evaluations, and whose voices are heard? How can complicated relationships be ‘translated’ into evidence for evaluation in a climate driven by quantitative targets? This presentation, which focuses on methodological challenges, considers these questions in the context of an evaluation project carried out with an early intervention service for young mothers under 25. Rather than give answers to these questions, the presentation highlights the ‘messiness’ of the experience and looks at some of the lessons learned.

The challenge of Participatory Action Research with children and young people
Catherine Goodall, Nottingham Trent University and Nottinghamshire County Council

The presentation explores methodological challenges and opportunities in conducting Participatory Action Research in a local authority. Preliminary findings and experiences will be described, and learning for methodological improvements and implementation discussed. This research is part of a Knowledge Transfer Partnership project between a University and a County Council.

Ethical challenges in online research

#WhatAreYouDoingWithMyData: a framework for social media ethics
Steven Ginnis, Ipsos MORI and Jamie Bartlett, Centre for the Analysis of Social Media at Demos

In this presentation, we present recommendations to the industry for ethical best practice in social media analysis, developed through the Wisdom of the Crowd project. These will be presented alongside findings from fresh primary research into people's attitudes towards social media research and sharing.

Researching online with drug-using communities: Ethical debates
Claire Meehan, University of Auckland

Using social media and online forums to contribute to research on sensitive topics involving vulnerable populations carries weighty ethical challenges. Not least among these are issues of consent, ownership and authorship of data, and the need to safeguard researchers too. This presentation will consider how we can further ethically sound and progressive research.

Research under pressure

Representing service users and frontline staff in value for money reports to Parliament
Erin Mansell and Maria-Christina Eskioglou, National Audit Office (NAO)

At the NAO we face numerous challenges when researching service users or frontline staff; including the logistics of reaching certain groups, the vulnerability of particular populations, and our own time and resource limitations. In the workshop we will be drawing on our experience to discuss these challenges and how we overcame them, giving examples from our work.

Mastering the challenges of impact evaluation in adaptive programmes – lessons learned from Nigeria
Karolin Krause, Coffey International

The “gold standard” of impact evaluation implies a quantitative comparison of changes in a beneficiary and comparison group between the start and end of a development intervention. In the challenging context of an ever-changing programme, how can we be as rigorous as possible? The presentation provides lessons learned from evaluating a market development programme in Nigeria.

Quality assurance

Creating ‘National Statistics’
Donna Livesey, UKSA; Julie Stroud, Health and Social Care information Centre; Sally McManus, NatCen Social Research

What needs to happen before the UK Statistics Authority will quality kite-mark survey data as ‘National Statistics’? Taking one survey as a case study, we talk through what’s involved in the process from the perspectives of the assessor (the UKSA), a research commissioner (HSCIC) and a research producer (NatCen Social Research).

Implementing quality assurance in research in developing countries
Sally Gowland, BBC Media Action

When working in challenging and insecure developing country contexts, how can quality and integrity in research be assured to provide robust evidence of the role of media in international development? Assuring Integrity in Measurement (AIM) is an approach which aims to ensure the quality of our research. Using real-life examples, we will share how AIM has improved research quality.

To find out more about the conference and to book a place, please go to the main conference page.

With thanks to: