The impact of Covid-19 on high quality complex general population surveys  

With the UK Covid-19 lockdown all face-to-face interviewing has been halted, and this is having major effects on high quality complex general population surveys. In this blog Patten Smith, director of Research Methods, Ipsos Mori, argues that the net impact of Covid-19 will be to accelerate ongoing methodological developments, but that, despite this, traditional methods (notably, face-to-face interviewing) will remain important once the crisis is over.   

Traditionally, in the UK, most high quality random probability complex social surveys have been conducted fully or partly using face-to-face interviewers. Only by including this method have researchers been able to obtain reasonably high response rates whilst administering long and complex questionnaires.  All alternative data collection methods (telephone, online, mail, etc.) when used alone deliver lower response rates, and are also subject to other problems (sampling difficulties for telephone surveys, complexity/length limits for mail surveys, and non-coverage problems for web surveys). 

Now that the Covid-19 lockdown has led to the cessation of all face-to-face interviewing, urgent questions arise concerning (i) how to administer complex surveys with high quality random probability samples in the immediate term and (ii) whether the Covid-19 crisis will irreversibly change how we do these surveys in the longer term, or whether, eventually, we will return to the old ways of doing things as if Covid-19 had never happened. 

In the immediate term

First, we need to ask the question, would it be possible, in-principle, to administer a high quality complex random probability survey without using face-to-face interviewing (our only possibility during lockdown)? Unhelpfully, the answer is yes and no, depending on what we mean by ‘high quality’. 

The no answer arises if we interpret it as meaning of the highest quality (assessed using conventional metrics like response rate) that is currently achievable for complex surveys.  Response rates will be significantly lower in surveys not including at least some face-to-face data collection. 

However, we can answer yes to our question if we interpret high quality as meaning of sufficient quality to address our survey objectives. If the main purpose of our survey is to measure change (in, say, the prevalence of volunteering, rather than to obtain point-estimates of how much something is happening (what percentage of adults volunteer each month), we do not necessarily have to obtain data of the highest possible quality.  

Trends and point estimates

In practice, a major focus of surveys that are repeated year after year (like the Crime Survey for England and Wales, the Health Survey for England, and the Scottish Household Survey) is often on the measurement of change despite the use of methods designed to maximise the accuracy of point estimates. It is very likely they could measure change equally successfully with lower response rate non-face-to-face methods even if this led to less accurate point estimates.  

This would happen because, even if a lower response rate non-face-to-face method produced more biased point estimates than the face-to-face equivalent, if the survey was rigorously and consistently implemented year on year, and if it used random probability sampling, it is likely that this bias would remain roughly constant over time, at least over the short term. Any observed year-on-year change could then be reasonably attributed to real population change because these biases would cancel out.  

What this means is that, if you want to start a new survey now, and if change is what you are mainly interested in, then you probably do not need to use face-to-face methods. 

Existing face-to-face surveys

Things are trickier, however, if you are already running a face-to-face survey and your fieldwork has been halted mid-stream. If accurate point estimates really are your principal concern, you will have little choice.  You will have either to live without collecting data for the lockdown period or accept some loss of accuracy until (hopefully) normal face-to-face fieldwork is reinstated (although as we discuss later, the new normal may not be the same as the old normal).  

If measuring trends is your main aim, you have three options:

  1. Mothball your survey until life returns to normal post-Covid-19, and then return to (new) normal face-to-face data collection
  2. Phase in face-to-face fieldwork using Covid-19-adapted protocols as lockdown relaxes, gradually returning to (new) normal face-to-face data collection as the crisis subsides
  3. Move to non-face-to-face data-collection methods permanently, accepting that this will introduce a discontinuity in your trend measures

Option 1: Mothball your survey

There are two potential problems with the first option, pausing fieldwork and waiting for a return to normal face-to-face data collection:

  • You may end up with a substantial gap of uncertain duration in your trend line (life may not return to normal until a vaccine has been administered to the bulk of the population which could easily be as late as 2022)
  • It may turn out to be unrealistic to assume that new and old variants of ‘normal face-to-face data collection’ will be identical because public acceptability may decline, and because the UK survey delivery infrastructures may have changed irreversibly during the pandemic (see next section) 

Option 2: A phased return to face-to-face fieldwork

The second option, phasing in fieldwork using Covid-19 adapted interviewing protocols, is hard to discuss in any detail because we cannot know what the Covid-19-adapted protocols would look like until we receive clearer guidance from Government. Currently, we don’t know how lockdown will be phased out, and we don’t know how the public will initially react to interviewers knocking on their doors. Will interviewers be able to enter people’s homes or will they have to stand outside front doors? Will they be required to wear protective equipment? Will they be required to show proof of Covid-19 immunity? Will interviewers still ask questions themselves (and if so, face-to-face or remotely) or will their role be limited to recruiting respondents for web/computer based data collection? And so on… However, even without knowing these details, two obvious shortcomings attend phasing in fieldwork in this way:

  • The adoption of Covid-19 adapted protocols measures will probably add substantially to survey costs.
  • Their adoption will almost certainly affect at least some survey estimates. It is likely that response rates will decline and that the characteristics of achieved samples will change in ways that bias some survey measures; furthermore, changed data collection protocols may also affect measurement (how people answer questions). 

If adapted protocols do indeed impact on survey estimates, this approach will put survey funders in the unenviable situation of having to take account of two discontinuities in their time series: one resulting from the adoption of Covid-19 adapted protocols and one resulting from the return to (new) normal protocols. 

Option 3: A permanent move to non-face-to-face methods

Given the problems likely to attend the options just discussed, the third option, moving the survey permanently to non-face-to-face methods may, for survey funders interested in trends, turn out to be the least-bad option. It removes the uncertainties associated with the first two approaches, eliminates the risk of a double trend-discontinuity associated with phasing in fieldwork, and, of course, would be resilient in the face of any Covid-19 resurgence. It will however produce its own trend discontinuity and this may be significant - but at least there will only be one of them! 

Longer term impact

As just discussed, there will be no completely satisfactory way of dealing with the sudden loss of face-to-face interviewing capacity both for survey funders who require the highest response rates and (presumed) level of accuracy in their point estimates (although as noted later a high return rate by itself is not a guarantee of quality), and for survey funders already measuring trends using face-to-face methods. I suggested that for many survey funders wanting to set up a survey from scratch or currently hit by the loss of face-to-face data collection, a permanent move to a non-face-to-face data collection method may prove to be relatively attractive. 

But which non-face-to-face method? The choice reduces to three – telephone, mail and web-based. 

Telephone surveys

Nowadays telephone is no longer considered viable for high quality random probability surveys, partly because response rates have been declining for many years, and partly because it has now become much harder to draw rigorous samples that remain comparable over time because people’s calling preferences have shifted from landline to mobile.
Mail surveys
Mail surveys seemed slightly old fashioned even when I started working in survey research in the 1980s, and for this reason are likely to be dismissed from consideration, but this would be a mistake. Mail surveys can cover random samples of addresses easily and relatively cheaply, and can still deliver respectable response rates (certainly ones that are higher than those obtained for telephone surveys). For these reasons I remain a fan of the mail survey method despite its appearance of belonging to another era. However, the method does have one major drawback – it can only be used with relatively short and uncomplicated questionnaires (i.e. ones lacking complex routing). Unfortunately, these are not generally the kinds of questionnaire used in the surveys in which face-to-face methods are used / being seriously considered.
Web-based methods

This brings us to web-based methods. Over recent years there has been an accelerating shift from face-to-face / telephone data collection methods to web-led mixed mode methods for quality social surveys. Examples of surveys making such changes include Understanding Society, the Community Life Survey, the Active Lives Survey, the Food and You survey, and the Wellcome Monitor. This shift has happened because:

  • web-based data collection is, in principle, cheaper and faster than the alternatives
  • client budgets have flatlined  
  • over recent decades face-to-face fieldwork has become more costly whilst delivering declining response rates (although still higher than the alternatives) 
  • much of the population has now become comfortable in online activities (and 90% of households are now connected) 
  • the Government Digital Transformation is encouraging online data collection.  

Of course, it’s not to be denied that, despite the foregoing considerations, web-led methods deliver substantially lower response rates than do face-to-face surveys, but methodological research over the past two decades offers reassurance.  The relationship between response rate and non-response bias is extremely weak, and it now appears that for most purposes much lower response rates are more acceptable than was previously thought.

 Web based methods have other difficulties too, most notably that on their own web surveys deliver more biased samples than mail or face-to-face methods. Those who do not respond to web surveys, either because they are not web-connected or because they are uncomfortable responding by web, are typically older, less educated and poorer than web responders. 

This bias will almost certainly decrease over time as the population ages and becomes more comfortable with the internet, but for the moment it requires us always to supplement web data collection with a secondary method specifically targeted at web non-responders. 

Most commonly, to minimise costs, mail questionnaires are used for this, but for surveys like Understanding Society that genuinely require the highest of response rates, face-to-face interviewers follow up web-non-responders. The bias problem is therefore solvable (although it has to be recognised that with a mail follow-up, the mail questionnaire may have to be simpler than the main web questionnaire).   

To summarise the immediate pre-Covid-19 situation, funders of high quality random sample social surveys have been increasingly moving from off-line methodologies to web-led ones - the Active Lives survey and the Community Life survey provide good examples. And now the Covid-19 crisis provides a further big push in that direction. I therefore believe that one impact of the Covid-19 crisis will be to accelerate these already existing trends. We will see funders of high quality random probability social surveys moving away from exclusively face-to-face data collection and towards web-led data collection. 

But I most certainly do not believe that this will lead to the death of face-to-face random probability social surveys. As I said above, if a survey funder needs to deliver long and complex questionnaires to a random probability sample with the highest possible response rate, the inclusion of some face-to-face data collection is essential. And surveys with these requirements will continue to exist.

However, although none of us knows exactly what face-to-face fieldwork will look like post-lockdown, it is unlikely that it will look exactly as it did before the crisis. Public attitudes to doorstep visits and data collection protocols may have changed irreversibly, interviewers may prove harder to recruit, the overall demand for face-to-face interviewing may have shrunk to the extent of requiring changes to the UK face-to-face interviewing infrastructure. Face-to-face fieldwork will eventually return to normal, but it will be a new normal.  

In summary, to distil 2,000 words to two bullet points:
  • The Covid-19 crisis will immediately accelerate the pre-existing shift from face-to-face to web-led data collection methods 
  • Although face-to-face methods will change we have good reasons for thinking they will not go away.


AUTHOR BIO:  Patten Smith is Director of Research Methods at Ipsos MORI and has worked on high quality social surveys for around 40 years. Patten is visiting professor at Surrey University and was Chair of the SRA from 2011 to 2017 .