1. Summary

This article presents high-level findings from several pilot tests conducted over the last two years that were designed to establish the impact of transforming the Opinions and Lifestyle Survey (OPN). The results of the testing suggest that the transformed survey, which will be mixed-mode (online first with telephone follow-up), will continue to meet user needs by producing high-quality statistics.

Specifically, the findings from the pilot tests demonstrated that there were minimal differences in the estimates produced from the alternative designs relative to the long-standing OPN design (random probability-based sample with face-to-face data collection). A small number of significant differences were observed, however, these are likely due to improvements made to the design of certain questions. The impact of the new design on these estimates will continue to be monitored.

Allowing the option of online completion will be more flexible and convenient for respondents to take part in the survey, and the new questionnaire design will more accurately capture data required.

None of the estimates produced in any of the tests detailed in this report are official statistics and should not be used in place of any official data output from the OPN survey by the Office for National Statistics (ONS) or any stakeholder departments.

Nôl i'r tabl cynnwys

2. Background

The Census and Data Collection Transformation Programme (CDCTP) aims to rebalance the Office for National Statistics (ONS) data collection activity significantly towards wider, more integrated use of administrative and other non-survey sources, thereby reducing our reliance on large population and business surveys. While this will not eliminate a need for surveys, it does mean that the ONS’s traditional approach to surveys is likely to change. The CDCTP is an important enabler of the UK Statistics Authority’s Better Statistics, Better Decisions strategy and some of the main features of this strategy include:

  1. changing existing processes so that survey data are predominantly collected using online methods rather than using existing face-to-face interviews

  2. using administrative data alongside survey data in an integrated manner to reduce the size of the residual survey samples and the number of variables that require a survey response

  3. rationalising our portfolio of social surveys into an integrated framework

Within the CDCTP, the Social Survey Transformation Division is responsible for transforming the statistical design of the household survey portfolio, and all current household surveys are within scope of the programme including OPN.

Nôl i'r tabl cynnwys

3. Opinions and Lifestyle Survey (OPN) transformation

The transformation of the Opinions and Lifestyle Survey (OPN) has been taken forward in several stages and some changes have been implemented already. Details of changes made in 2018 have already been published (see the OPN methodology guide).

A comprehensive programme of research and testing has been conducted to inform the transformation of the survey and this article provides an overview of this work and the main findings. This work includes several quantitative tests, which have considered the impact of mode change, redesigned questions and an alternative approach to sampling for the survey. The research aims to provide customers and stakeholders of the OPN with assurance that the quality of data collected on the survey has not and will not be undermined by the changes that are being made. The first stage of the transformation of the OPN saw the survey move from being a face-to-face interview to one that is based solely in the telephone unit. At the same time, the approach to sampling for the survey also changed and the survey is now integrated into the last wave of the Labour Force Survey (LFS) and LFS boost as a follow-up survey. This approach has been used by other surveys that ONS has conducted in the past, including the European Health Interview Survey and the Adult Education Survey.

In advance of making these changes, two pilot tests were conducted to establish the impact of this alternative methodology and main findings from these tests are presented in this article. These changes were implemented in April 2018 and since then, response to the survey has improved and the total number of achieved interviews has risen. Details of the change to OPN sampling can be found in the Appendix. The second stage of the transformation of the survey aims to deliver a fully mixed-mode OPN (online first with telephone follow-up of online non-responders) in November 2019. To establish the feasibility of taking a sequential approach and to understand the potential impact on data quality, a further pilot test of the mixed-mode OPN was conducted in February and March 2018. Details of this test are also outlined in this article.

It should be noted that given the nature of the survey, a multi-purpose Omnibus survey, the tests conducted were based on relatively small sample sizes. It is therefore suggested that any cross-analysis be treated with caution due to the low numbers and lack of statistical power to detect any significant differences. Collection of further data after moving to mixed-mode collection may highlight further differences not detected in the pilot tests. Monitoring of the data for mode effects will continue when the mixed-mode survey is operational and the OPN research team will work with stakeholders to help understand and address any concerns.

Nôl i'r tabl cynnwys

4. Pilot Test 1: April and May 2017

The first pilot test conducted as part of the transformation of the Opinions and Lifestyle Survey (OPN) aimed to establish the impact of changing OPN collection mode from face-to-face to telephone interviewing. This test was conducted in April and May 2017 when an issued sample of 2,010 individuals per month were approached for interview by ONS telephone interviewers. The pilot test included socio-demographic questions, as well as modules on smoking and internet access. This matched those carried on the face-to-face survey in the same months.

As well as changing mode of collection, the sample for the pilot test was drawn from the last wave of the Labour Force Survey (LFS) and LFS boost. Using this approach ensured that it was possible to pre-select the required individual for interview in the telephone unit (the face-to-face survey used a Kish grid approach).

Main findings

  • Average response to Pilot Test 1 was 50% compared with 48% in the face-to-face survey over the same two months (for more information, please see the Appendix).

  • There was a significant difference in sex in terms of response between the two modes: on the face-to-face interview, 55% of the sample were female compared with 45% male; and on the telephone interview, the distribution was more even, as 52% of the sample were female compared with 48% male.

  • The proportion of achieved response by age group was broadly similar on the telephone survey compared with the face-to-face OPN, with two exceptions: a smaller proportion of 25- to 34-year-olds responded on the telephone (10%) compared with the face-to-face interview (15%); and a greater proportion of respondents aged 60 years and over responded on the telephone (44%) compared with the face-to-face interview (37%).

  • No significant differences were observed between the telephone and the face-to-face surveys in terms of response by ethnicity, housing tenure, access to a vehicle, and employment status.

  • No significant differences were observed between the telephone and face-to-face surveys in terms of smoking status, and for e-cigarette usage overall.

  • No significant differences were found between the telephone and face-to-face surveys for the main internet access variables.

All variables and demographic breakdowns used in the significance testing can be found in the results table.

Although the achieved age distributions showed statistically significant differences, the weighting of the dataset to population estimates helps to remove any potential bias that could result from this. In addition, accepting that the size of this pilot test was relatively small, there was no evidence of mode effects on the main smoking and internet access estimates.

Nôl i'r tabl cynnwys

5. Pilot Test 2: October and November 2017

Following on from the first pilot test, work was progressed to optimise the questionnaire for mixed-mode data collection. Kantar Public was commissioned to redesign several modules of the Opinions and Lifestyle Survey (OPN) face-to-face questionnaire for both online and telephone collection. Further information on the redesign of the questionnaire can be found in the Appendix.

A second pilot test was conducted in October and November 2017. This test again aimed to understand any mode effects when moving data collection from face-to-face to telephone collection but this time using the redesigned telephone questionnaire. As per the earlier test, a sample of 2,010 individuals per month was drawn from the last wave of the Labour Force Survey (LFS) and LFS boost and respondent data were compared with face-to-face data collected over the same period. The focus of the analysis was on socio-demographic, smoking and internet usage variables.

Main findings

  • Average response on Pilot Test 2 was 57% on the telephone survey compared with 52% on the face-to-face survey (for more information, please see the Appendix).

  • Similar to Pilot Test 1, there was a significant difference in sex in terms of response between the two modes: on the face-to-face interview, 57% of the sample were female compared with 43% male; and on the telephone interview, the distribution was more even, as 52% of the sample were female compared with 48% male.

  • The proportion of achieved responses by age group was broadly similar on the telephone and face-to-face surveys, except for 25- to 34-year-olds (fewer responded via the telephone) and those aged 60 years and over (more responded via the telephone).

  • In contrast with Pilot Test 1, significant differences in response were observed for some categories of ethnicity: a significantly greater proportion of respondents to the telephone interview reported being White British (85.8%) compared with respondents on the face-to-face interview (77.9%); and a significantly smaller proportion of respondents to the telephone interview reported being African (0.6%) compared with the face-to-face interview (2.7%).

  • In terms of housing tenure, a greater proportion of respondents on the face-to-face interview reported living rent free (2.9%) compared with the telephone interview (0.9%).

  • Differences were observed in cigarette consumption at the weekend and on weekdays; significantly more respondents to the telephone interview aged 16 to 44 years reported smoking fewer than 10 cigarettes at the weekend (62.9%) compared with respondents to the face-to-face interview in the same age group (36.5%).

  • The reverse pattern was observed in the same age group (aged 16 to 44 years) in terms of smoking 20 or more cigarettes on the weekend (9.0% on the telephone compared with 25.9% on the face-to-face interview).

  • For weekdays, significantly more respondents aged 45 years and over reported smoking fewer than 10 cigarettes in the face-to-face interview (44.4%) compared with respondents in the same age bracket on the telephone interview (25.6%).

  • In terms of e-cigarette usage, while there was no significant difference between modes in terms of current use, a significantly lower proportion of respondents reported having used one in the past on the telephone interview compared with the face-to-face interview.

  • No significant differences were found between the telephone and face-to-face surveys for the main internet access variables.

All variables and demographic breakdowns used in the significance testing can be found in the results tables.

Although some differences were observed in some smoking estimates, these are possibly due to the changes made to the question designs (see Appendix), which were accepted as being an improvement on the long-standing designs. The size of the test was also relatively small, and typically there is some volatility in smoking estimates between months. These estimates will continue to be monitored and stakeholders will be consulted on any potential impact on the time series.

Phase 1 of transformation complete

Following the successful completion of the first two pilot tests and subsequent engagement with main stakeholders, the OPN moved to a telephone mode in April 2018. The sample for the survey has been selected from the last wave of the LFS and LFS boost since then, and the redesigned questionnaire has also been implemented. Since launching in April 2018, response rates have been higher on average than achieved on the face-to-face survey in recent years.

Nôl i'r tabl cynnwys

6. Pilot Test 3: February and March 2018

The final pilot test aimed to establish any mode effects from changing to mixed-mode collection (online first with telephone follow-up of online non-responders) using the redesigned online and telephone questionnaires. This test was conducted in February and March 2018 and the sample size in each month was 2,010 individuals selected from the last wave of the Labour Force Survey (LFS) and LFS boost.

Data from these two months were compared with data collected from the telephone-only survey from April and May 2018 (for the socio-demographic and smoking variables). As the internet access module is carried on the Opinions and Lifestyle Survey (OPN) in January, February and April only, the data for these months were compared with the mixed-mode data from February and March 2018.

Main findings

  • Average response on the mixed-mode pilot test was 61% compared with 57% for the telephone survey (for more information, please see the Appendix).

  • The proportion of achieved responses by age group was broadly similar with telephone and mixed-mode surveys, however, in contrast with the previous tests, a greater proportion of 25- to 34-year-olds responded to the mixed-mode survey.

  • For all other socio-demographic variables considered, including ethnicity, housing tenure, access to a vehicle and employment status, no significant differences were found in terms of responses.

  • Of the smoking variables analysed, significant differences were only found for the top-level estimates (males) for the grouped number of cigarettes smoked on weekdays; these questions will continue to be monitored with close user engagement with customers.

  • No significant differences were found for e-cigarette use.

  • No significant differences were found between the estimates for internet access.

All variables and demographic breakdowns used in the significance testing can be found in the results tables.

Nôl i'r tabl cynnwys

7. Conclusion

The testing conducted over the last two years as part of the transformation of the Opinions and Lifestyle Survey (OPN) has aimed to establish what (if any) impact there might be from changing main design features of the survey. All of the tests conducted were based on small sample sizes due to the nature of the survey and the estimates produced do not reflect official estimates.

Overall, the findings have demonstrated that there has been minimal impact on the data as a result of switching from face-to-face to telephone data collection, taking a different approach to sampling and redesigning some questions. Where changes have been observed they will continue to be monitored and stakeholders will be consulted.

There has been an overall improvement in response rates from the changes made to the design of the survey and the number of achieved interviews has also increased. The costs associated with data collection on the survey have also fallen. Based on the findings from Pilot Test 3 in 2018, we can expect response rates to improve further when we launch the mixed-mode OPN later in the year.

Nôl i'r tabl cynnwys

8. Appendix

Sample design and weighting

The Opinions and Lifestyle Survey (OPN) was initially conducted using face-to-face interviewing with a sample drawn from the Royal Mail’s postcode address file (PAF) and a Kish grid approach to select an individual. A mixed-mode collection requires telephone numbers for potential respondents, therefore, the new design involves a move to the Labour Force Survey (LFS) wave 5 or the local LFS boost as the sample frame. In the last wave of the LFS and LFS boost, respondents are made aware that they may be contacted for future research. The new OPN sampling frame includes all individuals who have not objected to future research.

To ensure that the achieved age distribution in the sample selected from this frame is not too dissimilar from that achieved in the face-to-face sample, certain age groups are over- and under-sampled.

The sample design maintains a two-stage approach, with selection of a sample of households followed by selection of one individual from each sampled household. Control for any selection bias introduced from the change in sampling frame and non-response is managed through a two-stage weighting method. A model-based approach (logistic regression) is applied to the OPN data (from April 2018 onwards) for non-response adjustment using information from LFS data (based on characteristics such as region, age, sex, tenure and economic status). Then calibration factors are computed to ensure that the cases gross up to the Office for National Statistics (ONS) population totals of age group by sex and Government Office Region, as well as LFS estimates for tenure, National Statistics Socio-economic Classification, economic activity and smoking.

For further information on OPN methodology, please see the OPN methodology guide.

Questionnaire transformation

The OPN well-being, smoking and internet access face-to-face questions were redesigned by Kantar Public for telephone and online data collection. The project included a period of gathering data requirements from the stakeholders who commission the questions.

Kantar Public completed a desk review of the questions and proposed amendments to test. They conducted an office pilot for initial feedback before completing several rounds of cognitive and usability testing to develop the final question set. Kantar Public met with the ONS after each round to go over the findings and come to an agreement on changes required for the next iteration, contacting stakeholders for clarification on data needs when required.

As well as optimising for the mode, the questions were designed to be more relevant for users and more accurately collect the data the clients required, in line with a set of design principles agreed between the ONS and Kantar Public. This included replacing outdated terminology with more current language and breaking down questions to be more manageable.

Mode effect analysis

A statistical test at 5% level of significance was applied to identify significant mode effects. As multiple tests were applied, a Bonferroni correction for multiple comparisons tests was used to reduce the risk of type one error. A critical value was set and any test statistic whose absolute value exceeded this critical value was considered as a significant mode effect.

For Pilot Test 3, a chi square test was also run on some variables, for example, frequency of internet use, to investigate whether there was a significant difference in distribution of responses by category. The variables and demographic breakdowns used in the analysis can be found in the results tables.

Response rates

Table 1 presents response rates for the three pilot tests.

Differences between face-to-face and redesigned telephone questions (Pilot Test 2)

The differences found for the ethnicity question in test two may be due to the change made to the showcard face-to-face question to make it suitable for telephone data collection. As such, the ethnicity derivation used to compare estimates across modes was not optimised for the comparison. The face-to-face and redesigned telephone questions used can be found in Table 2.

The redesigned question is in two steps: respondents first identify which broad category ethnic group they belong to, and then the more detailed ethnic group category. Additionally, the original face-to-face question assumes those who report they are British are also White, and so the redesigned question collects better quality data and is more suited for purpose.

The new question format also seems to have reduced the number of “other” responses by better capturing ethnic group with the options available. However, it is also possible that this change in estimate is a result of the new sampling method. This should, therefore, be closely monitored during future mixed-mode data collection.

For the e-cigarette question and the differences found for the response category “I have used one in the past but I no longer use one” (test two), further investigation into these differences indicated that this change is likely due to the redesigned question. The redesigned telephone question used to derive the variable (to allow comparison with the face-to-face response categories) clearly distinguishes between “regularly used” and “just tried” responses. The face-to-face variable does not use “regularly” as part of its distinction (see Table 2). It is therefore possible that some respondents in the face-to-face sample were identifying as past “users” of e-cigarettes, when by definition they had only “tried” a device. This suggestion is also supported by lower face-to-face estimates for the “I have tried one in the past but I no longer use one” category compared with telephone estimates (for nearly all demographic breakdowns).

For the significant differences for smoking consumption (at the lower demographic breakdown), it is again important to note that the redesign of the question for telephone collection may have had an impact on the responses. The telephone questions used to derive this variable ask for the usual amount smoked for each day of the week, while the face-to-face questionnaire asks the usual amount smoked “at the weekend” and “on weekdays” (see Table 2). How respondents interpret or define “the weekend”, for example, if this includes Friday, and/or focuses on more “social” occasions, may lead to the higher average number of cigarettes at the weekend for the face-to-face sample.

Table 2: Face-to-face and redesigned telephone questions for the Opinions and Lifestyle Survey

Face-to-Face question Redesigned telephone question used in
Pilot Test Two
Ethnicity
Ethnicity

To which of these groups do you belong?

1) English, Welsh, Scottish, Northern Irish, British
2) Irish
3) Gypsy or Irish Traveller
4) Any other White background
5) White and Black Caribbean
6) White and Black African
7) White and Asian
8) Any other Mixed/Multiple Ethnic background
9) Indian
10) Pakistani
11) Bangladeshi
12) Chinese
13) Any other Asian background
14) African
15) Caribbean
16) Any other Black/African/Caribbean background
17) Arab
18) Any other Ethnic group

Refusal

Don’t know
Ethnic¹

What is your ethnic group? Is it…


1) White
2) Mixed or multiple ethnic groups
3) Asian or Asian British
4) African, Caribbean or Black British or
5) Another ethnic group, for example Chinese, Arab or any other background?

Ask if Ethnic = 1
Is your White ethnic group…
1) Welsh, Scottish, English, Northern Irish or
British
2) Irish
3) Gypsy or Irish Traveller, or
4) Another White background? (Ask to specify)

Ask if Ethnic = 2
Is your mixed or multiple ethnic group…

1) White and Black Caribbean,
2) White and Black African,
3) White and Asian, or
4) Another mixed or multiple ethnic background? (Ask to specify)

Ask if Ethnic = 3
Is your Asian or Asian British ethnic group…

1) Indian
2) Pakistani
3) Bangladeshi, or
4) Another Asian background? (Ask to specify)


Ask if Ethnic = 4
Is your Black ethnic group…
1) African
2) Caribbean, or
3) Another Black, African or Caribbean background? (Ask to specify)

Ask if Ethnic = 5
Is your ethnic group…

1) Chinese
2) Arab, or
3) Another background? (Ask to specify)
Smoking consumption
M210_2

How many cigarettes a day do you usually smoke at weekends?
Please exclude electronic cigarettes.

M210_3

How many cigarettes a day do you usually smoke on weekdays?
Please exclude electronic cigarettes.
M210_Mon to M210_Sun

How many cigarettes a day do you usually smoke on each day of the week? How many do you smoke on…
1) Monday [Enter number] 999) Don’t know 998) Prefer not to answer
2) Tuesday [Enter number] 999) Don’t know 998) Prefer not to answer
3) Wednesday [Enter number] 999) Don’t know 998) Prefer not to answer
4) Thursday [Enter number] 999) Don’t know 998) Prefer not to answer
5) Friday [Enter number] 999)Don’t know 998) Prefer not to answer
6) Saturday [Enter number] 999) Don’t know 998) Prefer not to answer
7) Sunday [Enter number] 999) Don’t know 998) Prefer not to answer
E-cigarette use
MEG_1

Showcard if MEG_SelfCom <> 1

Have you ever used an electronic cigarette (e-cigarette)?

1) No, I have never used one and I will not use one in the future
2) No, I have never used one but I might use one in the future
3) Yes, I have used one in the past but I no longer use one
4) Yes, I currently use one
5) I tried one, but I did not go on to use it
6) I don’t know what an electronic cigarette is (Spontaneous only)
MEG_1a

Do you currently use…
1) E-cigarettes
2) A vaping device
3) Both
4) Or neither

Ask if MEG_1a = 4

MEG_1b

Have you ever regularly used or tried an e-cigarette or a vaping device?

1) Yes, I used it regularly
2) Yes, I just tried it
3) No

Ask if MEG_1b = 2,32

MEG_1c

How likely are you to use e-cigarettes or a vaping device in
the future? Are you…

1) Very likely
2) Fairly likely
3) Neither likely nor unlikely
4) Fairly unlikely
5) Very unlikely

Source: Office for National Statistics

Notes:

  1. Ethnicity data are collected using these redesigned questions. The data are then derived into the 18 categories used in the face-to-face question and delivered to customers on this basis.

  2. An issue was identified with the routing for MEG_1c and has been corrected from June 2019. The routing displayed reflects the correct routing for MEG_1c.

  3. Telephone questions may have been modified since Pilot Test 1.

Limitations

The analysis reported in this article is based on small sample sizes that are a snapshot based on two months’ worth of data and therefore should be treated with some caution. Collection of further data after moving to mixed-mode collection may detect further differences not detected in the pilot studies. Monitoring effects on response rates and trends will continue post mixed-mode launch and any changes will be communicated with customers.

No explicit adjustments were applied for mode of data collection; the weighting was adjusted for selection effects but not for measurement effects.

Nôl i'r tabl cynnwys