1. About this Quality and Methodology Information report

This quality and methodology report contains information on the quality characteristics of the data (including the five European Statistical System Dimensions of Quality) as well as the methods used to create it.

The information in this report will help you to:

  • understand the strengths and limitations of the data

  • learn about existing uses and user of the data

  • reduce the risk of misusing data

  • help you to decide suitable uses for the data

  • understand the methods used to create the data

Back to table of contents

2. Important points

  • The Occupational Pension Schemes Survey (OPSS) provides a detailed view of the nature of occupational (trust-based) pension provision in the UK.

  • Data for OPSS are sourced from an annual survey, which is conducted using online and paper questionnaires, sent to occupational pension scheme administrators.

  • Information collected includes scheme membership, benefits and contributions; membership includes active (current employee) participation, those with deferred rights in pension schemes and pensioner members.

  • OPSS covers both private and public sector occupational pension schemes registered in the UK.

  • OPSS excludes contract-based arrangements known as group personal pensions (GPPs) and State Pensions.

  • OPSS is published annually in September each year.

Back to table of contents

3. Quality summary

Overview

Results from the annual Occupational Pension Schemes Survey (OPSS) provide a detailed view of the nature of occupational pension provision in the UK. One of the main outputs of the survey is estimated membership of occupational pension schemes; estimates of active membership date back to 1953. The OPSS also provides estimates of employer and employee contribution rates. It covers both private and public sector occupational pension schemes registered in the UK and the results are published in September of each year.

Uses and users

The survey outputs are used, for example, for monitoring policy changes, by government departments (the Department for Work and Pensions (DWP), HM Revenue and Customs (HMRC)) and regulatory bodies (The Pensions Regulator). Other users include trade associations and industry bodies (the Pensions and Lifetime Savings Association, Pensions Policy Institute), charities (Age UK) and research institutes (the Institute for Fiscal Studies). OPSS data have been used by these organisations as context for research or think-tank papers and for political lobbying.

Assessment of user needs and perceptions

(The processes for finding out about uses and users, and their views on the statistical products.)

The surveys were originally devised by the Government Actuaries Department (GAD) but have been developed and questions have been re-designed/added by us in consultation with DWP, HMRC and the Pensions Regulator. Meetings are held at least twice a year to discuss the quality of the outputs and any problems with the questions. Users from the Pension Statistics Advisory Group (PSAG), which includes stakeholders from academia, private sector and other government departments, are contacted regularly to discuss any developments with OPSS, gain feedback and answer queries about the outputs. Any questions raised by users of OPSS are monitored to understand how the statistics are being used and whether the outputs are useful in aiding policy decisions and pension reform strategies.

Contact details for the survey statistician and a request for user feedback are included in the statistical bulletin.

A consultation on the future of OPSS, considering potential improvements that could be made to OPSS, was carried out between April 2011 and June 2011. Five issues were outlined and users were asked to give their views. A summary of responses was published outlining users’ feedback on our website.

OPSS was assessed by the UK Statistics Authority to ensure it was meeting user needs and not placing any unnecessary burden on respondents. This involved consulting with the users and respondents as well as reviewing all aspects of the survey. The Assessment Report on Pensions was published on 28 October 2010. In response to the assessment two documents were produced looking at the users and uses of OPSS statistics. Further details are provided in the ‘How pension statistics are used’ and ‘Meeting the needs of pension statistics users’ documents.

A further consultation considering the presentation of statistics from OPSS was carried out between December 2012 and January 2013 resulting in the annual report being discontinued.

Strengths and limitations

The main strength of the survey:

  • the survey is very comprehensive, covering many different scheme types

The main limitation of the survey:

  • as OPSS is an annual survey, there are restrictions on the timeliness of the data available
Back to table of contents

4. Quality characteristics

Relevance

(The degree to which the statistical outputs meet users’ needs)

The Occupational Pension Schemes Survey (OPSS) questionnaires are reviewed each year to accommodate, where possible, any changes to pension legislation and to ensure content and coverage are appropriate to user needs. We engage with users, including the Department for Work and Pensions (DWP), to discuss any questionnaire changes before they take place.

The primary estimates from the survey are membership of occupational pension schemes and employer and employee average contribution rates.

For further information on how the OPSS data are used, see the Other information – Assessment of user needs and perceptions section.

Accuracy and reliability

(The degree of closeness between an estimate and the true value.)

Non-sampling error

There is the potential for non-sampling error, which cannot be easily quantified. For example, undetected deficiencies may occur in the survey register and errors may be made by the respondents when completing the survey questionnaires.

Questionnaires are usually despatched to schemes five months after the end of the reference year. Two written reminders are subsequently sent to non-responding groups. Further telephone chasing is undertaken to try to minimise non-response and thereby any non-response bias. There is also the option of using the legal powers of the Statistics of Trade Act 1947 to compel response, although we prefer to work together with respondents to collect the necessary information.

Returned information is run through a series of validation checks to identify errors. These include tests to ensure that all required questions are completed, that the responses to individual questions are consistent within the questionnaire, and that the returned data are consistent with historical data from the scheme. Data that fail the validation checks are queried with respondents to confirm or correct the original data.

Sampling error

Sample surveys are used rather than censuses because the census process is too lengthy and costly to be viable. Standard errors illustrate the spread of results, which would be expected from estimates derived from successive samples selected by chance from the same population using the same sample specification. While each sample is designed to produce the “best” estimate of the true population value, different equal-sized samples covering the population would generally produce varying population estimates.

Response rates are published each year alongside the statistical bulletin. Response can vary between the private and public sector and between the different scheme sizes.

Estimates of scheme numbers were withdrawn in 2008 because of an unusual pattern in the proposed final output. These estimates were re-introduced in 2010, making use of a new methodology. Estimates from 2007 onwards were produced on the new basis. The methodology for weighting estimates of scheme numbers was improved, but the problem of sampling variability, which produced a set of unusual results in 2008, was not fully resolved.

Despite further efforts by us to improve these estimates, most notably by increasing the number of private sector forms sent to schemes with between 2 and 11 members, the sample size remains smaller than that required to produce reliable estimates for scheme numbers. It is important to note, therefore, that the estimates of numbers of very small schemes continue to be subject to considerable uncertainty and are no longer produced as part of the regular publication.

Calculation of standard errors

The standard error measures the precision of a sample. In statistics, a sample mean deviates from the unobserved actual mean of a population; this deviation is measured by the standard error. The smaller the standard error, the more precise the sample mean will be. Estimates for standard errors are released in a dataset alongside the statistical bulletin.

Output quality

This report provides a range of information that describes the quality of the output and details any points that should be noted when using the output.

We have developed Guidelines for Measuring Statistical Quality; these are based upon the five European Statistical System (ESS) Quality Dimensions. This report addresses these quality dimensions and other important quality characteristics, which are:

  • relevance

  • timeliness and punctuality

  • coherence and comparability

  • accuracy

  • output quality trade-offs

  • assessment of user needs and perceptions

  • accessibility and clarity

More information is provided about these quality dimensions in the individual sections.

The focus of the survey is upon provision of membership estimates. Scheme numbers estimates would benefit from a larger sample of very small schemes. However, unless additional resources are allocated to the survey so that the sample size could be increased, this would be to the detriment of the quality of the membership estimates. We do not consider this to be a priority in terms of resource allocation at a time of tight budgets and therefore no longer release the scheme numbers estimates as part of the regular publication.

Coherence and comparability

(Coherence is the degree to which data that are derived from different sources or methods, but refer to the same topic, are similar. Comparability is the degree to which data can be compared over time and domain, for example, geographic level.)

Coherence

One of the other main data sources we produce on pensions is the Annual Survey of Hours and Earnings (ASHE). Estimates from ASHE include the proportions of employees currently contributing to a workplace pension by type of pension. The figures also show estimates of number of jobs from which the proportions are derived – although these figures are for indicative purposes only.

A broad comparison between the active membership from OPSS and the number of employees (approximately equivalent to the number of jobs) contributing to a pension is therefore possible – if the ASHE analysis is restricted to the occupational types of pension only. Published estimates from OPSS do not cover group personal (or stakeholder or self-invested personal) pensions. Differences between the two sources are caused by differing survey methodology and differences in the time of reporting but mainly because respondents to OPSS are schemes, respondents to ASHE are employers. The ASHE weighting methodology is also optimised for estimating pay data rather than pensions data.

Reports are also available from the Pension Protection Fund (PPF) and the Pensions Regulator covering different aspects of occupational pension scheme data – the Purple Book covers defined benefit schemes, the DC trust report covers defined contribution schemes.

Comparability

The statistical bulletin highlights the headline figures of membership and contribution rates. Prior to 2012, a more detailed annual report was published alongside the bulletin that provided an analysis of the nature of occupational pension provision in the UK. Publication of the annual report was ceased following a user consultation. Previous publications are available (from 2006 to 2011), alongside the 2000, 2004 and 2005 survey reports produced by the Government Actuary’s Department. Several short stories were also produced to complement the statistical bulletin.

Some of the time series from OPSS are available back to 1953 – although given methodological changes over time, direct comparisons should be made with caution. Estimates for active and pensioner membership have been collected from 1953 with preserved pension entitlements from 1983. The survey has only been conducted on an annual basis since 2004 (although the 2005 survey only covered the private sector).

Since we took over the OPSS survey in 2006 some changes have been made to improve the methodology of the survey. During work on the 2007 survey some adjustments were made to the 2006 and 2007 survey results, which means that caution should be exercised when comparing results from 2006 onwards with previous years. This was partly to do with adjustments relating to late returns for the 2006 survey, and partly to do with a review of the estimation methods applied to the survey.

Each year the OPSS questionnaires are reviewed and questions may be added, removed or amended as necessary to meet user needs (as far as possible), and improve the questionnaire and resulting data quality. This means that the data collected may vary over time, although consistency over time is aimed for wherever possible.

In the 2008 survey, questions asking schemes to provide details of the total numbers of pensioner members and members with preserved pensions were added. Before 2008 these totals were derived by summing their constituent components. Where any components were missing, the total of the available components was used. The approach used since 2008 has led to an improvement in the estimation of total membership figures. It may be responsible for part of the increase in numbers of pensions in payment and preserved pension entitlements between 2007 and 2008.

Experience from previous surveys has led to improvements to some questions and changes to others, which were no longer appropriate because of changes to pension schemes or pensions legislation. For example, as part of the 2010 survey, the questionnaires were redeveloped to capture cases where schemes (or sections) had groups of members with differing contribution or accrual rates. Respondents were asked to record each rate and estimate the proportion of active members contributing, or accruing benefits, at each rate. Weighted-average contribution rates across all schemes were calculated based on the estimates for numbers of active members contributing at each rate.

The changes to the questionnaire in 2010 described previously mean that comparisons between 2010 and earlier years should be treated with caution. A similar approach (to capture the differences between groups of members within (sections of) schemes), for normal pension age estimates was introduced in the 2015 questionnaire.

In 2010, some of the largest public sector schemes in the sample, which were previously providing information only at scheme level, not for each of their sections separately, were asked to provide section-level information. Before 2010, the members of closed sections of these schemes would have appeared in the “open” category if the scheme status as a whole was open. In 2010, such membership is classified in the “closed” category. From the 2011 survey onwards, all public sector schemes in the sample were asked to return information for each of their sections. Changes in the definition of public and private sectors also mean that estimates differ prior to 2000, as some organisations such as the Post Office and the BBC were reclassified at this stage.

The Pensions Regulator holds information for some schemes at multiple levels. The overarching scheme is referred to as the “parent” level with the lower-level structures referred to as “child” levels. Historically, the OPSS sample has been drawn at the parent level. As part of refinements made to the 2015 survey, the decision was taken to sample at the child level.

This shift was made on advice from the Pensions Regulator, based primarily on the fact that scheme contact information should be more up-to-date at child level. In addition to fewer forms therefore having to be redirected, a further benefit may have been improvements to the accuracy of the information reported, particularly detailed data that may not always be held at the parent scheme level.

The table showing membership of private sector defined benefit schemes by type of member and contracted out status (Table 8 in OPSS 2015) was discontinued in 2016 as contracting out ended on 6 April 2016 and the data are therefore no longer applicable.

The table showing the number of pensions in payment in private sector defined benefit occupational pension schemes: by date pensions accrued and increase paid each year (Table 18 in OPSS 2015) was also removed in 2016. The decision to remove this table was made because of concerns over data quality and issues with interpretation of the information.

Concepts and definitions

(Concepts and definitions describe the legislation governing the output, and a description of the classifications used in the output.)

Survey data are collected under the statutory powers of the UK Statistics of Trade Act 1947.

OPSS is one of a very limited number of National Statistics surveys of pension schemes. “Pensions” is a sub-group of the cross-governmental programme of work considering harmonising inputs and outputs. The possible creation and use of harmonised standards on pensions will therefore be kept under review.

Accessibility and clarity

(Accessibility is the ease with which users can access the data, also reflecting the format in which the data are available and the availability of supporting information. Clarity refers to the quality and sufficiency of the release details, illustrations and accompanying advice.)

Our recommended format for accessible content is a combination of HTML webpages for narrative, charts and graphs, with data being provided in usable formats such as CSV and Excel. Our website also offers users the option to download the narrative in PDF format. In some instances, other software may be used, or may be available on request. For further information please refer to the contact details at the beginning of this report.

For information regarding conditions of access to data, please refer to:

In addition to this Quality and Methodology Information, quality information relevant to each release is available in the statistical bulletin.

Timeliness and punctuality

(Timeliness refers to the lapse of time between publication and the period to which the data refer. Punctuality refers to the gap between planned and actual publication dates.)

The statistical bulletin is published annually and consistently meets the scheduled publication date. The output is published 18 months after the period to which the data refer. The reference date for the survey is either 6 April or the date of the last set of trustees’ report and accounts. The survey fieldwork occurs between September and December with validation continuing into the spring and the results are published in September or October.

The timetable for publication of any additional analysis (which would always be released after the statistical bulletin) is intentionally flexible so that analyses can be produced if users highlight a particular topic. They would, however, be pre-announced on the UK National Statistics release calendar at least four weeks in advance.

In terms of timing, although there appears to be a long gap between the reference date of April and the publication of estimates (around 18 months later), it is not feasible to reduce this gap significantly without impacting upon the data quality. Many schemes do not receive reports from their actuaries or advisors until near the end of the calendar year. Bringing forward the fieldwork may result in schemes responding with data from earlier years where that is the only information they have.

For more details on related releases, the UK National Statistics release calendar is available online and provides 12 months’ advance notice of regular release dates. If there are any changes to the pre-announced release schedule, public attention will be drawn to the change and the reasons for the change will be explained fully at the same time, as set out in the Code of Practice for Statistics.

Why you can trust our data

Office for National Statistics (ONS) is the UK’s largest independent producer of statistics and its National Statistics Institute. The Data Policies and Information Charter, available on the ONS website, detail how data are collected, secured and used in the publication of statistics. We treat the data that we hold with respect, keeping it secure and confidential, and we use statistical methods that are professional, ethical and transparent. You can find out more about our data policies on our website.

The OPSS has National Statistics status, designated by the UK Statistics Authority in accordance with the Statistics and Registration Service Act 2007. This designation signifies compliance with the Code of Practice for Statistics, which has recently been updated and focuses on trustworthiness of data in greater depth.

Back to table of contents

5. Methods

How we collect the data, main data sources and accuracy of data sources

Data collection

The Occupational Pension Schemes Survey (OPSS) is conducted annually using online and paper questionnaires. Paper questionnaires are sent to the occupational pension schemes selected, with a link to a facility that enables respondents to complete their questionnaires online. Different questionnaires are sent according to the nature, benefit structure and size of the scheme. Where schemes have more than one section, a separate questionnaire is issued for each section, usually with a maximum of four questionnaires per scheme.

There are eight different OPSS questionnaires:

  • Private sector defined benefit single section

  • Private sector defined contribution single section

  • Private sector defined benefit multi section

  • Private sector defined contribution multi section

  • Public sector defined benefit single section

  • Public sector defined benefit multi section

  • 11 or fewer members

  • Winding up

In 2017, of all responses, 70% were received via the online questionnaires, with electronic responses more common among the schemes with more than 100 members.

Sampling frame

The Pension Scheme Register, maintained by The Pensions Regulator, is used as the sampling frame. The frame covers all occupational pension schemes in the UK with two or more members. There are approximately 44,000 schemes in the population.

Sample size

The sample consists of approximately 1,600 occupational pension schemes, of which 1,400 are in the private sector and 200 in the public sector. Response rates are included in the Validation and quality assurance – Accuracy section.

Sample design

A stratified random sample is taken from the Pension Scheme Register, with scheme membership and sector (public or private) used as stratification variables. Scheme membership refers to total membership – that is, active members, pensioner members and those with deferred entitlements.

Six strata are created for each sector by dividing the population of occupational pension schemes into size bands A to F. Around 80% of schemes are very small with fewer than 12 members; however, the bulk of scheme membership is concentrated in a much smaller number of very large schemes.

In creating the stratified sample to measure variables associated with membership, a higher sampling fraction is used for the largest schemes. Pension schemes with more than 5,000 members are always included in the sample and stratified random sampling is used for schemes with fewer than 5,000 members. The six size bands are shown in Table 1.

For the public sector, the Pension Scheme Register has a list of “administrative units” rather than a list of schemes. These units may be schemes, sections or other administrative units. For example, there are many entries for the Local Government Pension Scheme as this is administered at a local level. This means that although the survey reports membership for the public sector, it is not possible to report on the number of schemes in the public sector.

The proportions sampled from each size band are shown in Table 2.

Outliers

Within the survey there is currently no treatment of outliers. However, any extreme values are verified with respondents at the validation stage.

Imputation

There is currently no imputation used within the survey and non-responders and schemes not sampled are both estimated as detailed in the following section.

How we process the data

The OPSS collects information from occupational pension schemes (consisting of two or more members), about scheme membership and contributions. It includes sections on schemes of various sizes (ranging from small schemes with 2 to 11 members, to schemes with 10,000 or more members) and those schemes that are winding up. The OPSS covers both private and public sector occupational pension schemes registered in the UK.

Each year, once the sample has been selected, a “cleaning” exercise is carried out to confirm schemes’ details (for those newly selected to the survey), prior to the forms being sent out. During this exercise, schemes are asked to confirm their benefit structure (defined benefit or defined contribution), the size band appropriate to their scheme, their contact details and their status.

Weighting and estimation

Membership weights are calculated at a stratum (size band) level and for public and private schemes separately, using auxiliary information on total scheme membership from the Pension Scheme Register. While this membership is similar to the total membership collected by OPSS, the two differ because of a delay in the reporting of membership to the Pensions Regulator.

The sampling fraction (for each stratum) is the number of schemes in the sample divided by the number of schemes in the universe. The membership response rate (for each stratum) is the membership of the schemes that responded divided by the total scheme membership of those selected from the Pension Scheme Register.

The weights for each stratum are calculated by taking the inverse of the sampling fraction and dividing by the membership response rate. Further weights are applied for specific analyses to take account of non-response to specific questions (item non-response).

Statistical disclosure

Statistical disclosure control methodology is applied to the OPSS data. This ensures that data attributable to individual respondents are not disclosed in any publication or dataset. The Code of Practice for Statistics, specifically Trustworthiness principle T6 data governance: confidentiality sets out how to protect data from being disclosed. This principle includes a guarantee to survey respondents that “official statistics do not reveal the identity of an individual or organisation or any private information relating to them”.

How we analyse the data

Once the data are collected responses are analysed and where substantial data changes are identified schemes are contacted for clarification of the data. The data are then aggregated and the Results and Publication Team conduct a further phase of validation checks using SAS. These represent thorough micro-level investigation and respondents may receive further queries if important changes are highlighted, which would have an impact on published data.

How we quality assure the data

Data are quality assured throughout the data collection, processing and analysis processes through regular consistency checks, investigation of anomalies, ensuring disclosure procedures and reviewing data sources. These checks are presented at regular curiosity meetings, where important internal stakeholders are able to interrogate the data and explore any anomalies or interesting findings. External stakeholders, for example, other government departments, also have regular opportunities to analyse the data and share feedback with ONS colleagues.

How we disseminate the data

All OPSS statistics and analysis are disseminated primarily through publication of statistical bulletins on the ONS website. These are based on aggregate data only and are available to the public free of charge. Publication dates are planned in advance and pre-announced on the statistics release calendar on both the GOV.UK and ONS websites. External stakeholders receive data ahead of release as part of the quality assurance process.

How we review the data

Every year we undertake a review of the OPSS questionnaire. Among other things it ensures that the questions remain relevant, and allows us to remove questions no longer needed and add questions that address current issues. Relevant stakeholders are contacted for review during the process and decisions about questions are made in consultation with stakeholders.

External stakeholders receive data ahead of release as part of the quality assurance process.

Back to table of contents

6. Other information

How to cite this document

“Office for National Statistics, Occupational Pension Scheme Survey (2017). Quality and Methodology Information.”

Back to table of contents