Purpose 

Our statistical practice is regulated by the Office for Statistics Regulation (OSR). OSR sets the standards of trustworthiness, quality and value in the Code of Practice for Statistics (the Code) that all producers of official statistics should adhere to. 

This statement explains how the Office for National Statistics (ONS) applies the Code in the production and release of its statistics. We have measures in place to ensure that our official statistics are produced in line with the principles and standards set out in the Code. For other releases, such as research papers and quality and methodology information, we aim to apply the principles of the Code to ensure confidence, quality and public value. 

We welcome feedback from users on how well we meet these standards. You can contact us directly by emailing info@ons.gov.uk. Alternatively, you can contact OSR by emailing regulation@statistics.gov.uk or via the OSR website

This statement sets out how we work in line with the three core principles of the Code: trustworthiness, quality and value, and how we adhere to the Standards for the Public Use of Statistics

Trustworthiness 

Confidence in the people and organisations that produce statistics and data 

We are committed to producing statistics that are independent, transparent and produced in the public interest, and to ensuring that their release and use support public confidence. 

Our statistics are preannounced publicly via the ONS Release Calendar. Statistics are released as soon as is practicable and are normally published at 09:30am on the ONS website.  Some market sensitive publications are published at 7:00am, with specific agreement from OSR. Further information is available in our Release Practice Policy. 

Changes or updates to published statistics, analysis or data are sometimes required for various reasons. We use different terminology depending on the reason for the change or update. ONS regularly make scheduled revisions to previously published statistics, analysis or data that improve quality by incorporating improved methods, additional data sources or statistics that were unavailable at the point of initial publication. There are also occasions where it is necessary to correct errors in response to the identification of mistakes following their initial publication. Our approach to both revisions and corrections is set out in our Revisions and Corrections of Errors Policy, which explains how and when users will be informed of changes. 

Equality of access to official statistics is a fundamental principle of statistical good practice. As of 1 July 2017, pre-release access to Office for National Statistics (ONS) statistics was removed by UKSA in all but exceptional circumstances. Whenever a decision is taken to grant exceptional pre-release access, details are published on the ONS website

We take seriously our responsibility of holding all our data and statistics in a safe and secure manner in line with data protection legislation and our internal policies. This includes adherence to our Data Protection Policy and our data strategy. 

Our statistical outputs are produced by professionally independent analysts who are members of the Government Statistical Service (GSS). A number of staff also belong to professions within the Analysis Function, who are recruited via professional competency frameworks. As GSS members, staff adhere to high professional standards and undertake continuous development, including mandatory training on the Code of Practice for Statistics. They regularly develop their skills to ensure they demonstrate sound judgement, apply the principles of the Code and act with integrity, honesty, objectivity and impartiality. 

Quality  

Data and methods that produce assured statistics 

We aim to produce statistics that are based on sound methods and assured data, and to be transparent about their strengths and limitations. Our statistical releases include clear information about the quality of the underlying data, including known limitations, sources of uncertainty, and the likely impact these may have on the results. This helps users to interpret the statistics appropriately. 

We have a structured quality management framework, which supports colleagues in identifying challenges and risks to quality, mitigate those, and identify opportunities for continuous improvement. The framework brings together objectives, activities, tools and policies for ONS to deliver against its quality mission.  This is complemented by related and extensive guidance and training, which is available or in development, to support quality management throughout the data lifecycle. 

The aim of the framework is to support the application of the Code of Practice, particularly its Quality pillar (but not only), and to ensure that ONS is able to identify and mitigate quality issues in a pro-active manner. The framework also enables ONS to report on its quality position, be accountable, and seek external support where needed. We will be publishing an overview of our framework on our website in the summer. 

We recognise that producing statistics often involves trade‑offs between quality, timeliness and scope. At ONS, these trade‑offs are considered explicitly, with decisions taken at the appropriate level. Where necessary, we prioritise statistical quality and assurance over the volume or speed of outputs, to ensure that published statistics remain trustworthy and fit for purpose. Our current decisions and priorities were detailed by the Permanent Secretary in his letter to the UK Statistics Authority Interim Chair.  

We have also introduced a new tiering model to help us regularly review, prioritise and sequence work. The model is based on the level of impact that a statistical output is expected to have on users and the decisions they inform. This ensures that resources are focused on improving the statistics that carry the greatest weight for the UK. More information can be found in our website.  

Where errors occur, or where data are missing or delayed, we are open and transparent, and we act promptly to correct or revise the statistics.  Further details are set out in our Revisions and Corrections of Errors Policy. We also publish blogs and podcasts to demonstrate this transparency. 

We sometimes publish statistics as "official statistics in development" when using new methods or exploring new data sources. In these cases, we clearly label these statistics as in development to be transparent about strengths and limitations, and to seek feedback from users.  

Value 

Statistics that support society's needs for information 

Users are at the centre of our approach to producing statistics. Our statistics are designed to meet user needs, and as such our data and commentary are presented clearly and accessibly to support use by a wide range of audiences. We follow relevant accessibility standards and guidance, such as those from the Government Analysis Function, to ensure our statistics can be used by as wide a range of users as possible. We maintain an ONS Service manual that brings together guidance from across several design teams to support accessible and consistent services. We are also improving the ONS website, taking a user-centred approach to delivering our data and statistics.  

In line with the Standards for the Public Use of Statistics, we provide clear explanations of the key messages and their relevance, ensuring that statistics are not presented in a way that is misleading. Commentary and analysis are objective, impartial and evidence based, and we take care to preserve context and communicate limitations when statistics are reused or repurposed. Where available and appropriate, we publish disaggregated data to support deeper analysis and understanding and we take care to preserve context and communicate limitations when statistics are reused or repurposed. 

We are committed to continually reviewing our statistics to ensure that they continue to meet user needs. This includes consulting users on proposals to introduce, withdraw or make substantial changes to official statistics, data collections or outputs.  

Where users request analysis that is not already published, we aim to respond promptly, subject to data availability and resources. Any ad-hoc analysis produced for external users is published so that it is accessible to all. Users can also request additional breakdowns of data already in the public domain. These outputs are published as User Requested Data (URD) pages, which can include further breakdowns of already published data or linked or combined data from multiple sources that are all already publicly available.