1. Overview

1.1 About the public sector finances

Public sector finance statistics are compiled and published monthly in the Public Sector Finances (PSF) statistical bulletin, which aims to provide users with an indication of the current state of the UK’s fiscal position.

As well as being of wider interest to the general public, PSF statistics form a vital input to the policy and forecasting work of HM Treasury and the Office for Budget Responsibility (OBR).

PSF statistics are published jointly by the Office for National Statistics (ONS) and HM Treasury with each organisation’s responsibilities and accountabilities published on the ONS website.

1.2 Main public sector finance measures

The statistical aggregates published in the PSF bulletin are defined using national accounts concepts and rules. ONS produces the UK National Accounts on an internationally comparable basis, following the European System of Accounts 2010 (ESA2010), which in turn is largely consistent with the United Nations System of National Accounts 2008 (SNA2008).

There are 4 main measures:

  • public sector current budget deficit (PSCB) measures the gap between current receipts and current expenditure, less depreciation on capital assets

  • public sector net borrowing (PSNB), or net lending if there is a surplus, measures the gap between total revenue and total spending, including revenue and expenditure related to capital investment

  • public sector net debt (PSND) consists of the public sector’s financial liabilities (in the form of loans, debt securities, deposit holdings and currency) less its liquid financial assets (mainly foreign exchange reserves and cash deposits), both measured at face value

  • the central government net cash requirement (CGNCR) is a cash measure closely related to net borrowing (an accrued measure); it measures the government’s need to raise cash through, for example, issuing gilts or running down liquid assets

Fiscal aggregates are published as 2 sets of measures. The first, the “ex measures”, excludes the public sector banks (currently comprising only the Royal Bank of Scotland (RBS) Group, although other banks have been excluded in the past) and are discussed further in section 3.2, of the methodological guide. These “ex measures” are the government’s preferred measures of the fiscal position and are used to set fiscal policy. A second set of measures include the public sector banks. Both sets of aggregates are identical before the financial year ending March 2008.

In addition, in December 2016, 2 supplementary fiscal aggregates were introduced:

  • public sector net debt excluding both public sector banks and the Bank of England (PSND ex BoE)
  • public sector net financial liabilities (PSNFL), which consists of the public sector’s financial liabilities less its financial assets

At the time of writing, PSNFL is designated an Experimental Statistic, which means that its data sources and compilation methodologies are subject to further quality assurance before the published data can be considered as robust and accurate as the other statistics published in the PSF statistical bulletin.

Further detail on the context and methodological background of the PSF statistics can be found in the related methodological guide.

1.3 Background to this report

The statistics in the PSF bulletin (apart from PSNFL – see section 1.2) are designated as UK National Statistics, which means that they are produced, managed and disseminated according to the Code of Practice for Official Statistics.

The UK Statistics Authority (The Authority) has a statutory duty to assess (or reassess) all designated National Statistics against its Code of Practice. Its latest assessment report for the PSF bulletin was published in October 2015.

One of the requirements resulting from this assessment concerns the quality assurance of administrative data sources, where The Authority highlighted that:

“there is a clear need for ONS and HM Treasury to identify and investigate the quality assurance arrangements for the administrative data sources for public sector finances statistics”

To this end, the ONS PSF team has conducted an assessment of the administrative data sources used in the compilation of the PSF statistics in accordance with The Authority’s Administrative Data Quality Assurance Toolkit.

The Administrative Data Toolkit is the mechanism used by The Authority to determine whether the quality assurance arrangements, which are required for statistics compiled using administrative data, comply with the Code of Practice.

Quality assurance of administrative data is an ongoing process and covers the entire statistical production process; starting from the point where data are collected to the production of the final output and how information is shared with users. As such, the Administrative Data Toolkit looks to assess 4 practice areas associated with data quality.

  1. Operational context and administrative data collection: this covers the environment and processes for compiling administrative data; factors which affect data quality and may cause bias; safeguards which minimise the risks; and the role of performance measurements and targets.

  2. Communication with data supply partners: this covers the collaborative relationships with data collectors, suppliers, IT specialists, policy and operational officials; formal agreements detailing arrangements; and engagement with collectors, suppliers and users.

  3. Quality assurance (QA) principles, standards and checks by data suppliers: this covers the data assurance arrangements for data collection and supply; quality information about the data from suppliers; and the role of operational inspection and internal or external audit in the data assurance process

  4. Producers’ QA investigations and documentation: this covers QA checks carried out by producers of statistics, and quality indicators for input data and output statistics; strengths and limitations of the data in relation to use; and explanations for users about data quality and its impact on statistics. In conjunction with these 4 practice areas, the Administrative Data Toolkit encourages consideration of potential quality issues associated with data, which may affect the quality of the statistics, as well as the nature of the public interest served by the statistics. Considered together, these areas form the “QA matrix”; assurance levels are assigned to data sources and / or datasets and then assessed on these bases.

The remainder of this report is structured as follows:

  • section 1 provides an overview of the outcome of this work; detailing improvements or changes made as a result of this work and further quality assurance
  • section 2 provides an overview of the data sources used in the PSF statistics
  • section 3 discusses the approach we took to determine the level of quality assurance of the administrative data required
  • section 4 discusses our actions that provide evidence in support of the level of quality assurance for the administrative data sources
  • section 5 discusses our quality management actions

1.4 Approach to the assessment of the quality assurance of administrative data used in the public sector finances and improvements made during this project

1.4.1 Approach to the assessment

In order to assess the quality assurance of the administrative data used in the PSF statistics, we established a consistent and fair process, which allowed us to gather sufficient information from data suppliers while minimising burden. The approach taken was as follows:

  • all suppliers and data supplied were listed and prioritised by level of contribution to the PSF statistics
  • all suppliers received a “Public Sector Finances Supplier Questionnaire”, which examined various aspects of their administrative data, including but not limited to, data collection methods and quality assurance procedures
  • information within these questionnaires was collected and clarified either through written communication, teleconferences and/or workshops
  • where Data Access Agreements or Service Level Agreements were not already in place, we have drafted these and are working with suppliers to agree them
  • main areas to consider in determining the level of risk of data quality concerns and public interest profile were established, in line with the Administrative Data Toolkit (our approach here is further explained in section 3)
  • assessment templates were created based on these main areas and information in the Administrative Data Toolkit
  • assessment templates were completed using information gathered from suppliers and initial drafts were sent to The Authority for review
  • once any additional information needed was gathered from suppliers, a re-drafted report was sent to suppliers for review and sign-off

We found this approach effective in gathering relevant information and in engaging with suppliers.

1.4.2 Improvements made as part of this assessment

As part of this project and ongoing quality assurance of the administrative data used in the PSF statistics, we have made a number of changes to either our quality assurance procedures or data used, which have resulted in improved quality of PSF statistics.

These are summarised in Table 1.

1.5 Statement of errors

Producers of National Statistics and other official statistics are required, by the Code of Practice, to correct errors discovered in statistical reports and alert stakeholders promptly. ONS identified and corrected errors in 2 PSF bulletins during The Authority’s 2015 assessment of these statistics, and 2 others after its report was published.

In October 2014, ONS and HM Treasury identified errors in the back history of the central government net cash requirement (CGNCR) series, which related to the period December 2012 to April 2014. Users were notified of the error within 5 days of discovery, and the corrected time series were published in the 21 October 2014 bulletin. Users were notified of the error via a correction announcement on the ONS website; colleagues at HM Treasury and the Debt Management Office (DMO) alerted their contacts in the media, the City of London and elsewhere to ensure maximum possible awareness of the announcement.

The issues and subsequent revisions to CGNCR reported in the 21 October 2014 bulletin were identified through work undertaken to reconcile 3 of the 4 main fiscal measures (that is, net cash requirement, net borrowing and net debt) and to reconcile the central government net cash requirement with cash reported in audited resource accounts.

To mitigate the risk of further similar errors we have built these reconciliation processes into the monthly production systems. The first of these new reconciliations, Table REC3 in the Public Sector Finances Tables 1 to 10: Appendix A dataset, reconciles CGNCR and net debt.

In March 2015, ONS corrected an error in central government liquid assets data that was identified through quality assurance work at ONS and HM Treasury. The correction of the error resulted in a £5.5 billion decrease in central government bank deposits from October 2014 and a corresponding increase in public sector net debt. Users were notified of this error through the February 2015 PSF bulletin.

To mitigate the risk of further similar errors we have reviewed and improved our communication with our data suppliers.

Both of these errors are discussed in The Authority’s 2015 assessment report. Since The Authority’s assessment report was published in October 2015, errors have been identified and corrected in 2 PSF monthly statistical bulletins.

In September 2016, ONS and HM Treasury identified a processing error that led to the publication of incorrect data for the “net acquisition of company securities” component of the cash account for the financial year ending 2016. In the same month a processing error also led to figures for “current transfers received from abroad” being published with the wrong signage. Neither of these errors had an impact on the 4 main fiscal measures described in section 1.2 and were limited to components within the detailed annex tables of the PSF bulletin (that is, Tables 6C, 6E, REC1 and REC2 in the Public Sector Finances Tables 1 to 10: Appendix A dataset). The errors were investigated and corrected, with new tables published along with appropriate correction notices.

In late October 2016, ONS and HM Treasury identified that a processing error had led to the gilt holdings of the Asset Purchase Facility (APF), published in the PSF bulletin of 21 October 2016, being overstated at the end of September 2016 by £11.0 billion. This impacted published public sector net debt estimates. The processing error was as a result of a gilt which had reached maturity not being recorded as redeemed within the APF assets. Correctly removing the gilt from the APF assets not only increased public sector net debt, but also slightly increased public sector net borrowing and net cash requirement, due to the removal of interest payments to the APF relating to these gilts.

Senior management in ONS and HM Treasury considered carefully the possible impact of the correction on users of PSF statistics and concluded that it was unlikely to have an impact on the markets or other decisions based on the PSF statistics. Following this, the corrected PSF bulletin was published on the morning of 1 November 2016. Once the corrected bulletin had been published, ONS and HM Treasury informed their respective media contacts and PSF stakeholders of the correction, both through email and social media, to maximise awareness.

To mitigate against the risk of similar errors, to those described in this section, reoccurring additional checks have been incorporated in the relevant parts of the monthly PSF processing. Although the risk of errors can never be fully eliminated in a process as complex as the monthly production of the PSF statistics, the measures set out in this report are expected to significantly reduce the risk of future errors. Specifically, the work involves:

  • building stronger relationships with data suppliers
  • extending Data Access Agreements
  • improving the understanding of ONS, HM Treasury and data suppliers of the strengths and weaknesses of specific data sources
  • identifying additional quality checks to be carried out by ONS, HM Treasury and data suppliers

In addition to these aspects of the quality assurance framework, we believe that the risk of there being errors in published PSF statistics is reduced through the ongoing work to improve the published reconciliations between the different fiscal measures (and the new supplementary fiscal aggregates), described in section 1.2. The reason why these reconciliations are such important quality tools is that the data sources for net cash requirement, net debt and net borrowing are largely distinct and separate. Therefore, the process of reconciling the different fiscal measures allows the consistency and coherence between different data sources to be examined and potential issues identified prior to publication.

Nôl i'r tabl cynnwys

2. Summary of data sources for the public sector finances

2.1 Overview of administrative data sources

Administrative data refers to information collected primarily for administrative reasons (not research). These types of data are collected by government departments and other organisations for purposes such as registration, conducting transactions, and record-keeping, usually in the context of delivering a service. Administrative data are primarily used for operational purposes and their statistical use is secondary.

We recognise that there are limitations with administrative data and that these can create complications when compiling official statistics. However, their use is central to the production of official statistics and the existence of such challenges places a premium on proactive quality assurance to investigate the data, manage identified issues, and clearly communicate any limitations to users.

An overview of the data sources used in the public sector finances is provided in this section; further detail is available in our methodological guide.

2.1.1 Central government

Central government’s contribution to the public sector borrowing and debt aggregates is largely compiled by HM Treasury, mainly using administrative data sources.

Central government expenditure data

The main source of central government expenditure data is HM Treasury’s public spending database, OSCAR (Online System for Central Accounting and Reporting), which contains financial information from central government departments. Prior to financial year ending 2013, the corresponding HM Treasury database was called COINS (Combined Online Information System).

A small number of expenditure items are not sourced from OSCAR. The largest of these, expenditure on debt interest, is calculated from a variety of sources – mostly the Debt Management Office (DMO) and HM Treasury finance systems, with some smaller contributions from other places, including National Savings and Investments (NS&I). Depreciation on assets is derived using the Office for National Statistics (ONS) perpetual inventory model.

Central government subsidies to public corporations are also mainly sourced from ONS public corporation data rather than through the central government collection. Additional data comes from bodies such as the DMO, HM Revenue and Customs (HMRC), and the Bank of England.  

Central government income data

Most central government income takes the form of tax receipts, the vast majority of which are collected by HMRC. Therefore, data comprising the majority of receipts in the PSF bulletin are collated and quality-assured by HMRC analysts from their administrative data sources before delivery to HM Treasury and ONS.

Additional income data sources include:

  • non-HMRC tax or levy raising bodies that supply data directly to ONS; these include vehicle excise duty, national non-domestic rates, and information from utilities regulators

  • HM Treasury administrative sources, including OSCAR, which cover the majority of dividend and interest receipts, as well as the TV licence fee receipts, and a number of smaller receipts items

  • ONS modelling for the gross operating surplus; by convention, government gross operating surplus is assumed to be equal to depreciation (more precisely, consumption of fixed capital) which is derived from ONS models that use average asset lives

Cash data

The majority of cash data comes from HM Treasury’s cash management systems, supplemented with data from the DMO, Bank of England, and other sources.

Estimates of the central government net cash requirement are produced via a system of balancing a number of central government accounts for which complete balances are produced each month: these include the Consolidated Fund, National Loans Fund, and the Debt Management Account.

2.1.2 Local government

Most local government data are annual and relate to financial years, although the Department for Communities and Local Government (DCLG) does have in-year collections for England. Detailed annual returns of expenditure and income are compiled by local authorities and collected by DCLG, Scottish Government, Welsh Government, and the Northern Ireland Executive.

Data for the current year are generally based on local government budgets and in-year data and are therefore prone to revision once finalised data are available. Finalised figures are based on audited accounts, which are available for England around 8 months after the end of the financial year and somewhat later for the devolved administrations.

2.1.3 Public corporations

ONS collects quarterly data directly from the 8 largest public corporations via survey questionnaires. Data for the remaining public corporations comes from their published annual accounts or from data used for public corporations in HM Treasury’s Whole of Government Accounts. Data for public corporations are also prone to revision until final, audited, accounts are published.

2.1.4 Administrative data grouped by supplier

For ease of assessment and presentation, we have grouped administrative data sources by supplier. Appendix A provides a summary of all administrative data used in the compilation of the public sector finances.

Appendix A: Administrative data grouped by supplier

2.2 Overview of other data sources

In addition to the administrative data sources mentioned in Appendix A, we use a number of other datasets. These are listed in Table 2.

Nôl i'r tabl cynnwys

3. Determining levels of quality assurance for administrative data

3.1 Level of quality assurance and public interest profile

To determine the required quality assurance level of each administrative data source, we used the level of risk and public interest profile matrix set out in the UK Statistics Authority’s Administrative Data Quality Assurance Toolkit. We recognise that the toolkit should be used as a guide and that it does not contain an exhaustive list of factors to consider when determining the appropriate level. Therefore, we considered additional factors that we felt applied specifically to the public sector finances (PSF). In this section we discuss the process we undertook, and the aspects we considered, in order to determine the quality assurance level of each administrative data source.

To ensure that a consistent approach was applied to all of our administrative data sources, with the aid of the Admin Data Toolkit, we identified 6 main dimensions to determine the level of quality concern and 6 main areas to determine the public interest profile.

Dimensions used to determine the level of quality concern

Agreements in place for data supply: Service Level Agreements or Data Access Agreements ensure that data are supplied to an agreed specification and timetable; they also ensure certain practices are formally agreed between the producer and supplier, for example, information sharing, quality assurance procedures, communication or regular review of data supplied. We considered a lack of any formal agreements to constitute a high risk to the quality of the data.

Supplier awareness of how their data are used: as part of ensuring data supplied are correctly quality assured and fit-for-use, the supplier needs to have an awareness of how their data are used and the context in which they are used. We considered the supplier having no awareness of how their data are used within the PSF as high risk and good awareness as low risk.

Producer satisfaction with level of supplier quality assurance: as part of this project we asked all data suppliers to provide up-to-date information on their quality assurance procedures. If the supplier provided sufficient information and its quality assurance procedures ensured data supplied were fit-for-purpose, then a low or medium risk was assigned. However, if insufficient quality assurance procedures were in place, this was considered high risk.

Level of communication with data supplier: the level of communication needed between a producer and supplier can vary depending on the content of the delivery and the frequency at which it is supplied. We considered a lack of communication with data suppliers as high risk, and assigned a low or medium rating, depending on the frequency and depth of communication needed and achieved for a particular dataset or data source.

Moderation of high risk factors: data sources can have high risk factors associated with them; for example, multiple or complex data collection procedures, complicated systems or few checks in place. High risk factors can be mitigated in ways such as external independent audit of data, additional quality assurance, or effective communication. We took into consideration any potential high risk factors associated with a data source and/or dataset, and ascertained whether any actions were taken by the supplier to moderate these. If they had done so, a low or medium risk level was applied. If they had not, then this was considered high risk.

Level of contribution to PSF aggregates: so as not to place undue burden on data suppliers and to maintain an appropriate level of investigation, all of these dimensions were considered in the context of each dataset’s contribution to the main PSF aggregates (mentioned in section 2). Higher impact datasets were held to a greater level of scrutiny as even minor lapses in quality could result in large impacts on the final aggregates. Smaller impact datasets were considered low risk in many instances in accordance with their contribution to the final aggregates. The boundaries we applied were: less than 5% as low contribution; 5% to 25% as medium contribution and greater than 25% as high contribution.

Once all of these dimensions were taken into consideration, we were able to determine the overall level of quality concern.

Areas used to determine the public interest profile

Legislation: if data are required to fulfil a legal obligation, the Administrative Data Toolkit describes this as high public interest. Although the main PSF aggregates are not mandatory, the UK is legally obligated to supply government finance statistics to the European Commission as part of the excessive deficit procedure (EDP). Much of the data supplied for use within the PSF are also used to compile EDP returns. This increases the public interest profile for most datasets received.

Political sensitivity: the PSF are politically sensitive statistics; however, this does not mean that all of the underlying data are. Datasets or data sources related to issues that have high prominence in the political arena can be considered more politically sensitive.

Influence of public issues: public issues are those that are considered to impact the well-being of the general public. For example, the PSF have a high level of influence over public issues as they are used for monitoring and planning fiscal policy. However, not all of the datasets used to compile the PSF have the same level of influence. In this area we have looked into other uses and availability of the data supplied and relevance to public issues.

Media interest: the PSF can attract a lot of media interest; however, all of the underlying data may not. High media interest in statistics contributes to a higher public interest profile.

Market sensitivity: there is greater risk associated with market sensitive data as they can influence movements in the financial markets, therefore extra care is needed when handling these statistics. Similarly to the other areas, the PSF are market sensitive statistics, however, the market sensitivity of the underlying data can vary.

Level of contribution to PSF aggregates: similarly to determining the level of quality concern, the contribution of each dataset to the PSF aggregates has been considered in conjunction with the other areas listed.

Once the public interest profile and level of quality concern had been determined, a quality assurance level was assigned. Although we assessed each dataset individually, the final assurance level assigned to a group (supplier) corresponded to the highest level for any dataset within that group. Table 3 summarises the overall assurance levels assigned for each supplier, while Tables 4 and Appendix B provide details of the level of quality concern and public interest profiles assigned.

The public interest profile and level of quality concern matrix can be summarised in Table 4.

3.2 Overall assurance level, by supplier

Appendix B: Level of risk of quality concern assigned

Appendix C: Public interest profile assigned

Nôl i'r tabl cynnwys

4. Evidence and actions supporting the level of quality assurance for administrative data sources based on the 4 practice areas

4.1 Overview

The quality assurance matrix comprises 4 working practice areas related to the quality assurance of official statistics and of the administrative data used to produce them.

These practice areas demonstrate the need for the quality assurance of statistics obtained from admin sources to extend beyond the checks made by statistical producers on the data they receive. These 4 practice areas were briefly explained in section 1.3 of this report.

The Authority requires that producers demonstrate knowledge of the operational context in which the data are recorded, and an understanding of the impact that the motivations of data suppliers can have on data. Also, producers should have good communication links with data supply partners and understand their partners’ data quality processes and standards.

Table 5 provides a breakdown of the practice areas’ operational context and administrative data collection, communication with data supply partners, quality assurance principles, standards and checks applied by data suppliers, and producer’s quality assurance investigations and documentation.

The main mechanism used to gather information from suppliers has been through the use of a “Supplier Questionnaire”, which asked questions on the various aspects of their administrative data. This questionnaire was initially sent to suppliers in 2013, and as part of this project, suppliers were asked to review the information they had previously provided.

Using the assurance levels assigned for each supplier, we assessed our actions and those of our suppliers based on the 4 practice areas. Detailed information, by supplier, is provided in Table 8. An overview of our actions and evidence at a collective level is provided in this section.

Operational context and administrative data collection

We used a “Supplier Questionnaire” to learn about the operational context of the administrative data and data collection procedures. In this questionnaire suppliers were asked to provide information on:

  • how the data supplied are collected and the administrative context in which they are collected
  • potential bias in the data that could, for example, result from incomplete coverage of the data, and how they managed this (for example, via safeguards, imputation, estimation or forecasting)
  • quality assurance procedures applied to their source data as well as the aggregate data prior to sending to the PSF Branch
  • other areas, such as timeliness, coherence and revisions policies

Suppliers were responsive to these questionnaires and approximately 70% of suppliers responded to the questionnaire sent this year. For those that didn’t respond we used previously supplied information or information available on their websites. Suppliers were generally very responsive when asked follow-up questions, either through emails, teleconferences or quality workshops and their input proved to be invaluable in the compilation of this document.

Communication with data supply partners

To ensure that the public sector finances (PSF) are fit-for-purpose, we communicate regularly with both suppliers and users. We communicate with them in a number of ways, including:

  • regularly scheduled meetings with important stakeholders that are often to discuss data or methodological issues, or user needs (see Table 6)
  • ad-hoc meetings with important suppliers and users, which are usually either to discuss one-off events, or to discuss the need for additional analysis or outputs, or to inform suppliers of methodological changes that impact the data they supply
  • written communication or teleconferences with suppliers and users, when face-to-face meetings are not possible or necessary
  • a mailing list of PSF users is maintained, which is used to notify them about the latest release, publications or news releases
  • consultations and user engagement events – the PSF Branch consults on significant changes to methodology or presentation of data and runs user events either to increase user awareness or as part of a consultation

Quality assurance principles, standards and checks by data suppliers

We have investigated the quality assurance procedures of our data suppliers in the ways described in section 3. Commonalities we have identified amongst our suppliers are now summarised, (fuller descriptions for each supplier are detailed in Table 10):

  • suppliers generally have quality assurance procedures in place at points where raw data are taken from administrative systems, during editing and manipulation stages, as well as prior to sending to the PSF Branch or publication
  • quality assurance procedures include validation checks such as those related to data changes, revisions, growth rates and checking for plausibility
  • data are often signed off by senior management prior to sending to the PSF Branch or publication
  • where suppliers publish their own data, quality assurance information is generally published alongside
  • revisions policies exist
  • most data suppliers are subject to some form of external auditing, but also have their own internal audit procedures

Producers’ quality assurance investigations and documentation:

We have specific quality assurance procedures in place that begin at the point we receive data from our suppliers, carry through the processing stages and extend to publication. These generally fall into the following areas:

  • when data are received from suppliers, we check that the data are complete, that is, all items are delivered for the relevant time
  • data are analysed by magnitude of revisions and growth rates in both monetary terms and percentages of the prior value – this is completed over several iterations for supplied data and during processing
  • quality assured data and a draft bulletin are sent to senior staff and main internal (within ONS and HMT) data suppliers for sign-off

Flow charts providing an overview of this process are included in the Annex.

Details of quality assurance procedures for each dataset are described in Appendix D.

4.2 Evidence and actions, by supplier

Appendix D: Evidence and actions supporting the level of quality assurance required

Nôl i'r tabl cynnwys

5. Quality management actions

The Authority’s Administrative Data Toolkit highlights 3 quality management actions that producers should meet to ensure the suitability of administrative data. Table 7 lists the actions that the PSF Branch have taken to ensure this suitability.

Nôl i'r tabl cynnwys

6. Next steps

The Office for National Statistics (ONS) has committed to follow up this assessment with an administrative data quality assessment review every 2 years; with the first review taking place in early 2019.

As a part of this process, each supplier will be presented with the documentation they submitted for this assessment and asked whether their approach to the quality assurance of administrative data has changed in the intervening period. Any changes will be appropriately recorded within an updated suite of documentation and an overarching administrative data quality assessment report published.

As and when any new supplier of data is identified we will follow an individual assessment of their quality assurance procedures in line with this report.

Nôl i'r tabl cynnwys

7. Public sector finance conclusion

We are confident that we apply a high level of assurance to the published public sector finances statistics, from the source data being thoroughly validated as part of our routine operational work, to the revisions analysis (against the previous publication) prior to publication.

Public sector finance is a complex area and each publication contains a substantial number of supporting tables and reconciliations. Our statistical bulletin contains a summary section aimed at the inquiring citizen user along with detailed information targeted at expert users. We have worked with our digital publication teams to make our statistics accessible and have embraced social media to reach out to users and answer individual questions.

While we have identified 4 publication errors, in recent years (see section 1.5 Statement of errors), none of these significantly impacted on public sector net borrowing (the statistic that attracts the greatest media interest) and further quality assurance procedures have been introduced to mitigate against a reoccurrence of similar issues.

Nôl i'r tabl cynnwys

8. Authors

Foyzunnesa Khatun, Emily Knock, Fraser Munro, Sarah Nightingale and Scott Symons

Nôl i'r tabl cynnwys

Manylion cyswllt ar gyfer y Methodoleg

Fraser Munro
fraser.munro@ons.gov.uk
Ffôn: +44 (0)1633 456402