The Office for National Statistics (ONS) publishes annual estimates of public service productivity, which are badged as National Statistics. We produce estimates of inputs, output and productivity growth of nine service areas, four of which are adjusted for quality.
This article covers:
new data used and improvements to the quality adjustment of education
changes applied to healthcare data
changes in children's social care
These changes will be incorporated in Public service productivity, healthcare, England: financial year ending 2020 and Public service productivity: total, UK, 2019. They are in accordance with the Code of Practice for official statistics and follow discussion with government departments, the devolved administrations and relevant experts.
We welcome feedback to email@example.com. We will take this into consideration in the development plan of the measures and methods of public service productivity.Nôl i'r tabl cynnwys
Details of the quantity and quality output of education can be found in the Sources and methods article.
The quality adjustment for education has been developed over a number of years, in line with the recommendations published in the Atkinson review. Attainment data were previously used as a proxy for change in the quality of education, and the GCSE (or equivalent) results for a given year were applied to quality adjust the output of primary and secondary education for that year.
This method was changed in 2019, when a new "cohort split" approach was introduced. Using this method, the GCSE attainment data published for each academic year reflected the quality of teaching from Year 7 to Year 11.
Additional improvements were included in last year's publication, when the cohort split method was extended to account for GCSE results being "the outcome of 11 years of compulsory schooling" and the bullying indicator was introduced.
We propose incorporating the methodological changes discussed in this report into the existing measures.
Proposed changes - extending the cohort split to Reception (or equivalent) year
The current cohort split methodology considers 11 years of compulsory schooling. However, Reception year (in England and Wales), P1 (in Scotland) and Year 1 (in Northern Ireland) are currently excluded from the cohort split. We propose extending the cohort split to account for the first year of primary schooling.
To introduce the Reception year, we plan to keep the secondary school splits (from Year 7 to Year 11) and split the remaining 15% contribution from Reception to Year 6.
|Year group||Current (%)||Proposed (%)|
|Reception (or equivalent)||-||2|
Download this table Table 1: Current and proposed weights for each school year's contribution to GCSE (or equivalent) attainment.xls .csv
The primary school years of the cohort split are held constant, with the exception of Year 6. This is because, while there is evidence that primary school continues to influence later academic attainment up to the end of Year 11 (PDF, 745KB), the exact percentage contribution is less easily disentangled.
There is a wide range of evidence suggesting that the early years are most important for skills development in primary school, as these skills are more "malleable" at this age (see, for example the Organisation for Economic Co-operation and Development (OECD) report on Fostering and Measuring Skills (PDF, 2.79MB), The Effective Pre-school, Primary and Secondary Education Study (PDF, 745KB). Evidence also suggests that factors outside of the schooling system can have a larger impact on attainment (such as parents choosing schools (PDF, 767KB) and parental income). Additionally, as pupils age through the school system, they acquire subject-specific knowledge that may impact GCSE attainment more directly. Therefore, we hold the primary school splits constant given the mixed conclusions in the literature.
Proposed changes: primary school attainment
The current method of quality adjustment in education considers attainment at the GCSE or equivalent level and the bullying adjustment (as published in our 2021 article). Although we recognise the importance of these measures, they may not fully reflect quality across the whole school system. Combining them with additional measures could help to develop a more comprehensive and holistic quality adjustment. Furthermore, the current measures focus on secondary school. For this reason, we have researched measures of quality of the earlier years of schooling.
The Atkinson report (PDF, 1.1MB) (recommendation 9.3) states that it would be beneficial to "measure, if possible, the quality of education delivered at younger ages rather than relying on examinations of those aged 16 years to proxy the whole education output". At the primary school stage, we consider skills development as a key outcome and goal of education. In particular, literacy and numeracy are the foundational skills most targeted by policy. The Scottish government's National Improvement Framework and Improvement Plan cites "ensuring that every child achieves the highest standards in literacy and numeracy" as one of their key objectives. Similarly, the Department for Education's (DfE) previous Single Departmental Plan considers a "good level of development" as those achieving at least the expected levels in literacy and numeracy, among other skills.
Accordingly, we propose extending current quality adjustments for primary schools by using Key Stage 2 attainment measures, alongside existing measures. These are nationally representative, publicly available measures of attainment of those aged 11 years across the UK, which allow us to consider attainment in literacy and numeracy. You can find more information on the data used for each nation in our source and methods article, which will be updated in due course.
Consistent with the GCSE attainment cohort split, we also apportion primary school attainment equally between the seven years of primary schooling, since the year in which the test is taken is not the only year that contributes to this attainment. We propose assigning 14% of the national curriculum assessment attainment to each year of primary school from Reception (or equivalent) to Year 6, using the same methods applied in the past.
Proposed changes: Disadvantaged pupil attainment gap index
Closing the attainment gap between disadvantaged and non-disadvantaged pupils has long been a priority for the UK education system. As a measure of attainment, the DfE stated that the disadvantaged gap index "is more resilient to changes to grading systems and assessment methods" and hence may be more comparable between years (Workless Households and Educational Attainment Statutory Indicators). The OECD report Equity and Quality in Education (PDF, 4.1MB) found that students from low socio-economic backgrounds are twice as likely to have low pupil performance than their peers. Therefore, when considering attainment in school, it is important to simultaneously consider how equitable improvements in attainment are.
We propose to use the disadvantaged attainment gap index. This measure defines disadvantaged pupils as those who attend primary school and have been eligible for free school meals at any point in the last six years, children looked after by a local authority and children who left local authority care in England and Wales. You can find more information in the DfE's methodology documents.
In incorporating the disadvantaged attainment gap index, we take the inverse of the growth rate, such that a fall in the index (as it gets closer to 0) reflects an improvement in quality, using the DfE pupil premium funding information. To reflect its relative importance, we take the proportion of total school funding assigned to the pupil premium (which is funding specifically targeted at supporting disadvantaged pupils) to inform the weighting choice.
This quality adjustment is based on data for England, as no equivalent measures are available covering other parts of the UK. As such, we have applied it to the output measure for all parts of the UK.
Including these two new measures leads to lower growth in quality adjusted output from our current method. There are two reasons for this. Firstly, the weight of the GCSE attainment index is reduced significantly for primary schools to account for the introduction of the primary school attainment index. The primary school attainment index has grown at a slower rate than the GCSE attainment index since 1997, resulting in a slower rate of growth in the quality adjusted output estimate. Secondly, as primary schools have the largest expenditure weight in the non-quality adjusted education output series, a change to the quality adjustment for primary schools is likely to have a large impact.
The impact of the Key Stage 2 disadvantaged attainment gap index on the quality adjusted output index is minimal, because of the low weight given to the indicator. However, we still consider it to be an important addition in telling the story of quality in primary schools.
Future improvements to education quality adjustment
The quality adjustment development will continue in the future, where additional data on well-being, mental health, and other indicators will be considered. We are also keen to progress with our analyses on employment-related measures, aiming to develop an accurate model to better disentangle their association with the quality of the education system.
However, understanding the impact of restrictions arising from the coronavirus (COVID-19) pandemic on education outputs is our main priority, as the first public service productivity estimates for 2020 will be publish at the beginning of 2023.Nôl i'r tabl cynnwys
Our public service healthcare productivity statistics use a wide range of output activities and sources, as described in our methods article.
In previous years, NHS England's National Cost Collection (NCC) has been used as the main data source to measure hospital and community healthcare services (HCHS) output. However, changes to data collection and challenges presented by the coronavirus (COVID-19) pandemic have affected the comparability of these data between the financial year ending (FYE) 2019 and FYE 2020 for some services. As a result, alternative data sources have been introduced to estimate growth for some services within HCHS.
To measure healthcare output growth between FYE 2019 and FYE 2020, two strategies have been followed in choosing an appropriate alternative data source.
In the first instance, we used data sources used to produce the more-timely estimates of output within our Quarterly National Accounts (QNAs) and monthly gross domestic product (GDP) estimates. These data are generally less comprehensive and less granular than NCC data.
We then supplement this with additional data sources that capture growth in healthcare components not measured in the timely National Accounts measures.
Using these additional data sources means that our annual healthcare estimate for FYE 2020 will be a more comprehensive estimate of healthcare output than that available from the timely estimates in the QNA. However, because of the limitations in the availability of our regular annual data sources, the annual healthcare estimate for FYE 2020 will be based on less-detailed data than estimates produced in earlier years.
Growth rates using alternative data sources were applied to expenditure in the FYE 2019 National Cost Collection, to minimise the impact on the weight of different services.
The new data sources are only used to estimate output growth in FYE 2020 and have not resulted in revisions to estimates for healthcare output growth in earlier years.
Elective care, day-cases and non-elective care
A lower number of trusts reporting activity data has contributed to abnormal activity growth rates (that contradicted other more comprehensive sources of activities data) within the NCC. As a result, we have used the growth in the number of finished admission episodes reported in the Monthly Activity Returns (MAR) taken from NHS Digital's hospital episode statistics (HES) dataset instead. We also use this data source in our timely estimates of healthcare output within the QNA. The data are highly aggregated, providing activity for all specialities by elective care, day cases and non-elective care.
Monthly situation reports from NHS England were used to estimate growth in critical care output. The change in the average number of occupied beds for adult, paediatric and neonatal was used as an alternative to the NCC data. This source is also used to inform our QNA estimates.
Accident and emergency
New reporting categories have had an impact on comparability to FYE 2019. NCC data reported a drop in emergency care activity in FYE 2020, whereas data from HES and NHS England monthly situation reports both reported an increase. Data from the NHS England situation reports on attendances have been used instead. This is consistent with our QNA estimates.
Mental health care
In FYE 2020 most NHS trust mental health care activity was derived from patient-level information costings systems (PLICS). PLICS will ultimately provide more robust estimates of activity and unit costs than the previously reported reference costs, but the data in FYE 2020 are not comparable to data presented for FYE 2019. Consequently, we have used alternative sources to provide information on mental health service activity growth for these services.
We have used a set of indicators from NHS Digital's monthly mental health statistics release to measure growth in mental health care cluster activity and secure services. We measure the former through the increase in the number of people assigned to a care cluster, and the latter by the increase in the number of people subject to detention at the end of the reporting period.
To measure the growth in services provided under the Improving access to psychological therapies (IAPT) programme, we have used NHS Digital's IAPT statistics on the number of appointments attended and the number of first assessments completed.
Recording practices for ambulance calls have changed. Data from NHS England on calls are therefore used instead of NCC. For the rest of ambulance services NCC data are still used.
Community health services
A larger number of trusts than usual did not submit data for the full set of community health services in FYE 2020. The growth rate in output for this year was therefore calculated for each subset of community health services (groups of services such as community nursing or midwifery) using data from trusts that did submit for that subset of services in both FYE 2019 and FYE 2020. This enabled each subset of community health services and by extension, overall community health services, to be estimated. However, because of missing data, the overall weight of community health services in healthcare output will be reduced in FYE 2020.
In FYE 2020, high-cost drugs data were not disaggregated by those administered in admitted, outpatient or other patient settings. Therefore, output growth was determined at a less granular level than usual.Nôl i'r tabl cynnwys
In the forthcoming release, we will be using government expenditure data consistent with Blue Book 2021.
We have also made some more general systems improvements to maintain best practice and improve consistency across the different aspects of our processing. All the additions in this article, and in the forthcoming publication, were not possible without these system improvements, which have enhanced the consistency of our processing.Nôl i'r tabl cynnwys
With particular thanks to Frontier Economics team, for our collaboration and their significant contribution to the children's social care article.
We are grateful to colleagues at the Office for National Statistics (ONS), academics, and colleagues in various government departments for providing helpful comments over the last year. Developing measures for public service productivity has been recognised as particularly challenging (since the output of goods and services are free at the point of delivery) and continuing the discussion with experts will be crucial to improve our statistics.Nôl i'r tabl cynnwys
Manylion cyswllt ar gyfer y Methodoleg
Ffôn: +44 1633 455759