1. Overview of the improved methods for public service productivity

The Office for National Statistics (ONS) publishes annual estimates of public service productivity, which are badged as National Statistics. We produce estimates of inputs, output and productivity growth of nine service areas, four of which are adjusted for quality.

This article covers:

  • new data used and improvements to the quality adjustment of education

  • changes applied to healthcare data

  • changes in children's social care

These changes will be incorporated in Public service productivity, healthcare, England: financial year ending 2020 and Public service productivity: total, UK, 2019. They are in accordance with the Code of Practice for official statistics and follow discussion with government departments, the devolved administrations and relevant experts.

We welcome feedback to productivity@ons.gov.uk. We will take this into consideration in the development plan of the measures and methods of public service productivity.

Nôl i'r tabl cynnwys

2. New quality adjustment for primary education

Details of the quantity and quality output of education can be found in the Sources and methods article.

Current method

The quality adjustment for education has been developed over a number of years, in line with the recommendations published in the Atkinson review. Attainment data were previously used as a proxy for change in the quality of education, and the GCSE (or equivalent) results for a given year were applied to quality adjust the output of primary and secondary education for that year. 

This method was changed in 2019, when a new "cohort split" approach was introduced. Using this method, the GCSE attainment data published for each academic year reflected the quality of teaching from Year 7 to Year 11.

Additional improvements were included in last year's publication, when the cohort split method was extended to account for GCSE results being "the outcome of 11 years of compulsory schooling" and the bullying indicator was introduced.

We propose incorporating the methodological changes discussed in this report into the existing measures.

Proposed changes - extending the cohort split to Reception (or equivalent) year

The current cohort split methodology considers 11 years of compulsory schooling. However, Reception year (in England and Wales), P1 (in Scotland) and Year 1 (in Northern Ireland) are currently excluded from the cohort split. We propose extending the cohort split to account for the first year of primary schooling.

To introduce the Reception year, we plan to keep the secondary school splits (from Year 7 to Year 11) and split the remaining 15% contribution from Reception to Year 6.

The primary school years of the cohort split are held constant, with the exception of Year 6. This is because, while there is evidence that primary school continues to influence later academic attainment up to the end of Year 11 (PDF, 745KB), the exact percentage contribution is less easily disentangled.

There is a wide range of evidence suggesting that the early years are most important for skills development in primary school, as these skills are more "malleable" at this age (see, for example the Organisation for Economic Co-operation and Development (OECD) report on Fostering and Measuring Skills (PDF, 2.79MB), The Effective Pre-school, Primary and Secondary Education Study (PDF, 745KB). Evidence also suggests that factors outside of the schooling system can have a larger impact on attainment (such as parents choosing schools (PDF, 767KB) and parental income). Additionally, as pupils age through the school system, they acquire subject-specific knowledge that may impact GCSE attainment more directly. Therefore, we hold the primary school splits constant given the mixed conclusions in the literature.

Proposed changes: primary school attainment

The current method of quality adjustment in education considers attainment at the GCSE or equivalent level and the bullying adjustment (as published in our 2021 article). Although we recognise the importance of these measures, they may not fully reflect quality across the whole school system. Combining them with additional measures could help to develop a more comprehensive and holistic quality adjustment. Furthermore, the current measures focus on secondary school. For this reason, we have researched measures of quality of the earlier years of schooling.

The Atkinson report (PDF, 1.1MB) (recommendation 9.3) states that it would be beneficial to "measure, if possible, the quality of education delivered at younger ages rather than relying on examinations of those aged 16 years to proxy the whole education output". At the primary school stage, we consider skills development as a key outcome and goal of education. In particular, literacy and numeracy are the foundational skills most targeted by policy. The Scottish government's National Improvement Framework and Improvement Plan cites "ensuring that every child achieves the highest standards in literacy and numeracy" as one of their key objectives. Similarly, the Department for Education's (DfE) previous Single Departmental Plan considers a "good level of development" as those achieving at least the expected levels in literacy and numeracy, among other skills.

Accordingly, we propose extending current quality adjustments for primary schools by using Key Stage 2 attainment measures, alongside existing measures. These are nationally representative, publicly available measures of attainment of those aged 11 years across the UK, which allow us to consider attainment in literacy and numeracy. You can find more information on the data used for each nation in our source and methods article, which will be updated in due course.

Consistent with the GCSE attainment cohort split, we also apportion primary school attainment equally between the seven years of primary schooling, since the year in which the test is taken is not the only year that contributes to this attainment. We propose assigning 14% of the national curriculum assessment attainment to each year of primary school from Reception (or equivalent) to Year 6, using the same methods applied in the past.

Proposed changes: Disadvantaged pupil attainment gap index

Closing the attainment gap between disadvantaged and non-disadvantaged pupils has long been a priority for the UK education system. As a measure of attainment, the DfE stated that the disadvantaged gap index "is more resilient to changes to grading systems and assessment methods" and hence may be more comparable between years (Workless Households and Educational Attainment Statutory Indicators). The OECD report Equity and Quality in Education (PDF, 4.1MB) found that students from low socio-economic backgrounds are twice as likely to have low pupil performance than their peers. Therefore, when considering attainment in school, it is important to simultaneously consider how equitable improvements in attainment are.

We propose to use the disadvantaged attainment gap index. This measure defines disadvantaged pupils as those who attend primary school and have been eligible for free school meals at any point in the last six years, children looked after by a local authority and children who left local authority care in England and Wales. You can find more information in the DfE's methodology documents.

In incorporating the disadvantaged attainment gap index, we take the inverse of the growth rate, such that a fall in the index (as it gets closer to 0) reflects an improvement in quality, using the DfE pupil premium funding information. To reflect its relative importance, we take the proportion of total school funding assigned to the pupil premium (which is funding specifically targeted at supporting disadvantaged pupils) to inform the weighting choice.

This quality adjustment is based on data for England, as no equivalent measures are available covering other parts of the UK. As such, we have applied it to the output measure for all parts of the UK.

Including these two new measures leads to lower growth in quality adjusted output from our current method. There are two reasons for this. Firstly, the weight of the GCSE attainment index is reduced significantly for primary schools to account for the introduction of the primary school attainment index. The primary school attainment index has grown at a slower rate than the GCSE attainment index since 1997, resulting in a slower rate of growth in the quality adjusted output estimate. Secondly, as primary schools have the largest expenditure weight in the non-quality adjusted education output series, a change to the quality adjustment for primary schools is likely to have a large impact.

The impact of the Key Stage 2 disadvantaged attainment gap index on the quality adjusted output index is minimal, because of the low weight given to the indicator. However, we still consider it to be an important addition in telling the story of quality in primary schools.

Future improvements to education quality adjustment

The quality adjustment development will continue in the future, where additional data on well-being, mental health, and other indicators will be considered. We are also keen to progress with our analyses on employment-related measures, aiming to develop an accurate model to better disentangle their association with the quality of the education system.

However, understanding the impact of restrictions arising from the coronavirus (COVID-19) pandemic on education outputs is our main priority, as the first public service productivity estimates for 2020 will be publish at the beginning of 2023.

Nôl i'r tabl cynnwys

3. Children’s social care

Current and proposed methods for quantity output

Children's social care (CSC) includes the provision of social work, personal care, protection or social support services to children in need or at risk.

The current CSC output measures include looked-after children (LAC), children in need, Sure Start schemes, adoption and other activities. Both direct and indirect measures are adopted for CSC.

Approximately one-third of output, covering LAC services, is measured directly. The remaining two-thirds of CSC output, focusing on non-looked-after children (non-LAC), is measured indirectly using the output=inputs convention. This approach is not ideal since productivity growth is always equal to zero. In addition, the current approach is not quality adjusted.

To improve our measurement, additional measures for direct quantity output, quality adjustment and casemix have been developed for CSC.

We have identified four areas of CSC activity as direct measures of output. These are adoptions, special guardianship orders (SGO), care leavers services and safeguarding services. The inclusion of these additional service areas increases the percentage of CSC services that are directly measured from a third to approximately two-thirds in 2018.

Adoptions

CSC services are responsible for processes to determine when it is appropriate to place a LAC up for adoption. We capture the output of adoption services as the number of looked-after children adopted over the year. Children who are adopted cease to be reported in the data collection for looked-after children.

Special guardianship orders

Special guardianship orders (SGOs) are a type of care order used in England and Wales to place a child needing looking after with a long-term guardian. Children with SGOs are recorded as having left care. We record the number of SGOs as an output, capturing the services carried out by CSC to place a child with a guardian under an SGO.

Care leavers

These are defined as children who were previously looked after for at least 13 weeks after their 14th birthday in England, and between ages of 16 and 19 years for the other nations. CSC services have statutory responsibilities to provide care and support to care leavers who may have specific needs. We capture the output of care leavers' services as the number of care leavers reported in the year. Where possible, this is limited to the number of care leavers in receipt of services (instead of only those eligible), or those care leavers that are still in touch with the local authority indicating they are more likely to be receiving some level of support.

Safeguarding services

These services cover statutory services provided to children who, for various reasons, need support from children's social care services, and to protect children from harm. In England, these services fall under two main categories: services provided to children in need (CIN) and services provided to children with a child protection plan (CPP). CIN recipients may have a need for services because of family circumstances or for multiple other reasons including if they have a special educational need or disability (SEND). CPP recipients have been assessed as at significant risk of harm and whose cases require more intensive monitoring and engagement.

We identified two appropriate data series to measure the output of safeguarding services: the number CIN and number of children on a CPP. We propose to sum together the number of CIN and CPP to estimate the total number of children receiving safeguarding services. It is not possible to estimate the two service areas separately because expenditure data are not available for services provided for CIN separately from those provided to CPP.  

Figure 2 shows the difference in output between the current method and the proposed method.

Including the new direct measures of output leads to a flattening of output from 2014 to 2017, with an uptick in 2018. This is primarily because of the introduction of safeguarding activity, which exhibits slower growth in the latest years.

Proposed methods for casemix adjustment for direct quantity output

In public service productivity, activity measures are combined using expenditure weights to create a cost-weighted activity index (CWAI). Our Sources and methods article explains our methodology in more detail.

Weighting by the cost of an activity assumes that the value of the activity is reflected in the cost. For example, if a simple count of the number of fostering activities stays the same but more inputs are needed to provide these activities because of an increase in the age of children being taken into foster care, productivity will fall. However, if the same fostering activities are measured in a more granular way, with larger cost weights attached to fostering activity associated with older children, output would rise too, and productivity may not fall at all. More granular measurement of activities and costs can sometimes provide a better measure of output when the mix of high or low-cost activities changes, and therefore provide a more accurate estimate of productivity.

In the absence of unit cost information, it is possible to approximate a unit cost approach using publicly available data on casemix, as described below.

The casemix adjustment includes:  

  • safeguarding adjustment: the percentage of CIN with CPPs

  • LAC adjustment (for non-secure, secure, adoptions and SGOs): the age of looked after children

Only data for England have been used. The final casemix deflator is applied to each of the directly measured activity categories for each of the devolved administrations. We adopt a regression-based method to create casemix adjustment weights (see Section 6 for details). This regression estimates the relationship between local authority expenditure and casemix factors such as the age of the child, their primary need code (PDF, 667KB) and child protection status, separately for safeguarding and for the LAC adjustment. No casemix adjustment is made to care leavers due to a lack of data.

Figure 3 shows that the casemix adjustment being applied to direct output measurement leads to an increase in quality-adjusted CSC output from 2014 onwards. This is because there is a higher proportion of adolescents in care over time, leading to higher expected costs and higher casemix-adjusted output.

Proposed methods for quality adjustment

For the first time, we introduce quality adjustment to CSC output. Quality adjustment accounts for possible changes in the quality of the service over time, as described in our guide.

The following is a summary of the shortlisted quality adjustment series identified for the service areas of safeguarding, secure and non-secure care services and care leavers services.

Re-referrals and re-registrations

These occur when a child who has previously been referred to CSC services is referred again within 12 months (applying primarily to children in need). Similarly, for CPP, re-registrations occur when a child who had previously had a child protection plan is re-registered as at risk and starts a new child protection plan. Therefore, they suggest that the initial service was unsuccessful, and thus reflect lower quality. Thus, we treat an increase in re-referrals and re-registrations, as reported by DfE and StatsWales, as a decrease in quality. Given that it relates to the quality of the initial service, we lag this measure by 12 months.

Placement stability

This is measured as the percentage of children moving between placements two or more times within a year, and widely considered a key measure of CSC effectiveness, for example used by the Children's Commissioner for England to produce a Stability Index for children in care. Since more placement moves within a short period of time can have negative impacts on children's outcomes, stability is considered a primary objective of CSC services, as reported by DfE. As such, we treat an increase in stability as an increase in quality.

Care leavers

The care leaver outcomes reported in England, Scotland and Wales are a key measure of the effectiveness of care leavers services at supporting children moving on from placements. The outcomes include percentage of leavers living in suitable accommodation and the percentage that are not in employment, education or training (NEET). Providing suitable accommodation reflects positive quality of CSC service provision since it can be used to monitor whether they receive adequate support to transition to adulthood successfully. NEET outcomes for care leavers have been associated with negative long-term consequences, including higher rates of homelessness, mental health problems and imprisonment. As such, we use data from DfE and StatsWales, with an increase in the percentage of care leavers who are NEET reflecting a fall in quality.

Quality measures were shortlisted if they were fully attributable to CSC services as opposed to other public service areas or external factors. We excluded other measures (for example, percentage of child protection processes carried out within the recommended timeframes) since they were not necessarily reflective of year-on- year improved outcomes for children and relate to processes rather than our preferred focus on child outcomes.

The impact of introducing quality adjustment into CSC output can be seen in Figure 4. This is shown alongside the impact of the casemix approach.

Introducing quality adjustment and casemix uplifts output, as most measures of quality improved between 2014 and 2018.

Nôl i'r tabl cynnwys

4. Healthcare output data

Our public service healthcare productivity statistics use a wide range of output activities and sources, as described in our methods article.  

In previous years, NHS England's National Cost Collection (NCC)  has been used as the main data source to measure hospital and community healthcare services (HCHS) output. However, changes to data collection and challenges presented by the coronavirus (COVID-19) pandemic have affected the comparability of these data between the financial year ending (FYE) 2019 and FYE 2020 for some services. As a result, alternative data sources have been introduced to estimate growth for some services within HCHS.

To measure healthcare output growth between FYE 2019 and FYE 2020, two strategies have been followed in choosing an appropriate alternative data source.

In the first instance, we used data sources used to produce the more-timely estimates of output within our Quarterly National Accounts (QNAs) and monthly gross domestic product (GDP) estimates. These data are generally less comprehensive and less granular than NCC data.

We then supplement this with additional data sources that capture growth in healthcare components not measured in the timely National Accounts measures.  

Using these additional data sources means that our annual healthcare estimate for FYE 2020 will be a more comprehensive estimate of healthcare output than that available from the timely estimates in the QNA. However, because of the limitations in the availability of our regular annual data sources, the annual healthcare estimate for FYE 2020 will be based on less-detailed data than estimates produced in earlier years.

Growth rates using alternative data sources were applied to expenditure in the FYE 2019 National Cost Collection, to minimise the impact on the weight of different services.

The new data sources are only used to estimate output growth in FYE 2020 and have not resulted in revisions to estimates for healthcare output growth in earlier years.

Elective care, day-cases and non-elective care

A lower number of trusts reporting activity data has contributed to abnormal activity growth rates (that contradicted other more comprehensive sources of activities data) within the NCC. As a result, we have used the growth in the number of finished admission episodes reported in the Monthly Activity Returns (MAR) taken from NHS Digital's hospital episode statistics (HES) dataset instead. We also use this data source in our timely estimates of healthcare output within the QNA. The data are highly aggregated, providing activity for all specialities by elective care, day cases and non-elective care.

Critical care

Monthly situation reports from NHS England were used to estimate growth in critical care output. The change in the average number of occupied beds for adult, paediatric and neonatal was used as an alternative to the NCC data. This source is also used to inform our QNA estimates.

Accident and emergency

New reporting categories have had an impact on comparability to FYE 2019. NCC data reported a drop in emergency care activity in FYE 2020, whereas data from HES and NHS England monthly situation reports both reported an increase. Data from the NHS England situation reports on attendances have been used instead. This is consistent with our QNA estimates.

Mental health care

In FYE 2020 most NHS trust mental health care activity was derived from patient-level information costings systems (PLICS). PLICS will ultimately provide more robust estimates of activity and unit costs than the previously reported reference costs, but the data in FYE 2020 are not comparable to data presented for FYE 2019. Consequently, we have used alternative sources to provide information on mental health service activity growth for these services.

We have used a set of indicators from NHS Digital's monthly mental health statistics release to measure growth in mental health care cluster activity and secure services. We measure the former through the increase in the number of people assigned to a care cluster, and the latter by the increase in the number of people subject to detention at the end of the reporting period.

To measure the growth in services provided under the Improving access to psychological therapies (IAPT) programme, we have used NHS Digital's IAPT statistics on the number of appointments attended and the number of first assessments completed.

Ambulance services

Recording practices for ambulance calls have changed. Data from NHS England on calls are therefore used instead of NCC. For the rest of ambulance services NCC data are still used.

Community health services

A larger number of trusts than usual did not submit data for the full set of community health services in FYE 2020. The growth rate in output for this year was therefore calculated for each subset of community health services (groups of services such as community nursing or midwifery) using data from trusts that did submit for that subset of services in both FYE 2019 and FYE 2020. This enabled each subset of community health services and by extension, overall community health services, to be estimated. However, because of missing data, the overall weight of community health services in healthcare output will be reduced in FYE 2020.

High-cost drugs

In FYE 2020, high-cost drugs data were not disaggregated by those administered in admitted, outpatient or other patient settings. Therefore, output growth was determined at a less granular level than usual.

Nôl i'r tabl cynnwys

5. Data sources in National Accounts

In the forthcoming release, we will be using government expenditure data consistent with Blue Book 2021.

We have also made some more general systems improvements to maintain best practice and improve consistency across the different aspects of our processing. All the additions in this article, and in the forthcoming publication, were not possible without these system improvements, which have enhanced the consistency of our processing.

Nôl i'r tabl cynnwys

6. Regression model used for local authority expenditure casemix adjustment

The data are at a local geographic level, indexed over LA and a year, indexed over t, for a given spend category s. The regression specification is:


Where:
LA is the local authority
t the year considered
s the expenditure
C the deflated expenditure
n volume output (Safeguarding or LAC)
f₁ volume output with casemix factor 1
f₂ volume output with casemix factor 2
e is the error and the observations are weighted by the volume of outputs nLA,t,s

Then βn,t,CIN is the incremental expenditure associated with one unit of output that has no casemix factors. βf₁,CIN is the incremental spend associated with one unit of output that has casemix factor 1, relative to a unit of output that does not have casemix factor 1.

The variables include deflated expenditure C, the volume of outputs to be casemix adjusted, and the casemix adjustment factors which describe features of the outputs that impact expected expenditure.

Based on recommendations of academics and data owners, and after investigating different regression specifications, we included:

Children in need (CIN) primary need code

This is a code that is recorded by a social worker for referrals to children’s social care (CSC) that require further action. The possible codes are:

  • abuse or neglect (1)
  • child’s disability (2)
  • parental disability or illness (3)
  • family in acute stress (4)
  • family dysfunction (5)
  • socially unacceptable behaviour (6)
  • low income (7)
  • absent parenting (8)
  • other cases (9)

Child protection status

Children protection (CP) cases require more support and are expected to need more resources compared with non-CP CIN.

This adjustment applies to England safeguarding and looked after child (LAC) care day outputs only; it could in future be used in other nations or outputs if the necessary public data become available.

Nôl i'r tabl cynnwys

7. Acknowledgements

With particular thanks to Frontier Economics team, for our collaboration and their significant contribution to the children's social care article.

We are grateful to colleagues at the Office for National Statistics (ONS), academics, and colleagues in various government departments for providing helpful comments over the last year. Developing measures for public service productivity has been recognised as particularly challenging (since the output of goods and services are free at the point of delivery) and continuing the discussion with experts will be crucial to improve our statistics.

Nôl i'r tabl cynnwys

Manylion cyswllt ar gyfer y Methodoleg

Sara Zella, Meera Parmar
productivity@ons.gov.uk
Ffôn: +44 1633 455759