1. Overview
Measuring the extent and nature of child abuse is challenging. Abuse is often hidden and can take many forms, with children not always able to report what is happening to them. However, there is still a strong and long-standing need for robust, up-to-date information on how many children are affected today and in what ways.
Administrative data from authorities and support organisations provide some insight, but only into cases that are reported or identified. As a result, these figures underestimate the scale of the issue and are an unreliable representation of how patterns of abuse change over time.
Currently, the best estimates of child abuse we have come from the Crime Survey for England and Wales (CSEW). This asks people aged 18 years and over about abuse they experienced during their childhood. While valuable, these answers reflect events that may have occurred decades ago and, therefore, cannot fully describe the experiences of children today. In addition, because of demands on the survey which have impacted available survey space, we are unable to run this module again going forward.
To address this evidence gap, we have been conducting a feasibility study to assess whether a new national survey would provide reliable data on the current scale and nature of child abuse. If achievable, this evidence would be highly valuable to policymakers, service providers and practitioners providing support services. Over time, it could contribute to efforts in reducing the prevalence of child abuse and improve victims’ experiences of support services.
Based on the research we previously published, we proposed two components for the survey: a school survey of those aged 11 to 16 years and an online survey of those who left school, aged 16 to 25 years.
Since our last publication, we have carried out more research, including:
- engaging with schools, parents and ethics advisors on the school survey’s design, materials and safeguarding procedure
- developing the survey’s questionnaire and cognitive testing
- field-testing the online survey at a small scale
Following this research, we have now paused the development of the school survey. We have focused on developing the online survey with a plan for roll-out later this year, subject to securing the necessary funding from outside the Office for National Statistics.
This article outlines the main findings that led to this decision and the progress we have made on the online survey.
Nôl i'r tabl cynnwys2. The school survey
In our last research update, we outlined the survey design and proposed a safeguarding procedure for the school survey. This included:
- completing the survey within a classroom setting with children aged 11 to 16 years
- the survey being completely anonymous, with no safeguarding action taken based on survey responses
- providing respondents with a range of opt-in support options, which would be explained in detail to respondents before, during and after the survey taking place
- tailored safeguarding messages within the questionnaire, based on survey responses, and encouraging respondents to seek support
- providing parents or carers and teachers with information on the survey and support options for them and the children
- children aged 13 years and over being able to solely consent to participating in the survey
- an opt-out permission process for parents or carers
Since our last research update, we have engaged with:
- parents, carers and schools
- the National Statisticians Data Ethics Committee and the National Society for the Prevention of Cruelty to Children (NSPCC’s) research ethics committee
- a regional director of education and a school in Bristol, with a focus on the survey materials
- a school in Norwich and another in Scarborough, about implementing the survey
- an independent advisor from the Department for Education, on consent, anonymity and safeguarding
Despite our progress, there are still significant challenges and risks to running our survey in schools. Therefore, we have decided to pause our work in developing the school survey to focus on the online survey for those aged between 16 and 25 years.
While disappointing, we continue to assess the need for data collected directly from those aged under 16 years, including evaluating whether the online survey for those between 16 and 25 years can provide sufficient evidence. This can remove the need to ask younger children such sensitive questions, given the associated risks, operational challenges and safeguarding requirements. For some abuse categories, it may be possible to collect data through an alternative online survey for those under 16 years.
We are grateful to all who gave their time and feedback on the survey during this process. The following information highlights some of the main areas of challenge and concern in response to the proposed survey design.
Concerns and developments with the school survey design
We spoke with representatives from five schools about the survey design. At this stage, schools were not asked to run the survey, which may have affected their feedback, as follows:
- most schools felt they could run the survey if given enough planning time
- they understood why the survey was anonymous but were worried about not knowing which students were at risk or needed support
- they liked the support options, but some were unsure if the proposed drop-in support room would be used, in practice
- they were concerned about students in their first year at the school, as they may not have learned the relevant Personal, Social Health and Economic (PSHE) topics yet and may not yet have the necessary support networks in school
- many were worried about how parents and carers might react to the survey
- there were mixed views on the opt-out process and on the age at which children could give their own consent
- they said students with Special Educational Needs and Disabilities (SEND) would need extra support, which would differ for each child
We also spoke with parents and carers across four group sessions. This was their feedback:
- many were unsure about first year students taking the survey
- they shared ideas on how to make the survey materials easier to understand
- most agreed that children aged 13 years and over should be able to give their own consent
- some were worried about whether the survey questions were right for their child
Ethics committees shared similar concerns about first year students completing the survey. We recognise that the concern is made more pronounced when running the survey earlier in the year to avoid the exam season. As a result, the design was changed to only ask those aged between 12 (and in their second year) and 16 years.
We discussed the reality of implementing a design in which those aged 13 years and over would be able to give consent on their own with the ethics committees. As both ethics committees and schools expressed concerns with this approach, we decided to change it to a process in which parents or carers of children of any age could opt out their child. However, we acknowledge that this would pose a risk to the survey’s quality, given its potential to reduce response rates.
Alongside feedback from schools and parents or carers, we included feedback from a school in Bristol and from our Behavioural Insights team when updating the survey materials.
Challenges when recruiting schools
Our approach to finding schools to pilot the survey relied on two main strategies. We sent invitation letters to a random sample of schools, and we benefited from the NSPCC’s experiences working with schools and use their established relationships to ask schools to participate.
We found that uptake from schools was very low, and that a substantial amount of resource is required to recruit and support them, as well as to prepare and run the survey. This view is supported by the NSPCC’s assessment of how much support schools will require. Combined with the necessary resource needed, this raises concerns about the potential to scale up the survey and the possibility of bias within the sample.
Challenges implementing information sessions
The proposed design included information sessions with staff, parents or carers, and young people. From our discussions with ethics committees on ordering and timing, and from working with schools on delivery plans, we found substantial challenges to implementing these.
Organising the survey’s end-to-end design requires considerable time and resource from schools who have limited time to dedicate to preparing for it. Additionally, running these sessions would require a large survey team with specialist-level communication and safeguarding skills, especially for potentially challenging sessions with concerned parents or carers. Both factors also lead to concerns over the potential to scale up the survey.
Concerns about survey devices
Our design assumed that the survey would be completed using ONS tablets, with each class completing it at a time. However, some schools wanted the option to run the survey with multiple classes at the same time. To enable this, we would either need considerably more tablets, which has cost implications, or to allow the use of school devices. If we were to use school devices, we would then need to ensure that students’ screens could not be seen by others (as in exam conditions) and that they used privacy screens.
Ethics committees raised concerns about whether these measures would be sufficient or consistently implemented across schools at scale. For ONS devices, they raised concerns about whether children might use the tablets inappropriately during the survey collection, for example to take photos or send emails from the device, and whether sufficient controls could be applied.
Outstanding questions on support options
During the survey’s design, we would encourage children to seek support at multiple times. The support options include an anonymous online chat and helpline, provided by a specialist organisation, and support from the school safeguarding team. There were no concerns related to these options.
The original proposal allowed children to ask a member of the support organisation or the school safeguarding team to find them by the end of the day or phone them. This raised practical and ethical concerns, including the risk of being identified by peers, uncertainty about who the child was with when phoned, and challenges in handling personal data.
The proposal also included a drop-in support room, staffed by specialists from a support organisation on both the day of the survey and the day after. Many schools said they could not provide a dedicated room for multiple days, and some felt a room like this was unnecessary or ineffective. They felt school safeguarding teams would offer more appropriate, long-term support from people already known to the children. Some schools also felt this was unlikely to be used by students, given its visibility.
Independent experts in the field support these views and have also raised concerns about the scalability of a support room. Similar international studies rarely use this approach. We have not completed an ethical review of not including these support options. However, we recognise the challenges raised by the schools and the concerns about the approach.
Most schools felt their safeguarding teams had the right skills to support students over the survey process, but some were concerned about capacity. Most schools pointed out how an increase in support needs is likely after the survey’s completion, and one school highlighted that they already have a waiting list for support services.
Nôl i'r tabl cynnwys3. Developing the questionnaire
Reviewing the initial questionnaire
Since the development of the initial questionnaire, we have received feedback from the National Statisticians Data Ethics Committee, the National Society for the Prevention of Cruelty to Children (NSPCC’s) research ethics committee, and relevant stakeholders and experts on ways to streamline the survey and enhance data quality. While the main abuse types have not changed, we have rigorously reviewed and made several improvements to the survey.
As part of this review, we have ensured the survey aligns with up-to-date terminology and definitions of child abuse and neglect, in line with user needs and NSPCC definitions.
We have removed the “witnessing community violence” and “experiencing community violence” sections from the survey. These were considered the least essential modules, given our aim to measure child abuse, and removing them has helped reduce the survey’s length. We see this as essential in reducing respondent burden and increasing the number of completed surveys.
Cognitive testing of the questionnaire
Following the review and redesign of the survey questions, we implemented a programme of cognitive testing. This was extremely valuable for the redesign, as it helped to ensure data quality and reduce respondent burden. This includes the clear comprehension of questions, and appropriateness in drawing out responses that accurately reflect lived experience. As we have paused the school survey component, our cognitive testing focused on the online survey, aimed at respondents aged 16 to 25 years.
We conducted cognitive testing over two waves, involving a total of 20 online interviews with a range of participants, including those with lived experience of child abuse and neglect. Table 1 shows a summary of participants involved in the testing.
| Characteristic group | Personal Characteristic | Number of participants | Percentage of participants |
|---|---|---|---|
| All | All | 20 | 100 |
| Sex | Male | 8 | 40 |
| Sex | Female | 11 | 55 |
| Sex | Prefer not to say | 1 | 5 |
| Age (years) | 16 to 17 | 9 | 45 |
| Age (years) | 18 to 21 | 6 | 30 |
| Age (years) | 22 to 26 | 5 | 25 |
| Ethnicity | White | 11 | 55 |
| Ethnicity | Mixed | 1 | 5 |
| Ethnicity | Asian | 4 | 20 |
| Ethnicity | Black | 3 | 15 |
| Ethnicity | Other | 1 | 5 |
Download this table Table 1: Characteristics of cognitive testing participants
.xls .csvWave 1 of cognitive testing
Wave 1 of cognitive testing included 10 interviews conducted in November 2025. The aim of this wave was to assess understanding, confidence, and reactions to module introductions, screener questions (the experience-based questions) and response options.
Findings from the analysis of Wave 1 data
Some participants found it challenging to distinguish between the two screener questions assessing exposure to domestic abuse. They also struggled to respond accurately, especially because of the lack of a timeframe and the severity of the experiences.
It was difficult to capture valid experiences of neglect. Many participants’ responses indicated neglect but later highlighted that this did not feel accurate. Some participants described experiences that were more reflective of the natural progression of responsibility that children gain as they grow up (for example, being left at home on their own), as opposed to neglect.
The “criminal exploitation” screener did not effectively capture data as intended, as participants often answered based on experiences of peer pressure, which do not constitute criminal exploitation.
In each sexual abuse module, many participants were unclear that both consensual and non-consensual sexual experiences should be included in their answers to the initial screener questions.
Screener question changes from Wave 1
We made several formatting changes to highlight relevant words and increase accessibility. This included amending language to improve comprehension and appropriateness.
The screener questions in the "exposure to domestic abuse" module were changed to specify a timeframe, and their formatting was improved to emphasise the difference between the two screeners.
Two substantial changes were made to the screener question in the "neglect" module. A frequency Likert scale was added to replace the “yes” and “no” response options to capture more context on the nature of the experience, and to establish if this constitutes neglect. The sub-questions were also reversed, to reflect positive experiences, as participants found reflecting on negative experiences to be burdensome.
The screener question in the "criminal exploitation" module was amended to ask about behaviours specifically in relation to “adults”, rather than “anyone”. For this reason, this module was moved to an earlier position within the survey, following the "neglect" module. This was changed to reduce the potential confusion of respondents from questions that switch between asking about specific perpetrators and "anyone" too often across modules.
We also added a sentence to the introduction of each sexual abuse module to encourage respondents to include both consensual and non-consensual experiences.
Wave 2 of cognitive testing
We conducted Wave 2 of cognitive testing in February and March 2026. Involving another 10 interviews, the main aim of this wave was to assess comprehension of the follow-on questions and to determine the appropriateness of the changes we made, based on the feedback from Wave 1.
Findings from the analysis of Wave 2 data
Despite our attempts to improve clarity and comprehension in the "exposure to domestic abuse" module, the first three interview participants continued to struggle with the distinction between the two screener questions. We revised the questions during the interview period, merging them into one, which lead to a considerable improvement in comprehension. Following this change, the seven remaining participants mentioned no issues in their understanding of the question stem.
There was a vast improvement in participants’ ability to respond to the “neglect” screener. However, follow-on questions were subject to misinterpretation because of uncertainty around whether the subject of the questions was general experience of care or specific neglectful experiences.
The "criminal exploitation" module’s placement had been revised to follow several modules focusing on parents, carers, and adults within the household. This caused many participants to misinterpret the reference to “adult” as referring specifically to parents and carers.
When answering follow-on questions that related to a specific (potential) perpetrator, participants often found it challenging to recall who the follow-on questions related to.
Some participants found the question “Did you ever feel pressured or forced to do these things or was it ever unwanted?” burdensome. This was because of a difference in their perception of the event while reflecting back on it during the survey, compared with their experience of it at the time when it happened.
One participant expressed concern that grouping consensual and non-consensual sexual experiences may be invalidating for those with lived experience of abuse.
Screener and follow-on question changes from Wave 2
After our analysis of Wave 2 cognitive interview data, we made other formatting and language refinements to improve question clarity and comprehension.
As noted previously, the “exposure to domestic abuse” module was redesigned, with the two screener questions being merged into one.
The “Neglect” module’s follow-on questions were redesigned to contain additional context and add clarity, so they more accurately reflect neglectful experiences.
The introduction to the “Criminal exploitation” module has been amended to ensure participants consider experiences with any adults, not only parents or carers.
All follow-on questions that relate to a specific (potential) perpetrator now indicate this before the question stem.
The question about being pressured, forced, or having an unwanted experience was amended to add clarity around providing retrospective feelings about the experiences, rather than how they felt at the time.
The sentence requesting the inclusion of both consensual and non-consensual sexual experiences was amended to improve appropriateness. We did this by clarifying that the experiences will be differentiated.
Our analysis of data from both waves of cognitive testing resulted in important changes to improve the survey’s questions and structure. The final, redesigned survey is published alongside this article.
Nôl i'r tabl cynnwys4. The online survey
In our last research update, we outlined the survey design and proposed a safeguarding procedure for an online survey with those aged 16 to 25 years. Our activity has since focused on testing and refining the operational design. As the school survey has been paused, all 16-year-olds are now eligible for the online survey.
Respondents aged 16 and 17 years would be asked about experiences of abuse in the last year, as well as over their lifetime. This would provide an estimate of current prevalence.
Respondents aged 18 to 25 years would be asked about experiences of abuse before the age of 18 years. This gives a retrospective view of the population close to childhood age, which is reflective of current prevalence. It could also lead to more reliable responses gathered from older respondents, a view that is supported by past studies.
In our last research update, we included the need to agree the sampling frame to be used. Since then, we have agreed with NHS England to use their Patient Demographic Service, which includes individuals’ dates of birth and addresses.
Scotland and Northern Ireland do not currently have a sampling frame in place with appropriate permissions. Therefore, our testing has currently focused on England and Wales only. We continue to work with colleagues in Scotland and Northern Ireland to update them on our progress.
Field Test
In January 2026, we conducted a small-scale field test with an issued sample of approximately 800 participants. The questionnaire used for this test included only screener questions for each abuse type and did not include any follow-up questions. The response rate for this survey was 31.4%. We collected quantitative and qualitative feedback from this, which we are continuing to evaluate, and will publish our findings in a future update.
Nôl i'r tabl cynnwys5. Future Developments
Our next steps are to run a second field test of the online survey with those aged 16 to 25 years, using the full questionnaire. Subject to a successful evaluation of that test, it would then be possible to roll out the live survey to a full sample.
Nôl i'r tabl cynnwys7. Cite this article
Office for National Statistics (ONS), released 6 May 2026, ONS website, article, Title: Exploring the feasibility of a survey measuring child abuse: May 2026