1. Foreword

LDI.Evaluation@ons.gov.uk
Local Data and Insights Project Evaluation team
Release date: 22 September 2025

Introduction

The Local Data and Insights (LDI) project aimed to improve the government’s subnational data capabilities and to support the delivery of the Government Statistical Service (GSS) subnational data strategy ambitions, which were to:

  • produce more timely, granular and harmonised subnational statistics
  • build capability and capacity for subnational statistics and analysis
  • improve the dissemination of subnational statistics

It also supported evidence needs for the Levelling Up White Paper and later the English Devolution White Paper.

Delivered through collaboration between the Office for National Statistics (ONS) and the Ministry of Housing, Communities and Local Government (MHCLG), the project – formerly known as the Levelling Up Subnational Data (LUSD) project – sought to better inform policy and local decision making through improved subnational data.

The project was split into four main work packages:

  • more granular local statistics
  • new data sources and methods
  • the ONS Local service
  • new data platforms, including the Explore Local Statistics (ELS) service

Outline of successes

Over the last three years, a wide range of new and innovative statistics, methods and analyses have been created in response to user needs, harnessing administrative and commercial data sources to publish impactful analysis on: 

  • geographical mobility of young people across English towns and cities
  • local employment change dynamics influenced by job creation and job destruction
  • access to local amenities, including post offices, cashpoints and parks, as well as travel time estimates to railway stations and libraries
  • consumer card spending trends by time of day across the UK, including spend in a range of merchant categories for every postal district
  • clustering and statistical nearest neighbours, grouping UK local authorities with similar characteristics and outcomes
  • new Rural Urban Classification with improved methodology and International Territorial Level (ITL) geographies

We have developed new regular local statistics, which we are currently embedding into our core statistical productions, such as gross value added (GVA), gross disposable household income (GDHI) and household final consumption expenditure (HFCE) statistics.

We have embedded the ONS Local service across the English regions and in Wales, Scotland and Northern Ireland, providing analytical support and advice to over 200 organisations, delivering 81 webinars to increase knowledge and capability across local government, giving regular updates to over 450 people via our newsletter, and producing 78 requested datasets. 

Our local analytical projects have generated critical insights across a range of policy domains, including economic inactivity, job quality, growth sectors, child poverty, business ownership, active travel and rural challenges. A strategic priority has been the identification of local data gaps through sustained stakeholder engagement, ensuring these gaps are escalated and addressed either through the LDI project or in collaboration with relevant teams across the ONS and the wider GSS, where appropriate.

We have delivered the Explore Local Statistics service, which consolidates a wide range of subnational statistics into a single, user-friendly digital platform, making it easier for users to find, visualise, compare and download data about local areas. By the end of the project, it included 77 indicators across themes like population, economy, health, education and life satisfaction. 

Extensive monitoring and evaluation has been conducted during the course of the LDI project. Annual reports have been published, summarising insights gathered from interviews and feedback from external stakeholders. Using these insights, we have continued to mature and evolve our services and products to ensure they remain relevant, and that they continue to support government priorities for broadening and deepening English devolution and have strong links into the place pillars of the government missions.

While we have made improvements in the breadth and depth of statistics and data available, there are still barriers in the timeliness, quality and accessibility of subnational data. Falling response rates to our core surveys remain a challenge to producing subnational data. 

Response to development areas and recommendations

We welcome the findings from this final evaluation, and the resulting development areas and recommendations.

We recognise the need to continue to promote and raise awareness of our services and products to ensure they are well-used and have the greatest impact on policy development and delivery across central and local government, and are already beginning to embed new practices to do so. We will also continue to work with policy makers centrally and locally to understand the impact of our new statistics and analyses on policy changes and effectiveness.

We are also beginning to develop new processes to ensure we gather and prioritise user needs and develop new products in response, and to define targeted service levels to meet the needs of different user groups.

Overall, we are proud of what we have delivered in partnership with MHCLG over the last three years. This evaluation helps to highlight the strengths and successes we have had and will help us continue to evolve to deliver value to our stakeholders.

Becky Tinsley
Senior Responsible Owner for the LDI project

Nôl i'r tabl cynnwys

2. Main findings

The evaluation of the three-year Local Data and Insights (LDI) project was based on three main evaluation questions, and this report is structured around them.

Evaluation question 1: Did ONS Local contribute to filling knowledge gaps by providing analytical support and advice?

ONS Local established dedicated analysts based in every English region in March 2023, and across the devolved governments in 2024. ONS Local played an important role in understanding local governments’ analytical needs and responding to requests for support, advice and filling evidence gaps.

Of the 35 local government organisations we interviewed in our evaluation interviews, almost all had used analytical support from ONS Local (89%).

Webinars and workshops were frequently mentioned as useful for knowledge sharing and training, learning about new analytical techniques and tools, such as Power BI. These training sessions helped teams use local data more effectively, which led to improved confidence and accuracy in decision making.

The impacts from the ONS Local analytical support were vast across the local government group, with some stakeholders describing ONS Local support as invaluable and instrumental to their work. Many users developed regular contact with their regional lead and saw this relationship as helpful, with the regional structure allowing for context-aware advice, tailored to specific policies or demographic environment.

We also found that some expert users in local government felt they did not need analytical support from ONS Local because they handled their analysis internally.

Limited awareness of the full-service offer was a theme in our evaluation interviews. This was particularly relevant for teams with limited analytical capacity, who may have attended ONS Local webinars but remained unaware of the broader range of analytical support available to them.

ONS local resource was prioritised towards supporting local government. For central government projects, some support was given, for example, evidence gathering on the main data gaps for the Department for Digital, Culture, Media and Sport. The support was mainly small-scale and short-term asks that were requested by the departments as they became aware of the service.

Streamlining the routing of local intelligence to relevant government departments and raising awareness of ONS Local will aid in embedding local insights into central decision making.

Evaluation question 2: Did new data platforms provide a valuable repository to find, visualise, compare and download subnational data and statistics?

A suite of advanced public data dissemination products – known collectively as the Explore Subnational Statistics (ESS) service – have been progressively developed over the three years of the project, building up to the release of Explore Local Statistics (ELS) in March 2024. When ELS launched, the service hosted 57 different indicators, such as employment, school attendance and life expectancy. This increased to 77 indicators over the year and was refreshed on an ongoing basis with updated datasets and increased functionality. 

Website analytics on ELS showed a steady increase over the 12 months since its launch, with an average of 2,600 sessions per week (note 1). The service was also at the forefront of the ONS website transformation.

ELS has also received two highly regarded awards:

Many respondents to our evaluation interviews described ELS as a valuable tool for quick, practical data discovery. Users reported saving time and effort, and ELS was generally helpful for analysts in resource-constrained environments who lacked the tools or capacity to create dashboards or draw insights from spreadsheets. Having one curated platform helped build confidence and independence among less technical teams. The clear, user-friendly interface was widely praised, especially by non-data experts. The service was commended for making local statistics easy to explore, interpret and share.

ELS was often not used on its own, as other data and resources were needed or complemented by users’ own expertise and processes. Although some users embedded ELS into their regular workflow, others were unsure how it could support their work. ELS was less used by expert analysts or organisations with bespoke analytical setups, as these users preferred access to raw data, familiar tools, or required more complex geographies and variables.

Notes for Evaluation question 2

  1. A session can be defined as a website visit, initiated when a user views a page on the website and interacts with the website for a period of time. Only users who accept cookies are tracked, therefore the number of sessions reported are likely to be significantly less than the actual number of sessions.

Evaluation question 3: Are local decisions better informed by subnational data and methods?

Governments at all levels need to assess local growth and performance, develop local strategies and make decisions. To do this effectively, they require a wide range of detailed, granular data about their local areas to strengthen or supplement existing evidence. The project explored innovative techniques to disaggregate existing data to more granular levels and published these new statistics along with the methods and guidance to support effective use.

The granular local statistics work package covered transformation of economic and social indicators by breaking them down into small, more local, geographical areas. It also aimed to improve the ability to compare subnational data across all parts of the UK. This included, but is not limited to:

Another work package included the publication of new data and methods, made possible by expanded use of data sources and use of innovative techniques, with the aim to improve the timeliness and granularity of available insights. This included published data on:

Some evaluation interview respondents explained that more granular statistics have helped inform various aspects of decision making – such as planning and services, policy and interventions, evaluation, and monitoring. Examples from local government included:

  • sharing with elected members to help guide internal planning and the structuring of future changes in the area
  • part of strategy and policy narrative to go to government about recent successful growth on the economy and household income
  • used alongside other data for funding-related activities, such as evidence in business cases for projects, funding bids and investment decisions

While it was difficult to directly assess the immediate impact of the project within the scope of the evaluation, respondents shared that they had incorporated project data and insights into reports, dashboards and analyses. Evidence of data informing specific decisions was not consistent, however, there was recognition of the data’s potential value.

Some participants expressed interest in future use, and while engagement with outputs varied, often because of limited awareness or perceived relevance, the groundwork has been laid for longer-term impact. It is likely that the full benefits will emerge beyond the timeframe of the project and evaluation.

The ONS Local work package also played a clear role in supporting decision making within local government. By improving the use of data and analysis, local government had greater access to critical data supporting decisions in areas such as policy development, economic evaluation and targeted interventions.

ELS served as a valuable support tool, particularly for government analysts with limited analytical capability. While it was not yet central to decision-making processes, it did save users time enabling simple queries to be answered quickly. 

Nôl i'r tabl cynnwys

3. Evaluation approach

The Office for National Statistics’ (ONS’s) Evaluation Strategy sets out the ONS’s vision for embedding evaluation best practice across all work. To align with this, the purpose of monitoring and evaluation for the Local Data and Insights (LDI) project was to provide learning points throughout the project delivery and ensure continued accountability. Evaluating the success of the levelling up policy was not within scope of this project.

This report describes the impact evaluation and progress of the LDI project, covering the entire three years – April 2022 to March 2025. This builds on previous evaluation findings and recommendations, which happened at the end of each project year:

Evaluation methods

A theory of change was created in summer 2022 describing how the project is expected to deliver desired changes and impacts.

A realist evaluation approach, with elements of contribution analysis, were the methods applied. These addressed whether the project caused an impact, for who, and how and why it occurred.

Using data triangulation and realist evaluation principles, an overall assessment was given for each outcome using the following categories:

  • achieved: strong, consistent evidence confirmed the outcome was met with clear connection to the project theory of change
  • partially achieved: strong, consistent evidence showed some achievement of the outcome; or full achievement but only for certain user groups; or full achievement but with limited evidence
  • not achieved: strong, consistent evidence confirmed the outcome was not met
  • inconclusive: evidence was insufficient or unclear (including no evidence) to determine if the outcome was met

Two types of statements are presented in this report, having been identified and developed during the year 2 evaluation and subsequently refined as part of the final evaluation process:

  • context-intervention-mechanism-outcome (CIMO) statements for each priority outcome – the main structure of realist analysis and describe the circumstances under which the project was successful or failed to generate the intended impact
  • six alternative theories across the whole project – consideration of any external contextual factors, beyond the scope of the project itself, that may have influenced the outcomes observed

See Section 8: Evaluation methods for more detail of the evaluation approach.

Evaluation data

The evaluation was mainly based on quantitative and qualitative data collected via a range of methods including interviews, surveys and focus groups.

Sixty in-depth interviews or focus groups with external users and stakeholders were held during the last six months of the project (October 2024 to March 2025) that covered all the project’s activities and outputs completed by the time of the interviews. Some project deliverables occurred slightly after the interview timescale, therefore impact could not be evaluated.

The ONS annual stakeholder satisfaction survey (January to March 2025) was used to gather further information on ONS Local and subnational statistics from a wider audience; a total of 155 responses were received from a range of stakeholders following wide promotion to ONS users and producers.

See Section 7: Evaluation data, for more on the data sample.

Nôl i'r tabl cynnwys

4. Evaluation question 1: Did ONS Local contribute to filling knowledge gaps by providing analytical support and advice?

Service reach

The target user for ONS Local was analysts, collaborators and policy makers in local and devolved government. Although not direct users of the service, central government were intended beneficiaries of the local insights it generated.

Of the 35 local government organisations we interviewed, almost all had used analytical support from ONS Local (89%). Nearly two-thirds of the organisations we spoke to had attended webinars or workshops hosted by ONS Local. Nearly half receive regular support from their regional lead.

These findings were consistent with the results of the ONS stakeholder satisfaction survey (January to March 2025), with 76% of survey respondents who had used the service being satisfied with it (35 out of 46).

Outcome: Improved dissemination of subnational statistics and analysis products to ensure take-up in local government

Overall assessment

Outcome achieved.

There was strong evidence, particularly from local government interviewees, that ONS Local successfully delivered this outcome through its various forms of analytical support.

CIMO (context-intervention-mechanism-outcome) statement

Context: Local government need to access and understand local data to inform their decision making. Teams with limited resource and analytical capability cannot commit much time meaning they have limited awareness of local data and how best to use it.

Intervention: ONS Local established dedicated analysts based in every English region in March 2023 and each of the devolved governments throughout the following year. ONS Local representatives engaged with local government teams in their area via a variety of ways to understand and support their analytical needs. They hosted regular webinars and workshops to build analytical capability by reaching the wider local government group.

Mechanism: Local government interacted with ONS Local to share their analytical needs and request for support and advice. The level of engagement with ONS Local depended on their resource, capability and needs as well as awareness of the support available.

Outcome: Local government spent less time and resource looking for subnational data, and had increased analytical support, to inform their decision making. Analytical capability and capacity were improved.

Evidence towards the outcome

Around two-thirds of local government officials interviewed met this outcome (23 out of 36). Most were users with higher analytical capability.

Many highlighted regular ONS Local meetings, webinars and email updates as crucial for staying informed and connected with the ONS. This ongoing communication acted as a time-saver by reducing the effort to track down relevant data or information themselves.

Some respondents benefitted from ONS Local connecting them with ONS experts for direct analytical support, including methodology guidance and validation. Connecting them to other local organisations was also highlighted as a valuable part of the service.

Our ONS Local representative was responsive, helpful and had expertise to advise us or refer us to people within ONS. It is a good way to access expertise within the wider ONS organisation.

Economic evidence manager, local government, north of England

Webinars and workshops were frequently mentioned as useful for knowledge sharing and training, learning about new analytical techniques and tools, such as Power BI. These training sessions helped teams use local data more effectively, which led to improved confidence and accuracy in decision making.

I've found those [webinars] invaluable, because it just stops you working in isolation, instead of 300 local authorities all doing things in duplicate. We can follow best practice, or we have a platform to demonstrate where we’ve done something and sense check things we’re doing all the time.

Economic analyst, local government, north of England

The impacts from ONS Local’s analytical support were vast across the local government group. They:

  • supported local plans, strategy and evaluation – for example, ONS Local data were foundational to several strategic initiatives for one local organisation, including their plans for growth and local skills improvement plan
  • improved analytical capability – for example, the service helped improve the robustness of a local team’s analysis, and gave them confidence in the quality of evidence used in decision making
  • enabled more informed or easier decisions – for example, analytical support towards a tax and spend project to assist devolution discussions
  • boosted analytical resource – for example, ONS Local provided vital capacity to a small local team, enabling them to deliver evidence-based work they otherwise could not manage

ONS Local has probably changed that perception from many by bringing forward how our priorities at a local level might be different to a national level and developing that and bringing it together. Every council has got the bricks, and ONS Local can be the cement in connecting things together a bit better.

Local government official, south of England

In the ONS stakeholder satisfaction survey, we found satisfaction with the ONS Local service was overall positive, with only a few expressing concerns.

Evidence for further development

Out of the 35 local government organisations we interviewed, only four had not used analytical support from ONS Local (11%). Some local areas were from devolved nations, where the ONS Local service was established later and with a different approach to English regions.

Many interviewees showed a lack of awareness or engagement with the full service offer, which may have resulted in it not being fully used. This was particularly relevant for teams with limited analytical capacity, such as those who regularly attended ONS Local webinars but were unaware of the broader support available to them.

There’s so many things out there from different organisations that sometimes it’s difficult for some things to stick and to know exactly where you can use them and how you can use them.

Local government official, London

A few felt they did not need additional support because they handle their analysis internally.

It’s really not that useful for us as a local authority, would prefer the resources to be used to provide more timely data for subnational geographies that are useful to us.

Local government official

Outcome: Supporting central government projects and interventions with local intelligence and analysis

A broader ambition of ONS Local was to expand their impact by sharing local intelligence and feedback with central government. This outcome was brought into the theory of change mid-way through the project.

Overall assessment

Outcome partially achieved.

Some positive evidence towards this outcome, however, this was a longer-term aim and service resource was prioritised on local government support.

CIMO statement

Context: Central government officials working on local growth or other subnational policy areas could benefit from local intelligence and analytical collaboration.

Intervention: ONS Local’s primary audience was local government, and this is where the resource was prioritised. The service did not actively engage with central government departments but was able to support a small number of requests where resource and capacity allowed.

Mechanism: There were some instances of central government requesting support from ONS Local. Most central government officials interviewed did not engage with ONS Local for local insights or analytical support as they were not aware or did not know it was available to them.

Outcome: Minimal support was provided to central government projects and collaborations by ONS Local mainly because of service resource and this being a long-term goal. Greater awareness and engagement are needed to demonstrate how ONS Local could support central government projects, and how ONS Local can inform interventions by providing local insight and analysis.

Evidence towards the outcome

ONS Local provided support to several government department projects. This was mainly small-scale and short-term requests from the departments as they became aware of the service. Examples included:

  • qualitative work in local areas to support potential local plans to increase employment outcomes, in collaboration with the Department for Work and Pensions
  • work on the Vulnerable Persons Framework for the Cabinet Office
  • evidence gathering on the main data gaps for the Department for Digital, Culture, Media and Sport

The Ministry of Housing, Communities and Local Government (MHCLG) has engaged with all parts of the LDI project, including ONS Local. In the evaluation interviews we heard from a few MHCLG officials that close collaboration with ONS Local representatives was helpful with sharing project developments and insights into the support being provided at a local level. Larger analytical projects, such as the Night Time Economy project, have also gained momentum through ONS Local input, which played an important role in launching the project and ensured secure data access.

ONS Local, along with other LDI project teams, also supported MHCLG with any emerging needs, such as the UK government’s Long-Term Plan for Towns: data packs for 55 towns (published 28 March 2024 by MHCLG). These packs contained local insights and intelligence to support the delivery of Town Boards, build capacity and understanding, and inform local residents.

The senior subnational data group in its current form was launched by the LDI project on a quarterly basis in May 2022, chaired by the National Statistician with the aim of driving improvement of subnational data and providing coordination and support to efforts across government. This was beneficial to link up people at a senior level across government and enabled progress towards improving local insights.

Evidence for further development

Feedback in the evaluation interviews with central government was consistent with the way ONS Local had prioritised local government support and engagement. Most central government interviewees did not engage with the ONS Local service. Many considered that they were not the target audience and therefore had been put off pursuing it further. They were unaware of the services provided by ONS Local, such as sharing local insights and bespoke analysis, and how to engage.

My sense is that we don't need to [use ONS Local], because we have a good handle on the data that exists. I see them more as a service for people who don’t have the in-house capability.

MHCLG official

Some central government officials engaged with ONS Local only passively (for example, subscribed to mailing lists or attended webinars) because their existing resources and processes were sufficient.

Similarly, in the ONS stakeholder satisfaction survey (January to March 2025), 7 out of 34 central government respondents had used the ONS Local service. Reasons for not using it were mainly because of lack of awareness or relevance to their work.

Extent of contribution from ONS Local

For local government, ONS Local provided significant added value, for example:

  • they addressed capacity constraints by providing significant support that helped teams overcome limited resources and enable better decisions with clear evidence
  • they enabled proactive discussions and advice on data needs, analysis and interpretation, and provided connections to ONS teams to directly share feedback
  • they gave tailored, in‐depth advice that greatly improved the analytical ability of local government teams
  • they offered guidance to use new data more effectively in planning and decisions

Further examples were highlighted where ONS Local delivered more short-term analytical support. For example, they cleared up data issues, confirmed their work was accurate, and helped during busy periods to address immediate problems.

The extent to which ONS Local contributed to analytical support was less so for some local government teams. They were not using the full ONS Local analytical service because they used other statistics resources, or their own expertise and established ways of working.

We do use ONS Local, but we do quite often just go directly to the bits of ONS that we need. If I need housing statistics; I’ll go to the housing team.

Senior researcher, local government, north of England

There was also feedback from local government users that highlighted maybe the analytical support could be improved with:

  • clarity over the broader service offering
  • timely advice to feed into decisions or work
  • use of non-technical language in webinars to help teams to apply the advice

Progress since the start of the project

What happened before the LDI project (before April 2022)?

Before the LDI project, coherence of subnational data by statistical producers, particularly in devolved government, was limited and deprioritised because of resource constraints and data complexity.

The previous ONS Cities team of four operated reactively, supporting local leaders and stakeholders, with regular engagement with the nine English Mayoral Combined Authorities and quarterly Combined Authority Liaison Group meetings.

Dataset requests varied by scale: resource-intensive ones were backlogged, while smaller ones were handled ad-hoc, without a formal commissioning or prioritisation process. ONS messaging was shared via a monthly newsletter to around 150 local contacts, highlighting recent and upcoming subnational releases.

What can be done now that could not be done before ONS Local? 

Interview respondents were asked what difference ONS Local has made, if anything. Several local government users highlighted the increased resource and expertise they now have because of ONS Local.

Without [ONS Local] support, it probably takes a lot more time to try and find what we need and not having that sounding board then puts a lot more pressure for us to build up the resource internally.

Research and evaluation officer, local government, north of England

We also heard that without ONS Local input, they would be dependent on other data that were not as good, no evidence to back up statements or decisions, or a narrower view of a topic.

We can access data that wasn’t readily available before. That means we’ve been able to enhance our analytical capability and identify issues within our area’s economy and assess them in a way [we] couldn’t before, which has improved the quality of our planning and our objectives.

Economic evidence manager, local government, north of England

ONS Local also enabled easier access to other ONS teams.

[On connecting to others in ONS] That's been really helpful, because I’d have no idea where [in] the ONS I would go for that [input from the population team]. Whereas ONS Local was that front door service, to help us reach the right people in the ONS.

Local government official, Midlands

Other influences

For this evaluation question, evidence backed up three of the project’s alternative theories, or other influences, which may have affected the outcomes and impacts we observed.

Alternative theory 4

Some central and local government analysts do not need to use ONS Local, as they already have good expertise and understanding of ONS statistics or already have direct contact with statistical production teams.

Alternative theory 5

Less engaged local government officials are not aware of new services and products, or they have not used them, and they continue to do things as they have previously.

Alternative theory 6

Central government officials are unaware or unclear on what services ONS Local offers and assume it is for local government only.

Nôl i'r tabl cynnwys

5. Evaluation question 2: Did new data platforms provide a valuable repository to find, visualise, compare and download subnational data and statistics?

The evaluation of this question focused on the main Explore Subnational Statistics (ESS) product – Explore Local Statistics (ELS) – that was launched in March 2024.

Service reach

The target user for ELS was analysts and policy makers in local, followed by central and devolved government. Enquiring citizens interested in finding out more on their local area were also considered.

Website analytics on ELS showed a steady increase over the 12 months since its launch, with an average of 2,600 sessions per week (note 1).

Over half of local government organisations that were interviewed had used ELS (19 out of 35), which met the target reach. Users were typically those within resource-constrained settings who lacked the tools to build dashboards or extract insights from spreadsheets.

Reasons that local government used ELS included:

  • general understanding of the local area
  • monitoring
  • for sharing information with others
  • responding to queries
  • to access or identify subnational data
  • supported decision making

ELS is very good at finding a quick reference point. When you need a number quickly of where we are, and this feeds into a policy; sometimes it’s a very quick turnaround.

Economic advisor, local government, south of England

Only a small number of local government officials that were interviewed had not heard of the service. Other reasons for not using ELS included:

  • preferred to access raw data
  • unaware of the data or functions that it provides
  • not sure what they can use it for
  • not having time to see what it can do
  • believed that data or geographies needed are not available
  • finding it daunting
  • too challenging

Notes from Service reach

  1. A session can be defined as a website visit, initiated when a user views a page on the website and interacts with the website for a period of time. Only users who accept cookies are tracked, therefore the number of sessions reported are likely to be significantly less than the actual number of sessions.

Outcome: Easy and seamless access to data on people and place, and improved presentation and dissemination of local-level statistics

Overall assessment

Outcome partially achieved.

There is strong evidence from local government users with limited analytical capability that this outcome was achieved. For others (such as central and devolved governments, as well as expert local analysts), there was evidence that they did not need to use the service because these users preferred access to raw data, familiar tools, or required more complex geographies, or were unsure how it could support their work.

CIMO (context-intervention-mechanism-outcome) statements

Context: Analysts who need to use local data can face challenges in finding them and may lack the resource or expertise to analyse and visualise the data effectively.

Intervention: the ONS implemented and continually developed the ELS data platform to meet the needs of government analysts and enquiring citizens. ELS was promoted through a variety of channels to try to reach these groups. 

Where the outcome was achieved

Mechanism: Government analysts who generally had limited analytical capability or capacity chose to use ELS, or share with colleagues, to answer quick and simple queries about a place.

Outcome: ELS provided an easy, convenient and consistent way for analysts to access and compare local data. It saved them time by having a lot of data in one place and the processing already completed.

Where the outcome was not achieved

Mechanism: Local government teams with more analytical capability or capacity did not use ELS because they prefer to use source datasets or other tools. This was the same for central and devolved government officials. For some users it was not needed for their role.

Outcome: ELS was generally not used by central and devolved governments, nor expert local analysts, who were content with their existing approach, to find and use local data. Clarifying the target user and promoting recent improvements could boost engagement, especially among non-users.

Evidence towards the outcome

Feedback on the look of ELS was generally positive, including how user friendly it was. For some the tool gave them confidence, for example, because they knew they were using the same data as others.

Interviewees frequently cited the tool’s ease and convenience in accessing local data, valuing its ability to provide a clear snapshot of area-specific context. Its existence saved people time by having data in one place and the processing already completed.

The Explore Local Statistics is there in one place, we can quickly refer to it and make decisions based on that or take straight from there and feed into our own products, without having to do heavy lifting, because it's done for us.

Delivery analyst specialist, local government, south of England

Although some more capable analysts did not use ELS themselves, they would signpost ELS to others. Specifically, when their colleagues asked simpler queries, which left the analysts to focus on more technical work.

Some users valued the visualisations for their guidance on data presentation and the selection of appropriate measures, helping to align their approach with others.

ELS gives me a context for my work. It gives me a sounding board, and it gives me verification of the things that we should be looking at and how we should be analysing them.

Economic analyst, local government, north of England

The comparison aspect of ELS was highly valued by users, with ready-made comparators across different areas.

If we need to do anything around comparisons, ELS is fantastic for that. You just pop it in, and you’ve got it ready-made, you can just basically paste it straight into the background, craft the answer and it has made our lives a lot easier.

Economist, devolved government

Outside of the main user group, there was positive feedback received on the usefulness of ELS including from charities and local media.

Evidence for further development

Most central and devolved government interviewees did not use ELS mostly because they preferred to use source datasets or existing tools. Also, some felt that ELS was not needed for their role.

Of the local government organisations that were interviewed, 15 did not use ELS (out of 35), including 10 who were aware of the service. Reasons for not using the service included:

  • needed to use the source dataset to do their work
  • had existing knowledge on accessing data
  • continued to use existing data services (for example, LG Inform and Nomis)
  • preferred to go to local data providers for those based in devolved nations
  • no time to look at it

Issues with ELS data and functionality were also highlighted but as the service was continuously updated, some of these may have been resolved after the users looked at it. Examples included outdated data, unable to retrieve data for new geographies, and required level of geography not provided. There was also no clear list of available indicators or a way to download all data in one go, which may have been a legacy barrier as these features were included when ELS was launched.

You can waste time [using ELS] because we’ve looked at it and we’ve just now parked it to one side because it doesn’t come out with what we need.

Analyst, local government, north of England

Presentational challenges were also an issue for several users that meant they did not use ELS. For example, it did not present the data in a way they needed, and quality caveats were not provided, which help minimise misinterpretation.

We have to pretty much recreate everything, all the hard work you’ve done, in the internal style, and then mix it in with all the other data that [our colleagues] want. They don’t want to go here for this and there for that. They want one something.

Local government official, Midlands

While respondents acknowledged improved access to subnational statistics, some still experienced difficulties locating data, indicating an opportunity to further enhance data navigation and user support.

I know it’s something that has been worked on, but I still don’t find it easy to find anything. I know that the data’s there, but sometimes it feels like such a slog to find it.

Policy analyst, local government, London

Extent of contribution from ELS

ELS did contribute towards saving users time by having data in one place and data manipulation already done. It enabled users to:

  • access data quicker and easier than internal systems
  • reduce own need to bring together datasets
  • draw quick conclusions
  • quick access to headline data
  • easy provision of initial context on an area

Some analysts would share ELS with colleagues to help answer their queries. So, although they did not use ELS directly, it saved time allowing them to prioritise their workload on more specialist advice.

The data service was not the only influence, with several using it in conjunction with other tools for a variety of reasons:

  • wanting data all from one place
  • can create own visuals
  • easier to navigate interface
  • simpler data download process
  • need flexible geographies
  • reliance on tools already familiar with

It was also highlighted that other data and resources were needed in addition to ELS or it was complemented by their own expertise and processes.

The extent to which ELS contributed was limited by the amount of data included and the users’ work remit. Others also highlighted the consistent engagement needed to build familiarity.

Progress since the start of the project

What happened before the LDI project (before April 2022)?

Before the LDI project, subnational data were scattered across various releases and government websites, often inconsistently presented with outdated geography boundaries.

There was no central platform for local analysts to access, analyse and visualise data, prompting multiple teams to consider building their own dashboards.

The ONS website lacked a seamless user experience for finding, visualising, comparing and downloading up-to-date local data across topics.

What can be done now that could not be done before ELS? 

Interview respondents were asked what difference ELS has made, if anything.

Positive responses from local government teams highlighted that they now spent less time on various parts of the analysis process because of ELS. For example, less time looking for or collecting data, searching the ONS website, and doing data manipulation and analysis. An added benefit was that it reduced the risk of human error, by using the ready-made data and visuals in ELS.

Before we had ELS, we spent so much time looking for data, like what’s the latest release, etc., and then having to do manual manipulation on the datasets released.

Delivery analyst specialist, local government, south of England

Finding what data were available for their area was now easier with the development of ELS, with more awareness of some measures.

Other influences

For this evaluation question, evidence backed up three of the project’s alternative theories, or other influences, which may have affected the outcomes and impacts we observed.

Alternative theory 1

The data are also available in alternative data products where analysts can access it in a way that meets their needs (such as Nomis, LG Inform, products from private sector).

Alternative theory 3

Some of the more capable analysts do not need to use ELS to present and visualise data. They can, and prefer to, do this themselves with access to the source data so that they can present it in their own way without ONS branding.

Alternative theory 5

Less engaged local government officials are not aware of new services and products, or they have not used them, and they continue to do things as they have previously.

Nôl i'r tabl cynnwys

6. Evaluation question 3: Are local decisions better informed by subnational data and methods?

More than half of the users interviewed had improved decision making because of the project’s services and statistics, with the majority being from local government. Contributions from tailored data analysis delivered through the ONS Local request process, and the availability of more granular data, were the primary drivers towards this improvement.

The long-term value and impact of the project’s services and data are still emerging, with ongoing assessment expected to provide further insights. Also, when interviewed, some analysts were unsure of the wider implications of the statistics they had used.

Outcome: Improved utilisation of data and analysis for decision making in local government across all regions and devolved governments

Overall assessment

Outcome achieved.

Strong evidence from local government that decision making benefitted from the ONS Local service. There was not enough information from devolved government to make an assessment.

CIMO (context-intervention-mechanism-outcome) statement

Context: Local government need access to reliable local data to support informed decision making. However, data gaps and limited analytical resources can hinder this process.

Intervention: ONS Local representatives engaged with government teams in their area via a variety of ways to understand and support their analytical needs. They provided support with analytical data requests, including publishing bespoke datasets.

Mechanism: Local government interacted with ONS Local to share their analytical needs and request for support and advice. The level of engagement with ONS Local depended on their resource, capability and needs as well as awareness of the support available.

Outcome: Local government had greater access to critical data supporting decisions in areas such as policy development, economic evaluation and targeted interventions. ONS Local provided analytical support throughout, saving local government time and resource.

Evidence towards the outcome

ONS Local produced bespoke data and analysis, including larger analytical projects, which were used by several local government interview respondents. This added to the evidence base for planning, interventions and evaluation.

Insights from the data helped shape policy development, performance tracking and monitoring economic trends.

[ONS Local data] has set a direction of lower-level understanding that we would not have had otherwise on our workforce. So, the value for money there is creating a completely different lens that we can tailor projects to, to support those people that are not just on benefits that are out of work if they want to come back into work as well. And to also unpick why they would not want to come back into work too.

Senior analyst, local government, north of England

ONS Local acted as the front door to enable technical and methodological support to local teams. It also allowed collaboration and networking, including on larger analytical projects across local and central government.

ONS Local had wider impact, with similar insights emerging from the small group of interviewees in non-government organisations, such as regional partnerships and academic institutions.

Evidence for further development

Section 3: Evaluation approach describes the use of the ONS Local service. Data and analysis provided by ONS Local were not used by some with limited awareness of ONS Local’s full analytical offer. Interviewees expressed future interest in using this function after learning about it during the evaluation.

Local government analysts using their own resource and alternative data sources or tools affected the use of ONS Local’s data.

Outcome: Easier access to allow comparisons of subnational data, presented and visualised in the same place

Overall assessment

Outcome partially achieved.

Users of Explore Local Statistics (ELS) provided some positive evidence of its impact, though it was primarily used as a supporting tool, with limited evidence that it directly aided decision making.

CIMO statement

Context: Analysts who need to use local data can face challenges in finding and making comparisons. They may lack the resource or expertise to analyse and visualise the data effectively.

Intervention: The ONS implemented and continually developed the ELS data platform to meet the needs of government analysts and enquiring citizens. ELS was promoted through a variety of channels to try to reach these groups. 

Mechanism: Government analysts who generally had limited analytical capability or capacity choose to use ELS, or share with colleagues, to answer quick and simple queries about a place.

Outcome: ELS played a supporting role with limited evidence that it directly aided decision making processes. However, it did save them time by having a lot of data in one place and the processing already completed.

Evidence towards the outcome

Section 4: Evaluation question 1 showed how government analysts who generally had limited analytical capability or capacity value the ELS service to answer quick and simple queries. Users spent less time and resource looking for subnational data to support their analysis which could ultimately inform local decisions.

A few interviewees were clear with how they have integrated ELS into decision making and policy work. They relied on it for structured insights and benchmarking across regions.

Monitoring performance, including making comparisons, was a common benefit of using ELS, even if this did not always directly feed into decision making.

[ELS helps to] understand what your priorities are, and just very quickly being able to see where a local authority sits in the distribution of all local authorities, to understand relatively where you're performing well and not well and getting that information really quickly is very helpful.

Regional partnership, south of England

The clustering similar areas release helped organisations identify areas facing common challenges or for benchmarking. Some also adapted ONS’s clustering code to build their own model.

Evidence for further development

Section 4: Evaluation question 1 describes the reasons why some analysts do not use the ELS service.

For ELS users, it helped with queries and comparisons, but decision-making impact was either as a supporting role or unclear. It was mostly used for verification rather than direct strategy formulation.

A small number of interviewees used the clustering methodology produced by the project, which helped them identify similar areas for comparative analysis. There were still many analysts who relied on alternative products to enable comparisons with other areas, such as LG Inform, Chartered Institute of Public Finance and Accountancy (CIPFA) nearest neighbours tool, and NHS benchmarking groups.

Outcome: Delivery of timely cross-cutting insights to show performance for local growth

Overall assessment

Inconclusive.

The new data sources work package delivered innovative insights, and while their usage and impact within the evaluation sample were not yet widespread – mainly because of timing and limited awareness – there is clear potential for greater uptake, with improved promotion and integration.

CIMO statement

Context: Governments at all levels need to assess local growth and performance, develop local strategies and make decisions. To do this effectively, they require a wide range of detailed, granular data about their local areas to strengthen or supplement existing evidence.

Intervention: The ONS explored and developed statistics from new data sources to provide new or timely local insights, prioritised by user needs.

Mechanisms: Local and central government analysts working with localised data generally did not engage with the outputs of new data sources because of a lack of awareness or not being directly relevant to their work or remit.

Outcome: New insights were delivered following extensive exploration and development – most published in the latter stages of the project. The evaluation sample showed limited evidence of current usage. With more time and visibility, there is potential for these outputs to make a meaningful contribution.

Evidence towards the outcome

A few interviewees had engaged with the outputs of new data sources, even though the evaluation group included a wide range of interested and influential users and stakeholders. Those who had used the data described how it supported planning and deepened their understanding of how different places are performing, highlighting its potential value.

Evidence for further development

On seeing the list of project outputs in the interviews, most people were not aware of the new data source’s existence or potential use. Some showed familiarity with the outputs but had no active engagement with them.

A likely reason for this is because of the amount of time required to understand, develop, and then publish statistics from the new data. Most of the outputs from this work package were published later in the project (from March 2024). Time is also required by the local users to find, understand and use the data, as well as then seeing any potential impacts.

Other reasons for not using the new local data included its lack of relevance to their work or role, or they were already using similar data sourced elsewhere.

Outcome: More granular statistics for measuring progress of local growth activities and tell the story of place

Overall assessment

Outcome partially achieved.

Some outputs supported decision making, especially alongside other data. Usage varied across the outputs and was limited in part because of lack of awareness among some local analysts. Also wider impact is expected to emerge over a longer timeframe.

CIMO statement

Context: Governments at all levels need to assess local growth and performance, develop local strategies and make decisions. To do this effectively, they require a wide range of detailed, granular data about their local areas to strengthen or supplement existing evidence.

Intervention: The ONS explored innovative techniques to disaggregate existing data to more granular levels. The ONS published these more granular statistics along with the methods and guidance to support effective use.

Mechanisms: Some government officials found the new, more granular data and methods and made use of them, often alongside other data. Others did not use them because of a lack of awareness, lack of relevance to their work scope, time constraints, or did not fully understand them.

Outcome: More granular statistics supported some local planning and decision making, especially when used with other data. However, usage was inconsistent across the outputs. While stakeholders were generally satisfied with subnational data, there was still strong demand for greater detail of local areas.

Evidence towards the outcome

Some interviewees explained that more granular statistics have helped inform various aspects of decision making – such as, planning and services, policy and interventions, evaluation, and monitoring. Examples from local government included:

  • shared with elected members to help guide internal planning and the structuring of future changes in the area
  • part of strategy and policy narrative to go to government about recent successful growth on the economy and household income
  • used to serve their customer base by providing granular geographical information, which allowed them to design targeted interventions across local authorities

Across the board, the data that you’re producing is helping us provide better evidence, better analysis and better advice to ministers. [This] helps us shape and evaluate our policy.

Ministry of Housing, Communities and Local Government official

Others explained how the granular data have fed into economic or local growth strategies.

We’ve recently published an economic strategy that sets out the county’s plans for the future. A lot of this information fed into the evidence base that was informed by and then helps set out some of our priorities going forward.

Local government official, south of England

More granular statistics were used alongside other data for funding-related activities, such as evidence in business cases for projects, funding bids and investment decisions.

The data is often part of the scene setting within a business case analysis, or part of how we’re looking to influence a project through an investment.

Local government official, Scotland

In the ONS stakeholder satisfaction survey (January to March 2025), 95% of local government respondents who had used subnational data said it had met their needs (20 out of 21). This was slightly higher than the year before (17 out of 20, 85%).

Evidence for further development

More granular local statistics had been used by many interviewees. Some barriers to wider use were highlighted, such as limited awareness, varying relevance to individual work scope, time constraints, or the need for clearer guidance to aid understanding. Addressing these areas could help unlock greater value from the outputs.

For a few respondents, improvements made to the more granular data did not go far enough. For example, data were not at the localised level needed, or there were concerns with data reliability at lower levels.

It's only at the regional level, so you can’t necessarily disaggregate it at lower levels to understand which particular local authority districts are contributing much.

Regional partnership, south of England

In response to questions about the ONS’s future focus, many respondents emphasised the importance of more granular geographic data. Some expressed a need for continued access, while others called for even more localised information. A few asked for ONS to deliver fundamental statistics ahead of new or more granular data.

Reliability of the core datasets and consistency is the most important thing so we can plan around those. Then adding these other datasets which have been interesting, but we need to have some [data] that we can really rely on to make plans off the back of.

Ministry of Housing, Communities and Local Government official

Extent of contribution from the LDI products and services

For local users, ONS Local provided significant added value with their provision of data and analysis, for example:

  • filling data and analytical gaps, which, without ONS Local, respondents would not have had access to necessary datasets or analytical insights
  • influencing policy and strategic decisions, such as policy development, economic evaluations, and government lobbying efforts
  • played an important role throughout the entire analysis process, including supporting interpretation and queries
  • helped alleviate resource constraints within local teams by enhancing their analytical capacity or saving them time and effort

All of that [ONS Local] data is critical. Without it, we cannot show success. We would be completely in the dark.

Local government official, London

With new data platforms, the contribution towards local decision making was more uncertain. Where ELS did support was by providing quick access to subnational data, comparisons and improved efficiency. Often other tools or analytical expertise were needed to deepen analysis. Local analysts sometimes shared ELS with colleagues to help address their queries, even if they did not use it directly themselves.

For more granular statistics, there were many examples where the data had provided some contribution to decision making, benchmarking and analysis. Respondents also relied on additional data sources or faced some limitations with the data that reduced its impact.

The GVA [gross value added] data helps fill a gap, but it’s also a good sense check against any data that we might also be producing because it gives you that benchmark and maybe shows you that your own figures are too far off track.

Local government official, Scotland

Some respondents found it challenging to clearly identify the specific contribution of more granular or new data sources to decision making. This was often because they were not directly involved in how the data were used or lacked visibility into its application.

For this short period, in such a politically fast-moving environment, it’s not actually informed decisions. I would say it’s informed the direction of thinking; so how we are pitching our local growth plan.

Senior economic advisor, local government, south of England

Many respondents spoke more broadly about the value of the ONS subnational statistics offering. This suggests that while the data is seen as useful, its individual components may benefit from clearer communication of their impact.

Progress since the start of the project

What happened before the LDI project (before April 2022)?

Before the Local Data and Insights (LDI) project, limited social and economic statistics at local geographies hindered timely analysis of contemporary issues. ONS data reflecting levelling up priorities were sparse and mostly produced reactively. Access to granular data was restricted, and lags in data collection and production led to outdated insights.

Comparing areas was challenging because of inconsistent geographic disaggregation. Data were released based on use case, policy area, or country-specific standards. Technically skilled users often needed data for custom geographies built from smaller units, but this was limited by sample size or methodology.

Before the LDI project, levelling up data existed but lacked coverage for some missions and were scattered across departments, geographies and formats. No central place offered dashboarding or visualisation tools to support shared development or easy access to metrics.

What can be done now that could not be done before the project? 

In the ONS stakeholder satisfaction survey (January to March 2025), two-thirds of local government respondents (21 out of 33) were satisfied with geographic breakdowns in ONS statistics compared with January 2022. Fewer than half of central government respondents (14 out of 34) were satisfied, though only five stated expressed dissatisfaction.

In the evaluation interviews, respondents were asked what difference ONS subnational statistics and services had made, if anything.

More granular statistics gave benefits by providing data insights and enabling comparisons. Several people valued the benefits of having granular data to enable deeper exploration of an area. They appreciated that ONS data lend credibility, provide a reliable foundation, and enable broad data use.

[Without some LDI data] we would have been less well informed in our conversations with places so less able to target those policy levers that were most likely to support the government’s regional and national growth ambitions.

Ministry of Housing, Communities and Local Government official

The ONS Local service enabled deeper understanding of topics and analysis not previously possible. Without ONS Local input, some said that they would have no evidence to back up statements or decisions. Whereas others felt the work probably still would have happened as they could go directly to the ONS.

It's not just a dry exercise of allowing us to track, [ONS Local data] means that we could take intervention and reprioritise, and that means stopping something and putting money towards this.

Local government official, London

With new data platforms, some local government teams with less analytical capability now spent less time on the analysis process because of the Explore Local Statistics (ELS) service.

Other influences

For this evaluation question, evidence backed up all of the project’s alternative theories, or other influences, which may have affected the outcomes and impacts we observed.

Alternative theory 1

The data are also available in alternative data products where analysts can access it in a way that meets their needs (such as Nomis, LG Inform, products from private sector).

Alternative theory 2

Some devolved governments analysts are put off using ONS products because they assume their data are not included and therefore they are irrelevant. 

Alternative theory 3

Some of the more capable analysts do not need to use ELS to present and visualise data. They can, and prefer to, do this themselves with access to the source data so that they can present it in their own way without ONS branding.

Alternative theory 4

Some central and local government analysts do not need to use ONS Local as they already have good expertise and understanding of ONS statistics or already have direct contact with statistical production teams.

Alternative theory 5

Less engaged local government officials are not aware of new services and products, or they have not used them, and they continue to do things as they have previously.

Alternative theory 6

Central government officials are unaware or unclear on what services ONS Local offers and assume it is for local government only.

Nôl i'r tabl cynnwys

7. Evaluation data

The evaluation was based on quantitative and qualitative data collected via in-depth interviews with external users and stakeholders, the Office for National Statistics (ONS) annual stakeholder satisfaction survey and an internal survey of the Local Data and Insights (LDI) project delivery teams. It also builds on evaluation data collected across the three years.

These were supplemented by other data, such as management information on ONS Local events and the Google Analytics of publications on the ONS website.

Stakeholder interviews

Interviews were held between October 2024 and March 2025 for external users and stakeholders, including the Ministry of Housing, Communities and Local Government (MHCLG), to provide views and experiences of using the services and outputs from the entire project duration.

Requests for interviews were emailed to 191 known stakeholders and collaborators, and promoted through meetings, ONS working groups, and ONS Local events. While the sample was largely familiar with the ONS, it included people with limited or no prior engagement with the project’s services and statistics.

The interview discussion focused on:

  • the ONS subnational data they use or used, including how, why and any impact
  • experiences with ONS Local, considering any impact as a result
  • why they may not use the services and outputs, and any improvements needed
  • what other services and statistics they use, outside of ONS
  • future focus of local data and insights from ONS

Initially open questions were asked to identify what ONS subnational data they used before drilling down into specific project outputs. This involved sharing links to the project outputs, to help focus the discussion and verify the attribution of the feedback.

Sample

Sixty interviews were held in total. The sample included individuals with varying levels of engagement – from minimal involvement to extensive use of all project components – and a range of data expertise, including collaborative partners. Ten participants were previously interviewed in the year 2 evaluation.

ONS annual stakeholder satisfaction survey

Questions about the project’s services and statistics were included in the ONS annual stakeholder satisfaction survey, which was open between 29 January and 12 March 2025. It aimed to understand users’ and stakeholders’ awareness, satisfaction with and use of ONS statistics since January 2024, including questions on ONS Local and subnational statistics for evaluation use.

A total of 155 responses were received from a range of stakeholders following wide promotion to users and producers. The most common professional sectors of respondents were local government (21%) and national government departments (14%). ONS statistics were used for different reasons, including policy and decision making (43%), informing organisational strategy (34%) and personal interest only (19%).

Year 2 data collection (April 2023 to March 2024)

An evaluation at the end of year 2 was carried out, which also acted as a mid-point evaluation. The data for this were collected via an evaluation survey, responded to by 144 external stakeholders and users. Several follow-up interviews with survey respondents were held to gather an in-depth understanding of their experiences.

In July 2024, our year 2 local data and insights project evaluation report was published, which provides a summary of the evaluation data, findings and recommendations at that time point.

Baseline and year 1 data collection (April 2022 to March 2023)

The data for the baseline were collected in autumn 2022 by combining desk research, stakeholder meetings and a benefits workshop. This provided insight into the current state of subnational data use, identified user needs, and highlighted the main issues the project should aim to address. Meetings were held with stakeholders from across the ONS and MHCLG, as well as local stakeholders across UK regions.

In September 2023, our Levelling up subnational data project monitoring update for year 1 was published, which provides a summary of the status at baseline and year 1.

Nôl i'r tabl cynnwys

8. Evaluation methods

Theory of change

A theory of change (ToC) for the Local Data and Insights (LDI) project was created in summer 2022 describing how the project is expected to deliver desired changes. To develop this, user needs were defined and translated into intended outputs, outcomes and impacts, mapped against the inputs and activities required to achieve them.

It was reviewed and updated before the year 2 evaluation (October 2023 to January 2024), which was previously described in our year 2 evaluation report. Minor updates of the ToC also happened before the end of the project.

The overarching and intended impact was that policies and local decision making are better informed by subnational data. This aligns with several outcomes and impacts from the Office for National Statistics’ organisational theory of change 2024 to 2025 (PDF, 93KB).

The ToC was prioritised for the analysis, to focus on the main causal pathways for each evaluation question. These prioritised outcomes are provided in Sections 4 to 6.

Methods

Impact and process evaluation were used for this project.

Theory-based evaluation methods were considered the most appropriate, alongside quantitative and qualitative research methods, as described in Section A1 of the Magenta Book Annex A: Analytical methods for use within an evaluation (PDF, 495KB).

For the impact evaluation, the principles of realist evaluation were applied to the main evaluation questions and considered different user groups. Elements of contribution analysis were also considered to explore the wider context.

The methods helped show whether the project made a difference, how and why it happened, and how external factors affected the results.

The approach and methods were based on discussion with, and advice from, the Evaluation Trial and Advice Panel (ETAP) in April 2024.

Reach in an evaluation setting is defined as the extent to which the target audience encounter the intervention. The reach of the ONS Local and Explore Local Statistics (ELS) services were assessed for those interviewed in local government. Central and devolved government participants were not assessed because of a smaller sample interviewed.

Analysis plan

Survey, interview and focus group data were analysed using content analysis to organise unstructured text, and thematic coding to group insights by common topics.

Sequencing of the evaluation methods followed a previous project (Evaluation of Building Capacity to Use Research Evidence (GOV.UK)) by using the following steps:

  • establishing that an outcome did (or did not) happen, and for who
  • identifying evidence that the project did (or did not) contribute, and the influence of other factors
  • looking deeper as to how and why the project did (or did not) contribute

Context-intervention-mechanism-outcome (CIMO) statements were developed during year 2 and refined in the final evaluation.

The contribution from the project was assessed using a scale from crucial contribution to no contribution. Any insufficient evidence of the contribution was acknowledged in the analysis.

Limitations of the evaluation

The findings reported here are limited by the following aspects of the evaluation.

Sample

The evaluation sample mainly included engaged, data savvy users, with less representation from those less involved.

While interview numbers were sufficient overall, small subgroup sizes limited detailed analysis – except for local government – which made it hard to draw firm conclusions about which users benefitted from the services and statistics.

Impacts on the public and research community were not assessed because of limited resources and their less direct involvement with the project.

Timing

The evaluation covered activity up to March 2025, with data collected in the preceding six months. Some services and projects were not fully assessed because of later completion or ongoing development, including the ELS service, which was continuously developed.

For evaluating decision making, more time was needed for people to use the data and services, and to observe any impact from this.

Data collection

The LDI project spans multiple outputs on the ONS website and the ONS also produces subnational data not covered by the project. This made it hard to pinpoint exactly what interviewees used, especially for the new and more granular data.

Interview questions focused on perceived impact, but responses may be affected by recall bias (not accurately remembering and reporting past experiences).

Citations were not included in the evaluation because of inconsistent references (such as full titles or authorship), which made systematic tracking unreliable.

Nôl i'r tabl cynnwys

9. Glossary

Devolved governments

A collective term for the executive bodies in Northern Ireland, Scotland and Wales: the Northern Ireland Executive, the Scottish Government and the Welsh Government.

For some topic areas, such as where policy is devolved, each country may have several bodies responsible for producing statistics. This can make finding data on a particular topic difficult. Our Statistics across the UK web page provides more support and guidance on the matter.

Subnational

The term “subnational” refers to all data that are produced for the 12 International Territorial Level 1 (ITL1) areas in the UK and smaller geographical areas.

Nôl i'r tabl cynnwys

10. Future developments and recommendations

In the 2025 to 2026 spending review period, the Office for National Statistics (ONS) and the Ministry of Housing, Communities and Local Government (MHCLG) will be continuing the partnership to deliver Local Data and Insights. The project will focus on continuation of the ONS Local Service, maintenance of Explore Local Statistics (ELS) and further development of granular data and local insights.

Based on the final evaluation, the following activities are recommended to guide the future of this work.

ONS Local should:

  • define engagement levels to meet different user groups’ need
  • raise central government awareness of the service and its benefits
  • prioritise outreach to local teams with limited analytical capacity
  • maintain and monitor parts of the service where the outcomes were achieved

New data platforms (ELS) should:

  • refine target users and tailor future product features to their needs
  • promote the service and updates to boost engagement, especially among non-users

New data sources, methods and more granular local data should:

  • increase awareness and usage of these data through targeted promotion
  • continue delivering improved granular data based on user needs
  • continue to work with local and central government to understand the impact of these sources on policy changes and effectiveness
Nôl i'r tabl cynnwys

12. Cite this article

Office for National Statistics (ONS), released 22 September 2025, ONS website, article, Local Data and Insights project, final evaluation report, UK: April 2022 to March 2025

Nôl i'r tabl cynnwys