1. Executive summary
This report presents a systematic investigation of public trust in the context of social surveys at the Office for National Statistics (ONS). The report covers three key chapters:
Behavioural Insights (BI) workshops
Pilot Communications Campaign
Evaluation
The BI workshops identified the barriers to trust through engagement with internal stakeholders, interviewers, and members of the public. They examined trust barriers using Behavioural Insights, which apply psychology and behavioural science to understand and influence how people make decisions.
The report then describes a novel pilot communications campaign designed to build trust in the ONS and increase participation in social surveys.
Finally, it covers an evaluation of the impact of this pilot communications campaign on both trust and respondent participation levels.
Pre-live campaign testing of direct mail postcards, broadcast video on demand (BVOD) and digital out of home (DOOH) billboards revealed key insights into public engagement and design effectiveness. It also provided insights into the role of trust, demographics, and socio-economic status in communication outcomes.
Inclusive design and multi-channel strategies were found to be essential to resonate with diverse audiences, though further research is needed on socio-economic impacts. Institutional trust emerged as a critical factor: participants with low trust were less likely to find materials relevant or memorable, underscoring the need to build and maintain trust for successful survey engagement. Any future campaigns should make the most of opportunities for more targeted messaging based on audience segmentation and wider insights.
Operational limitations affected the trial's evaluation approach, making it difficult to attribute changes to specific interventions or identify affected demographics. However, contribution analysis suggested that direct mail postcards helped to:
raise awareness of the ONS
reassure recipients of survey legitimacy
highlight participation importance
While there were indications that direct mail postcards may have contributed to increased participation, further research is required to confirm this effect, because of the challenges in isolating their specific impact and the scale of the pilot.
Recommendations
Building on these findings the research identifies 10 recommendations, which both consolidate existing best practice and provide a focus for future trust-building and survey engagement efforts.
Simplify messaging, make calls to action stand out, and clearly show why the message matters to people's everyday lives.
Use a phased, multi‑channel communications approach to reach people in different ways over time.
Make branding clear and easy to verify, giving people simple ways to confirm authenticity and understand what the ONS is and what it does.
Address scam concerns directly but cautiously, as fear of scams was the biggest barrier to trust.
Design inclusively for diverse audiences, tailoring campaigns and using flexible, audience‑sensitive designs where possible.
Build trust in the organisation before asking people to take part, since low trust strongly reduces engagement.
Segment audiences carefully, managing trade‑offs so that tailored messages for one group do not reduce engagement in others.
Pre‑test materials thoroughly with diverse groups, as the pilot showed major differences in how groups responded.
Create evaluation frameworks that can isolate the impact of each intervention, since the pilot made it difficult to link outcomes to specific actions.
Monitor external events that may affect trust, as context, such as local disruptions like bin strikes, potentially shaped people's attitudes.
2. Overview of trust barriers to survey participation
Globally, government departments and national statistical institutes (NSI) often rely on voluntary public participation in social surveys to produce reliable, high-quality data that inform policy decisions. Research from the ONS on engaging with the public on data suggests that there is high public trust in official statistics. However, this research also highlights that trust is only one factor influencing willingness to take part in surveys. Some barriers may be related to trust, but others reflect practical considerations, such as respondents being time poor or having a lower level of digital literacy.
Additional insight from Office for National Statistics (ONS) interviewer feedback and refusal data provides further context showing that despite high trust in official statistics, participation decisions are shaped by a broader set of attitudes and experiences. These include scepticism about survey authenticity, uncertainty about purpose and varying awareness of the ONS's role in collecting social survey data. Together, these evidence sources show that trust in the ONS does not necessarily extend to trust in social surveys or data collection. This demonstrates the complex and multifaceted nature of trust in the context of public engagement.
According to the Organisation for Economic Co-operation and Development's (OECD's) 2017 Guidelines on Measuring Trust, trust is defined as "a person's belief that another person or institution will act consistently with their expectations of positive behaviour". For survey respondents, trust relates both to interpersonal trust, meaning confidence in the individuals that represent the organisation, and institutional trust in the organisation itself. Interpersonal trust is often formed during initial contact, whether face-to-face or via telephone. Institutional trust also may be influenced by wider perceptions of government, data protection and institutional independence.
The UK National Statistician's Inclusive Data Taskforce (IDT) was set up to recommend how best to make improvements to the inclusivity of UK data and evidence. The first recommendation of the IDT's Leaving no one behind report, in 2021, was "to create an environment of trust and trustworthiness which allows and encourages everyone to be counted in UK data and evidence". The report highlighted that "trust is crucially important for the collection and use of data and for inclusion in statistics".
Building trust with survey respondents is a complex and challenging process. Understanding both the opportunities for, and barriers to, trust within the survey process is therefore not only of theoretical interest, but also of practical importance for NSIs.
Four main barriers related to trust emerged from reviewing previous ONS research. These were:
a limited awareness about the organisation and its role in collecting data through social surveys
a limited understanding of the purpose of data collection and the organisation's independence from government
concerns about data handling and data protection, including privacy, security and anonymity
issues experienced during the survey process
These barriers are interconnected and influence both interpersonal trust (for example, in interviewers) and institutional trust (for example, in the ONS as an organisation). They reflect wider societal trends, including:
declining trust in institutions
concerns about data security and privacy
growing scepticism towards unsolicited contacts
The findings are in line with the ONS's ongoing work on engaging the public with data, which has shown that public acceptance of data depends on factors, such as:
the type of data being used
the purpose of the data
which organisations have access to the data
whether the data are handled safely and securely
A further challenge relates to how trust-related barriers are distributed across different areas and demographic groups. Underrepresented groups are not geographically concentrated, and traditional advertising channels do not allow the precise targeting needed to reach households experiencing low institutional trust or unfamiliarity with the ONS. This creates a practical constraint for addressing trust barriers; even if messaging is well-designed, it often cannot be directed towards the audiences who would benefit from it most.
These findings highlight that building trust requires an integrated and adaptive approach across all stages of the respondent journey to meet respondent needs, not a single intervention. By proactively addressing trust to meet the needs of different groups, surveys have the potential to become more inclusive, improving participation and strengthening the quality of official statistics.
Nôl i'r tabl cynnwys3. Behavioural insights workshops
We conducted a workshop in December 2023 using the TESTS framework (Target, Explore, Solution, Trial, Scale) to further examine trust-related barriers to participation in Office for National Statistics (ONS) social surveys. The session focused on identifying practical and process-related barriers within the survey journey and exploring opportunities to improve participation. The workshop was an ideas session and was not intended to define or implement specific interventions.
The workshop included ONS staff with expertise in:
field interviewing
communications and public engagement
survey operations
research and methodology
TESTS framework – Target Phase
The problem that we identified was low response rates in ONS social surveys, which affects data quality and representativeness. The target behaviour was increasing participation by addressing trust-related barriers.
Internal research shows persistent underrepresentation among certain demographic and regional groups across ONS Surveys, including:
young people (aged 16 to 30 years)
ethnic minority groups
single (unmarried) individuals
households with three or more people
mortgage holders
those living in the most deprived areas
people in routine or intermediate occupations
residents of London, the Northwest, Scotland and the West Midlands
These patterns highlight persistent participation gaps.
In workshops, stakeholders discussed whether to prioritise:
increasing overall response rates across all groups
improving representativeness among underrepresented groups
pursuing a balanced strategy that achieves both goals
In defining the project objectives for improving survey participation, it was essential to ensure they were specific, measurable, achievable, realistic and timely (SMART). Clear, measurable goals helped to align strategic priorities, guide intervention design, and set realistic expectations for impact.
We explored two further potential objectives:
Raising overall response rates by a set percentage.
Closing participation gaps for key underrepresented groups.
Objective 2 was selected as the overall goal for the campaign.
TESTS Framework – Explore Phase
The workshop was designed to understand the context in which a target behaviour takes place and to identify the key barriers to it, as well as the factors that encourage and facilitate it. In this case, the target behaviour is increasing participation in social surveys, with substantial interest in focusing on Objective 2: closing participation gaps for key underrepresented groups.
A range of trust-related barriers that influence whether people decide to take part were discussed and summarised into the following high-level themes:
limited awareness of the organisation
limited understanding of the purpose of data collection
lack of understanding about data handling and protection
issues experienced during the survey process
During the workshop, the four barriers were examined using the COM-B model (Capability, Opportunity, Motivation – Behaviour) and mapped to the respondent journey:
when barriers occur (for example, at invitation stage, during interviewer contact, at follow-up)
where they occur (for example, letters, doorstep, helpline, online platform)
how they manifest (for example, fear of scams, lack of understanding, mistrust, accessibility issues)
The main trust barriers identified using the COM-B model are shown in the following subsection.
Key trust barriers identified using the COM-B model
Barrier 1: Limited awareness of the ONS
Capability
Respondents may lack knowledge and understanding of the organisation, its independence and its role in running surveys.
Opportunity
The way materials are presented (for example, envelopes, letters, branding) affects whether respondents can easily verify authenticity and engage.
Motivation
Low familiarity can lead to mistrust, fear of scams or the perception of low relevance, reducing willingness to participate.
Barrier 2: Limited understanding of the purpose of data collection
Capability
Respondents may not fully understand what data are collected or how this contributes to policy decisions.
Opportunity
Limited clear communication at early touchpoints means that respondents have fewer chances to learn why the survey matters.
Motivation
When respondents cannot see the purpose or impact of their contribution, or feel disillusioned, they are less motivated to engage.
Barrier 3: Lack of understanding about data handling and protection
Capability
Gaps in knowledge about anonymisation, legal protections and how data are stored, reduce confidence.
Opportunity
Lack of clear, accessible explanations about data security limits reassurance opportunities.
Motivation
Concerns about privacy and data misuse can trigger fear or anxiety, reducing trust and willingness to participate.
Barrier 4: Issues experienced during the survey process
Capability
Respondents may face barriers linked to accessibility, such as understanding instructions, navigating online platforms, or verifying an interviewer's identity.
Opportunity
Poor experiences with interviewer interactions, delays with incentives, or inaccessible materials, reduce the opportunity to participate comfortably.
Motivation
Negative emotional responses – such as anxiety, discomfort or frustration – can lead to attrition or refusal to take part in future waves.
TESTS Framework – Solution Phase
Potential solutions were identified using the Behavioural Insights Team's EAST framework (PDF, 573KB). One proposed solution was a paid communications campaign, building on the success of the ONS's Census 2021 campaign, which demonstrated that targeted paid media effectively engaged audiences and increased response rates. It was thought that a similar approach could help to overcome trust barriers and increase survey participation, so a trial communications campaign was agreed as the next step.
Nôl i'r tabl cynnwys4. Trial pilot communications campaign
As a result of previous research into the role of trust in shaping public engagement in social surveys and insights from the Behavioural Insights workshops, the Office for National Statistics (ONS) ran a pilot communications campaign in Birmingham Local Authority. The campaign was designed to address trust-related barriers affecting participation in ONS social surveys. The campaign aimed to test how targeted, locally tailored communication could help build trust, increase awareness of the ONS, and improve response rates. A key innovation of the pilot was to use channels that allowed postcode-level targeting – something that had not been previously explored in survey-related communications.
OASIS Framework
The campaign was developed using the Government Communications Services (GCS) recommended OASIS framework.
OASIS is a series of steps that can help bring order and clarity to planning campaigns. The aim is to help make the planning process rigorous and consistent. The five steps required to create a campaign using OASIS are:
objectives
audience/insight
strategy/ideas
implementation
scoring/evaluation
Objectives
The communications campaign was designed to effectively test whether targeted communication could improve public trust in and engagement with ONS social surveys.
The campaign aimed to address priority trust barriers by using GCS's COM-B behaviour change model.
Barriers to trust and engagement
Capability
difficulty identifying genuine survey materials
concerns about scams
uncertainty of data security and confidentiality
Opportunity
lack of visible, verifiable communication
uncertainty about legitimacy of surveys in the local environment
Motivation
low motivation to participate
doubts about societal value
concerns about safety and trustworthiness of communications
How the campaign aimed to address barriers to trust and engagement
Capability
prominently displaying ONS branding
signposting to the ONS website landing page through multiple channels to help respondents recognise materials and understand secure data handling
Opportunity
- using multiple visible and verifiable communication channels to prime households to expect the survey, reinforce legitimacy, and provide consistent cues for recognition and trust
Motivation
- maintaining consistent and positive messaging across all materials to highlight the societal value of contributions and to reassure households that ONS communications are safe and trustworthy
Audience and insight
ONS field operations staff ran a questionnaire to gather insights into the trust barriers they most frequently encounter at first contact with respondents. To narrow the focus of the campaign, participants ranked a set of 10 predefined trust barriers and reported how often each occurred (low, medium or high frequency). They also provided open-text feedback on how the ONS could build more public trust.
Based on 204 responses, the four most common trust barriers (in order of prevalence) were:
"I am worried that the survey could be a scam."
"I am unsure if my data will remain confidential and if my anonymity will be protected."
"It is not clear why my contribution is important and will make a difference."
"I don't know who the ONS is/that the ONS runs surveys."
Geographic pilot area
The pilot area needed to reflect a region where survey participation was particularly challenging and where an intervention could provide meaningful insights. Research shows that groups that are underrepresented in social surveys – including those living in deprived areas, younger people, and ethnic minority groups – tend to report lower levels of both generalised and institutional trust. The initial objective was to target underrepresented groups, but this would have made the target audiences for the campaign very small. It was decided instead to focus on all audiences, while ensuring that underrepresented groups could relate to the advertising materials.
A large volume of sampled cases was necessary to generate sufficient data for evaluation of the campaign's impact within the budget available, and to ensure that any observed effects could be meaningfully analysed. For more information, see Section 6: Overview of methodology and evaluation.
Birmingham was selected as the pilot area because of its:
low response rates (below 35%)
high deprivation levels (high Index of Multiple Deprivation score)
higher proportions of underrepresented groups (including younger households, larger households, ethnic minorities)
large volume of sampled cases that were suitable for piloting a communication campaign within available budget and timescales
Strategy and ideas
A comparative analysis of trust and survey communication strategies across statistical agencies (Canada, Ireland, Australia, Netherlands, and the UK) revealed common practices aimed at building public confidence. These practices include:
strong messaging on data security and confidentiality
assurances of independence and transparency
proactive measures against scams through official verification channels
These agencies emphasise the civic value of participation by framing surveys as essential for informed social and economic decisions, while expressing gratitude to respondents. Many agencies feature dedicated trust webpages, branding, videos, and leadership statements to reinforce credibility, with memorable taglines such as "Count on us, we count on you", which is included in a Central Statistics Office (CSO) of Ireland promotional video. Commercial companies adopt similar tactics for identity verification and fraud prevention.
Key recommendations for the ONS include:
creating a clear, accessible trust webpage
consolidating trust-related content
using videos and engaging taglines to enhance public trust and participation
Multi-channel approach
The campaign was deliberately designed as a multi-channel intervention. This is because delivering an integrated campaign with consistent messaging across multiple channels is more effective than using single channels alone. Using multiple channels also allowed for the impact and effectiveness of different channels to be evaluated.
Outcomes from the ONS's Census 2021 communications campaign showed that a phased approach to engaging with audiences was most effective at encouraging participation, using multiple channels, from direct mail (via a postcard) to a multi-channel paid-for media approach. This approach allowed the campaign to deliver engagement points using messaging at the right phase of the wave of contact.
The channels were chosen because they enabled postcode-level targeting. This was critical, because the goal of the campaign was to trigger behaviour change using campaign communications that were immediately followed by an invitation to participate in an ONS survey.
Paid media strategy
Precise targeting was essential, because the campaign needed to target only 762 households out of approximately 423,500 households in the Birmingham area (according to Census 2021 data).
The campaign focused on using paid media channels that allowed targeting audiences using the full postcode only. After consideration, we ruled out many channels, such as social media and other digital advertising, as targeting was only possible using district postcode. This would have meant serving adverts to an average of 8,200 households, when the campaign needed to target a small number, and would lead to a large amount of wastage.
After researching all media channels, and considering the budget and time required to make these operational, three routes provided the possibility of targeting using the full postcode. These were:
direct mail (postcard)
broadcast video on demand (BVOD)
digital out of home (DOOH) billboards
The channels operated strategically across the implementation period, aligning with the operational phases of survey roll out, particularly the invite letters and the field work.
Messaging across adverts was adapted to align to the three campaign phases: prime, prompt, and remind.
Phase 1: prime
We sent a postcard letting households know the invitation letter was coming and to look out for it. The postcard featured language that highlighted this and used an image of the letter. It also directed them to the landing page (via QR code/URL) for further information. The postcard was distributed at least one week before the invite letter.
Phase 2: prompt
BVOD, a 30-second advert, was shown on ITVX, Channel 4 and Sky to prompt participation during the fieldwork period, addressing trust barriers. The video went live approximately two weeks before fieldwork took place.
Phase 3: remind
DOOH billboards were placed within 1 kilometre of target postcodes to remind households to complete surveys and to increase general trust in the ONS across the wider Birmingham population. Adverts went live during the final few weeks of the campaign, in conjunction with the fieldwork phase.
Development of the campaign proposition
The ONS's Marketing, Survey and Design teams developed the creative campaign proposition. A creative workshop generated a long list of ideas, which were refined through an online survey. Some ideas were ruled out as too complex, or because there was not enough time to fully develop them. A shortlist of two key concepts was taken forward for testing:
"It starts with you (ISWY)"
"Make yourself heard (MYH)"
The objective was to determine which approach most effectively addressed trust barriers and encouraged participation. Two postcard designs were developed for each of the key concepts.
There were also three DOOH billboard designs created for each concept. The first was designed for impact and included an example of how survey data are used. The second was a generic design aimed at encouraging survey participation. The third focused on breaking the barrier of trust by addressing concerns around confidentiality.
Evaluating creative material testing
Before finalising the creative materials for the trust campaign, we conducted testing to ensure that the proposed messages, imagery and taglines effectively addressed the identified trust barriers. The objective was to identify which creative approach most effectively promoted trust and increased the likelihood of someone taking part in an ONS survey.
The testing was conducted through an online survey with a nationally representative sample of 1,004 respondents across England, Scotland and Wales, based on age, gender and region, and using quota sampling. An additional 113 participants from Asian, Asian British or Asian Welsh ethnic backgrounds were included because Asian ethnic groups are the most underrepresented in ONS surveys. This sample also aligns with the ethnic makeup of Birmingham, which ensures diverse representation and facilitates subgroup analysis.
Half of participants saw one postcard and three DOOH billboard adverts including the tagline "It starts with you (ISWY)". The remaining half of respondents saw one postcard and three DOOH billboard adverts, all including the tagline "Make yourself heard (MYH)".
Respondents were asked to rate each creative concept on several measures, including:
clarity
trustworthiness
memorability
relevance
appeal
likelihood to act
Qualitative feedback was also collected to understand emotional responses and suggestions for improvement.
Postcard testing findings
Both postcard designs were seen as clear, informative, and professional. Most participants felt the postcards explained why they should take part and that the postcards would encourage participation. Relatable content, such as references to cost of living, was positively received.
Some found the text on the back too dense and suggested a simpler layout. Colour feedback was mixed, and preferences varied, highlighting the need for inclusive design. The image of shoes on the ISWY postcard was described as "irrelevant" by some.
Demographic variation in response to the ISWY postcard
There was notable demographic variation in responses to the ISWY postcard. Respondents aged 55 years and over were significantly more likely to rate the ISWY postcard as "memorable" (54%), compared with those aged under 35 years (42%). Older participants also found the postcard more relevant. A higher proportion of younger respondents rated the postcard as "unappealing". Those rating the postcard as unappealing breaks down as follows:
12% of those aged under 35 years
15% of those aged 35 to 54 years
6% of those aged 55 years and over
White respondents were significantly more likely than Asian, Asian British or Asian Welsh respondents to rate the ISWY postcard as "relevant" (63%, compared with 43%) and "eye-catching" (59%, compared with 43%).
These findings suggest that while the ISWY postcard was generally received well, its effectiveness varied across demographic groups. This highlights a need for further refinement and development of targeted communication strategies to ensure inclusivity and resonance with diverse audiences in future iterations. Because of the small geographic scope and limited number of participants targeted in this campaign, it was not feasible to tailor the postcard to specific audience groups.
Trust and perceived effectiveness of postcards
Participants who expressed low or no trust in the ONS were significantly less likely to rate the postcards as "memorable" or "relevant", compared with those who reported any level of trust. They were also considerably less likely to indicate that they would read either postcard.
This suggests that if trust does not exist or has been lost, people are unlikely to engage with any materials. The campaign was aimed to reassure households that the ONS communications are safe and trustworthy. However, this finding shows reassurance alone may not work if confidence is already lost. Trust clearly shapes how materials are received. Even well-designed content may fail to engage people who lack confidence in the organisation. Building and maintaining institutional trust is therefore essential for effective public engagement.
However, it is likely that there is an overlap between people who do not trust the ONS and people who do not trust the government more broadly. This means we must recognise that a single institutional intervention may be limited in its impact.
Digital out of home billboards findings
Attention and engagement
Positive responses – DOOH Billboards successfully captured attention, with approximately half of respondents expressing interest.
Negative responses – a small number of respondents found visuals plain and lacking attention-grabbing features, saying:
"It does not stand out."
"Boring, isn't it."
Clarity of purpose and call to action
Negative responses – several respondents felt that the purpose of the billboard was unclear, saying:
"It does not explain why people should do the survey."
"It expects you to know what an ONS survey is."
Content relevance
Younger respondents engaged more with the ISWY DOOH billboard. This includes 41% of those aged under 35 years, compared with 27% of those aged over 35 years.
Negative responses – several respondents did not believe that a survey response could have an impact on things like mortgage rates or the cost of living, which were cited in the impact example, saying:
- "No survey is going to change that fact."
Other respondents felt that examples lacked relevance.
Visual design and imagery
Positive responses – several respondents appreciated the use of colour.
Negative responses – respondents viewed the image of shoes unfavourably, describing the images of chains and locks as "strange and off-putting."
Font size and readability
Negative responses – there was a recurring concern about the size of the text, saying:
"The font is tiny."
"Print at the bottom is tiny."
Additional features
Many respondents suggested adding a QR code for more information.
Tone and messaging
Positive responses – respondents considered both versions to be clear and approachable.
Negative responses – DOOH billboard tone raised suspicion, saying:
- "Looks like pushy propaganda."
Comparing the two taglines
The ISWY tagline performed slightly better across several engagement metrics (though differences were not statistically significant in most cases). Notably:
respondents were more likely to recall the ISWY postcard and report actions, such as looking out for the survey letter
the ISWY postcard was rated as more memorable, eye-catching, and appealing, compared with the MYH postcard
younger respondents (aged under 35 years) were more likely to engage with the ISWY DOOH billboard
Direct comparisons of respondents' tagline preference showed that they slightly favoured MYH. However, we decided to prioritise behavioural indicators – such as reported actions – over self-reported preferences when we evaluated the effectiveness of the materials.
These findings suggest that while both taglines have merit, ISWY may be more effective in prompting action and engagement.
Final creative proposition
The ISWY postcard had stronger overall engagement, compared with the MYH postcard, particularly in prompting respondents to look out for their survey letter, encouraging participation and being retained for future reference. It was also rated slightly higher across key design metrics, including clarity, memorability and visual appeal. However, effectiveness varied across demographic groups and younger and ethnic minority respondents were less likely to find the postcard appealing or relevant.
The ISWY DOOH billboard had generally better positive emotional impact and engagement among younger respondents. However, all DOOH billboard versions had issues relating to clarity of purpose, visual appeal, and accessibility of information. Additionally, reactions to the barrier DOOH billboard highlighted the importance of tone and framing when addressing sensitive topics like confidentiality. Participants emphasised a need for clearer messaging about the survey's purpose and benefits, and more inclusive and audience-sensitive design elements. Better ONS branding and clearer signposting (for example, via QR codes) could improve comprehension and impact, more effectively building trust.
Based on the testing results, we selected the ISWY creative route for the campaign. We made several adjustments before rollout, including:
addressing findings on the images and replacing with more relevant visuals (for example, the impact DOOH billboard referred to mortgage rates, which was felt to be irrelevant for some, so the message was changed to appeal more to younger people)
clarification of the call to action and the purpose of the campaign
simplifying text and ensuring font sizes were easily readable
use of QR codes
revision of DOOH billboards examples to ensure broader relevance and inclusion of the ONS logo
The final concepts are illustrated in the following figures, showing campaign images.
It starts with you billboard

Source: Office for National Statistics
It starts with you postcard

Source: Office for National Statistics
Nôl i'r tabl cynnwys5. Implementation
The communications campaign pilot was launched between 4 February and 28 March 2025 in Birmingham.
Broadcast video on demand (BVOD)
BVOD was viewed1.40 million times across Sky, ITVX and Channel 4 TV on-demand services.
Digital out of home (DOOH)
Digital billboards placed up to 1 kilometre from target postcodes were projected to be seen by over half a million people and viewed 1.22 million times (as some people would see them more than once).
Nôl i'r tabl cynnwys6. Overview of methodology and evaluation
As the pilot communications campaign was a test campaign, it was vital that an evaluation was conducted to inform decisions about whether the targeted campaign should and could be implemented more widely, across the UK. The evaluation aimed to understand:
whether a targeted campaign is an effective approach to overcoming the key trust barriers, which could in turn contribute to increased survey participation
whether respondents could be targeted effectively
what we learnt from the delivery of the campaign
A Theory of Change (ToC) was developed, mapping campaign inputs – postcards, video broadcasts, digital out of home billboards (DOOH) and the Office for National Statistics (ONS) landing page – to outputs, outcomes and impacts.
The evaluation used a theory-based approach, specifically Contribution Analysis, as experimental or quasi-experimental designs were not feasible owing to the lack of randomisation, budget, and time constraints. Contribution Analysis offers a structured way to assess causal contribution without experimental conditions, using the ToC to examine how the campaign influenced outcomes and to rule out alternative explanations.
The ToC outlined expected outcomes for households, including:
awareness of ONS surveys
understanding of why participation matters
trust that data will remain confidential
feeling motivated to engage
recognising the survey letter
not perceiving the survey as a scam
The ToC outlined the intended impacts as:
increased trust in the ONS
greater awareness of surveys
improved response rates in the targeted Local Authority (LA)
This aligns with the ONS organisational theory of change model (PDF, 69.7KB) outcomes:
"more response approach to user needs on priority issues"
"better quality statistical outputs"
While also aligning to impacts:
"the reputation of the ONS is enhanced"
"users are more satisfied with ONS statistics"
Evaluation questions
Evaluation questions were defined to align with outcomes from the ToC. There were originally two overarching evaluation questions.
Question 1: Did the campaign contribute to addressing trust barriers relating to participation in ONS social surveys?
Question 2: Did the campaign contribute to increased social survey participation?
When developing the evaluation these were then broken down further, into eight questions:
Were campaign materials received and/or viewed by sampled households?
Did the campaign contribute to increasing awareness of ONS surveys?
Did the campaign contribute to increasing understanding of why individual contribution is important?
Did the campaign contribute to increasing confidence that the data will remain confidential?
Did the campaign contribute to reassuring respondents that the survey is not a scam?
Did the campaign contribute to increased social survey participation?
Did the campaign contribute to increased motivation to participate in ONS surveys?
Did exposure to broadcast video on demand (BVOD) and/ or the digital out of home (DOOH) billboards campaign materials among non-sampled individuals increase awareness and trust in ONS social surveys?
Evaluation data collection
The ONS communications team collaborated with the media buyers, OmniGOV, to gather metrics on household exposure to the BVOD advertisement and DOOH billboards. Data on landing-page visits was collected, which provided evidence that recipients engaged with the campaign by using either the provided link or the QR code. Access to information about the ONS was provided through the postcards, BVOD Advert and a DOOH billboard.
For survey response rate analysis, two key comparisons were made against the Birmingham trial period (covering 24 February 2025 to 13 April 2025). The first was a comparison with the cohorts from seven weeks before the trial. The second was a comparison with the same period as the trial, but from the previous year (24 February 2024 to 13 April 2024). Manchester was selected as a comparator local authority because it has similar characteristics to Birmingham: comparable age and ethnicity profiles, and a sufficiently large response size.
Owing to sample-size considerations, the analysis was conducted using Transformed Labour Force Survey (TLFS) data only, with a sample size of 586 respondents.
Focus groups
Two focus groups with field interviewers were conducted to gather insights and interviewer experiences during the Trust Campaign. All field interviewers involved in the Trust Campaign were invited to participate.
Group 1
Group 1 consisted of three participants. These participants were all TLFS interviewers, who were involved in approaching potential respondents to encourage participation, also known as "knocks to nudge".
Group 2
Group 2 consisted of nine participants. These participants were all General Field Force (GFF) interviewers, who were face-to-face interviewers engaging with potential respondents at any point of contact.
Transcripts from the focus groups were analysed using thematic analysis.
Respondent Evaluation Survey
Finally, a Respondent Evaluation Survey was designed to gather feedback on the trust campaign, focusing on respondents' experiences and perceptions. Individuals must have previously been selected, in Birmingham Local Authority, for an ONS face-to-face household survey, including:
TLFS
Labour Force Survey (LFS)
Wealth and Assets Survey (WAS)
Living Costs and Food Survey (LCF)
Family Resources Survey (FRS)
Invitation methods were different depending on the survey:
TLFS – online invitations were sent via email to those who consented to re-contact; a £5 incentive was offered for online completion
LFS – respondents were invited to complete the survey during any doorstep contact
TLFS, Knock to Nudge (K2N) – some interviews were conducted in person during field visits
Evaluation analysis
Were campaign materials received and/or viewed by sampled households?
We measured results across:
delivery of postcards to sample households
delivery of adverts to sample households
delivery of DOOH billboards
all planned outputs delivered on time and within budget
the number of landing page views
Postcards were delivered to 762 sampled households. Field interviewers who participated in later focus groups reported that postcards were not received during the first two weeks of the campaign. However, no evidence was found to substantiate these claims.
The non-skippable advert (shown when selecting on-demand services) was shown 1.4 million times in 762 postcode areas. ITVX delivered 1,298 clicks and 0.22% click-through rate against the benchmark of 0.05%.
We estimate that the DOOH billboards reached 60.2% of the target audience, with an impact level of 1.72 million. This was an over delivery of 40.3% (this means that the advert was shown more than was planned). The remaining 39.8% could not be reached because of a lack of available inventory in those postcode areas. However, there was a degree of spillover, with individuals from non-targeted households also being exposed to the DOOH billboards.
The landing page was seen 282 times in total (177 views from the postcard, 97 views from the advert and 8 views from the DOOH billboard).
The sample size of the evaluation survey was very limited, with 125 interviewer-led responses and 33 online follow-up responses from TLFS. This strictly limits the reliability and representativeness of the findings. However, in total:
44% of respondents recalled seeing the postcard
13% of respondents recalled seeing the BVOD advert
4% of respondents recalled seeing the DOOH billboard
The vast majority of TLFS online respondents (91%) who completed the evaluation survey reported noticing the postcard. In contrast, only about a third (32%) of those asked by interviewers on the doorstep said they had seen it. This discrepancy was also reflected in the qualitative feedback gathered by interviewers, suggesting that while the postcard is visible to many, it may not be seen by every member of the household.
Under half of TLFS online respondents (45%) who completed the evaluation survey reported seeing the BVOD advert. In contrast, 4% of those who were interviewed face to face recalled having seen it. This pattern was repeated in feedback from field interviewers during focus groups, who noted that some respondents they spoke to on the doorstep did not recall the advert. These reports contrast with web metrics that tell us that it was shown 1.4 million times. However, evaluation survey figures should be interpreted with caution because of the small sample size of the research.
Advertising and materials recall is often underestimated. This is because people often claim not to have seen or heard advertisements that they have been shown, also known as passive exposure and selective recall. However, it is possible that advertisements that people say they do not recall can influence their attitudes and behaviours. However, the BVOD advert was shown 1.4 million times in the postcard areas only, which is a substantial number. In addition, the advert received 1,298 clicks, which is in excess of the target, with a 0.22% click-through rate, compared with a target of 0.05%. This indicates interest in the content of the advert. Although, web metrics tell us that the landing page was viewed only 181 times in total.
While the small-sample feedback is mixed, it is clear that the advert was shown, at a high volume, in target postcodes. Although this does not tell us how many times the advert was seen and if this led directly to survey participation.
Additionally, web tracking is only possible if users accept cookies. Some reports show that as many as 70% of users reject cookies, meaning that the 181 viewings of the landing page may represent only a small proportion of the actual number of people who visited the website.
A small proportion of TLFS online respondents (12%) who completed the evaluation survey reported seeing the DOOH billboard. Just 2% of those interviewed face to face recalled noticing it. This low level of visibility was also reflected in feedback from field interviewers during focus groups, who reported that most respondents they spoke with did not recall seeing the DOOH billboard and several interviewers expressed frustration at not being able to locate them themselves during fieldwork.
The postcard had strong visibility among respondents in the online evaluation, with a high percentage recalling having seen it. It is important to note that these individuals had already completed the main survey and agreed to be re-contacted, which may have influenced recall.
While the interviewer-led group showed a lower recall rate overall, the postcard still performed better than other media formats – notably the BVOD advert and DOOH billboard, which had the lowest levels of recall.
Did the campaign contribute to increasing awareness of ONS surveys?
There is evidence of an awareness boost from the campaign, with respondents who reported seeing the postcard when interviewed face to face, saying:
- "I am more aware of the ONS and the work they do."
Interviewers also reported in the focus groups that they experienced positive reactions when respondents were familiar with the ONS. They felt the postcard played a key role in building recognition, stating:
- "They knew already about the ONS."
They also said that it had acted as a primer, helping respondents to feel more receptive to the survey. Overall, they reported that the postcard had reduced uncertainty by introducing the ONS and its purpose in advance.
A similar pattern was observed for the BVOD advert, among those who reported seeing it. However, there is limited evidence of impact from the BVOD advert and the DOOH billboards used in the campaign because of very low sample sizes.
Among the 29% who refused to take part in the main survey, a few reported that they had never heard of the ONS before, but said that the postcard made them more aware. However, these cases were very limited and 22% of those who refused participation said that they did not recognise the ONS as an organisation or know about its work conducting surveys.
It is therefore difficult to draw conclusions as to whether the campaign increased awareness of ONS surveys.
Did the campaign contribute to increasing understanding of why individual contribution is important?
Individual contributions were promoted via postcards and DOOH billboards, with the BVOD advert reinforcing their importance.
Nearly half of online respondents who saw the postcard selected "I understand why it is important to be counted in an ONS survey" in the follow-up survey. This means that the rest of the respondents who saw the postcard did not select this option. The postcard includes key information about the importance of participation, so this raises concerns, such as:
Is the message unclear?
Are people not reading the postcard?
Does the format or placement affect engagement?
Among those who saw the BVOD advert or DOOH billboard, the survey showed understanding was low. For the BVOD advert, the sample size was only 15, limiting conclusions. Understanding was lowest among those unfamiliar with the ONS who only reported seeing the postcard. Understanding was slightly higher among those who were aware of the ONS, but still low overall.
In conclusion, while a minor increase in understanding was observed, the small sample size means that we cannot confidently determine the true impact of the campaign materials.
Did the campaign contribute to increasing confidence that the data will remain confidential?
The QR code and link on the campaign materials led respondents to a landing page where there was detailed information on how the ONS protects respondents' information and the confidentiality of their responses and data. The landing page was viewed 181 times in total. In the follow-up evaluation survey respondents were asked: "did you scan a QR code on the postcard, or follow a web link from the BVOD advert or DOOH billboard, to visit the ONS website to find out more information?"
Web metrics indicate that individuals who interacted with campaign materials, either scanning the QR code or following links, did visit the ONS website. However, low numbers reported scanning the QR codes, following links or visiting the website across both the online respondents and those asked by interviewers. Very few respondents (both online and interviewer-led) reported feeling reassured about data confidentiality after visiting the site.
While web metrics suggest engagement, self-reported data do not strongly support this and do not indicate reassurance of data confidentiality. It is difficult to draw conclusions as to whether the campaign contributed to increased confidence that data remain confidential.
Did the campaign contribute to reassuring respondents that the survey is not a scam?
For those who reported seeing the postcard, 65% of interviewer-led respondents and 50% of online respondents agreed that the postcard made the survey appear legitimate and therefore reassured them that it was not a scam. Sample sizes for those who reported seeing the BVOD advert and DOOH billboard were too small to draw reliable conclusions.
Interviewers reported in the focus groups that concerns about scams made it difficult to establish trust with some respondents. Interviewers explained that, while general campaign awareness helped reassure certain individuals, the interviewers themselves felt the materials lacked clear information about the ONS, which limited their sense of reassurance. However, interviewers suggested that if materials were official or personally addressed, it could increase respondent confidence that surveys are legitimate and would make them more likely to engage.
It is important to note that because of word-count restrictions on materials, it is difficult to cover all relevant information.
Did the campaign contribute to increased social survey participation?
There was a significant increase in the overall response rate over the Birmingham trial period (24 February 2025 to 13 April 2025), compared with the seven weeks before the trial and one year before the trial. The trial response rates showed:
no significant change in full completions
no significant change in partial completions
no significant change in the proportion of partial responses
Only response rates for the TLFS were analysed because the sample sizes for the other surveys were too small to allow meaningful analysis.
| | Birmingham Trial | Seven weeks Before Trial | One-year ago |
|---|---|---|---|
| Household Overall Response Rate | 33.3% | 27.9% | 27.0% |
| Household Full Completes | 21.8% | 17.4% | 18.8% |
| Household Partial Completes | 11.4% | 10.4% | 8.3%* |
| Proportion of Household returns that are partial | 34.4% | 37.4% | 30.6%* |
| Sample Size | 586 | 556 | 400 |
Download this table Table 1: TLFS Return Rates
.xls .csvIn comparison to Manchester, there was no significant change in the overall response rate in the same period as the Birmingham trial (24 February 2025 to 13 April 2025), compared with the seven weeks before the trial or one year before the trial. This indicates that the trial intervention could be a factor in improving response rates in Birmingham.
Response rates during the Birmingham trial were higher from the first day of the trial, and continued to be higher than comparators for the entire period:
Birmingham had a 7.3% response rate on day one of the trial
Manchester had a 3.2% response rate
Birmingham had a 2.2% response rate in the seven weeks before the trial
This suggests that pre-trial postcards, which were sent three weeks before the cohort start date, may have had a positive impact on starting response rates.
Knock to nudge
Knock to nudge (K2N) visits are allocated to cases that are considered difficult to reach because, for example they are based in:
rural locations with increased travel requirements,
urban locations where access can be difficult,
areas with high indices of multiple deprivation (IMD), where response rates can be lower
Among those receiving a K2N visit, the response rate is understandably much lower than for those who do not receive a visit. For those receiving a visit, there was no significant difference in the overall response rate between the Birmingham trial, the seven weeks before the trial, and one year before the trial. We were unable to test full completions and partial completions because of small sample sizes.
Among those who did not receive a visit, there was a significant difference in the overall response rate (52.6%), compared with 38.7% in the seven weeks before the trial, and 31.6% from one year before the trial.
During the Birmingham trial, the majority (77.7%) who were eligible for a visit, received one. This was significantly higher in the trial than in the seven weeks before the trial and in the same period one year before the trial. Eligibility for K2N is determined through adaptive survey design.
A significantly lower proportion of households in the Birmingham trial were eligible for K2N, compared with the cohort surveyed seven weeks before the trial. This suggests that the Birmingham trial may have had a lower proportion of cases considered to be difficult to reach.
The proportion of households refusing to participate significantly increased in the trial period compared with the cohort surveyed one year before the trial. There was no significant change in the proportion of those refusing to participate in the survey compared with the seven weeks before the trial. If the trial had an impact on trust, we may have expected this to show a significant decline in the proportion of refusals. While this is not presented in the data, no increase has been observed either.
| | Birmingham Trial | Seven weeks Before Trial | One-year ago |
|---|---|---|---|
| Circumstantial refusal | 7.5% | 8.8% | 3%* |
| Outright refusal | 1.4%** | 1.4%** | 2%** |
| Non refusal | 91.1% | 89.7% | 95.0% |
| Sample Size | 586 | 556 | 400 |
Download this table Table 2: Proportion of refusals
.xls .csvProportion of refusals
Sample composition
There is a significant difference in the age-group composition of the sample between the Birmingham trial and the cohort surveyed seven weeks before the trial. Sample composition findings show the following changes in participation by age group:
a significant increase in those aged 64 years and over
a significant decrease in those aged 16 to 64 years
no change in those aged 15 years and under
The trial aimed to increase engagement from the underrepresented groups, and this finding suggests that the trial did not increase participation from those age groups that are considered difficult to reach.
Compared with the seven weeks before the trial, the Manchester sample composition findings show the following changes in participation by age group:
a significant decrease among those aged 15 years and under
a significant increase among those aged 16 to 64 years
no change in those aged 65 years and over
This suggests that the trial may have made an impact in increasing participation among those aged 65 years and over, as this effect was not seen in Manchester (the comparator local authority), where no trial took place.
There are no significant changes within the sample composition across sex, ethnicity, household size or IMD Decile, compared with one year before the trial or compared with seven weeks before the trial. The trial's aims included an objective to reach a larger proportion of rarely heard ethnic groups (those not categorised in the White ethnic group). However, increased participation among these groups is not reflected in sample composition findings.
Did the campaign contribute to increased motivation to participate in ONS surveys?
Respondents were asked to select statements that they agreed with from a list. One of these was:
- "I am more likely/it made me more likely to participate in the ONS survey I was invited to."
Of those who reported seeing the postcard, 35% of interviewer-led respondents and 40% of self-reporting online respondents said they agreed with this statement.
Sample sizes for the BVOD advert and the DOOH billboard were too small to allow us to draw conclusions. However, receiving postcards could be associated with an increased likelihood of participating. This suggests that the postcard may have motivated recipients to take part.
The interviewer focus groups reported that the postcard acted as a primer and respondents were more receptive to the survey after seeing it:
- "I did receive some positive reaction after when they've seen the postcards, and they were prepared about what ONS is."
However, the focus groups also mentioned that there may have been external factors that had an impact on motivation to participate, including cultural and seasonal events and local disruptions. Ramadan and Eid were reported to have affected availability and engagement in some communities. Other interviewers mentioned that they thought the local bin strikes had had an impact and noted that media coverage of the strikes portrayed them as more severe than they thought was the case in most areas. However, the strike triggered unexpected discussions marked by a rise in anti-government rhetoric that interviewers had not seen before.
- "For the first time, although I hear people talk about anti-government during our surveys. The last couple of months has been the first time ever I've heard people say to me, you know, it's clearly anti-government, that they don't want to help any government whatsoever."
This shift in sentiment may have broader implications and could be evidence of:
eroding trust in public institutions and official communications
reduced willingness to participate in future initiatives or data collection efforts
Overall, these reports signal a need to better understand and address public concerns to rebuild confidence in surveys.
Did exposure to BVOD/DOOH campaign materials among non-sampled individuals increase awareness and trust in ONS social surveys?
There were 1,300 landing-page clicks from the BVOD advert. This indicates that non-respondents were engaging with the advert to find out further information about ONS surveys. However, response rates have significantly declined since the trial, from 33.3% to 27.0%, suggesting that the campaign did not have a lasting effect on the local authority.
Evaluation limitations
It is possible that there is a cumulative effect where repeated exposure to different materials leads to better recall in participants, and that this gradual impact is not captured by the current evaluation design.
Across the evaluation, we received small sample sizes, which make the results for the BVOD adverts and DOOH billboards unreliable and difficult to interpret.
It was also not possible to use a robust quantitative impact assessment method, owing to a lack of randomisation, and budget and time constraints.
The Birmingham trial included a significantly lower proportion of households eligible for K2N. It is also possible that cultural and seasonal events, such as Ramadan and Eid, and local disruptions, such as the bin strikes, had an impact on participation through the campaign period.
Finally, because of budget and time constraints, the campaign was only run for one month and only utilised two paid-for advertising channels.
Nôl i'r tabl cynnwys7. Findings from our pilot trust campaign
Summary of findings
There was a significant increase in the overall response rate in the Birmingham trial (24 February 2025 to 13 April 2025), compared with responses from the same area seven weeks before the trial and one year before the trial (24 February 2024 to 13 April 2024). Full completions made up most of this increase. Manchester did not have a similar increase.
At the cohort level, those cohorts that had exposure to the adverts for the entirety of their collection periods had higher response rates than cohorts that only had a partial overlap. This indicates a possible positive impact of the trial.
Cohorts in the Birmingham trial started with a higher day-one response rate, compared with seven weeks earlier. Compared with Manchester response rates, this indicates a possible positive impact of the postcard.
The greatest impact on response rates was on those cases where a knock to nudge (K2N) visit was not made. This suggests that the trial may have had a greater impact on those cases that were already easier to reach.
There was no significant change in the proportion of refusals in the survey, compared with the seven previous cohorts. This suggests that the trial might not have affected trust among those who were most reluctant to take part.
There was a significant increase in the proportion of respondents aged 65 years and over and a significant decrease in those of working age. All other demographics tested showed no significant changes.
Analysis of findings
The pre-campaign testing of both postcard and digital out of home (DOOH) billboard materials revealed valuable insights into public engagement, design effectiveness, and the influence of trust, demographics and socio-economic status on communication outcomes. These findings highlight the importance of inclusive design and the need for a range of channels and engagement strategies that resonate with diverse audiences. However, more research is needed to better understand the specific impact of socio-economic status on engagement and response.
Prioritising behavioural outcomes over subjective preferences provides a more robust basis for selecting communication strategies in future campaigns. Incorporating digital elements, such as QR codes or short URLs, can provide respondents with access to additional information, videos or frequently asked questions.
Institutional trust emerged as a critical factor that influences the perceived effectiveness of both postcards and DOOH billboards. Participants with low or no organisational trust were significantly less likely to find the materials memorable, relevant or worth reading. This suggests that even well-designed content may fail to engage audiences who lack confidence in the organisation. This reinforces the importance of building and maintaining institutional trust as a foundation for successful public engagement.
Tailoring was not operationally viable for this campaign because of its limited geographic scope and small participant pool. Future campaigns should explore audience segmentation and targeted messaging strategies, where feasible.
However, audience segmentation and targeted messaging strategies are inherently limited in the context of social surveys. This is because individual-level information about sample respondents is not usually available before initial contact. As a result, postcard messaging can only be tailored at a broad level, rather than to specific subgroups. For example, they can be tailored by local area, index of multiple deprivation (IMD), or by information on demographic factors like ethnicity from the latest census and other data sources.
If postcards were targeted towards specific socio-demographic groups, it would likely be with the aim of increasing response rates among subgroups that are currently underrepresented. However, these groups generally make up only a small proportion of the overall sample. Any tailored messaging would need to be carefully designed to ensure it does not inadvertently reduce engagement or trust among other population groups.
By contrast, digital channels such as online advertising and DOOH billboards offer more flexibility to introduce variation in messaging imagery and tone. These channels can be geographically targeted and strategically timed. This allows for more nuanced approaches that reflect local context and that may resonate with different audiences, without the same level of risk associated with printed materials. However, DOOH billboards can only be targeted at district postcode level, which has limited accuracy and a high risk of wastage.
During our evaluation, operational limitations in data collection affected the data quality overall. This means it is not possible to distinguish which intervention led to change. We are also unable to fully determine which demographic groups were most affected by the campaign, though findings suggest that older people and groups who are considered easier to reach were more affected.
However, evidence from the contribution analysis suggests that the postcard did play a role in raising awareness of the ONS. It also suggests that the postcard played a role in reassuring recipients that the survey was legitimate and not a scam, and helped people understand the importance of their participation.
Recommendations
The initial trust pilot was a proof of concept; there were limitations to what could be achieved with the sample size and variety, timing, messaging, and channels. Based on the research findings from the ONS's pilot campaign and behavioural insights work, we have 10 recommendations relating to communications and research that could be taken forward to consolidate and build on the learning and existing best practice.
These recommendations reflect our finding that building trust requires an integrated, adaptive approach across all stages of the respondent journey rather than any single intervention.
To support the development of communications campaigns to build trust, we propose the following recommendations.
Recommendation 1 – simplify messaging
Keep messages simple and make calls to action easy to find. Long or complicated text lowers engagement, particularly among those with lower trust.
Though this is challenging when promoting multiple surveys, ensuring survey purpose is clear and understandable is important. It is essential to explain why the survey matters in everyday terms, without assuming people already know the organisation or its surveys.
Recommendation 2 – use a multi-channel, phased approach
Test a multi-channel, phased approach, compared with using a single channel. A "prime, prompt, remind" sequence – where households receive advance notice, followed by broadcast advertising during fieldwork and then reminder materials – would create more contact points and would potentially encourage more engagement.
Recommendation 3 – provide resources to check organisational legitimacy
Make branding clear and easy to check. Show the organisational branding. Provide clear routes for recipients to verify authenticity and build understanding of what the ONS is and does. Further test including QR codes or URLs linking to dedicated trust webpages where people can confirm legitimacy and learn about data protection.
Recommendation 4 – address confidentiality and security concerns
Address scam concerns directly, but carefully. The most common trust barrier is fear of scams. Messaging about confidentiality and security should be framed positively, rather than defensively.
Recommendation 8 – pre-test materials with diverse groups
Test messages and materials thoroughly with diverse demographic groups. Different groups responded differently in the pilot. Testing should include groups that are underrepresented and people with different levels of trust. This will help identify problems early, before the campaign is fully implemented.
Recommendation 9 – design future pilots for useful evaluation
Design evaluation frameworks that can isolate the effects of different interventions. It was difficult to link results from the pilot to specific actions. Future campaigns should include randomisation or comparison areas, where possible, and ensure sufficient sample sizes to allow analysis of different population groups.
Recommendation 10 – consider external factors that influence trust
Continue to monitor external factors that influence trust. Local issues (like bin strikes) and cultural events may have affected how people felt and whether they were available to take part. Understanding these wider factors helps explain changes in response rates and can guide the timing of future communications.
Next steps
Despite encouraging results from our pilot trust campaign, we still face challenges to reframing the role of surveys for citizens, to improve response rates and target underrepresented groups. The second phase will support efforts in this area.
Strategic planning is in progress and is focused on, but not limited to:
clear objectives and research questions
robust methodology and rigorous design
engaging the right expertise
emphasis on diligence and efficiency because of budget considerations
This is in line with plans for the Refreshed Citizen Relationship, described in Section 4: Social surveys of our Survey Improvement and Enhancement Plan for Economic Statistics. Our plans for 2026 and beyond include:
enhancing our communication strategy by using inclusivity research to engage underrepresented groups
trialling targeted campaigns and incentives
improving outreach to communities with low response rates, including those with English as a second language