1. Introduction
The census is vital for shaping policy that will determine the country’s future and for ensuring that decision-makers have access to the right information to plan and deliver the services used by us all. The census:
underpins local and national decisions on provision of education, housing and healthcare
informs fair distribution of funding to local areas
provides accurate national and local information on the diversity of the population allowing public bodies to know whether they are meeting their multiple duties and to take action where necessary
ensures our population estimates are as accurate as possible – without which both economic and public policy “per head of the population” rates cannot be accurately calculated
The public have given census information for over 200 years. It is vital that the census sheds light on long-term trends, while also reflecting the society in which we live today. It would not be sensible if our data still captured the occupation of lamp-lighting and didn’t now include social media analyst. With the role of the census being to collect information on the social condition, then we need to move and reflect the society we live in, often collecting new information which previous generations would not have imagined.
As we reflect on how best to capture changes in society for 2021, we need to work with all interested parties to reach a common view on the information (or topics) that should be captured in the 2021 Census. The framework to help us reach this view is our assessment criteria of user need for the data, public acceptability of the topics and questions, space constraints and legal duties.
This report provides an update of our research and testing on questions and topics for the 2021 Census. For the vast majority of topics there is a consensus view – these are set out in Annex 1. In some areas further research work or additional information such as the views of stakeholders is required before we can finalise our recommendations to government.
The topics where the focus of our recent work has been are at the front of this report (section 3). This includes four “potential new topics” that we said we would investigate (armed forces community, gender identity, sexual orientation and volunteering) and two topics where more research was required (ethnic group and number of rooms or bedrooms).
Details of our position for all topics, which includes some of the other research we have conducted, are outlined in Annex 1. A separate testing annex is available which provides background information on the research described in section 3 – mainly the sampling methodology and wording of the questions tested.
For the remaining work, we will either need to undertake further research or engage further with stakeholders to assess both user need and commonality of views within different communities. This further work will provide the most rounded information to evaluate against our assessment criteria for new topics: user need for the data, public acceptability of the topics and questions, space constraints and legal obligations.
Our recommendations will inform a government white paper in 2018. The final approval of the topics and questions to be included in the 2021 Census via the Census Order and the Census Regulations is for Parliament. The Census Regulations in Wales also need approval from the Welsh Assembly.
1.1 Background
Our response to the 2021 Census topic consultation conducted in 2015 was published in May 2016 in The 2021 Census – Assessment of initial user requirements on content for England and Wales: Response to consultation. This provided an overview of the evaluation process we used to assess the responses and summarised the results of the evaluation. We set out our updated view on the topics to be included in the 2021 Census, including a summary of proposals for new topics, next steps and an overview of our plans.
This process was started when we invited views on The 2021 Census – Initial view on content for England and Wales between 4 June and 27 August 2015. The aim was to promote discussion and encourage the development of strong cases for topics to be included in the 2021 Census. The focus was on information required from the 2021 Census, not the detailed questions that should be asked on the questionnaire.
As in previous censuses, there will be separate censuses in Scotland and Northern Ireland and the three census offices are working to develop a set of questions that, wherever possible, deliver harmonised statistical outputs across the UK.
Nôl i'r tabl cynnwys2. Evaluating needs and recommended content for 2021
In addition to any legal obligations on us to collect information, the criteria used for the evaluation of the 2021 topic consultation are shown in Table 1. The evaluation criteria, and the evaluation of each topic against the criteria, are described in The 2021 Census – Assessment of initial user requirements on content for England and Wales: Response to consultation.
Table 1: 2021 Evaluation Criteria
User requirement | Other consideration – impact on: | Operational requirement |
---|---|---|
Purpose | Data quality | Maximising coverage or population bases |
Small geographies or populations | Public acceptability | Coding of derived variables and adjustment for non-response |
Alternative sources | Respondent burden | Routing and validation |
Multivariate analysis | Financial concerns | |
Comparability beyond England and Wales | Questionnaire mode | |
Continuity with previous censuses | ||
Source: Office for National Statistics |
Download this table Table 1: 2021 Evaluation Criteria
.xls (63.0 kB)The impact of overall respondent burden has been assessed within this set of criteria. There are design and layout constraints for the online census, including considerations of the layout of questions on different sizes of mobile devices.
Although the 2021 Census will be primarily online, consideration is being given to the design and space constraints associated with the need to produce a paper version of the 2021 Census questionnaire. Therefore the inclusion of any new questions may mean that either some questions from the previous census may need to be dropped for 2021 or existing questions may need to be simplified and shortened while still meeting user need.
Having used the evaluation criteria to prioritise topics for further consideration and research, we will next need to prioritise what is included on the questionnaire. This process will need to balance considerations of space, cost, public acceptability and the legal position.
For some topics, additional testing is required to develop a question that: meets user needs; is acceptable to the public and relevant communities; and is easy to answer. Remaining testing across topics includes:
considering question placement within the questionnaire
reviewing instructions and guidance
testing online validation and auto fills for write-in options
For new questions or reworded questions, we also need a question in Welsh for use in Wales. We have already conducted a process to develop questions in Welsh with recognised Welsh speakers. We have conducted focus groups with members of the public to review these questions. The next steps are to test the Welsh questions in cognitive in-depth interviews. Final recommendations for the Welsh question designs are for Welsh Assembly approval through the secondary legislation referred to earlier.
Nôl i'r tabl cynnwys3. Developing and testing questions
This section sets out the areas we have been focusing our recent research and testing on. The following sections outline the research we have done on each of these. The first four are new topics we committed to investigating following the 2021 topic consultation.
3.1 Armed forces community
The armed forces community covers those who are serving, service leavers and their families. In our response to the 2021 Census topic consultation we proposed not to develop or test a question. Instead we outlined our intention to explore the use of administrative data held by the Ministry of Defence.
Since then we have clarified the user need and concluded that the administrative data currently available does not provide full coverage of those who previously served in the armed forces. Therefore, we published an update in October in which we announced our plans to include a question.
Reviewing the user need
We’ve worked with the Ministry of Defence (MoD), local authorities, clinical care groups, and charities to better understand the need for data and how best to meet that need. Through our discussions with users we’ve clarified, and gained further evidence of, the user need for information on those who have served in the UK Armed Forces, and their dependants.
Users told us that they need this data to support their commitments under the Armed Forces Covenant. Central and local government have both committed to the Covenant and it’s now part of the NHS charter. Those who sign up to the covenant commit to ensuring those who serve or who have served in the armed forces, and their families, are treated fairly. Support is provided in a number of areas including education and family well-being, housing, healthcare and employment.
Users need this data at local authority level and below so they can better target services to those most in need. They also told us that it is important they have this data for ex-armed forces personnel in the retired age groups of the population.
Assessing alternative sources
The Veteran Leavers Database (VLD) is a single source of electronic information for service personnel that have left the UK Armed Forces since 1975. This database contains records for approximately 1.9 million service leavers, sourced from a variety of legacy and current administrative systems held within the MoD. Records exist for both regulars and reserves who have served in the armed forces for at least one day. Their home address at the time of leaving the forces is recorded, but it’s not updated over time.
We anonymously linked the VLD to the 2011 Census in our safe and secure research environment to see if it could provide sufficient information to meet users’ needs. Our assessment showed that the linked dataset has potential for producing estimates that meet user needs. However, there are gaps in the dataset that can only be filled by collecting this information by other means.
If we ask the question in the 2021 Census we can create a base from which statistics can be produced. Using this census base with survey and administrative data from the VLD would allow us to produce statistics on an ongoing basis. This is subject to ongoing supply of the administrative data.
We also considered using data from the Annual Population Survey (APS) which is a survey with an achieved sample size of approximately 320,000 individuals across the UK. Questions on the armed forces community have been asked since 2014. The data provided by the APS meets some user need at a higher geographic level for all ages. However, it is not designed to be reliable at local authority level and below. This is still the case for most areas even if years of data are pooled together. Therefore the APS cannot provide the low-level area data needed by users.
Developing a question
We’ve been developing and testing a service leavers question alongside this work. So far we have run two rounds of cognitive testing, with a focus on respondents’ understanding of our proposed questions, and a test that focused on public acceptability of a question on service leavers.
In the cognitive testing, we found some challenges in capturing those who have previously served, those who are currently serving and those who serve in the reserves. In our public acceptability testing we found that 88% of respondents said that they thought it was an acceptable question for the 2021 Census. Of the remaining respondents, 7% were undecided and 5% thought it was unacceptable. For most respondents this was because they were unsure of the purpose of the question or they felt it was irrelevant. We are continuing testing to ensure a question that doesn’t over-count service leavers and is acceptable to the public.
3.2 Gender identity
The 2021 topic consultation highlighted a need for data on gender identity in order to understand inequality, inform and monitor policy development and allocate services for this population. The introduction of the Equality Act 2010 further strengthens the user requirement for those with the protected characteristics of gender reassignment.
We also identified a respondent need, with some members of the public reporting that they were unable to complete the current sex question accurately as it only offered the two categories of male or female. A major concern was not to damage the information already collected through the male or female sex question, which is an important variable. The research was focused to ensure that we fully understood this issue.
We don’t currently collect this information on any social surveys, so we developed a Gender identity research and testing plan (May 2016) to inform our position on this topic. It outlined next steps including engaging with relevant stakeholders, learning from other National Statistical Agencies, and identifying alternative data collection options. Within the plan we also committed to undertaking a review of the transgender Data Position Paper which was published in 2009. In response to this we published a Gender identity update (Jan 2017) which detailed changes and progress around the topic of gender identity and covered our research, testing and findings to that date.
Engaging with stakeholders
We have continued to engage with users and held focus groups to understand this user need further and to develop a clear understanding of the different concepts. We held a stakeholder update meeting in June 2017 and two stakeholder workshops, one in August 2016 and one in September 2017.
The workshops presented a consistent data need for a transgender population count, including individuals of all ages, and a respondent need for being able to self-identify. The workshops also provided insight into how we could ask a gender identity question on the 2021 Census. This has shaped the testing we have been, and will be, conducting to ascertain whether we can devise questions that meet needs while ensuring that we can collect the vital information on male or female accurately.
Understanding concepts
Our research has focused on understanding three concepts:
Sex: male or female – this is the legal concept and a key research variable
Gender: male, female or other – this is about the respondent need to be able to self identify and answer the census as well as being able to estimate those who identify as non-binary within the transgender community
transgender population – the need for a reliable estimate of the population identifying as transgender which we define as those whose gender identity is different from the sex they were assigned at birth, including those who do not use the binary classification of male or female
We are aware man and woman are sometimes used in the transgender community rather than male or female but our research so far has suggested that the use of male and female are acceptable.
Question design testing
We conducted qualitative research through focus groups and in-depth interviews. This research aimed to explore how both the transgender population and cisgender population (people whose gender identity is the same as the sex they were assigned at birth) interpret concepts around gender identity and sex. It also explored how they might answer questions relating to these topics. As part of this research, potential barriers to answering the questions and completing the census were explored including terminology and any privacy, security, burden and acceptability concerns.
Three question designs were used in the research:
version 1 – the 2011 Census ‘’sex’’ question
version 2 – the 2011 Census “sex” question with the addition of ‘’other’’
version 3 – a two-step design with separate sex and gender identity questions
The full wording of these questions can be seen in the testing annex.
It was concluded that the three question designs, as currently presented, would not fully meet the requirement for a reliable estimate of the transgender population. In addition, the research found that further question development was required to ensure respondents could easily understand and provide an appropriate answer to any question(s) on sex and/or gender. The questions were re-developed based upon these findings. These are in the process of being cognitively tested to consider how information on gender identity could be collect in addition to information to be collected on sex.
Public acceptability testing
In 2016, we commissioned independent research into the public acceptability of asking about gender identity using one of the Equality Human Rights Commission’s recommended gender identity questions. The full wording of these questions can be seen in the testing annex.
The results showed that 80% of those in England and 75% of those in Wales considered it acceptable to ask a question on gender identity. About 5% in England and 10% in Wales stated that they would skip a gender identity question but continue to respond to the census. Around 1% said a gender identity question would lead them to stop the census altogether.
When asked about having to answer a gender identity question on behalf of other household members aged 16 and over, acceptability was still at 80% in England but decreased in Wales to 67%. Acceptability declined in England to 69% and Wales to 66% when participants were asked about answering such a question on behalf of other household members aged 15 or under.
Findings from the public acceptability testing show that there appears to be no significant problems with asking a gender identity question on the 2021 Census. The acceptability reduced if asked for under-16s.
Testing impact on response
We also commissioned research to explore data quality and how this differs according to different sex and gender questions asked in a variety of formats. The research tested three questionnaires using the three question versions described above: one with the 2011 census sex question, one with the 2011 census sex question with the addition of the category “other (write in)” and one with the 2011 sex question followed by a gender identity question.The full wording of these questions can be seen in the testing annex.
The research showed that there was no significant difference in overall response rates between the three alternatives. The data also showed that having a gender identity question did not affect the quality of the data collected on sex.
Current assessment and further developments
As a result of this work, we are still considering whether and how to collect information on gender identity, alongside continuing to collect information on male or female. This is particularly complex in meeting a respondent need for gender identity, a user need to measure the size of the transgender population while ensuring an accurate estimate of the male and female population.
Our research so far gives us confidence that collecting gender doesn’t have a negative impact on collecting information on male and female. We also know that there is a strong information need for separate information on the transgender population.
Further testing is planned in order to refine the question design and inform our recommendation about the inclusion of such a question or questions in the 2021 Census. To be clear, we have never suggested that people would not be able to report themselves as male or female. We have and will continue to collect this vital information.
3.3 Sexual orientation
In our response to the 2021 Census topic consultation, we identified a clear need among data users for improved information on sexual orientation in relation to policy development, service provision and planning, equality monitoring, resource allocation and reflecting society.
We highlighted that the Equality Act 2010 states that it is unlawful to discriminate against workers because of sexual orientation in relation to provision of goods and services, employment, or vocational training. Furthermore, the Act introduces a public sector Equality Duty, which requires that public bodies consider all individuals when shaping policy, delivering services and interacting with their own employees. They must also have due regard to the need to eliminate discrimination, advance equality of opportunity and foster good relations between different people when carrying out their activities.
Despite a high user need, our assessment indicated that a sexual orientation question may have a high impact on data quality and public acceptability.
We have not previously included such a question on the census. As such, concerns were around:
impact on overall census response
quality and completeness of response to the sexual orientation question
difficulties in, and appropriateness of, statistically estimating (or imputing) answers for non-response to the question
ability to complete by or through a second channel (proxy)
To assess these concerns we developed a research and testing plan which was formed of three strands.
Inclusion of a question in the 2017 Test – a census test across England and Wales
A public acceptability survey in England and Wales
Development of statistics from ONS social surveys
2017 Test
We investigated the impact of a sexual orientation question on the overall census response rate and response to the question as part of the census test in England and Wales. Further details on the design of the test, the question wording and summary findings are available in the 2017 Test report.
For the 2017 Test we used the National Harmonised Standard sexual identity question (self-completion version). This question was modified based on recommendations from cognitive interviews that we conducted with members of the public in 2016.
Our modifications included making the question voluntary based on our view of potential legislative requirements at the time. We also added a write-in option so that individuals who did not identify as heterosexual, lesbian, gay or bisexual (LGB) could write in their answers. Another difference is that the census allows for proxy respondents so individuals could answer the question on behalf of other household members aged 16 or over. The full wording of these questions can be seen in the testing annex.
We found that:
the inclusion of a question on sexual orientation does not have a material effect on overall response – the difference in response was 0.4 percentage points which is within the criterion of a two percentage point response difference which we set up-front
level of item non-response for the sexual orientation question was 8.4% overall which is below our threshold of 10% – this was 4.8% for those who responded online and 20.2% for those who responded on paper, this item non-response is higher for older ages
there were only a small number of exits from the census questionnaire (where someone stops completing the questionnaire) at this question based on information from individuals that answered the question online
the agreement rate between the responses provided for the sexual orientation question in the 2017 Test and the Census Test Evaluation Survey (CTES) was 98.6% – this is above the 80% target agreement rate set prior to the test
the results produced distributions which were broadly comparable to the Annual Population Survey and are even closer when “don’t knows” or “missing” are excluded from the analysis
Public acceptability testing
In 2016, we commissioned independent public acceptability testing of the modified harmonised question on sexual orientation in households in England and Wales. Respondents were shown a copy of the question and asked about their attitude towards it.
The test found that:
70% of respondents in England and Wales found it acceptable to include a question on sexual orientation on the 2021 Census
only 1% would stop completing the census form altogether if the sexual orientation question was included in the 2021 Census as a voluntary question
less than 1% of the public in England and Wales would provide an inaccurate answer or request an individual form
The testing found that the addition of a “prefer not to say” response option would increase the acceptability of the question. Of those who felt the inclusion of the question was unacceptable, 25% felt that the addition of a “prefer not to say” response option made the question acceptable. This records people’s attitudes but this is not necessarily how they would behave. However, the 1% who said they would stop completing is in line with overall drop in response (0.4%) seen in the 2017 Test.
Alternative sources
We have also examined whether we can meet the user need of lesbian, gay or bisexual (LGB) estimates at sub-regional level by using other data. We have published a series of experimental research estimates based on a three-year Annual Population Survey pooled dataset. Although this enabled estimates of sexual identity for the UK, constituent countries, and English regions, it is not possible to produce robust estimates for all local authorities. There is an important user need for this for local authorities to meet their public sector equality duties.
In terms of administrative data, we have only found sexual orientation to be collected by the Higher Education Statistics Authority (HESA) and Care Quality Commission (CQC). As these data are to support services for specific groups (students in higher education and those living in care homes), the population coverage is too limited to meet census purposes.
Communal establishments
We didn’t include communal establishments (managed residential accommodation) in the public acceptability survey or the 2017 Test. Therefore, we conducted a literature review to assess the impact of asking sensitive questions in these contexts followed by fieldwork in care homes and nursing homes.
The literature findings indicate that data quality obtained from a census sexual orientation question, asked of respondents in communal establishments, may be of low standards. This is as a result of respondents’ concerns surrounding discrimination or harassment and low levels of privacy in some communal establishments. However, the qualitative interviews in care homes indicate that the overall data of the census would not be impaired.
Potential steps to mitigate the risks identified from the study are being developed. This includes guidance for care home managers and details for residents on the purpose and use of the information.
Current assessment and further developments
Under the 1920 Census Act, it’s not possible to ask questions which are voluntary with the exception of religion.
Based on the research findings that a “prefer not to say” response option would increase public acceptability, we are testing a “prefer not to say” option as the best way to make this a voluntary question.
So far the evidence suggests that we could get a question that provides sufficient quality of information to meet user need without damaging overall quality in the census. We have identified some further research which may improve quality further and will be undertaking further engagement with key stakeholders.
These last activities, alongside space constraints on a paper questionnaire, will inform our decision on whether to recommend inclusion of a question on sexual orientation in the 2021 Census.
3.4 Volunteering
In the 2021 Census topic consultation, users from central government, local government and charities responded that they needed information on volunteering to: understand the contribution of volunteers; reveal the potential for economic growth; target policies, especially for different ages, genders and ethnicities; and understand community resilience, social capital and wellbeing. We said we would do further work to clarify the user need and test a question.
Developing a question for testing
After further engagement with users of volunteering information we found the key areas of interest were the time spent volunteering and how often a person volunteers so we designed questions to meet that need. The user need for time spent volunteering was to allow a measure of the contribution to the economy and the frequency of volunteering was to measure participation by volunteers.
We developed a question to test with respondents who volunteer and found they had difficulty with understanding the wording of the question and the instructions. This posed quality issues as respondents struggled to provide answers. We also found that respondents found it difficult to select an answer because they had to calculate the numbers of hours they volunteered and this was even more difficult when they didn’t volunteer on a regular basis.
Alongside this testing we gained understanding that the contribution of volunteering to the economy can be estimated through household surveys and not required at a small area level. As a result we changed the question wording to collect data on frequency of volunteering. We collected this as instances per week, rather than hours per week. While this question performed well, respondents still had difficulty selecting a response option.
We included the latter question on volunteering in the 2017 Test and the follow-up Census Test Evaluation Survey (CTES). Results showed an overall agreement rate, between the test and evaluation survey, of 78.3% which is considerably lower than other topics. The percentage of people not answering this question was low at 3.4% though which suggests that people were happy to answer this question. Furthermore, the drop off rate of 1.3% from the test suggests that the volunteering question didn’t stand out compared to other questions. The drop-off rate is where a respondent completing the questionnaire online exits the questionnaire in that session.
Current assessment
We’ve designed two different questions on volunteering. In both rounds respondents had difficulty matching their volunteering activities to a response option. Results of the Census Test Evaluation Survey suggest challenges to collect information of sufficient quality to meet the user need. Alongside this, our understanding of user needs for other topics has grown. And so to manage respondent burden and meet space constraints on the paper questionnaire, we intend to recommend not collecting information on volunteering in the 2021 Census.
3.5 Ethnic group
The 2021 Census topic consultation confirmed a high user need for ethnic group data. Ethnic group is a protected characteristic under the Equality Act 2010. The census data provide information to enable public bodies to meet their statutory obligations under this Act. Data users, including central government and devolved administrators, noted that the data is used for resource allocation, service planning, policy development and equality monitoring.
In response to the 2021 Consultation, we planned to conduct research into the need for additional response categories over and above those in the 2011 Census question. This resulted in a programme of work to:
explore whether an alternative question design could better meet user needs for information, for example a two-stage 2011 style question design for online completion
evaluate what additional response options (if any are required) using a tool to prioritise requests for additional response options.
Reviewing the question design
We have conducted a number of tests to review the ethnicity question design. This has included exploring whether an alternative question design could better meet user needs for information and to refine a two-stage 2011 style question design for online completion.
We commissioned research to carry out face-to-face interviews with participants. This research included looking at whether we could have a purely open free-text box as a complete write in option. The wording and format of this can be seen in the annex. The open free-text box was designed for respondents to type or write their answer (with search-as-you-type suggestions for those answering the question online).
Findings indicated that a question designed with tick-box response options was more effective than a open free-text box question. Therefore we stopped development of a purely open free-text with search as you type question.
We then commissioned research to conduct an online survey to test different online question designs. This research included three versions of the ethnic group question:
version 1 – a two-stage tick box question (similar to 2011 question)
version 2 – a two-stage tick box with an additional Sikh tick box (given the strong user need demonstrated)
version 3 – an alternative question, first stage plus search as you type open question at second stage
The full wording of these questions can be seen in the testing annex. This research aimed at providing evidence on the success of an alternative question, to assess any quality impacts of adding a question and to provide evidence of the need for a Sikh tick box.
This research was a voluntary survey conducted in Hounslow and Wolverhampton (two local authorities with high Sikh populations). Further details of this research can be found in the testing annex. Results showed a significantly lower response to the second (open) stage of the version 3 question. As a result of these findings we stopped the development of a two-stage hybrid alternative question (version 3).
This research also found no indication that the religious affiliation and ethnic group questions are capturing different Sikh populations. All respondents who stated they were ethnically Sikh also stated their religious affiliation was Sikh. This is in line with findings from the 2011 Census data. In 2011 Census only 1.6% in the 2011 Census who had recorded themselves as ethnically Sikh recorded a religious affiliation other than Sikh.
We conducted public acceptability research to assess the public’s opinion on the acceptability of the terms used in the 2011 Census. The public acceptability research was carried out by the ONS opinions and lifestyle survey (Omnibus). This research found that the majority of the sample was comfortable with the higher level categories and lower level terms tested.
Gathering needs for new tick boxes and community engagement
A stakeholder follow up survey was undertaken to gain an understanding of user need for additional or new tick boxes in the ethnic group question. The stakeholder follow-up survey identified a need for greater granularity of data and requests for 55 additional tick boxes were received. Space constraints on the online and paper questionnaires mean that there is limited space for new response categories.
Therefore we conducted a prioritisation evaluation to consider the strength of need of these additional requests for tick boxes. The following tick boxes were taken forward for consideration once the strength of user need evaluation was completed:
Gypsy
Irish traveler
Sikh
Somali
Jewish
Roma
Korean
Kashmiri
We then evaluated these requests further against additional criteria including: the availability of alternative data sources, data quality, and comparability. After this evaluation we have identified the following four groups where we need to undertake further work before we can decide whether to recommend any new additional categories:
Jewish
Roma
Sikh
Somali
In order to finalise our views on the ethnic group categories, we need to engage further with stakeholders to assess commonality of views within different communities, undertake further research to assess whether the inclusion of new categories will collect sufficient quality information to meet the user need and that our conclusions are compliant with our legal obligations.
Finalising our recommendation
We now need to finalise our recommendations for which tick boxes to include on the 2021 Census ethnic group question. We have established an Ethnic Group Assurance Panel to support our process. This group consists of data users and data collectors from across government and ethnic group experts. Our ongoing research is reviewed by this group to provide external assurance that research meets the user need. The evaluation tool used to prioritise tick boxes was also shared with other key stakeholders and went through a number of quality assurance processes.
In developing the ethnic group question, we have met with representatives from a number of communities. To date we have engaged most with the Sikh community. Alongside this report we have published a summary of a meeting held on 23 October 2017 and a SlideShare presenting the slides from the meeting. We have also been engaging with the Jewish community. We will engage further with some communities we have yet to engage fully with, for example Roma.
Our recommendation for the 2021 Census will be reviewed by the Ethnic Group Assurance Panel.
3.6 Number of Rooms and Bedrooms
In the 2021 Census topic consultation, users told us that data on number of rooms and bedrooms is used to:
assess changes in overcrowding
assess the number of households living in unsuitable accommodation
tackle deprivation
develop appropriate housing policies
plan future housing provision
allocate resources
In our response to the 2021 Census topic consultation we outlined our intention to investigate the use of Valuation Agency Office data to provide information on number of rooms and bedrooms. In our response we also stated that we did not believe it was appropriate to continue to ask two questions designed to meet a single information need if there is not a clear requirement to do this. This is in the context of minimising respondent burden.
Consultation on using Valuation Office Agency data
In June 2017, we published our assessment of estimating number of rooms and bedrooms using Valuation Office Agency (VOA) admin data. This research was looking at VOA data as a potential alternative to estimating the number of rooms and bedrooms on the 2021 Census. We held a public consultation from the 28 June to 25 September 2017, inviting users to respond to this publication. Our response to that consultation has been published alongside this report.
There were two key conclusions from the responses we received. Firstly, in the context of asking one question, number of bedrooms was more appropriate than number of rooms. This is because:
more respondents used number of bedrooms than number of rooms
the requirement for number of rooms is for under- and over-occupancy, this calculation can be done from number of bedrooms and no evidence has been provided of the need for two questions in the census
we know that the data quality for a number of bedrooms question is greater than number of rooms
Given these points and our belief that is it only appropriate to ask one question, we intend to recommend removing the number of rooms question.
Secondly, respondents highlighted some quality concerns they had with the VOA data. These included:
the differences between VOA and 2011 Census for occupancy rating (bedrooms)
how frequently VOA records are updated
They were also unsure how changing to an address-based measure would impact the data use compared to using the census household-based measure.
We have conducted further research to understand these issues in the VOA data in more detail. This research has concluded that at present these quality concerns would impact data use. Therefore, we intend to recommend asking number of bedrooms on the 2021 Census.
We will continue research into understanding the Valuation Office Agency (VOA) data and how it can be used to enhance the 2021 Census outputs and in an Administrative Data Census. This will include research to understand the quality of the number of rooms data and considering how the VOA size of property variable could be used.
Nôl i'r tabl cynnwys4. Annex
Detailed information about the user needs for these topics can be found in our response to the 2021 topic consultation.
Topics we intend to collect
We intend to recommend the following topics to be included in the 2021 Census. These topics were collected in 2011 Census and were in our initial view to collect. They will be included in end to end testing to finalise ordering and routing.
Age
Sex
Marital and legal partnership status
Household and family relationships
Type and Self-containment of Accommodation
Tenure and Landlord (if renting)
Type of central heating
Number of cars or vans
National identity
Welsh language
Main language used
English language proficiency
Religion
Long-term international migration
Short-term international migration (intention to stay)
Internal migration (address one year ago)
Citizenship
Qualifications
General health
Long-term health problem or disability
Amount of unpaid care provided
Economic activity and hours worked
Whether or not a respondent worked in the last year
Supervisory status (subject to further research)
Occupation
National Statistics socio-economic classification (NS-SeC) – derive from collected information
Industry
Method of transport to place of work
Address of place of work
Additional research and testing on topics we plan to collect
Marital and legal partnership status
In light of the Marriage (Same Sex Couples) Act 2013, we have been reviewing the response options in this question. We contacted a number of stakeholders to seek views on suggested question designs. Feedback received has helped to develop the current question design under consideration.
A suite of questions were included in the ONS Omnibus survey in October 2017. These questions were designed to provide insight into the public acceptability of the proposed new question design. The results of this public acceptability testing are currently being analysed. Work is ongoing to assess user requirements and develop the final question design.
Household and family relationships
Initial research has focused on the optimal ordering of response options to maximize data quality. An online card sorting activity was carried out in August 2016 with over 500 respondents who were asked to place response options in the order they felt most appropriate. Focus groups with students have been carried out to provide more in depth insight in this area.
Type of central heating
We’re exploring the option of adding an additional response option to capture renewable heating systems, as well as reconsidering the categorisation of solid fuel types splitting wood from coal, to meet additional user needs. The question will be tested in 2018 to evaluate how the response options are understood by respondents and whether they are able to select the appropriate answer.
Main language used
We reviewed the need for additional response options. We conducted cognitive testing in Wales to test separating English and Welsh tick boxes for the question in Wales. We have decided not to split these tick boxes.
Qualifications
We are reviewing options to shorten or simplify the question. We have focused on the design of the online version of the question. As we’re not limited by space constraints online, we have taken inspiration from the Canadian census and looked to split the question out into separate sections based on qualifications level. For example GCSEs, A levels, and degrees will now be in their own questions on separate pages. Testing is ongoing to see if splitting the question makes it more manageable for respondents.
Economic activity
A new set of guidelines for collecting information on the economic status of respondents was released in 2013 by the international labour organisation. We’re working to make sure our questions collect sufficient information to meet these. Testing is ongoing to explore options for shortening the labour market questions and simplifying the routing.
Year last worked
We have been exploring the use of Pay As You Earn data from HMRC to estimate if someone was working in the 2010/to 2011 tax year. Early analysis suggests that we can produce estimates for some of the population but there are notable gaps in the data currently available to ONS – principally those who are self-employed. We are pursuing access to admin data to fill these gaps. We now intend to recommend collecting information on whether or not a respondent worked in the last year.
Supervisory status
We’ve been working with academics to explore options for deriving NS-SEC that still meet users’ need for continuity. One of these options is a census specific NS-SEC matrix. Academics are using the Labour Force Survey and Standard Occupational Classification 2010 data to estimate the impact of moving to a census specific matrix. Their early findings suggest that a census specific matrix would correctly derive 92.6% of the 13 operational classes. This research is ongoing.
Research on topics we plan not to collect
Income
Our plans following the topic consultation were to explore using administrative data to meet the need for income information. We have published two research outputs on income. In response to user feedback, this year’s publication includes both individual and household gross income distributions at lower layer super output area (LSOA) level for the tax year ending 2016.
At this stage, the Research Outputs are limited to income from the Pay As You Earn (PAYE) and benefits systems (which include tax credits); therefore, a number of components of income are missing, for example, income from self-employment and investments taxed via Self Assessment. Although it’s currently presented using an administrative data population base, the methodology can be used in combination to produce 2021 Census outputs by income.
Despite the limitations of the analysis, we’re encouraged to see that results broadly reflect the patterns we expect; for example, a higher percentage of both individuals and households fall within the higher income bands in London and the south east.
We’re pursuing access to additional administrative data on income – particularly income processed through the Self Assessment system – to help address the current limitations.
Nôl i'r tabl cynnwys