1. Output information
- Accredited official statistic: No
- Survey name: Management and Expectations Survey (MES)
- Data collection: sample of around 53,000 businesses
- Frequency: Every three years. The 2020 wave of the survey asked respondents about their management practices in both 2019 and 2020. As a result, the survey has three waves but corresponds to four years of data.
- How it is compiled: In 2023, the sample consists of all IDBR firms with more than 250 employees, respondents of previous MES waves (MES 2017 and MES 2020), respondents of the latest Annual Business Survey (ABS 2022) and a random stratified sample from the IDBR of firms with more than 10 employees. In 2020, all firms in industry and geography groupings (strata) with less than 25 firms were also chosen but this rule was dropped in 2023.
- Geographic coverage: UK in 2023, Great Britain in 2020 & 2017
- Industry coverage: The survey covers the non-financial market economy, same as the ABS. It excludes firms in Sectors A: Agriculture, Forestry and Fishing, K: Financial and Insurance Activities, O: Public Administration and Defence and Compulsory Social Security, T: Activities of Households and U: Activities of Extraterritorial Organisations and Bodies.
- Related publications: Management practices in the UK.
2. About this Quality and Methodology Information report
This quality and methodology information report contains information on the quality characteristics of the data (including the European Statistical System’s five dimensions of quality) as well as the methods used to create it.
The information in this report will help you to:
- understand the strengths and limitations of the data
- learn about existing uses and users of the data
- understand the methods used to create the data
- help you to decide suitable uses for the data
- reduce the risk of misusing the data
3. Important points
- The Management and Expectations Survey (MES) provides information on firm’s structured management practices: how firms respond to problems, their employment practices, and how they use key performance indicators (KPIs) and targets.
- It also asks questions about firm characteristics such as employment, turnover, capital expenditure and intermediate consumption; the 2020 wave included questions around supply chain issues and homeworking during the coronavirus (COVID-19) pandemic while the 2023 wave included questions on learning and development and technology adoption.
- The MES takes place every three years and is a voluntary survey; The results from the survey are considered official statistics in development.
- The survey includes respondents of the latest wave of the Annual Business Survey (ABS), previous MES respondents, as well as firms randomly sampled from the Inter-Departmental Business Register (IDBR).
- The 2017 and 2020 sample covered firms in Great Britain, whereas Northern Ireland was included in the 2023 wave, making the sample UK-wide; the 2020 survey asked respondents about their management practices in both 2020 and 2019, so while the survey has three waves it provides four years of data.
- The survey is used across government and academia to understand the role of management practices on growth; the 2020 wave was funded by the Economic and Social Research Council and the 2023 wave was funded by HM Treasury’s Economic Data Innovation Fund (see MES-survey.org for more information).
- MES results are published in a summary article or bulletin accompanied by a data pack with summaries of the main variables for the entire sample, as well as by the main industries and geographies.
4. Quality summary
Overview
The Management and Expectations Survey (MES) primarily collects data about management practices in firms: how firms respond to problems, their employment practices, and how they use key performance indicators (KPIs) and targets. It also asks questions about firm characteristics such as employment, turnover, capital expenditure and intermediate consumption in the previous calendar year as well as firms’ expectations about these in the next year. The 2020 wave of the MES included questions around supply chain issues and homeworking during the coronavirus (COVID-19) pandemic. The latest wave of the MES, conducted across the UK from November 2023 to March 2024, asked firms about their approaches to learning and training and their adoption of new technologies such as artificial intelligence (AI), robotics, specialised software or equipment and cloud computing systems and applications.
Previous waves of the survey – covering Great Britain only – were conducted in 2017 and 2020. In 2017, the survey asked respondents about their management practices in 2016. In 2020, because the survey was conducted during the pandemic, it asked respondents about their management practices in both 2019 and 2020 to try and mitigate survey scores reflecting exceptional circumstances firms were facing and understand how firms were responding to the pandemic.
The MES samples firms with 10 or more employees excluding firms in agriculture, financial services, and the public sector. In 2023, 53,433 firms were sampled with an achieved response rate of 26.9%. This was an increase in the sample size from MES 2020, which was 50,714 and achieved response rate of 24%. MES 2017 went out to 24,998 firms and achieved a response rate of 38.7%.
Uses and users
The MES was developed by us, at the Office for National Statistics (ONS), in partnership with the Economic Statistics Centre of Excellence (ESCoE). As a result, the main users of this dataset are academics and researchers, which we supply MES microdata to through the Secure Research Service. Additionally, the MES has been used by other government departments, think tanks, and the media to inform policy debate around issues such as the productivity puzzle or the role of management interventions in improving firm growth. Important policy users of the MES include:
- the Department for Business and Trade
- the Department for Science, Innovation and Technology
- the Competitions and Market Authority
- the Bank of England
- think tank analysts
- ESCoE and the wider academic community
The main use of the MES within the ONS is to measure management practices of businesses in the UK. Using the survey responses, we generate a management practice score for each responding firm. The management practice scores developed capture four dimensions of management:
- continuous improvement, or how businesses respond to problems
- the use of KPIs
- the use of targets
- employment practices relating to promotion, training and employee underperformance
Management practice scores range from 0 to 1. Firms score 0 if they do not respond to ongoing problems, base promotion decisions on factors other than merit, and do not track performance or set targets. Conversely, to score 1, firms need to continuously review their processes with the aim to minimise future challenges, carry out regular performance reviews, train employees, and base hiring and promotion decisions on merit.
The second use of the MES within the ONS is as an input in research aiming to understand productivity and growth of UK firms. Management practices are associated with higher productivity and resilience for firms. A more extensive review of the literature can be found in our Management practices in Great Britain: 2016 to 2020 article.
Strengths and limitations
The main strengths of the survey are as follows.
MES is the most recent large business survey which asks about management practices in the UK. The Worker Employment Relations Survey also asked about management practices, but its latest wave was in 2011. Management questions are included in other surveys, such as the Decision Maker Panel, but these surveys are smaller in scale.
It is designed with data linkage to important business microdata in mind. In the MES sample we include respondents of the most recent Annual Business Survey (ABS) which allows for variables from the MES dataset to be used in wider analyses, together with financial information about firms such as their capital, GVA or profits. It is also designed to allow for longitudinal analysis, as all respondents of previous MES waves are included in the sample.
It is developed jointly with stakeholders to meet data needs. The questionnaire is developed with the input of customers and stakeholders who sit on our advisory panel. The questionnaire has changed to meet user needs, asking about supply chain issues and homeworking in 2020 and AI and technology adoption in 2023.
Robust methods are adopted for the survey's sampling and weighting strategies to limit the amount of bias (for example, because of non-response and sampling errors).
The main limitations of the survey are as follows.
These data are considered official statistics in development.
The survey is voluntary, so response rates are lower than for other ONS mandatory business surveys; the low response provides reliable high-level estimates but means more detailed breakdowns may not always be possible. The low response rates also limit the sample available for longitudinal analysis.
Recent improvements
There are two recent changes to the survey and how we present the results from the survey.
In our most recent bulletin, we changed how we calculate design weights for the respondents to Wave 2 (covering 2019 and 2020) and Wave 3 (covering 2023) of the survey. In both waves, we sampled all live firms which have responded to previous waves of the MES or to the most recent wave of the ABS. These firms are placed on a reference list, and the remainder of the sample is drawn at random from the Inter-Departmental Business Register (IDBR).
In our 2021 article, Management practices in Great Britain, weights were designed so that firms which were sampled from the MES and the ABS were given a design weight of 1 and the firms taken from the IDBR were given weights of greater than 1 to ensure that the results were representative of the whole population of businesses with 10 or more employees.
However, the use of a large reference list to generate these samples, while necessary to ensure longitudinal analysis is possible, also means that in certain strata of our sample, all firms are taken from the reference list, and none are taken randomly from the IDBR. This meant that the firms in these strata only represent themselves and were not representative of the actual business population of these strata.
To resolve this, we have taken a new approach to calculating design weights for firms in Waves 2 and 3 of the MES in our 2024 bulletin. We make use of the fact that many firms entered our sample because they responded to the ABS and give these firms the design weights given to them in the ABS but scale these weights so that the sum of weights for each strata still equals the actual number of firms in this strata according to the IDBR. Further detail of this weighting method is given in Section 6: Methods used to produce the data.
We also implemented corrections to the calculation of management practice scores. This changed the scores published in the 2021 article. During the processing of the 2023 survey results, we identified an error in how Wave 2 respondents’ answers were coded for questions 12 a,b (“In 2019 (12a) / 2020 (12b) what were performance bonuses for managers usually based on?”) and questions 12 c,d (“In 2019 (12c) / 2020 (12d) what were performance bonuses for non-managers usually based on?”). These questions contributed to the total management practice score given to each firm and correcting this error changed our estimates of average management practice scores in 2019 and 2020. The questions were originally coded in reverse, giving a score of 0 to firms that should have scored a 6, 1 to firms that should have scored a 5 and so on. As a result, correcting this increased the number of firms with lower management scores that would have previously been uplifted by mistakenly scoring highly in those two questions.
After the initial publication of the 2024 bulletin, we made further corrections to the estimates of management practice scores from both Wave 2 and 3. This error changed our estimates of average management practice scores in 2019, 2020 and 2023. This error occurred as part of the calculation of scores of firms who responded that there was no use of KPIs or targets. These firms were originally given missing scores to the follow-up KPI and targets questions rather than zero scores. This also resulted in firms with lower management scores scoring higher than their true score. The impact of this correction on aggregate scores is also described in the correction notice above the 2023 bulletin.
The sum of these changes for 2020 and 2023 can be seen in Figures 1 and 2, which show the distribution of management practice scores under different combinations of weighting and scoring methods. The weighting methods are as follows:
- old weights: design weights used in our 2021 article, Management practices in Great Britain
- new weights: design weights used in our 2024 bulletin, Management practices in the UK
The scoring methods are as follows:
- old scoring: uncorrected scores published in our 2021 article, Management practices in Great Britain
- superseded scoring: uncorrected scores published in superseded version of our 2024 bulletin, Management practices in the UK; in this version, the questions in the targets section (12a,b,c,d) that required rescoring have been rescored but firms are still given missing values instead of zeros when saying they have no targets or KPIs
- corrected scoring: corrected scores published in the current version of our 2024 bulletin, Management practices in the UK; for 2020, this scoring includes both the correction of the targets question scores and the correction of the KPI and target scores to zero rather than missing for firms who report no KPIs or targets (only the latter correction affected 2023 MES scores)
Figure 1: The distribution of management practice scores has shifted to the left because of improvements and corrections to our processing
Distribution of overall management practice scores under different weighting and scoring methods, Great Britain, 2020
Source: Management and Expectations Survey from the Office for National Statistics
Download this image Figure 1: The distribution of management practice scores has shifted to the left because of improvements and corrections to our processing
.PNG (227.1 kB) .xlsx (477.1 kB)
Figure 2: The distribution of overall management practice scores under different scoring methods for the UK in 2023
Distribution of overall management practice scores under different scoring methods, UK, 2023
Source: Management and Expectations Survey from the Office for National Statistics
Download this image Figure 2: The distribution of overall management practice scores under different scoring methods for the UK in 2023
.PNG (129.4 kB) .xlsx (229.3 kB)5. Quality characteristics of the data
This section describes the quality characteristics of the data and identifies issues that should be considered when using the statistics.
Relevance
Research has found that management practices are associated with measures of business success. A considerable body of research finds a strong and positive relationship between management practice scores and productivity, profitability, resilience and survival, with evidence that part of this relationship is causal. See the World Management Survey at 18 report (PDF, 1.5MB) and the Does management matter? article from The Quarterly Journal of Economics for more information.
While management practices seemed to be an important influence of firm outcomes, before 2016, there was a data gap in measuring management practices in the UK. The Office for National Statistics (ONS) and the Economic Statistics Centre of Excellence (ESCoE) jointly bid for funding to create the UK Management and Expectations Survey (MES) designed to produce data which can be compared with the data collected on management practices internationally (see the Coherence and comparability section).
Since then, the ONS, other government departments, and the academic community has used the data to understand the role of management practices on growth. MES scores of UK firms are positively related to labour productivity, investment in R&D and innovation and firms’ ability to adopt homeworking during the coronavirus (COVID-19) pandemic. For more information, see our bulletin on Management practices, and our corresponding articles on innovation and homeworking. Because of the relevance of the survey to policy and academic research, a third wave was funded in 2023 through HM Treasury’s Economic Data Innovation Fund.
More on the research carried out with the survey can be found at MES-survey.org.
Accuracy and reliability
The total error in a survey is the difference between the estimate derived from the data collected and the true (unknown) value for the population. The total error consists of the sampling and non-sampling error combined. Sampling error is the error that arises because the estimate is based on a survey rather than a census of the population. The results obtained for a sample may vary from the true values for the population, but in a series of random samples over several repeats of the survey, this error would be expected to be zero on average. Non-sampling errors cover all errors unrelated to the sampling methodology. These can be difficult to quantify and relate to errors in coverage, measurement, processing, and non-response.
Sampling error
The MES is a sample survey. Therefore, estimates are subject to sampling variability. The size of the sampling variability is dependent upon several factors:
- the size of the sample and population: a larger sample in relation to the population is likely to capture more of the variation and therefore achieve a closer estimate to the population value
- sample design: the proportion of the survey that is randomly sampled and the stratification method; for example, in 2020, we chose to stratify the manufacturing sector in more detail, capturing greater variation in that sector, though a smaller sample can be taken from sectors that are more homogenous
Standard errors are used to give an indication of the amount that a given estimate may deviate from the true population value. Standard errors have been used to calculate 95% confidence intervals to report whether a change in the management score over time is statistically significant.
There are various ways to construct standard errors to reflect the sample design and our assumptions about the resulting sample. For more detail regarding standard errors in MES 2023, see the discussion in Section 10 of the Management practices in the UK bulletin.
Non-sampling error
Non-sampling errors cover all errors unrelated to sampling methodology. Sources of non-sampling error include:
- response errors caused by, for example, the respondent misunderstanding the question or responding to the question inaccurately because of things like recall bias
- non-response
To minimise non-sampling error because of response errors, the team ensures the wording of the questions is as clear and simple as possible and carries out questionnaire testing with businesses prior to the questionnaire being used for the survey. We also follow up with businesses that have responded to questions in ways that are inconsistent between questions or provided values that may seem implausible, to validate their responses and make sure they are correct.
Sampling error would, on average, amount to zero when the survey sample is randomly drawn, since random draws ensure the individuals selected are not systematically different in their characteristics and (in this instance) underlying management quality, to those not selected. However, non-response means that, despite random sampling, the resulting group of responses may have systematic differences in their characteristics if the probability of non-response is correlated with the outcome of interest, so for instance, if better managed firms are more likely to respond.
We try to account for non-response using weights. Weights are adjusted for non-response by assuming that non-response is random within the same cell (for instance, within the same size band, industry division and International Territory Level 1 (ITL1) region). The adjustment allows responding firms within the cell to represent non-responsive firms by increasing the weight of the responding firms proportionally to non-response. More information on the stratification and weighting methodology is given in the “Weighting”, “Coverage” and “Sampling frame” sections.
Non-sampling errors could come up in the MES as a result of the position in the firm of the individual who responds to the questionnaire. Because we do not specify who within the business should be completing the questionnaire, some information might have higher risk of error than others depending on the respondent’s seniority or function in the business.
Coherence and comparability
There are a few differences between the three waves of the MES. While we use the same concept and pillars of management practices, the exact scoring schedule (the exact set of questions underlying each pillar) changed slightly in 2020. The old scoring schedule can be found in our Management practices and productivity in British production and services industries methodology. The new scoring schedule has been included in Section 7: Other information. MES was first launched in 2017 following up on the Management Practices Survey (MPS), piloted in 2016, which collected data on management practices of firms in the manufacturing sector for the reporting period 2015. The MES is designed to produce data which can be compared with the data collected in the MPS, MES 2017 and 2020, the US Census Bureau’s Management and Organizational Practices Survey (USMOPS) and the German Management and Organizational Practices Survey (GMOPS). More information on the development of the management scoring framework can be found in the “Management practice score” section.
The 2017 and 2020 surveys covered Great Britain, and Northern Ireland was added to the 2023 sample. The same questions have been used to compare management scores in our publication to ensure comparability across time.
Accessibility and clarity
For each wave, accessible data packs are produced in Excel format alongside the statistical bulletin or article. See our Management practice scores and distributions by firm characteristics dataset for the files. Each workbook contains estimates from the data, such as the average, median, 10th and 90th percentile management practices score broken down by industry, employment size band, region and category of the management practice score.
Anonymised data from the MES are sent to our Secure Research Service (SRS) and the Integrated Data Service (IDS) to allow users to access the microdata. Open text responses are edited to ensure that no specific business can be identified.
Only researchers accredited under the Digital Economy Act can access data in the SRS. You can apply for accreditation through the Research Accreditation Service (RAS). You need to have relevant academic or work experience and must successfully attend and complete the assessed Safe Researcher Training. To conduct analysis with microdata from the SRS, a project application must be submitted to the Research Accreditation Panel (RAP). To access the SRS, you must also work for an organisation with an assured organisational agreement in place.
For information regarding conditions of access to data, please refer to our Terms and conditions (for data on the website), Freedom of information (FOI), and our Accessibility statement.
Timeliness and punctuality
The MES collects data every three years. Since 2020, this has been through an online questionnaire, but previous data collection was using paper forms. Every round of the survey is referred to as a “Wave”. Results from each wave are published months after the survey closes. The 2020 wave of the MES collected data for both 2020 and 2019. Responses for the 2023 wave referred to the 2023 calendar year; the survey was sent out on 1 December 2023, and closed on 8 March 2024. The bulletin with the main results was published on 13 May 2024, as planned.
In the 2020 wave, feedback was given to participants about their management practices as well.
Concepts and definitions
The business unit to which questionnaires are sent is called the reporting unit. This can cover the enterprise as a whole or parts of a business enterprise identified by a group of local units. More detail on firm demography and how firms are mapped into the IDBR can be found on the Longitudinal Business Database user guide.
The industry classifications used for the MES are set out by the UK Standard Industrial Classification.
Geography
The MES did not include Northern Ireland in 2017 and 2020. Firms in Northern Ireland were included in 2023.
Why you can trust our data
We are the UK's largest independent producer of statistics and the recognised national statistical institute of the UK. The Data Strategy, available on our website, explains how data are collected, secured and used in the publication of statistics. We treat the data that we hold with respect, keeping it secure and confidential, and we use statistical methods that are professional, ethical, and transparent. You can find out more about our Data policies on our website.
Nôl i'r tabl cynnwys6. Methods used to produce the data
Coverage
The Management and Expectations Survey (MES) is a voluntary business survey of businesses with 10 or more employees covering both the production and services sectors in the non-financial market economy. This includes all Standard Industrial Classification Sections excluding sections A (agriculture, forestry and fishing), K (financial and insurance activities), O (public administration and defense), T (activities of households as employers) and U (activities of extraterritorial organisations and bodies). Firms in the public sector are also excluded, including the public provision of education and health. The MES covers firms in the following sectors:
- non-manufacturing production, which includes mining and quarrying, energy generation and supply, and water and waste management
- manufacturing
- construction, which includes civil engineering, housebuilding, property development, and specialised construction trades such as plumbers, electricians and plasterers
- distribution, hotels and restaurants which includes retail, wholesale and motor trades, and accommodation and food services
- transport, storage, and communication which includes transportation and storage services, and information and communication services
- business services, which includes professional, scientific and technical activities and administrative and support service activities
- real estate
- other services, which includes private provision of education and health, entertainment services and any other services not mentioned by the list
The MES 2023 sample was restricted to businesses with employment of 10 or more for consistency with previous waves of the survey as well as the pilot Management Practices Survey (MPS). These were based on the US Census Bureau’s Management and Organizational Practices Survey (USMOPS) and adapted to the UK. The main reason for focusing on firms with at least 10 employees was the survey burden to microbusinesses and some evidence that the personnel management questions on microbusinesses would not be as applicable or reliable for larger businesses. The sample is restricted to the private sector and excludes agriculture, finance and public administration so it aligns with the Annual Business Survey (ABS) business population. It is also restricted because the validity of these questions in capturing management practices outside of the private non-financial economy has not been tested. In 2023, we ran a separate Public Sector Management Practices Survey adapting the MES to understand management practices in the public sector.
Sampling frame
The latest MES sample followed a mixed methods sampling strategy. We first create a stratified sampling frame from the Inter-Departmental Business Register (IDBR) so that the business population is divided into groups. In 2023, we stratified by five employment size groups (10 to 19, 20 to 49, 50 to 99, 100 to 249, 250 or more), 59 industry groups (detailed in Section 7: Other information) and all 12 International Territory Level 1 (ITL1) regions in the United Kingdom (the nine English regions, Scotland, Wales, and Northern Ireland). This gives us 3,374 unique combinations of employment size bands, industry groupings and regions, which we call “cells".
We then fill out this sample from the IDBR in a specific order. First, we take firms from ”census” cells; these are cells from which we sample every firm in that cell. In the MES, this is every firm with 250 or more employees. Second, we take firms from our reference list. In 2023, firms entered the reference list if they had responded to either of the previous waves of the MES (MES 2017, 2020) or if they responded to the 2022 ABS. This was done to maximise the number of firms available for longitudinal analysis and allow data linkage to the ABS that holds an extensive list of important financial information on firms.
Lastly, we draw a random sample and add these to the firms from the census and reference list so that each cell has at least nine firms or 2% of the total number of firms in that cell on the IDBR, whichever was larger. Where cells included many firms on the reference list, this meant that no firms were drawn at random from the IDBR.
The sample design holds only for the Great Britain portion of the MES sample. There were no Northern Irish firms in the reference list since previous waves of the MES and the ABS covered Great Britain only. This meant that all Northern Irish firms were drawn at random from the IDBR. For Northern Irish firms, we took 20 firms from each cell or 48% of the total number of firms in that cell, whichever was larger.
Weighting
Design weights are applied to estimates of all variables from the MES published in our bulletin, Management practices in the UK to account for a non-random sampling strategy and differential response rates between different cells. Design weights sum to the total number of firms in the population of interest, which for the MES is described in the “Coverage” section and is equal to 287,690 firms for the 2023 wave.
Typically, when a survey follows a mixed methods survey design (for example, when it combines a stratified sample design and a reference list of firms selected into the sample), then the firms on the reference list are given a design weight of 1 while firms that were drawn at random are given a design weight as follows:
Where the selection probability for randomly drawn firms is given by the following equation:
This approach is the one taken to generate weighted estimates in the article Management practices in Great Britain, 2016 to 2020.
However, in the 2023 wave there were many cells where no firms were drawn at random because of the size of the reference list growing as the number of previous MES respondents grows. This meant that the sum of design weights for that cell would equal the number of firms from the reference list and not the total number of the firms in that cell according to the IDBR.
With only reference list firms in a cell, we cannot carry out statistical inference to understand how the scores of these firms relate to the population score as the reference list firms are assumed to be unrepresentative of other firms in their cells.
Longitudinal weights could be developed to reflect the probability that these firms are in our sample and capture their representativeness against the relevant population. Longitudinal weights have not been developed for this survey but will be considered for future MES waves as the longitudinal sample of firms grows over time. In the absence of longitudinal weights, there are two options available. Either we could assume there is no self-selection of firms with particular characteristics in the longitudinal sample, assume these firms are as representative as a firm drawn from the IDBR, and give them design weights or choose the more conservative approach of assuming they are unrepresentative and give them a weight of 1. Given MES is a voluntary survey with a 25% response rate carried out every three years, it would be a strong assumption to predict that these firms are not systematically different to a randomly drawn firm from the population in their cell. Even more, response rate analysis carried out on the 2020 MES wave showed that repeated respondents from the 2017 wave were likely to be better managed than the average firm in the 2020 sample.
Without longitudinal weights, assigning previous MES respondents and firms from the ABS a weight of 1 means we have some cells that are unrepresentative of the business population in that cell. To address this issue, we explore four options:
- option one: do nothing but accept we cannot estimate consistently population estimates from the sample
- option two: use (adjusted) ABS weights for cells without any randomly drawn firms but continue to give reference list firm from the ABS a weight of 1 otherwise
- option three: use (adjusted) ABS weights for all ABS firms
- option four: treat firms from the ABS as if they were randomly drawn from the IDBR
- option five: treat all firms in the sample as if they were randomly drawn from the IDBR (that is, give all firms design weights)
In Figures 3 and 4, we show the distribution of MES scores under these five options.
Figure 3: The effect of different weighting options on the distribution of the 2020 Management and Expectations Survey scores
Distribution of overall management practice scores under different weighting methods, Great Britain, 2020
Source: Management and Expectations Survey from the Office for National Statistics
Download this image Figure 3: The effect of different weighting options on the distribution of the 2020 Management and Expectations Survey scores
.PNG (266.3 kB) .xlsx (26.2 kB)
Figure 4: The distribution of overall management practice scores under different weighting methods for the UK in 2023
Distribution of overall management practice scores under different weighting methods, United Kingdom, 2023
Source: Management and Expectations Survey from the Office for National Statistics
Download this image Figure 4: The distribution of overall management practice scores under different weighting methods for the UK in 2023
.PNG (267.7 kB) .xlsx (26.3 kB)We choose option three, resolving this issue by bringing in the weights used in the 2022 ABS for firms that entered the reference list by virtue of responding to the 2022 ABS. This is because the selection probability of firms entering the MES sample because they responded to the ABS was included implicitly within the calculation of their ABS weight, whereas the ABS weight is the inverse of the probability that a firm was selected into the ABS sample and chose to respond to the ABS. Since the ABS is a mandatory survey with high response rates, we are confident a firm’s ABS weight is a good approximation of its representativeness in the population.
However, the ABS and MES use different sampling frames to generate their sample meaning that we could not use the ABS design weights on a one to one basis and had to scale the design weights of the firms in our sample to sum to the total number of firms in the cell according to the IDBR. As such, we applied an adjustment factor to all ABS-origin firms in the sample and those randomly drawn from the IDBR of the type:
The use of an adjustment factor is a non-standard approach to developing weights. However, it is similar to the use of calibration weights (g-weights) in other business surveys, including the ABS. G-weights are factors that adjust design weights upwards or downwards to account for a population characteristic of a cell (for instance, employment or turnover), that may not be well represented by the use of design weights alone. For example, if sampled firms have smaller employment than the average firm in the population of that cell. Similarly, we calibrate the a-weights to account for firms in the sample that did not exist in the ABS sample for which the weights were designed.
Following the generation of these adjusted ABS design weights, we then account for non-response bias by dividing the a-weights of each firm with the response rate for that cell. However, given that different firms in each cell can have very different a-weights from one another, we calculate the response rate as follows:
For the weights used in the 2021 published article, the response rate adjustment was done using the sample rather than the weighted response rates and design weights for randomly sampled firms did not consider the number of reference list firms in the cell. These are two corrections that have been implemented in option one that is otherwise identical to the 2021 weighting approach.
Management practice score
The MES questionnaire consists of 16 categorical questions on quantitative and qualitative aspects of a business’ management practices. Each question is accompanied by a list of options from which respondents chose options closest to the practices within their firms. For each question, scores were awarded to each option on a scale of 0 to 1, where 0 was the least and 1 the most structured management practice. An overall management score was derived as a simple average of a firm’s score on all individual questions.
Several pieces of research have been instrumental in shaping the questions and the scoring framework used for the UK MES. The 2007 publication, Measuring and Explaining Management Practices Across Firms and Countries by Nick Bloom and John Van Reenen, was the first attempt to capture management quality through a survey. An evolution of the management framework in subsequent research is described by Daniela Scur, Raffaella Sadun, John Van Reenen, Renata Lemos and Nicholas Bloom in the article, The World Management Survey at 18: lessons and the way forward. The questionnaire for the UK was developed to be comparable to the US and German Management and Organizational Practices Survey (MOPS). It was created in collaboration with an expert advisory group of academics including:
- Nick Bloom
- John Van Reenen
- Rebecca Riley
- Paul Mizen
- John Forth
- Alan Felstead
We also worked with other government and policy stakeholders including:
- bethebusiness
- the Department of Business Energy and Industrial Strategy
- HM Treasury
All questions in the questionnaire are mandatory; however, where respondents answer that they do not use certain management practices (for instance, key performance indicators (KPIs)) our online questionnaire routes them away from follow up questions on that same management practice.
A score of zero was awarded to questions that were skipped because of the response given to a prior leading question. Details of the scoring schedule are included in Section 7: Other information.
Quality assurance
The data are validated and cleaned, variables are derived, and weights are applied. As the survey collects information on a sample of the population, the data are weighted to enable us to make inferences from this sample to the entire population.
Nôl i'r tabl cynnwys7. Other information
This section presents the scoring schedule for the Management and Expectations Survey (MES) and the industry stratification for the 2020 and 2023 waves of the survey.
The following establishes the survey questions in Section B of the questionnaire: continuous improvement, listing their responses and corresponding scores.
12. In 2023, in general what is the most common response to problems faced within this business?
- We resolve the problems but do not take further action – 1 out of 3
- We resolve the problems and take action to try to ensure they do not happen again – 2 out of 3
- We resolve the problems and have a continuous improvement process to anticipate similar problems in advance – 1
- No action is taken – 0
The following establishes the survey questions in Section C – key performance indicators, listing their responses and corresponding scores.
14a. In 2023, how many key performance indicators (KPIs) does this business monitor?
- 1 to 2 key performance indicators – 1 out of 3
- 3 to 9 key performance indicators – 2 out of 3
- 10 or more key performance indicators – 1
- No key performance indicators
14b. In 2023, how frequently is progress against the key performance indicators (KPIs) reviewed by managers?
- Annually – 1 out of 6
- Quarterly – 1 out of 3
- Monthly – 1 out of 2
- Weekly – 2 out of 3
- Daily – 5 out of 6
- Hourly or more frequently – 1
- Never – 0
14c. In 2023, how frequently is progress against the key performance indicators (KPIs) reviewed by non-managers?
- Annually – 1 out of 6
- Quarterly – 1 out of 3
- Monthly – 1 out of 2
- Weekly – 2 out of 3
- Daily – 5 out of 6
- Hourly or more frequently – 1
- Never – 0
The following establishes the survey questions in Section D – targets, listing their responses and corresponding scores.
16a. In 2023, which of the following best describes the main timeframes for achieving targets within this business?
- Main time frame is less than one year – 1 out of 3
- Main time frame is one year or more – 2 out of 3
- Combination of time frames of less than and more than one year – 1
- There are no targets – 0
16b. In 2023, how easy or difficult is it to achieve these targets?
- Very easy (possible to achieve without much effort) – 0
- Quite easy (possible to achieve with some effort) – 1 out of 2
- Neither easy nor difficult (possible to achieve with normal effort) – 3 out of 4
- Quite difficult (possible to achieve with more than normal effort) – 1
- Very difficult (possible to achieve with extraordinary effort) – 1 out of 4
16c. In 2023, approximately what proportion of managers are aware of these targets?
- All – 1
- Most – 2 out of 3
- Some – 1 out of 3
- None – 0
16d. In 2023, approximately what proportion of non-managers are aware of these targets?
- All – 1
- Most – 2 out of 3
- Some – 1 out of 3
- None – 0
17. In 2023, what are performance bonuses for managers usually based on within this business?
- Their own performance as measured by targets – 1
- Their team’s or shift’s performance as measured by targets – 4 out of 5
- Their site’s performance as measured by targets – 3 out of 5
- The business’ performance as measured by targets – 2 out of 5
- Performance bonuses were not related to targets – 1 out of 5
- No performance bonuses – 0
18. In 2023, what are performance bonuses for non-managers usually based on within this business?
- Their own performance as measured by targets – 1
- Their team’s or shift’s performance as measured by targets – 4 out of 5
- Their site’s performance as measured by targets – 3 out of 5
- The business’ performance as measured by targets – 2 out of 5
- Performance bonuses were not related to targets – 1 out of 5
- No performance bonuses – 0
The following establishes the questions in Section E – employment practices, lists their responses and corresponding scores.
21. In 2023, how are managers usually promoted within this business?
- Based solely on performance or ability – 1
- Based partly on performance or ability and partly on other factors; for example, partly based on length of service or business restructuring – 2 out of 3
- Based mainly on factors other than performance and ability; for example, length of service or business restructuring – 1 out of 3
- No managers are promoted – 0
22. In 2023, how are non-managers usually promoted within this business?
- Based solely on performance or ability – 1
- Based partly on performance or ability and partly on other factors; for example, partly based on length of service or business restructuring – 2 out of 3
- Based mainly on factors other than performance and ability; for example, length of service or business restructuring – 1 out of 3
- No non-managers are promoted – 0
25. In 2023, on average how many days training and development do managers undertake in a year?
- Less than a day – 0
- 1 day – 1 out of 4
- 2 to 4 days – 1 out of 2
- 5 to 10 days – 3 out of 4
- More than 10 days – 1
27. In 2023, which of the following best described the timeframe that action is taken to address under-performance among managers within this business?
- Within six months of identifying under-performance – 1
- After six months of identifying under-performance – 1 out of 2
- No action was taken to address under-performance – 0
- There is no under-performance – 0
The following sets out the industry stratification for the 2020 and 2023 MES, with their corresponding industry-strata codes.
- Section B: minding and quarrying – B
- Division 10: manufacture of food products – CA
- Division 11: manufacture of beverages – CA
- Division 12: manufacture of tobacco products – CA
- Division 13: manufacture of textiles – CB
- Division 14: manufacture of wearing apparel – CB
- Division 15: manufacture of leather and related products – CB
- Division 16: manufacture of wood and of products of wood and cork, except furniture; manufacture of articles of straw and plaiting materials – CC
- Division 17: manufacture of paper and paper products – CC
- Division 18: printing and reproduction of recorded media – CC
- Division 19: manufacture of coke and refined petroleum products – CD
- Division 20: manufacture of chemicals and chemical products – CE
- Division 21: manufacture of basic pharmaceutical products – CF
- Division 22: manufacture of rubber and plastic products – CG
- Division 23: manufacture of other non-metallic mineral products – CG
- Division 24: manufacture of basic metals – CH
- Division 25: manufacture of fabricated metal products, except machinery and equipment – CH
- Division 26: manufacture of computer, electronic and optical products – CI
- Division 27: manufacture of electrical equipment – CJ
- Division 28: manufacture of machinery and equipment n.e.c – CK
- Division 29: manufacture of motor vehicles, trailers and semi-trailers – CL
- Division 30: manufacture of other transport equipment – CL
- Division 31: manufacture of furniture - CM
- Division 32: other manufacturing - CM
- Division 33: repair and installation of machinery and equipment – CM
- Section D: electricity, gas, steam and air conditioning supply – D
- Section E: water supply, sewerage, waste management and remediation activities – E
- Division 41: construction of buildings – F41
- Division 42: civil engineering – F42
- Division 43: specialised construction activities – F43
- Division 45: wholesale and retail trade and repair of motor vehicles and motorcycles – G45
- Division 46: wholesale trade, except of motor vehicles and motorcycles – G46
- Group 47.1: retail sale in non-specialised stores – G47NONSPECIALIST
- Group 47.2: retail sale of food, beverages and tobacco in specialised stores – G47FOODSPECIALIST
- Group 47.3: retail sale of automotive fuel in specialised stores – G47OTHERSPECIALIST
- Group 47.4: retail sale of information and communication equipment in specialised stores – G47OTHERSPECIALIST
- Group 47.5: retail sale of other household equipment in specialised stores – G47OTHERSPECIALIST
- Group 47.6: retail sale of cultural and recreation goods in specialised stores – G47OTHERSPECIALIST
- Group 47.7: retail sale of other goods in specialised stores – G47OTHERSPECIALIST
- Group 47.8: retail sale via stalls and markets – G47OTHERSPECIALIST
- Group 47.9: retail trade not in stores, stalls or markets – G47ONLINE
- Division 49: land transport and transport via pipelines – H49
- Division 50: water transport – H50
- Division 51: air transport – H51
- Division 52: Warehousing and support activities for transportation – H52
- Division 53: postal and courier activities – H53
- Division 55: accommodation – I55
- Group 56.1: restaurants and mobile food service activities – I56FOOD
- Group 56.2: event catering and other food service activities – I56FOOD
- Group 56.3: beverage serving activities – I56DRINKS
- Division 58: publishing activities – J58
- Division 59: motion picture, video and television programme production, sound recording and music publishing activities – J5960
- Division 60: programming and broadcasting activities – J5960
- Division 61: telecommunications – J61
- Division 62: computer programming, consultancy and related activities – J6263
- Division 63: information service activities – J6263
- Section L: real estate activities – L
- Division 69: legal and accounting activities – M69
- Division 70: activities of head offices; management consultancy activities – M70
- Division 71: architectural and engineering activities; technical testing and analysis – M71
- Division 72: scientific research and development – M72
- Division 73: advertising and market research – M73
- Division 74: other professional, scientific and technical activities – M74
- Division 75: veterinary activities – M75
- Division 77: rental and leasing activities – N77
- Division 78: employment activities – N78
- Division 79: travel agency, tour operator and other reservation service and related activities – N79
- Division 80: security and investigation activities – N80
- Division 81: services to buildings and landscape activities – N81
- Division 82: office administrative, office support and other business support activities – N82
- Section P: education – P
- Division 86: human health activities – Q86
- Division 87: residential care activities – Q87
- Division 88: social work activities without accommodation – Q88
- Division 90: creative, arts and entertainment activities – R9092
- Division 91: libraries, archives, museums and other cultural activities – R9092
- Division 92: gambling and betting activities – R9092
- Division 93: sports activities and amusement and recreation activities – R93
- Division 94: activities of membership organisations – S94
- Division 95: repair of computers and personal and household goods – S9596
- Division 96: other personal service activities – S9596
9. Cite this methodology
Office for National Statistics (ONS), released 6 September 2024, ONS website, quality and methodology information report, Management practices in the UK QMI