Skip Main Navigation Skip To Navigation Content

Management and Organizational Practices Survey (MOPS)

You are here: Census.govBusiness & IndustryManufacturingManagement and Organizational Practices Survey › How the Data are Collected
Skip top of page navigation

Methodological Documentation for the 2010 Management and Organizational Practices Survey

The Census Bureau’s 2010 Management and Organizational Practices Survey (MOPS) is a supplement to the 2010 Annual Survey of Manufactures (ASM). For information on the ASM, see www.census.gov/manufacturing/asm/index.html. The MOPS was developed jointly with a team of academic experts and was partially funded by the National Science Foundation. Conducting the MOPS as a supplement to the ASM maximizes the analytic utility of the data and minimizes the additional respondent burden required to achieve the measurement goals of the survey. These goals were to 1) describe the prevalence and use of structured management practices in U.S. industry and 2) permit analyses of the relationship between these practices and key economic outcomes, such as productivity and employment growth. Like previous supplements to the ASM (e.g., the 1999 Computer Network Use Supplement), the MOPS enhances the information content of the base ASM. The ASM collects detailed information on many inputs used in manufacturing production, such as labor, capital, energy, and materials, as well as the outputs from this production. The MOPS provides information about other important components in these production processes (the management and organizational practices) and thus enhances our understanding of business dynamics.

Below we summarize various methodological aspects of the 2010 MOPS, including the development of the survey questionnaire, sample design, survey response, and derivation of establishment management scores and published estimates.

1 Survey Questionnaire

The original survey questionnaire design was based in part on an international survey tool used by the World Bank, as discussed in Bloom, Schweiger and van Reenen (2012). The survey tool was adapted to the United States through several months of development and testing by the Census Bureau. The 2010 MOPS questionnaire comprised 36 questions (30 of which were multiple choice questions) about the establishment, taking about 20 to 30 minutes to complete. The questions were split into three sections: management practices (16 questions), organization (13 questions) and background characteristics (7 questions). For each question, the respondents were asked to report their response for 2010, as well as a response based on recall for 2005. The survey questionnaire is available at bhs.econ.census.gov/bhs/mops/form.html.

1.1 Management Practices

The management practices covered three main sections: monitoring, targets, and incentives, based on Bloom and Van Reenen (2007), which itself was based in part on the principles of continuous monitoring, evaluation and improvement from Lean manufacturing (e.g., Womack, Jones and Roos 1991). The monitoring section probed establishments about their collection and use of information to monitor and improve the production process, such as how frequently performance indicators were tracked at the establishment, with options ranging from “never” to “hourly or more frequently”. The targets section asked about the design, integration, and achievability of production targets, such as what was the time-frame of production targets, ranging from “no production targets” to “combination of short-term and long-term production targets”. Finally, the incentives section asked about non-managerial and managerial bonuses, promotion and reassignment/dismissal practices, such as how were managers promoted at the establishment, with answers ranging from “mainly on factors other than performance and ability (for example, tenure or family connections)” to “solely on performance and ability”.

1.2 Organization

The organization part of the survey covered questions on the decentralization of power from the headquarters to the plant manager based on Bloom, Sadun and Van Reenen (2012) and Breshanan, Brynjolfsson and Hitt (2002). This asked, for example, where decisions were made on pay increases, ranging from “only at headquarters” to “only at this establishment”. A second set of questions asked about plant-manager span of control and reporting levels based on Bloom, Garicano, Sadun and Van Reenen (2011), such as asking how many employees report directly to the plant manager. A final set of questions asked about data use in decision making based on Bryjnolfsson, Hitt and Kim (2011), such as asking about the use of data in decision making at the establishment with response options ranging from “decision making does not use data” to “decision making relies entirely on data”. In addition, one question asks about how managers learn about management practices with answers concerning a variety of sources (“Consultants”, “Competitors”, etc.).

1.3 Background Characteristics

This section asked a range of questions about the number of managers and non-managers at the establishment, the share of both groups that had a bachelors degree, the share of employees in a union, and the seniority and tenure of the respondent.

2 Sample Design and Survey Response

The sample for the 2010 MOPS consisted of the approximately 50,000 establishments in the 2010 Annual Survey of Manufactures (ASM) mailout sample. The mailout sample for the ASM is redesigned at 5-year intervals beginning the second survey year subsequent to the Economic Census. For the 2009 survey year, a new probability sample was selected from a frame of approximately 117,000 manufacturing establishments of multi-location companies and large single-establishment companies in the 2007 Economic Census, which surveys establishments with paid employees located in the United States. Using the Census Bureau’s Business Register, the mailout sample was supplemented annually by new establishments, which have paid employees, are located in the United States, and entered business in 2008 - 2010. For more information on the ASM sample design, see www.census.gov/manufacturing/asm/how_the_data_are_collected/index.html.

To reduce cost and response burden on small- and medium-sized single-establishment companies, which were identified in the Manufacturing component of the 2007 Economic Census, these companies were not mailed ASM questionnaires. For the 2010 ASM, annual payroll data for these approximately 144,000 companies were estimated based on administrative information from the Internal Revenue Service and the Social Security Administration, and other data items were estimated using industry averages. To produce estimates from the 2010 ASM, the estimated data for these small- and medium-sized single-establishment companies were combined with the estimates from the 2010 ASM mailout sample. Because the sample for the 2010 MOPS consisted of only the establishments in the 2010 ASM mailout sample, these small- and medium-sized single-establishment companies were not represented in the 2010 MOPS estimates.

The 2010 MOPS questionnaire was sent by mail and electronically to the ASM mailout sample establishments, typically to the accounting, plant or human-resource manager. Most respondents (58.4%) completed the survey electronically, with the remainder completing the survey by paper (41.6%). Non-respondents were given up to three follow-up telephone calls if no response had been received within three months. After the follow-up calls were made, no attempt was made to impute data for item or unit nonresponse.

Of the approximately 50,000 establishments that were sent a MOPS survey questionnaire, over 30,000 establishments responded with information that could be used for subsequent analysis. To be included in the calculation of the estimates included in the Census Bureau press release, a given establishment record must have met specified tabulation criteria. To be tabulated, a given establishment record must have had at least 11 non-missing responses to the MOPS management questions; successfully matched to the ASM database and be included in the 2010 ASM tabulations; successfully matched to the Longitudinal Business Database; and had positive value added, positive employment, and positive imputed capital stock for 2010, except this criterion based on these three variables for 2010 was excluded when estimates for both 2005 and 2010 were calculated and compared in the press release. For information on the Longitudinal Business Database, see https://www.census.gov/ces/dataproducts/datasets/lbd.html.

3 Derivation of Establishment Management Scores and Published Estimates

The management score for each establishment was derived in two steps. First, the response to each of the 16 management questions was normalized on a 0-1 scale. The response which was associated with the most structured management practice was normalized to 1, and the one associated with the least structured was normalized to zero. We define more structured management practices as those that are more specific, pro-active, frequent and explicit. For example, when asking “...when was an under-performing non-manager reassigned or dismissed?”, the response “Within 6 months of identifying non-manager under-performance” was ranked 1 and the response “Rarely or never” was ranked 0. As another example, when asking “…what best describes what happened at this establishment when a problem in the production process arose?” the response “We fixed it and took action to make sure that it did not happen again, and had a continuous improvement process to anticipate problems like these in advance” was ranked 1 and the response “No action was taken” was ranked 0.

If a question had three categories, the “in between” category was assigned the value 0.5. Similarly, for four categories, the “in between” categories were assigned 1/3 and 2/3 and so on. For multiple choice questions which allowed for the selection of more than one answer per year, we used the average of the normalized answers as the score for the particular question. If the question did not allow for the selection of more than one answer, but more than one box was selected, we treated the observation as missing. Second, the management score for a given establishment was calculated as the unweighted average of the normalized responses for the 16 management questions. In robustness tests, we also evaluated another way to average across the 16 individual scores. We used a management z-score, which normalized each question to have a mean of 0 and a standard deviation of 1, then averaged across these z-scores. We found that all our results were extremely similar because the z-scores were highly correlated with our management scores.

Using establishment records that met the tabulation criteria described in the last section, the estimates published in the Census Bureau press release were based on weighted data from survey respondents, in which the ASM sample weights were used in weighted averages of establishment management scores. Both the numerator and denominator of a given weighted average were based on accumulations of weighted data. When calculating estimates displayed in a frequency histogram, the frequency weights were the ASM sample weights multiplied by 100, because the ASM weights were the inverse of the selection probabilities and were stored as real numbers greater than or equal to 1, instead of integers.

3.1 Sampling Error

The estimates developed from the sample are likely to differ from the results of a complete canvassing of all eligible establishments in the population. The particular sample selected for the 2010 ASM mailout and used for the 2010 MOPS is one of many probability samples that could have been selected under identical circumstances. Each of the possible samples would yield a different set of results. The derived standard errors are measures of the variation of all the possible sample estimates around the result from a complete enumeration. Estimates with low standard errors are typically more accurate than those associated with high standard errors.

For the published statistics in the Census Bureau press release, estimates of the standard errors are computed from the sample data. They are represented in the form of 90% margins of error, each of which is the standard error multiplied by 1.645. By adding and subtracting the 90% margin of error from its associated estimate, the margin of error may be used to define a 90% confidence interval, or range that would include the result from a complete enumeration for 90% of all the possible samples. For example, suppose an average management score is shown as 0.52 with an associated margin of error of 0.02. Then, the 90% confidence interval for estimating the average management score based on a complete enumeration is 0.50 to 0.54.

3.2 Non-sampling Error

In addition to the sampling errors, the estimates are subject to various response and operational errors: errors of collection, reporting, coding, transcription, non-response, etc. These non-sampling, or operational, errors also would occur if a complete canvass were to be conducted under the same conditions as the survey. Explicit measures of their effects generally are not available. However, it is believed that most of the important operational errors were detected and corrected during the review of the data for reasonableness and consistency.

3.3 Disclosure Avoidance Procedures

Title 13 of the United States Code authorizes the Census Bureau to conduct censuses and surveys. Section 9 of the same Title requires that any information collected from the public under the authority of Title 13 be maintained as confidential. Section 214 of Title 13 and Sections 3559 and 3571 of Title 18 of the United States Code provide for the imposition of penalties of up to five years in prison and up to $250,000 in fines for wrongful disclosure of confidential census information. In accordance with Title 13, no estimates are published that would disclose the operations of an individual company.

The Census Bureau's internal Disclosure Review Board sets the confidentiality rules for all data releases. A checklist approach is used to ensure that all potential risks to the confidentiality of the data are considered and addressed.


Bibliography

Bloom, N., Garicano, L., Sadun, R. and Van Reenen, J. (2011), “The Distinct Effects of Information Technology and Communication Technology on Firm Organization”, Centre for Economic Performance Discussion Paper No. 927.

Bloom, N., Sadun, R. and Van Reenen, J. (2012), “The organization of firms across countries”, forthcoming Quarterly Journal of Economics.

Bloom, N., Schweiger, H. and Van Reenen, J. (2012), “The land lean manufacturing forgot? Management practices in transition countries”, Economics of Transition 20(4), 569-785.

Bloom, N. and Van Reenen, J. (2007), “Measuring and explaining management practices across firms and countries”, Quarterly Journal of Economics 122(4), 1351–1408

Bresnahan, T., Brynjolfsson, E. and Hitt, L. (2002), “Information technology, workplace organization and the demand for skilled labor: firm-level evidence”, Quarterly Journal of Economics, Vol. 117(2) 339-376.

Bryjnolfsson, E., Hitt, L. and Kim, H. (2011), “Strength in Numbers: How Does Data-Driven Decision-making Affect Firm Performance?”, MIT mimeo April 2011.

Womack, J., Jones, D. and Roos, D. (1991), The Machine That Changed the World, Simon and Schuster Inc: New York.

 

PDF Publications: None

Back to Top


[PDF] or PDF denotes a file in Adobe’s Portable Document Format. To view the file, you will need the Adobe® Reader® Off Site available free from Adobe. [Excel] or the letters [xls] indicate a document is in the Microsoft® Excel® Spreadsheet Format (XLS). To view the file, you will need the Microsoft® Excel® Viewer Off Site available for free from Microsoft®.

This symbol Off Site indicates a link to a non-government web site. Our linking to these sites does not constitute an endorsement of any products, services or the information found on them. Once you link to another site you are subject to the policies of the new site.

Source: U.S. Census Bureau | Management and Organizational Practices Survey | (301) 763-4673 |  Last Revised: January 07, 2013