Skip Header

Statistical Quality Standard A2: Developing Data Collection Instruments and Supporting Materials

Purpose: The purpose of this standard is to ensure that data collection instruments and supporting materials are designed to promote the collection of high quality data from respondents.

Scope: The Census Bureau’s statistical quality standards apply to all information products released by the Census Bureau and the activities that generate those products, including products released to the public, sponsors, joint partners, or other customers. All Census Bureau employees and Special Sworn Status individuals must comply with these standards; this includes contractors and other individuals that receive Census Bureau funding to develop and release Census Bureau information products.

In particular, this standard applies to the development or redesign of data collection instruments and supporting materials. The types of data collection instruments and supporting materials covered by this standard include:

  • Paper and electronic instruments (e.g., CATI, CAPI, Web, and touch tone data entry).
  • Self-administered and interviewer-administered instruments.
  • Instruments administered by telephone or in person.
  • Respondent letters, aids, and instructions.
  • Mapping and listing instruments used for operations, such as address canvassing, group quarters frame development, and the Local Update of Census Addresses (LUCA).

In addition to the global exclusions listed in the Preface, this standard does not apply to:

  • Data collection instruments and supporting materials where the Census Bureau does not have control over the content or format, such as the paper and electronic instruments used for collecting import and export merchandise trade data.

Key Terms: Behavior coding, CAPI, CATI, cognitive interviews, data collection instrument, field test, focus group, graphical user interface (GUI), imputation, integration testing, methodological expert review, nonresponse, pretesting, questionnaire, record linkage, respondent burden, respondent debriefing, split panel test, and usability testing.

Requirement A2-1: Throughout all processes associated with data collection, unauthorized release of protected information or administratively restricted information must be prevented by following federal laws (e.g., Title 13, Title 15, and Title 26), Census Bureau policies (e.g., Data Stewardship Policies), and additional provisions governing the use of the data (e.g., as may be specified in a memorandum of understanding or data-use agreement). (See Statistical Quality Standard S1, Protecting Confidentiality.)

Requirement A2-2: A plan must be produced that addresses:

  1. Program requirements for the data collection instrument and the graphical user interface (GUI), if applicable (e.g., data collection mode, content, constraints, and legal requirements).
  2. Supporting materials needed for the data collection (e.g., brochures, flashcards, and advance letters).
  3. Pretesting of the data collection instrument and supporting materials.
  4. Verification and testing to ensure the proper functioning of the data collection instrument and supporting materials.


    1. Statistical Quality Standard A1, Planning a Data Program, addresses overall planning requirements, including the development of schedules and costs.
    2. See the Guidelines for Designing Questionnaires for Administration in Different Modes and the Economic Directorate Guidelines on Questionnaire Design for guidance on designing data collection instruments.
    3. Data Stewardship Policy DS016, Respondent Identification Policy, contains policy requirements for data collection operations involving households where respondents in households provide information.

Requirement A2-3: Data collection instruments and supporting materials must be developed and tested in a manner that balances (within the constraints of budget, resources, and time) data quality and respondent burden.

Sub-Requirement A2-3.1: Specifications for data collection instruments and supporting materials, based on program requirements, must be developed and implemented.

Examples of topics that specifications might address include:

  • Requirements for programming the instrument to work efficiently. For example:
    • Built-in edits or range checks for electronic data collection instruments (e.g., edits for numeric data that must be within a pre-specified range).
    • Compliance with the CATI/CAPI Screen Standards for GUI (Windows-based) Instruments and Function Key Standards for GUI Instruments. (See the Technologies Management Office’s Authoring Standards Blaise Standards for Windows Surveys).
    • Input and output files for data collection instruments.
  • Segmented boxes for paper data collection instruments to facilitate scanning.
  • Paper size, color, thickness, and formatting to ensure compatibility with data capture and processing systems for paper data collection instruments.
  • Frequently Asked Questions about the data collection.
  • Supporting materials, such as Help materials and instructions.

Note: The Census Bureau Guideline Presentation of Data Edits to Respondents in Electronic Self-Administered Surveys presents recommendations for designing editing functionality, presentation, and wording in both demographic and economic self-administered electronic surveys.

Sub-Requirement A2-3.2: Data collection instruments and supporting materials must clearly state the following required notifications to respondents:

  1. The reasons for collecting the information.
  2. A statement on how the data will be used.
  3. An indication of whether responses are mandatory (citing authority) or voluntary.
  4. A statement on the nature and extent of confidentiality protection to be provided, citing authority.
  5. An estimate of the average respondent burden associated with providing the information.
  6. A statement requesting that the public direct comments concerning the burden estimate and suggestions for reducing this burden to the appropriate Census Bureau contact.
  7. The OMB control number and expiration date for the data collection.
  8. A statement that the Census Bureau may not conduct, and a person is not required to respond to, a data collection request unless it displays a currently valid OMB control number.

Sub-Requirement A2-3.3: Data collection instruments and supporting materials must be pretested with respondents to identify problems (e.g., problems related to content, order/context effects, skip instructions, formatting, navigation, and edits) and then refined, prior to implementation, based on the pretesting results.

Note: On rare occasions, cost or schedule constraints may make it infeasible to perform complete pretesting. In such cases, subject matter and cognitive experts must discuss the need for and feasibility of pretesting. The program manager must document any decisions regarding such pretesting, including the reasons for the decision. If no acceptable options for pretesting can be identified, the program manager must apply for a waiver. (See the Waiver Procedure for the procedures on obtaining a waiver.)

  1. Pretesting must be performed when:
    1. A new data collection instrument is developed.
    2. Questions are revised because the data are shown to be of poor quality (e.g., unit or item response rates are unacceptably low, measures of reliability or validity are unacceptably low, or benchmarking reveals unacceptable differences from accepted estimates of similar characteristics).
    3. Review by cognitive experts reveals that adding pretested questions to an existing instrument may cause potential context effects.
    4. An existing data collection instrument has substantive modifications (e.g., existing questions are revised or new questions added).

    Note: Pretesting is not required for questions that performed adequately in another survey.

  2. Pretesting must involve respondents or data providers who are in scope for the data collection. It must verify that the questions:
    1. Can be understood and answered by potential respondents.
    2. Can be administered properly by interviewers (if interviewer-administered).
    3. Are not unduly sensitive and do not cause undue burden.
    Examples of issues to verify during pretesting:
    • The sequence of questions and skip patterns is logical and easy-to-follow.
    • The wording is concise, clear, and unambiguous.
    • Fonts (style and size), colors, and other visual design elements promote readability and comprehension.

  3. One or more of the following pretesting methods must be used:
    1. Cognitive interviews.
    2. Focus groups, but only if the focus group completes a self-administered instrument and discusses it afterwards.
    3. Usability techniques, but only if they are focused on the respondent’s understanding of the questionnaire.
    4. Behavior coding of respondent/interviewer interactions.
    5. Respondent debriefings in conjunction with a field test or actual data collection.
    6. Split panel tests.
    1. Methodological expert reviews generally do not satisfy this pretesting requirement. However, if a program is under extreme budget, resource, or time constraints, the program manager may request cognitive experts in the Center for Statistical Research and Methodology or on the Response Improvement Research Staff to conduct such a review. The results of this expert review must be documented in a written report. If the cognitive experts do not agree that an expert review would satisfy this requirement, the program manager must apply for a waiver.
    2. Multiple pretesting methods should be used as budget, resources, and time permits to provide a thorough evaluation of the data collection instrument and to document that the data collection instrument “works” as expected. In addition, other techniques used in combination with the pretesting methods listed above may be useful in developing data collection instruments. (See Appendix A2, Questionnaire Testing and Evaluation Methods for Censuses and Surveys, for descriptions of the various pretesting methods available.)
  4. When surveys or censuses are administered using multiple modes and meaningful changes to questions are made to accommodate the mode differences, all versions must be pretested.

    Meaningful changes to questions to accommodate mode differences include changes to the presentation of the question or response format to reflect mode-specific functional constraints or advantages. In these cases, the proposed wording of each version must be pretested to ensure consistent interpretation of the intent of the question across modes, despite structural format or presentation differences. As long as the proposed wording of each version is pretested, testing of the mode (e.g., paper versus electronic) is not required, although it may be advisable.

  5. Data collection instruments in any languages other than English must be pretested in the languages that will be used to collect data during production. Pretesting supporting materials in these languages is not required, but is recommended.

    Note: The Census Bureau Guideline Language Translation of Data Collection Instruments and Supporting Materials provides guidance on translating data collection instruments and supporting materials from English to another language.

Sub-Requirement A2-3.4: Data collection instruments and supporting materials must be verified and tested to ensure that they function as intended.

Examples of verification and testing activities include:

  • Verifying that the data collection instrument’s specifications and supporting materials reflect the sponsor’s requirements (e.g., conducting walk-throughs to verify the appropriateness of specifications).
  • Verifying that the data collection instrument and supporting materials meet all specifications (e.g., verifying correctness of skip patterns, wording, instrument fills, and instrument edits).
  • Conducting integration testing using mock input files with realistic scenarios to test all parts of the data collection instrument together (e.g., front, middle, and back modules).
  • Conducting usability testing to discover and eliminate barriers that keep respondents from completing the data collection instrument accurately and efficiently.
  • Conducting output tests to compare the output of the data collection instrument under development with that of its predecessor (if the data collection has been done with a similar instrument in the past).
  • Verifying that user interfaces work according to specifications.
  • Verifying that user interfaces for electronic forms adhere to IT Standard 15.0.2, Web Development Requirements and Guidelines, and any other guidance applicable to the program.
  • Verifying that Web-based data collection instruments comply with requirements of Section 508 of the U.S. Rehabilitation Act.
  • Verifying that paper data collection instruments are compatible with the program’s data capture and processing systems.

    Note: The Census Bureau Guideline Computer Assisted Personal Interviewing reflects recommended practices for ensuring the quality of CAPI.

Requirement A2-4: Documentation needed to replicate and evaluate the development of data collection instruments and supporting materials must be produced. The documentation must be retained, consistent with applicable policies and data-use agreements, and must be made available to Census Bureau employees who need it to carry out their work. (See Statistical Quality Standard S2, Managing Data and Documents.)

Examples of documentation include:

  • Plans for the development and testing of the data collection instrument and supporting materials.
  • Specifications for the data collection instruments and supporting materials.
  • Results of questionnaire development research (e.g., pretesting results, expert review reports, and site visit reports).
  • Input files used to test the final production instrument and reports of testing results.
  • Computer source code for the production data collection instrument along with information on the version of software used to develop the instrument.
  • Quality measures and evaluation results. (See Statistical Quality Standard D3, Producing Measures and Indicators of Nonsampling Error.)


  1. The documentation must be released on request to external users, unless the information is subject to legal protections or administrative restrictions that would preclude its release. (See Data Stewardship Policy DS007, Information Security Management Program.)
  2. Statistical Quality Standard F2, Providing Documentation to Support Transparency in Information Products, contains specific requirements about documentation that must be readily accessible to the public to ensure transparency of information products released by the Census Bureau.


Back to Header