Work with interactive mapping tools from across the Census Bureau.
Read briefs and reports from Census Bureau experts.
Watch Census Bureau vignettes, testimonials, and video files.
Read research analyses from Census Bureau experts.
Developer portal to access services and documentation for the Census Bureau's APIs.
Explore Census Bureau data on your mobile device with interactive tools.
Find a multitude of DVDs, CDs and publications in print by topic.
These external sites provide more data.
Download extraction tools to help you get the in-depth data you need.
Explore Census data with interactive visualizations covering a broad range of topics.
Information about the U.S. Census Bureau.
Information about what we do at the U.S. Census Bureau.
Learn about other opportunities to collaborate with us.
Explore the rich historical background of an organization with roots almost as old as the nation.
Explore prospective positions available at the U.S. Census Bureau.
Information about the current field vacancies available at the U.S. Census Bureau Regional Offices.
Discover the latest in Census Bureau data releases, reports, and events.
The Census Bureau's Director writes on how we measure America's people, places and economy.
Find interesting and quirky statistics regarding national celebrations and major events.
Find media toolkits, advisories, and all the latest Census news.
See what's coming up in releases and reports.
During the past ten years, in an effort to improve data quality, there has been an increase in the use of questionnaire pretesting, prior to implementation. Various questionnaire evaluation techniques have been evaluated and the associated strengths and weaknesses have been identified (DeMaio et al, 1993; Esposito et al, 1992; Oksenberg et al, 1991; Presser and Blair, 1994). Some limited research has been conducted about the effectiveness of cognitive interviews in actually reducing questionnaire problems (Willis, 1996, Lessler et al, 1989). The objective of our research is to determine how well various question pretesting methods predict the types of problems that will actually be experienced in the field and to what extent the laboratory testing contributes to improved questions. In this research, multiple researchers in three research organizations conducted expert reviews, cognitive appraisals, and cognitive interviews on three survey instruments. A classification scheme was developed to code problems identified through all three methods. The questions identified as the most problematic were revised. Both the original and the revised questions were tested in an omnibus CATI/RDD survey conducted by the U.S. Census Bureau. Analysis of the field results are being evaluated using independent outcome quality measures. Comparing the results from pretesting with the results of the field study will determine how well the various pretesting methods identified the types of problems which surfaced during field testing. Comparing the results of the independent outcome quality measures from the field study for both the original and revised question wordings will tell us whether the revised question wording actually improved data quality.