Work with interactive mapping tools from across the Census Bureau.
Collection of audio features and sound bites.
The Census Bureau packages data and information into easy-to-understand visuals.
Browse Census Bureau images.
Read briefs and reports from Census Bureau experts.
Watch Census Bureau vignettes, testimonials, and video files.
Read research analyses from Census Bureau experts.
Developer portal to access services and documentation for the Census Bureau's APIs.
Explore Census Bureau data on your mobile device with interactive tools.
Find a multitude of DVDs, CDs and publications in print by topic.
These external sites provide more data.
Download extraction tools to help you get the in-depth data you need.
Explore Census data with interactive visualizations covering a broad range of topics.
How we provide the best mix of timeliness, relevancy, quality, and cost for the data we collect.
Learn about other opportunities to collaborate with us.
Explore the rich historical background of an organization with roots almost as old as the nation.
Explore prospective positions available at the Census Bureau.
Information about the current field vacancies available at the U.S. Census Bureau Regional Offices.
Discover the latest in Census Bureau data releases, reports, and events.
The Census Bureau's Director writes on how we measure America's people, places and economy.
Find interesting and quirky statistics regarding national celebrations and major events.
Listen to audio files on fun facts, historical figures, and celebrations of the month.
Find media toolkits, advisories, and all the latest Census news.
See what's coming up in releases and reports.
Despite some initial work (e.g. Presser and Blair, Cannell et al.), few objective guidelines currently exist to suggest how many cases are needed when using behavior coding as a pretest technique. In practice, time and budget factors often coalesce to reduce the number of cases which are finally behavior coded. This paper attempts to establish a practical number of cases to maximize knowledge gained from behavior coding while holding dollar and time cost to a minimum. It also examines these issues for questionnaire items categorized by expected level of respondent and interviewer difficulty. Results of the research may be useful to survey organizations wishing to conduct efficient behavior coding pretests to identify questionnaire design problems before fielding a survey on a large scale. The paper first describes the impetus for the research and briefly reviews the relevant literature. Second, we discuss our methods and analytic procedures. Next, we present our findings, which suggest that a relatively small number of cases may typically provide questionnaire designers with the same information - leading to identical conclusions - that they would have obtained with a much larger sample at much greater cost and effort. Finally, we discuss the implications of these findings and suggest next steps.