Work with interactive mapping tools from across the Census Bureau.
Collection of audio features and sound bites.
The Census Bureau packages data and information into easy-to-understand visuals.
Browse Census Bureau images.
Read briefs and reports from Census Bureau experts.
Watch Census Bureau vignettes, testimonials, and video files.
Read research analyses from Census Bureau experts.
Developer portal to access services and documentation for the Census Bureau's APIs.
Explore Census Bureau data on your mobile device with interactive tools.
Find a multitude of DVDs, CDs and publications in print by topic.
These external sites provide more data.
Download extraction tools to help you get the in-depth data you need.
Explore Census data with interactive visualizations covering a broad range of topics.
How we provide the best mix of timeliness, relevancy, quality, and cost for the data we collect.
Learn about other opportunities to collaborate with us.
Explore the rich historical background of an organization with roots almost as old as the nation.
Explore prospective positions available at the Census Bureau.
Explore Census programs targeted for particular needs.
Discover the latest in Census Bureau data releases, reports, and events.
The Census Bureau's Director writes on how we measure America's people, places and economy.
Find interesting and quirky statistics regarding national celebrations and major events.
Listen to audio files on fun facts, historical figures, and celebrations of the month.
Find media toolkits, advisories, and all the latest Census news.
See what's coming up in releases and reports.
All American Community Survey (ACS) estimates are currently released with their associated margin of error (MOE). However, many users do not think that the MOE provides enough information about an estimate’s reliability. We conducted usability testing to examine whether the addition of reliability indicators to new prototypes of ACS data tables helped people use the tables with greater accuracy, efficiency, and satisfaction. All three prototypical reliability indicators were based on the coefficient of variation (CV), which is defined as the standard error divided by the mean of the estimate.
Each prototypical table with a reliability indicator tested in this study implemented a color-coded “Reliability” column and a legend explaining the meaning of the color codes to provide users with guidance as to whether the proportion of error to the estimate itself might be considered unacceptably high.
Each prototype table was defined by the number of “levels” that its Data Reliability Indicator used to label the estimates. Specifically, the two-level indicator had the levels “blank” (CV<=0.30) and “use caution” (yellow; CV>0.30); the three-level indicator included the levels “good” (green; CV<=0.30), “fair” (yellow; 0.30<CV<=0.61), and “poor” (red; CV>0.61); and the four-level indicator had the levels “excellent” (green; CV<=0.10), “good” (yellow; 0.10<CV<=0.30), “fair” (orange; 0.30<CV<=0.61), and “poor” (red; CV>0.61). Nine participants completed usability tasks using the prototypes, and three participants completed the tasks using baseline versions of the current ACS data tables without the indicators. Full versions of the prototype tables can be found in Appendix A.
The results showed that the four-level indicator was associated with the highest accuracy and satisfaction scores. In fact, eight of the twelve participants indicated that they preferred the four-level indicator overall. Users were able to complete the tasks more efficiently using the prototypes than the baseline tables, and they also expressed a strong preference for all of the prototypes over the baseline tables.
This symbol indicates a link to a non-government web site. Our linking to these sites does not constitute an endorsement of any products, services or the information found on them. Once you link to another site you are subject to the policies of the new site.