Introducing a new way to navigate by topics. Access the latest news, data, publications and more around topics of interest.
Our population statistics cover age, sex, race, Hispanic origin, migration, ancestry, language use, veterans, as well as population estimates and projections.
This section provides information on a range of educational topics, from educational attainment and school enrollment to school districts, costs and financing.
We measure the state of the nations workforce, including employment and unemployment levels, weeks and hours worked, occupations, and commuting.
Our statistics highlight trends in household and family composition, describe characteristics of the residents of housing units, and show how they are related.
Health statistics on insurance coverage, disability, fertility and other health issues are increasingly important in measuring the nation's overall well-being.
We measure the housing and construction industry, track homeownership rates, and produce statistics on the physical and financial characteristics of our homes.
The U.S. Census Bureau is the official source for U.S. export and import statistics and regulations governing the reporting of exports from the U.S.
The U.S. Census Bureau provides data for the Federal, state and local governments as well as voting, redistricting, apportionment and congressional affairs.
Search an alphabetical index of keywords and phrases to access Census Bureau statistics, publications, products, services, data, and data tools.
Geography provides the framework for Census Bureau survey design, sample selection, data collection, tabulation, and dissemination.
Geography is central to the work of the Bureau, providing the framework for survey design, sample selection, data collection, tabulation, and dissemination.
Find resources on how to use geographic data and products with statistical data, educational blog postings, and presentations.
The Geographic Support System Initiative will integrate improved address coverage, spatial feature updates, and enhanced quality assessment and measurement.
Work with interactive mapping tools from across the Census Bureau.
Find geographic data and products such as Shapefiles, KMLs, TIGERweb, boundary files, geographic relationship files, and reference and thematic maps.
Metropolitan and micropolitan areas are geographic entities used by Federal statistical agencies in collecting, tabulating, and publishing Federal statistics.
Find information about specific partnership programs and learn more about our partnerships with other organizations.
Definitions of geographic terms, why geographic areas are defined, and how the Census Bureau defines geographic areas.
We conduct research on geographic topics such as how to define geographic areas and how geography changes over time.
Visit our library of Census Bureau multimedia files. Collection formats include audio, video, mobile apps, images, and publications.
Collection of audio features and sound bites.
The Census Bureau packages data and information into easy-to-understand visuals.
Browse Census Bureau images.
Read briefs and reports from Census Bureau experts.
Watch Census Bureau vignettes, testimonials, and video files.
Read research analyses from Census Bureau experts.
Access data through products and tools including data visualizations, mobile apps, interactive web apps and other software.
Developer portal to access services and documentation for the Census Bureau's APIs.
Explore Census Bureau data on your mobile device with interactive tools.
Find a multitude of DVDs, CDs and publications in print by topic.
These external sites provide more data.
Download extraction tools to help you get the in-depth data you need.
Learn more about our data from this collection of e-tutorials, presentations, webinars and other training materials. Sign up for training sessions.
Explore Census data with interactive visualizations covering a broad range of topics.
Learn how we serve the public as the most reliable source of data about the nation's people and economy.
How we provide the best mix of timeliness, relevancy, quality, and cost for the data we collect.
Our researchers explore innovative ways to conduct surveys, increase respondent participation, reduce costs, and improve accuracy.
Our surveys provide periodic and comprehensive statistics about the nation, critical for government programs, policies, and decisionmaking.
Learn about other opportunities to collaborate with us.
Explore the rich historical background of an organization with roots almost as old as the nation.
Explore prospective positions available at the Census Bureau.
Explore Census programs targeted for particular needs.
Discover the latest in Census Bureau data releases, reports, and events.
The Census Bureau's Director writes on how we measure America's people, places and economy.
Find interesting and quirky statistics regarding national celebrations and major events.
Listen to audio files on fun facts, historical figures, and celebrations of the month.
Find media toolkits, advisories, and all the latest Census news.
See what's coming up in releases and reports.
Citizenship Status (Person Question 8)
The primary objectives of the 2006 ACS Content Test work on the U.S. citizenship status question were to correct inaccuracies with the current questionnaire item and to meet the requirements of congressionally funded initiatives. Collecting year of naturalization aids in verifying the accuracy of the reported rates of naturalization, which has been shown by some research to be overstated with the U.S. citizenship status question in the ACS prior to the 2006 ACS Content Test. The updated question also serves as the first official benchmark to compare with Office of Immigration Statistics (OIS) administrative records on naturalization. The additional detail also assists in editing U.S. citizenship. The control version replicated the 2006 ACS question. The test version modified the 2006 question by including a write-in field for naturalized, foreign-born respondents to provide their year of naturalization. In addition, the test version replaced the response option "Yes, born abroad of American parent or parents" with "Yes, born abroad of U.S. citizen parent or parents."
The 2006 ACS Content Test findings showed no significant differences in overall question item nonresponse rates between the control and test versions. Approximately 10 percent of respondents who indicated being a naturalized U.S. citizen did not report a year of naturalization. The overall distribution for U.S. Citizenship status was not significantly different between the control and test panels.
School Enrollment and Type of School (Person Questions 10a-10b)
The primary objectives of the 2006 ACS Content Test work on school enrollment were to improve the data collected and to provide information on school enrollment of the U.S. population ages 3 and older. The 2006 ACS Content Test compared two versions of the school enrollment question. The control version replicated the 2006 ACS question and response categories. The test version modified the school enrollment question by substituting "school" for "regular school" and adding "home school" in the instructions and response categories. The test version also modified the response categories by adding the qualifier "beyond a bachelor's degree" to the "graduate or professional school" category and including a write-in field for enrollment in grades 1 through 12.
The 2006 ACS Content Test findings showed that there were no significant differences in the distributions of grade of enrollment, private school, or home schooled enrollment, and the enrollment of vocational, technical, or business students between the control version and the test version. The test version did not significantly increase the item nonresponse rates for the school age population. However, the test version showed a higher item nonresponse rate for grade level among the population age 3 and above, and a lower age-grade consistency for the grades 1-4 category. Despite these differences, both versions had low nonresponse rates overall and high age-grade consistency for the grade school categories. Because the test version performed as well as the original question in almost all respects and provided additional information by partitioning level of enrollment into single grades, the test version was adopted.
Educational Attainment (Person Question 11)
The primary objectives of the 2006 ACS Content Test work on educational attainment were to improve the clarity of the question for respondents and to improve detail on grades of schooling and secondary credentials. The 2006 ACS Content Test compared two versions of the educational attainment question. The control version replicated the 2006 ACS question and response categories. The test version modified the educational attainment response categories by including a series of five headings, separating kindergarten from nursery school, and including a write-in field for grades 1 through 11. The test version also included separate categories for type of high school completion (high school diploma versus GED and other alternative credentials). The test version also specified categorization of some college based on credits, rather than years, and added the qualifier "beyond a bachelor's degree" to the professional degree category.
The 2006 ACS Content Test findings showed no significant differences in overall question item nonresponse rates between the test and control versions. The distribution of educational attainment also differed between test and control versions for several attainment categories. For two groups, persons age 3 and above and persons 18 and above, the test version showed a decrease in the percent of people in the following categories: 7th or 8th grade; 12th grade, no diploma; and high school graduate. The test version showed an increase in the percent of people with more than 1 year of college, no degree and those with a bachelor's degree. In the age 3 and above distribution, the test version showed a higher proportion in nursery school to 4th grade and a lower proportion in 11th grade. In the age 18 and above distribution, the test version had a higher proportion of respondents reporting no schooling completed.
Residence 1 Year Ago (Person Questions 14a-14b)
The primary objectives of the 2006 ACS Content Test work on residence 1 year ago (migration) were to collect complete and appropriate address information for recent movers within the United States and to collect complete and appropriate previous residence information for movers to the United States from Puerto Rico. The 2006 ACS Content Test compared two versions of the residence one year ago (migration) question set. The control version replicated the 2006 ACS question. The test version modified the migration question by including the address (structure number and street name) and collecting geographic information down to the place level within Puerto Rico for persons living in the United States at the time of the survey whose previous residence was Puerto Rico.
The 2006 ACS Content Test findings showed item nonresponse rates for place (city, town, or post office), county, and state names increased for the modified question while the item nonresponse rate for zip code decreased. Even with the higher item nonresponse rates, there was no significant difference in the number of addresses that were codeable at the state, county, or place level.
Disability (Person Questions 16a-18)
There were two primary objectives of the 2006 ACS Content Test work on the disability status questions. The first objective was to better identify specific portions of the population of people with disabilities. The second objective was to improve the estimate of the population of persons with disabilities, as defined by a person's risk of participation limitation when he or she has a functional limitation or impairment. The test version included the following adjustments:
After obtaining favorable results from the test version of the disability status questions, the revised set was included in the 2008 ACS. The 2008 questions on disability represent a conceptual and empirical break from earlier years of the ACS. Hence, the Census Bureau does not recommend any comparisons to disability data from the 2007 ACS and earlier.
Employment Status (Person Questions 28a-28b, 34b, 35)
The primary objective of the 2006 ACS Content Test work on the employment status series, specifically, the questions about worked last week, temporarily absent, and looking for work, was to improve the measurement of employment status by addressing several limitations that previous research suggested were present in the ACS question wording prior to 2008. Analysis of employment data from the ACS and Census 2000 revealed that employment levels were underestimated and unemployment levels were overestimated relative to benchmark data from the Current Population Survey (CPS) or from the Local Area Unemployment Statistics (LAUS) program at the Bureau of Labor Statistics.
Three of the ACS employment status questions were modified for the test panel. The worked last week and temporarily absent questions are key components in the measurement of employed people and people who are not in the labor force, while the looking for work question is a component in the measurement of unemployed people. These changes were done with an overall goal of increasing the estimate of employed people, reducing the estimate of unemployed people, and reducing response inconsistencies in the individual categories of the employment status concept.
The 2006 ACS Content Test compared two versions of the Employment status series question set. The control version replicated the 2006 ACS questions. The test version modified the employment status series questions by
The 2006 ACS Content Test findings overall showed that the test questions produced a higher estimate of employed people compared to the control. While the test questions did not produce a lower estimate of unemployed people, the overall unemployment rate was lower for the test panel. The unemployment rate is the more useful measure because it excludes people who are not in the labor force. Empirically, the test version of the questions performed better than the existing ACS questions in terms of more favorable estimates, but respondents are answering the employment series of questions inconsistently.
Weeks Worked in the Past 12 Months (Person Questions 38a-38b)
The primary objective of the 2006 ACS Content Test work on the weeks worked question was to produce a higher estimate of year-round workers, bringing the ACS estimate closer to the CPS estimate, while not adversely impacting item nonresponse rates and data quality. One of the main purposes of the weeks worked question is to establish a framework for the earnings data collected in the ACS. Classifying workers as "full-time, year-round" is key to the presentation and analysis of earnings data.
The 2006 ACS Content Test compared two versions of the weeks worked question set. The control version replicated the 2006 ACS question. The test version modified the opened-ended weeks worked question by splitting it into two parts. Part (a) asked whether the respondent worked 50 or more weeks in the past 12 months, or past 52 weeks. Part (b) asked respondents who said 'no' to part (a) to choose from six categories of how many weeks they did work, even for a few hours, including paid vacation, paid sick leave, and military service.
The 2006 ACS Content Test findings showed that the percentage of people 16 years and older who were year-round workers (50-52 weeks) was higher in the test panel than in the control panel at the national level. There was no significant difference in item nonresponse rates between the test and control panel.
Health Insurance Coverage (Person Questions 15a-15h)
The primary objective of the 2006 ACS Content Test work on health insurance coverage was to determine how to collect data about health insurance coverage for all members of a household. Information on health insurance coverage was not collected in the ACS prior to 2008. Therefore, the limitations are not fully known and no comparisons can be made to previous ACS data.
However, the results of the 2006 ACS Content Test provide useful information about the evaluation of the test data that demonstrated the viability of asking questions on health insurance coverage in the ACS.
Marital History (Person Questions 20-22)
The primary objective of the 2006 ACS Content Test work on marital history was to produce annual estimates of the numbers of people who marry and divorce, the number of times people were married, and the duration of their current marriage. The motivation for these questions was to use the ACS as the primary federal vehicle for the collection of marital data to replace the discontinued marriage and divorce registration area that had previously provided this information to Department of Health and Human Services on an annual basis.
Beginning in 2008, people 15 years and over who were ever married (currently married, widowed, separated, or divorced) were asked if they had been married, widowed, or divorced in the past 12 months. They were asked how many times (once, two times, three or more times) they have been married, and the year of their last marriage.
Because marital history is a series of new questions on the 2008 ACS, the limitations are not fully known. However, the results of the 2006 ACS Content Test provide useful information about the evaluation of the test data that demonstrated the viability of asking marital history questions in the ACS.
Service-Connected Disability (Person Questions 27a-27b)
The objective for including the topic of military service-connected disability rating in the 2006 ACS Content Test was to test whether the ACS could provide useful estimates of veterans by the disability-rating categories of the Department of Veterans Affairs (VA). The VA needs data on the distribution of the veteran population by level of disability rating in order to estimate the demand for their health care services.
Information on service-connected disability was not collected in the ACS prior to 2008. Therefore, the limitations are not fully known and no comparisons can be made to previous ACS data. However, the results of the 2006 ACS Content Test provide useful information about the evaluation of the test data that demonstrated the viability of asking service-connected disability questions in the ACS.
[PDF] or denotes a file in Adobe’s Portable Document Format. To view the file, you will need the Adobe® Reader® available free from Adobe. This symbol indicates a link to a non-government web site. Our linking to these sites does not constitute an endorsement of any products, services or the information found on them. Once you link to another site you are subject to the policies of the new site.