U.S. flag

An official website of the United States government

Skip Header


2020 Census Operational Quality Metrics: Item Nonresponse Rates

Written by:

Estimated reading time: 8 minutes

Earlier this month, the Census Bureau released the 2020 Census Redistricting Data (Public Law 94-171) Summary File. These are the first detailed results from the 2020 Census that include demographic characteristics and population counts for numerous areas. Among many other uses, states may choose to use the redistricting data on race, Hispanic origin, and the voting-age population to redraw the boundaries of their congressional and state legislative districts.

As data users dive into these files, it is important to consider the quality of the data. This year, we have provided information on the 2020 Census in the form of operational quality metrics, which look at various factors such as how we obtained a response for each address. The operational quality metrics have been released for the nation and for each state.

Today we are releasing another indicator – the 2020 Census item nonresponse rates for the population count, age or date of birth, race, and Hispanic origin questions – characteristics present in the redistricting data.

What Is Item Nonresponse and Why Does It Matter?

There are two basic types of nonresponse in censuses and surveys.

  • Unit nonresponse occurs when there is no response at all from people living at a particular address.
  • Item nonresponse occurs when a respondent provides some information but does not respond to all the survey or census questions.

We calculate item nonresponse rates before we use the statistical techniques of editing and characteristic imputation to fill in the missing information. When calculating the rate, we do not take into consideration the validity of a response. For example, if a respondent checks multiple boxes for the relationship question, we edit that response in post-processing but we do not consider it a missing response. 

What Item Nonresponse Rates Are Available Today?

The metrics are available in a downloadable table. The numbers from today’s release (as well as all previously released operational quality metrics) are also available on the 2020 Census Data Quality webpage.

The 2020 Census item nonresponse rates for each of the redistricting data items (population count, age or date of birth, race, and Hispanic origin) are available for the nation and for each state. Corresponding item nonresponse rates from the 2010 Census are also provided.

In addition, as we know from past research there are response mode differences. So we have calculated the item nonresponse rates for these questions by census operation:

  • Self-Response — Households responded on their own. We provide rates for overall self-response and broken down by online, paper and telephone responses.
  • Nonresponse Followup (NRFU) — A census taker collected a response from a household member or a proxy respondent, such as a neighbor, building manager or landlord. The 2020 Census also used high-quality administrative records, data we already have about the household, to collect census data. We provide rates for NRFU responses overall, and for enumerations by household member, proxy respondent and administrative records.
  • Other housing unit operations — This category includes primarily smaller operations, such as Update Enumerate which counted people living in remote parts of the country.
  • Group quarters — This operation counted people in a group living arrangement, such as nursing facilities, military barracks and college dormitories.

Lastly, for each of these census operations we are reporting the percentage of responses that provided only a population count (and no other data).

Key Highlights

Here are a few takeaways:

  • As our acting director, Ron Jarmin, mentioned in a recent blog, item nonresponse rates for most questions were higher in the 2020 Census than in previous censuses. Those rates were calculated using a preliminary data file with duplicate responses. For this release, we have calculated the rates using a file that was unduplicated and we see that the 2020 item nonresponse rates, although higher than 2010 Census rates for most questions, were lower than we had initially thought. It’s normal and expected for the Census Bureau to refine initial estimates with further work and we are pleased that our effort reveals a more nuanced picture with item nonresponse.
  • Our calculations reveal that most missing data rates were low overall. Among all occupied households in the nation, this ranged from 0.52% for population count to 5.95% for age or date of birth. In comparison, the national range for 2010 item nonresponse was 1.43% for population count to 3.99% for Hispanic origin.
  • Item nonresponse rates for most questions were lowest for households that self-responded, whether online, by phone or by mail. This is especially true for internet and phone respondents, likely in part because the online questionnaire (which the phone representatives also used to capture responses) reminded people to provide a response if they tried to skip a question. Among internet respondents, the U.S. total item nonresponse ranged from 1.37% for age or date of birth to 2.19% for race.
  • Among households enumerated by a census taker in our NRFU operation, item nonresponse rates for demographic questions (age, Hispanic origin, race) were highest when a proxy respondent, such as a neighbor or landlord, provided the data. This is not surprising, as people who don’t live at a residence wouldn’t be expected to have full knowledge of the demographics, such as age or race, of their neighbors. Further, the results confirm what we had suspected – that administrative records provide us with more reliable data than proxy respondents, given they have a lower item nonresponse rate. For example, among NRFU household member interviews, about 8.71% were missing race nationally compared to 41.22% among proxy respondents. On the other hand, we were missing only 18.10% of these demographic items for people enumerated through administrative records.
  • Group quarters tended to have relatively high item nonresponse rates across question items and across states. For instance, nationally about 17.81% of people living in group quarters were missing age or date of birth. We have talked before about the various challenges, particularly from the COVID-19 pandemic, that impacted our group quarters operations.
  • Some states consistently had higher item nonresponse rates for all items. There are some research-driven explanations for this. For instance, states with higher Hispanic or Latino populations tended to have higher item nonresponse to the race question because those respondents are less likely to identify within the existing race categories.
  • It’s important to be mindful of the mode effects when comparing 2020 Census and 2010 item nonresponse rates. About 80% of households that self-responded in 2020 responded online, but there was no online option available in 2010. This means, of course, that we don’t have a comparable item nonresponse rate for online responses between the two decades. It also means we should use caution when comparing the rates for paper responses. About 18% of households that self-responded in 2020 returned a paper form in the mail, while nearly all self-responses were on paper in 2010. Those two groups are drastically different in size and we expect, from prior research, also significantly different in their demographic and social characteristics. The multiple 2020 Census self-response modes available and the different timing of when the paper forms were sent likely also affect direct comparability.

Putting this all together, we know there is a range in the magnitude of item nonresponse rates across data collection operations. Some of the results (for instance, with NRFU proxy enumerations or from group quarters) are not particularly surprising but point to areas that we are studying closely as we design the 2030 Census. But the vast majority of the nation’s households provided high-quality data and answered most of the census questions. 

Where Can I Find More Information About Item Nonresponse?

As part of the 2020 Census Evaluations and Experiments (EAE) program, we’re set to release the Item Nonresponse and Imputation Assessment report next summer. The report will contain comprehensive results on item nonresponse, including metrics for all data items available today plus household tenure (owned or rented), sex and relationship, as well as detailed imputation and substitution rates.

What Other Quality Indicators Will Be Available?

Throughout this year, we have been sharing as much information as we can about the 2020 Census results. As a reminder, some of the other ways that we are measuring the quality of the 2020 Census include:

The Census Bureau is committed to sharing what we know, when we know it, to help the nation understand the quality of the 2020 Census results. We will continue to update the 2020 Census Data Quality webpage as new information becomes available. 

 

Page Last Revised - October 8, 2021
Is this page helpful?
Thumbs Up Image Yes Thumbs Down Image No
NO THANKS
255 characters maximum 255 characters maximum reached
Thank you for your feedback.
Comments or suggestions?

Top

Back to Header