Skip Header
U.S. flag

An official website of the United States government

Examining Operational Quality Metrics

April 26, 2021
Michael Bentley, Assistant Division Chief for Census Statistical Support, Decennial Statistical Studies Division

The first release of results from the 2020 Census is an important milestone in the once-a-decade count of the nation’s population. Over the next decade, lawmakers, business owners, researchers and many others will use the data to make important decisions about their communities.

In order to make those decisions, people using the data need to feel confident in the data’s quality and accuracy. So, how do you gauge the quality of the census?

The Census Bureau is taking a multifaceted approach to studying the quality of the 2020 Census, so as to produce a more complete and informative picture. One way we are doing this, as we described in the previous Introduction to Quality Indicators: Operational Metrics blog, is by providing statistics on other aspects of the census, including operational quality metrics.

What Are the Operational Quality Metrics?

Today’s initial release of the 2020 Census operational quality metrics provides important information about the quality of the census by looking at how we obtained a response for each address.

We do this, in part, by examining the final status of addresses in the census and how that status was determined. While this is just one piece of information about the quality of the census, it offers insight into how we collected responses from the different census operations.

For example, if an address was resolved as “self-response occupied,” that means that a household member living at that address responded online, by phone or by mail on their own. Self-responses are preferable because they are quick, efficient, high quality, and don’t typically require further action or follow-up.

These metrics also provide more details on how we resolved addresses in the Nonresponse Followup operation. During this operation, census takers knocked on the doors of addresses that did not self-respond to the census to try to get a response or to verify whether the address was vacant.

Because of the various operational challenges we faced from the COVID-19 pandemic and other major events, it is particularly important to study and understand various metrics about Nonresponse Followup to get a better picture of our success in adapting to these challenges.

Viewing the Quality Metrics

The operational quality metrics are available in an interactive dashboard. The dashboard makes it easy to compare metrics among the nation and states.

We are also providing:

It’s important to note that we expect to see some differences across states and from one census to the next. Different doesn’t necessarily imply “better” or “worse.”

Many differences are a result of changes to the way we conducted the 2020 Census as compared with the 2010 Census (for example, adding an internet response option or using administrative records to enumerate some households in Nonresponse Followup in 2020). A difference is just another data point as we consider the breadth of quality assessments in the works for the 2020 Census.

Summary of Key Results

Today’s release contains a wide array of summary statistics and other information to consider on the subject of operational quality metrics.

Some of the highlights include:

  • 65.28% of census addresses were resolved by self-response, where a household member responded online, by phone or by mail. (This percentage is different from a similar rate depicted on the Self-Response Rates Map. Communities should continue to use the Self-Response Rates Map to gauge participation and the success of efforts to motivate response. The metric released today shows the breakdown of where all the final census numbers came from, with the largest portion coming from self-response. The differences are further explained in FAQs about today’s release.) Almost all of these (64.28% of census addresses) were occupied households. This is higher than the proportion resolved by self-response in the 2010 Census (61.05%), likely in part because of improvements and innovations in the self-response operational design and an extended self-response period.
  • The majority (79.74%) of households that self-responded did so online, while 18.13% returned a paper form in the mail and 2.13% responded by phone. The 2020 Census was the first census with the internet as the primary way to respond, and many households took advantage of this option.
  • Among those that were identified as occupied households in Nonresponse Followup, 55.48% were enumerated with a household member; 26.07% were resolved with a proxy respondent, such as a neighbor, building manager or landlord; and 18.44% were enumerated using high-quality administrative records. The use of administrative records was new to the 2020 Census, but the proxy rate is comparable to the 2010 Census, in which 24.71% of occupied households in Nonresponse Followup were enumerated by a neighbor or other knowledgeable person. The proportion enumerated by a household member was 74.88% in 2010.
  • 5.94% of occupied households in Nonresponse Followup, excluding those enumerated by administrative records, provided only the population count and no other characteristics such as age, sex, race, Hispanic origin or household tenure. This is slightly lower than the comparable 6.78% that provided only the population count in the 2010 Census. Additionally, the distribution of these households changed between 2010 and 2020. In 2020, a higher share of the “population count only” responses came from interviews with a household member instead of a proxy respondent.
  • 0.23% of addresses were unresolved after data collection. This means that we did not have enough information about a particular address after we concluded our data collection operations and therefore needed to use imputation to assign a status and population count. (Imputation is a statistical technique that fills in missing information using other available information. We talked more about it in the recent How We Complete the Census When Households or Group Quarters Don’t Respond blog.) This compares to 0.38% of addresses being unresolved after the 2010 Census. In addition, 0.71% of addresses were unresolved as a result of person unduplication during census data processing. (These were new procedures for the 2020 Census, so comparable metrics from 2010 are not available. We discuss the procedures in the recent How We Unduplicated Responses in the 2020 Census blog.) Combining the two rates for the 2020 Census — imputation needed after the completion of data collection operations and imputation needed after unduplication procedures — the total count imputation rate was 0.93%.

While no single number can definitively quantify the quality of the census, examining these metrics and comparing them across geographies and with past censuses can shed light on how our operations and processes affect our ability to accurately count our nation’s population.

Other Ways We Are Measuring Quality

Next month, we plan to provide additional operational metrics on topics such as average household size and percentage of single-person housing units and two-person housing units. These will provide more detailed information about the results from census operations and shed more light on the quality of the census.

Providing operational quality metrics is just one example of how the Census Bureau is striving to give the public insight into the efforts we have made to ensure that the 2020 Census counts are the highest quality possible and fit for their many uses.

Other ways that we are measuring the quality include:

As they become available, we look forward to sharing these additional results with the public on our 2020 Census Data Quality webpage.



Back to Header