Skip Main Navigation Skip To Navigation Content

Research Reports

You are here: Census.govSubjects A to ZResearch Reports Sorted by Year › Abstract of RRS2008/08
Skip top of page navigation

General Methods and Algorithms for Modeling and Imputing Discrete Data under a Variety of Constraints

William E. Winkler

KEY WORDS: Data Quality, Loglinear Model Fit, Missing Data, Convex Constraints

ABSTRACT

Loglinear modeling methods have become quite straightforward to apply to discrete data X. The models for missing data involve minor extensions of hot-deck methods (Little and Rubin 2002). Edits are structural zeros that forbid certain patterns. Winkler (2003) provided the theory for connecting edit with imputation. In this paper, we give methods and algorithms for modeling/edit/imputation under linear and convex constraints. The methods can be used for statistical matching (D’Orazio, Di Zio, and Scanu 2006), edit/imputation in which models are also controlled for external constraints such as in benchmark data, and for creating synthetic data with significantly reduced risk of re-identification (Winkler 2007).

CITATION: Winkler, W.E. General Methods and Algorithms for Modeling and Imputing Discrete Data under a Variety of Constraints. Statistical Research Division Research Report Series (Statistics #2008-08). U.S. Census Bureau.

Source: U.S. Census Bureau, Statistical Research Division

Created: October 3, 2008
Last revised: October 3, 2008


[PDF] or PDF denotes a file in Adobe’s Portable Document Format. To view the file, you will need the Adobe® Reader® Off Site available free from Adobe.

This symbol Off Site indicates a link to a non-government web site. Our linking to these sites does not constitute an endorsement of any products, services or the information found on them. Once you link to another site you are subject to the policies of the new site.

Source: U.S. Census Bureau | Statistical Research Division | (301) 763-3215 (or chad.eric.russell@census.gov) |   Last Revised: October 08, 2010