ASA Connect

 View Only

Usefulness of 2020 Household Pulse Survey

  • 1.  Usefulness of 2020 Household Pulse Survey

    Posted 07-23-2020 11:47
    In trying to work with data from the pandemic 2020 Household Pulse Survey (HPS) conducted by the US Census Bureau, I notice three problems.  The survey was designed to produce weekly estimates during the Covid-19 pandemic for the 15 largest MSAs, the states (plus DC) and national estimates.  The content is developed by questions about employment status, spending patterns, food security , housing, physical and mental health, access to health care, and educational disruption (survey documentation and results are on the Internet - google "2020 Household Pulse Survey").

    (1) The federal government did not get the survey in place quickly enough to permit development of baseline data.   Has anyone working with this data found a way to normalize weekly response results to survey questions to pre-pandemic "normal" values? 

    (2) The form of the survey questions precludes use in analysis.  For example, while 56.1% of responding households for a state report using regular income sources like those used before the pandemic and 24.2% from credit cards or loans to pay bills, these response categories have overlap (can represent the same respondents).  Since there is apparently no follow-up question to report dollar amounts or an apportioning of percentages across response categories, one respondent could be paying 90% from normal sources and 10% from credit cards; while another could be paying 90% from credit cards and 10% from normal sources.   Has anyone found a way to use this data in independent analysis to be able to assess effects?

    (3)  Non-sampling survey error appears huge.  The weekly surveys are conducted online only, which introduces a major coverage bias.  The week one sample size was 1,867,000 and 74,500 households responded so the response rate is 3.8%.  To put this directly, if I had a representative random sample of 100 households and 3 responded I would not presume the data was in any way grounded in reality, because it has huge completion error/non-response bias.  If I apply an absolute error calculation of the kind used in organizational analysis where in an organization of 100 members you might reach 97 and miss 3, the results are meaningless.  

    My analytic intent is conventional structured economic analysis within which logic has application and quantitative effects can be developed through economic modeling.  With no baseline, non-mutually exclusive response categories, lack of follow-up questions to enable quantitative analysis of pandemic effects, coverage bias and response bias large enough to invalidate the survey results, it appears the Pulse Surveys are useless for this application (though they may be relevant for some kind of non-analytic impressionistic appreciation in a liberal arts sense).  Does anyone see a way to use this data in reasoned, structured analysis?

    ------------------------------
    Hugh Peach
    President
    H. Gil Peach & Associates, LLC
    ------------------------------