Commission on Evidence-Based Policymaking: Public Hearing

By Amy Nussbaum posted 10-27-2016 14:35

  

Last Friday, members of the Commission for Evidence-Based Policymaking gathered in the Rayburn House Office Building for a public hearing. Katharine Abraham, chair of the commission, presided over the hearing, with co-chair Ron Haskins and commissioner Allison B. Orris also present (other commissioners may have tuned in via webcast). The purpose of the hearing was to allow members of the public the opportunity to inform the commission about their concerns. Additional hearings, set for early 2017, will be held in Chicago and somewhere on the West Coast.

Clyde Tucker, former chair of ASA’s Scientific and Public Affairs Advisory Committee, read remarks to the commissioners covering the stature and autonomy of the federal statistical system, data sharing leading to data synchronization, concerns related to privacy and confidentiality that may present barriers to the release of data needed for evidence-based policy making, nurturing evidence-based policymaking capacity across the federal government, and the nature of statistical evidence (read remarks here or watch video  here, starting at 31:10).  The Science Policy Department would like to thank Clyde for taking the time to present these important comments, as well as the SPAAC for their continued involvement, support, and guidance.

Other comments specifically mentioning statistics include those of an American Evaluation Association representative, who spoke on the importance of evaluation in government programs. Many government agencies lack a formal evaluation structure, and the AEA would like to see opportunities to grow this capacity. AEA also prioritizes rigorous impact evaluations using the most suitable statistical methods for a given situation and basing policy decisions on the entire body of evidence rather than a single randomized controlled trial. After these remarks, there was a period of discussion on the use of RCTs in policymaking. Many organizations emphasize RCTs above all other types of statistical evidence and tout them as a “gold standard,” even though they can be extremely costly and even unethical in some situations. Both the ASA and AEA would like to see increased guidance on when other kinds of statistical evidence are appropriate.  

Other organizations present included the Institute for Higher Education Policy, the Workforce Data Quality Campaign, Veterans Education Success, American Institutes for Research, New America, The Education Trust, Association of Public and Land-grant Universities, Young Invincibles, and the American Principles Project. For the most part, these groups were united in a desire to make educational data more accessible. In addition, many would like to see this data made available in conjunction with additional data sources. For example, using both education and earnings data could give more information on which schools and subjects result in the highest return on investments for different professions, while analyzing education and military data would allow the estimation of graduation rates for veterans attending college as a result of the GI Bill. In some cases collecting the right data to answer these questions is not only technically impossible, it is illegal. Although many of these remarks were quite optimistic, privacy does remain a concern when collecting student-level data.

Speakers from the National Prevention Science Coalition to Improve Lives, Booz Allen Hamilton, and Public Performance Improvement Researchers were on hand to offer a perspective from the private sector. They encouraged the commission to help break down data silos and help data users connect to emotionally relevant issues. Interestingly, they also spoke of the need to collect data on data usage—who is using what and which questions they are answering.

Finally, members of Results for America, The National Campaign to Prevent Teen and Unplanned Pregnancy, and Pew-MacArther Results First Initiative gave their perspectives on evidence-based policymaking in action. All three have experience in the “Moneyball for Government” movement—that is, building evidence about practices, programs, and policies, investing limited dollars for program evaluation, and directing funds away from practices that are consistently not effective. They highlighted the importance of using local results to inform policy, and cautioned evaluators to be prepared for null results.

ASA is pleased to have had the opportunity to speak at the hearing, and will continue to keep members updated on the progress and actions of the commission. Keep checking for updates, and leave comments below or at nussbaum@amstat.org.

0 comments
502 views

Permalink