ASA Connect

 View Only
  • 1.  National Academies Committee Meeting on Reproducibility and Replicability in Science, December 12-13

    Posted 12-07-2017 15:12
    Dear All, 

    The agenda for the the National Academies Meeting for Committee on Reproducibility and Replicability in Science, December 12-13, is now posted: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_183509.pdf

    Let me also take the opportunity to highlight Ron Wasserstein's recent blog post on the topic: Addressing Reproducible Research: How the Statistical Community Can Help (and Has Been Helping). Please read it to review what the ASA, its committees, and its members have been doing to address reproducible and then contribute your ideas for what we should be doing. 

    Thank you,
    Steve

    ------------------------------
    Steve Pierson
    Director of Science Policy
    American Statistical Association
    ------------------------------


  • 2.  RE: National Academies Committee Meeting on Reproducibility and Replicability in Science, December 12-13

    Posted 12-08-2017 12:37
    12/8/17    
    Dear Dr. Steve Pierson & ASA Statisticians,
          In connection with the problems in reproducibility of scientific research to which Dr. Pierson refers in his ASA Connect blog of 12/8/17, my suggestion of "results blind publishing" may be relevant.  I commented about this at this ASA Connect site in April, but have since published a formal article on the subject, if anyone is interested.  It is published in Basic & Applied Social Psychology (Editor: Dr. David Trafimow), the journal now well known for having banned the publication of results of null hypothesis significance tests, about 2 years ago.  My article is followed by commentary from four leading researchers, after which my rejoinder is printed. 
          Abstract & links to the articles are pasted below: 
     
    Abstract  
         Problems in science publishing involving publication bias, null hypothesis significance testing (NHST), and irreproducibility of reported results have been widely cited. Numerous attempts to ameliorate these problems have included statistical methods to assess and correct for publication bias, and recommendation or development of statistical methodologies to replace NHST where some journals have even instituted a policy of banning manuscripts reporting use of NHST. In an effort to mitigate these problems, a policy of "results blind evaluation" of manuscripts submitted to journals is recommended, in which results reported in manuscripts are given no weight in the decision as to the suitability of the manuscript for publication. Weight would be given exclusively to (a) the judged importance of the research question addressed in the study, typically conveyed in the Introduction section of the manuscript, and (b) the quality of the methodology of the study, including appropriateness of data analysis methods, as reported in the Methods section. As a practical method of implementing such a policy, a two-stage process is suggested whereby the editor initially distributes only the Introduction and Methods sections of a submitted manuscript to reviewers for evaluation and a provisional decision regarding acceptance or rejection for publication is made. A second stage of review follows in which the complete manuscript is distributed for review but only if the decision of the first stage is for acceptance with no more than minor revision.   
     
     
    Reference 
    Locascio, J. J.  Results blind science publishing. (pg. 239-246).  Rejoinder to responses to 'Results blind publishing'. (pg. 258-261), Basic and Applied Social Psychology, 2017, Vol 39(5).       
     
     
    Thank you for your interest. 
     
    Joseph J. Locascio, Ph.D.,
    Assistant Professor of Neurology,
    Harvard Medical School,
    & Bio-Statistician,
    Memory and Movement Disorders Units, 
    Massachusetts Alzheimer's Disease Research Center,
    Neurology Dept., Massachusetts General Hospital (MGH),
    Boston, Massachusetts 
    Phone: (617) 724-7192
    Email: JLocascio@partners.org        
     

               
    "The information transmitted in this email is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of or taking of any action in reliance upon this information by persons or entities other than the intended recipient is prohibited. If you received this email in error, please contact the sender and delete the material from any computer."
     
     
     

    The information in this e-mail is intended only for the person to whom it is
    addressed. If you believe this e-mail was sent to you in error and the e-mail
    contains patient information, please contact the Partners Compliance HelpLine at
    http://www.partners.org/complianceline . If the e-mail was sent to you in error
    but does not contain patient information, please contact the sender and properly
    dispose of the e-mail.






  • 3.  RE: National Academies Committee Meeting on Reproducibility and Replicability in Science, December 12-13

    Posted 12-11-2017 09:22
    ​All,

    Am glad Ron is a featured speaker at the National Academies meeting this week on reproducibility/replicability in stats, but I'm unsure about accepted standards and practice in this area. Would appreciate any postings here of key references in the stats literature on reproducibility/replicability, and particularly any that provide accepted standards/practice in this area. Thanks.

    David Williamson

    ------------------------------
    G. David Williamson
    Centers for Disease Control and Prevention
    ------------------------------



  • 4.  RE: National Academies Committee Meeting on Reproducibility and Replicability in Science, December 12-13

    Posted 12-12-2017 14:11
    Here's a very good paper on programming standards you should follow to improve your reproducibility. I like it because the standards are pragmatic and easy to implement.

    Wilson G, Bryan J, Cranston K, Kitzes J, Nederbragt L, Teal TK (2017) Good enough practices in scientific computing. PLoS Comput Biol 13(6): e1005510. Available at https://doi.org/10.1371/journal.pcbi.1005510.

    --
    Steve Simon, mail@pmean.com
    I'm blogging now! blog.pmean.com




  • 5.  RE: National Academies Committee Meeting on Reproducibility and Replicability in Science, December 12-13

    Posted 12-11-2017 12:20
    Will there be a discussion on the need for academic researchers to learn about Design of Experiments? Getting academic scientists to acknowledge that something like Definitive Screening Designs exist would be a major step in the right direction.

    Keep in mind that most academic scientists and engineers "know" and often with "great certainty" that "One simply cannot change more than one factor at a time during an experiment. Because statistics doesn't allow it." If that is true, I demand a refund of my tuition for the 15 stats classes I took at the graduate level that used multiple regression techniques. Should probably also demand a refund for all those math classes that covered systems of linear equations, partial differential equations, etc. 

    Perhaps ASA and the National Academies can come up with some videos or Web pages that explain basic DOE concepts and their applications to science research.

    ------------------------------
    Andrew Ekstrom

    Statistician, Chemist, HPC Abuser;-)
    ------------------------------



  • 6.  RE: National Academies Committee Meeting on Reproducibility and Replicability in Science, December 12-13

    Posted 12-12-2017 04:18

    Very happy to see the attention this important subject is getting today! Earlier this year, the ASA's statement on Reproducible Research established guidelines for agencies supporting research, helping them to understand what to look for in accepting papers and conference presentations. In this way, the ASA is playing an important role in supporting research in all areas where statistics are used. 

    These actions, where the ASA and its members act as partners to consult on statistical best practices, is vital for research in all areas. One way we can further this important work is to present the ASA recommendations on reproducible research in our own particular areas of study. As one example, I had the opportunity to present the recommendations as a paper at a counter-terrorism conference this year. Positioned as a presentation on how to avoid getting your paper retracted, it was well received by conference attendees. 

    There are hundreds of conferences in medicine, engineering, finance, and so many other areas that would benefit from a presentation on the ASA recommendations. This meeting from the National Academies can serve as a starting place, encouraging each of us to present on the reproducible research recommendations where we work and teach them to our students to help make them effective communicators of statistical best practices.



    ------------------------------
    David Corliss
    Analytics Architect / Predictive Analytics
    Ford Motor Company
    ------------------------------



  • 7.  RE: National Academies Committee Meeting on Reproducibility and Replicability in Science, December 12-13

    Posted 12-22-2017 15:25
    I'm sharing a summary of the December 12-13 meeting from the American Institute of Physics policy team:https://www.aip.org/fyi/2017/national-academies-launches-study-research-reproducibility-and-replicability
    Happy Holidays!
    Steve

    ------------------------------
    Steve Pierson
    Director of Science Policy
    American Statistical Association
    ------------------------------