ASA Connect

 View Only

Interpretable and Explainable AI and Machine Learning: Symposium on 6/21 (free online event)

  • 1.  Interpretable and Explainable AI and Machine Learning: Symposium on 6/21 (free online event)

    Posted 16 days ago
    Interpretable and Explainable AI and Machine Learning: Symposium on 6/21
     
    Advances in artificial intelligence could improve how we make decisions for a broad array of applications, such as healthcare or urban planning. However, practitioners need to understand and trust how complex machine learning systems make choices before these new tools can be more widely adopted.
     
    Please join the National Academies for a symposium on Interpretable and Explainable AI and Machine Learning on Tuesday, June 21 from 1-4pm ET. During the symposium, expert speakers will discuss the possibilities and challenges of interpretable machine learning across a variety of applications, including cognitive science, healthcare, and policy. The symposium will conclude with a moderated panel on the future landscape of interpretable and ethical machine learning.
     
    Featured Talks:
    • Dr. Been Kim (Google Brain) will explore how to translate artificial intelligence for humans.
    • Dr. Gari Clifford (Emory University) will discuss the use of interpretable machine learning in healthcare and its impact on bias and ethics.
    • Dr. Christian Lebiere (Carnegie Mellon University) will present on cognitive models at the interface of humans and machines.
    • Patrick Hall (bnh.ai and The George Washington University) will share proposed guidelines for the responsible use of explainable machine learning.
     
    The event will be livestreamed and open to a limited number of in-person participants in Washington, DC.
     
    Register for the event at http://interpretable-ai.eventbrite.com

    ------------------------------
    Nicholas Horton
    Beitzel Professor of Technology and Society (Statistics and Data Science)
    Northampton, MA United States
    ------------------------------