HB Ad Slot
HB Mobile Ad Slot
NIST Publishes Proposed Principles for “Explainable” AI Systems
Thursday, August 20, 2020

On August 18, 2020, the U.S. National Institute of Standards and Technology (“NIST”) published a draft report, Four Principles of Explainable Artificial Intelligence (Draft NISTIR 8312 or the “Draft Report”), which sets forth four proposed principles regarding the “explainability” of decisions made by Artificial Intelligence (“AI”) systems.

Explainability refers to the idea that the reasons behind the output of an AI system should be understandable. According to the NIST press release, AI must be explainable to society to enable understanding, trust and adoption of new AI technologies and the decisions and guidance they produce.

The proposed principles are:

  • Explanation: AI systems should deliver accompanying evidence or reasons for all outputs.
  • Meaningful: Systems should provide explanations that are understandable to individual users.
  • Explanation Accuracy: The explanation should correctly reflect the system’s process for generating the output.
  • Knowledge Limits: The system only operates under conditions for which it was designed or when the system reaches a sufficient level of confidence in its output.

The Draft Report is a part of a broader NIST effort to develop trustworthy AI systems. In publishing the Draft Report, NIST indicated it hopes to start a conversation about the expectations to which decision-making devices should be held. NIST is accepting comments on the draft report until October 15, 2020.

HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins