Support Centre

You have out of 10 free articles left for the week

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

USA: NIST seeks comment on publication for developing trustworthy AI

The National Institute of Standards and Technology ('NIST') released, on 22 June 2021, Draft NIST Special Publication 1270 A Proposal for Identifying and Managing Bias in Artificial Intelligence ('AI') which forms part of NIST's effort to support the development of trustworthy and responsible AI. In particular, the draft notes that NIST has identified the following technical characteristics needed to cultivate trust in AI systems including accuracy, explainability and interpretability, privacy, reliability, robustness, safety, security, and the mitigation of harmful biases. With specific reference to bias, the draft highlights that it focuses on biases that can lead to harmful societal outcomes. Furthermore, the draft identifies common problems for AI stakeholders in the AI lifecycle including:

  •  problem formulation and decision making;
  •  assumptions on operational settings;
  • overselling tool capabilities and performance;
  • optimisation over context; and
  • intended context vs. actual context.

As such, the draft recommends the increased use of deployment monitoring and auditing, the use of standards and guides for the evaluation of bias, and bias reduction techniques. In addition, the draft outlines that NIST plans to develop a framework for trustworthy and responsible AI with the participation of a broad set of stakeholders to ensure that standards and practices reflect viewpoints not traditionally included in AI development. 

Comments may be submitted until 5 August 2021. 

You can read the press release here and the draft here