We use cookies to give you the best experience and to help improve our website

Find out what cookies we use and how to disable them

ISO/IEC NP TS 17847 - Information technology - Artificial intelligence - Verification and validation analysis of AI systems

Scope

This document describes approaches and provides guidance on processes for the verification and validation analysis of AI systems (comprising AI system components and the interaction of non-AI components with the AI system components) including formal methods, simulation and evaluation. 

This document is applicable for AI systems verification and validation in the context of the AI system life cycle stages described in ISO/IEC 22989.

This document is applicable to all types of organizations engaged in the development, deployment and use of AI systems.

Purpose

Traditional verification and validation approaches being used for conventional software systems cannot be directly utilized for AI systems. AI systems raise new challenges since these systems have to model decision-making processes that deal with a large number of features and data, making it hard to check using conventional verification and validation methods.

The verification and validation of an AI system is an important stage as described in ISO/IEC FDIS 22989 “Figure 4 “Example AI system life cycle model with AI system-specific processes” and is also described in ISO/IEC CD 5338.

The conventional software verification and validation methods or practices are described in ISO/IEC 29119-1:2022 “Figure 1 “Verification and validation methods or practices”, also reproduced below showing the focus of this document on verification and validation analysis.

ISO/IEC AWI TS 29119-11 Information technology — Artificial intelligence — Testing for AI systems — Part 11 intends to address the static testing aspect for AI systems.

There is a need to expand upon the AI system verification and validation analysis approaches like formal methods, simulation and evaluation. AI systems should be verified and validated against their system specifications and requirements specification to determine if an AI system meets the expectations of the AI stakeholders. This verification and validation analysis can be done at all stages of the AI life cycle, especially during inception, design and development, deployment, operating and monitoring and re-evaluation stages.

An AI system needs to address requirements and specifications dealing with situations that are dynamic, not fully testable, rare boundary conditions, exceptional inputs and changing priorities and objectives. Approaches such as formal methods, simulation and evaluation of the AI system vis-a-vis its requirements and system specification are important. Also, the complexity of verification and validation analysis approaches increase with the levels of automation of the AI system and needs specific approaches at each of the levels.

An AI system can be composed of both AI components and non-AI components. Further, there is significant diversity in the AI solution landscape that necessitates a careful inspection of every component of an AI system to devise the right verification and validation analysis strategy. This document decomposes the complexity of the AI system into its constituent AI components and focuses on a comprehensive set of verification and validation analysis approaches that can be used in addition to testing AI systems. Since an AI system can be refined after initial implementation due to dynamic changes such as data drift, concept drift, poor generalization and continuous learning, this document describes relevant steps to address these dynamic changes during verification and validation analysis.

This document refers, reuses and expands on AI system verification and validation aspects described in ISO/IEC FDIS 22989 and ISO/IEC CD 5338. ISO/IEC/IEEE 29119-1;2022 Figure 1 provides an overview of verification and validation hierarchy and we focus on verification and validation analysis improvements for AI systems.

The following approaches to verification and validation analysis of AI systems are covered in the proposed document:

Formal methods Formal methods provide approaches to establish properties that an AI system should satisfy to meet its required functionality and use that to prove or disprove whether the system can function as desired, without extensive testing and real-time use of the system. This is required when all possible scenarios that an AI system can encounter cannot be envisaged and tested upfront. Using formal methods, provable guarantees can be established for important components of the AI system. For example, formal methods like satisfiability modulo theories, abstraction interpretation, optimization problems are important to determine whether completely autonomous systems and safety-critical systems can function as desired in real world dynamic scenarios.

Simulation AI systems need to work in complex socio-technical environments exhibiting dynamic behaviour, with many heterogeneous components such as humans and other systems. The systemic simulation approach focuses on simulating the input, output and dynamic nature of large-scale socio-technical environments to determine the impact on and the impact of the AI system in such heterogeneous environments.

Evaluation The evaluation approach for AI systems considers the intended use of the AI system using user objective evaluation, interpretability evaluation or transparency evaluation. The evaluation approach focuses on how an AI system is perceived in and influences the environment in which it operates.

This document will describe the above three approaches looking at functional and nonfunctional requirement verification and validation analysis and across different levels of automation.

This document will thus help AI stakeholders on how to use verification and validation analysis approaches for AI systems to determine if an AI system can perform as per its specifications.

Comment on proposal

Required form fields are indicated by an asterisk (*) character.


Please email further comments to: debbie.stead@bsigroup.com

Follow standard

You are now following this standard. Weekly digest emails will be sent to update you on the following activities:

You can manage your follow preferences from your Account. Please check your mailbox junk folder if you don't receive the weekly email.

Unfollow standard

You have successfully unsubscribed from weekly updates for this standard.

Error