Artificial intelligence, particularly generative AI, has advanced rapidly in a very short time, with the technology insinuating itself into businesses big and small across the world. But the speed at which it has been adopted, and the scale of its impact, has led to many concerns about its use and misuse. This, in turn, has highlighted the importance of adequate governance for these complex systems.
Accounting professionals are long used to helping clients through the governance challenges of other complex systems, from financial data integrity to cybersecurity protocols. Consequently, they are uniquely positioned to help with AI governance challenges as well, especially through standards such as the recently released ISO/IEC 42001.
The
(Read about Top 50 Firm Schellman's journey to become the
The standard defines an AI management system as a set of interrelated or interacting elements of an organization intended to establish policies and objectives, as well as processes to achieve those objectives, in relation to the responsible development, provision or use of AI systems. ISO/IEC 42001 specifies the requirements and provides guidance for establishing, implementing, maintaining and continually improving an AI management system within the context of an organization.
It is distinct from other standards that pertain to AI, such as ISO/IEC 22989, which establishes terminology for AI and describes concepts in the field; ISO/IEC 23053, which establishes an AI and machine learning framework for describing a generic AI system using ML technology; and ISO/IEC 23894, which provides guidance on AI-related risk management for organizations.
ISO/IEC 42001, on the other hand, is a management system standard.
Implementing this standard means putting in place policies and procedures for the sound governance of an organization in relation to AI, using the Plan‐Do‐Check‐Act methodology. Rather than looking at the details of specific AI applications, it aims to provide a practical way of managing AI-related risks and opportunities across an organization.
Top 50 Firm Schellman,
They must also demonstrate the commitment of top management to AI governance through policy, roles, responsibilities and authorities. Overall, management must be actively involved in support, especially through the artificial intelligence policy and communicated roles and responsibilities.
Organizations must also outline their AI objectives; determine AI risks, impact and opportunities; and plan actions to address them. Schellman noted that the required completion of an AI impact assessment goes a little further than other ISO standards.
Organizations are recommended to:
- Define a process to assess the potential consequences that can result from AI systems on individuals, groups, and societies;
- Outline the potential consequences of an AI deployment, intended use, and potential misuse for individuals, groups, and societies;
- Understand the context — both technical and social — where the AIMS is primarily deployed considering applicable jurisdictions;
- Retain documented information of the AI impact assessment, available to internal and external interested parties (as determined by the organization's strategic alignment); and,
- Use the results of the AI impact assessment as inputs for their AI risk assessment as required by ISO 42001.
They must also demonstrate allocation of adequate resources to support the AIMS, appropriate competence for persons doing work under the AIMS, and personnel's awareness of the AIMS, as well as communication and documented information regarding the AIMS. This includes employing adequate personnel, but also deploying the necessary data, tooling, systems, and assets (including human capital) to support the AIMS. The framework also mandates a certain level of competence, awareness, communication, and documented information as part of that support.
In addition, organizations must outline the implementation of processes regarding artificial intelligence offerings to ensure the conformance of AI operational planning and control within the design, development, and production processes through effective, efficient, and agile implementations.
There must also be monitoring, measurement, analysis, and evaluation of AIMS processes and performance, and internal audit against the AIMS framework and other applicable controls, as well as a dedicated management review.
Finally, the standard calls for the correction of nonconformities and continual improvement of the AIMS. The compliance journey will necessitate the correction of major or minor nonconformities, which can be raised by the organization, the internal auditors, or by an external certification body performing a readiness assessment or initial certification.