Seeking to establish global leadership on governing artificial intelligence, the EU AI Act lays down a risk-based regulatory framework where AI systems are classed into having low or minimal risk, limited risk, high risk, or unacceptable risk, with obligations proportional to the level of risk posed. While AIs with unacceptable risks are prohibited, the legislation places stringent obligations on High-Risk AI Systems (HRAIS), transparency requirements on systems with limited risk, and no obligations on systems with minimal risks.
A system is considered a HRAIS if it is covered under Annex III of the EU AI Act, and poses significant risks to harm an individual’s health, safety, or fundamental rights. The second sufficiency condition was recently added in the Act’s latest compromise text, which was passed by the European Parliament on 14 June 2023.
A crucial requirement for HRAIS under the EU AI Act is for its providers to conduct ex-ante conformity assessments. A new regulatory tool in the realm of AI Governance, this blog provides an overview of these conformity assessments.
Conformity assessments are a primary legal requirement provided under Article 43 of the EU AI Act, envisaged to create accountability in the development and deployment HRAIS. The Act defines Conformity assessments as a means of ‘demonstrating whether requirements set out in Title III, Chapter 2’ of the legislation relating to an AI system have been fulfilled. A snapshot of these requirements is provided in the diagram below:
Conformity assessments are required to be performed before placing a HRAIS in the market, or before its first use in the EU market. Importantly, a new conformity assessment (CA) must be conducted in the event of significant modifications to a HRAIS. Such modifications can be triggered by changes that impact the system's compliance with the requirements for high-risk AI systems or when they alter the system's intended purpose.
Conformity assessments are to be typically undertaken by the providers of HRAIS. However, there are two instances where the responsibility for conducting a conformity assessment falls under the purview of:
The EU AI Act offers two options for conducting conformity assessments: either internally or through a notified third-party entity. The choice between these routes depends on the presence and usage of harmonized standards mentioned in Article 40 of the legislation.
These are conducted by the provider (or the manufacturer, distributor or importer, as the case may be) of the HRAIS. While the EU AI Act is desirous of developing a regime of third-party assessors in the long-term, it notes the following in Recital 64:
Given the current experience of professional premarket certifiers in the field of product safety and the different nature of risks involved, it is appropriate to limit, at least in an initial phase of application of this Regulation, the scope of application of third-party conformity assessment for high-risk AI systems other than those related to products. Therefore, the conformity assessment of such systems should be carried out as a general rule by the provider under its own responsibility.
Entities responsible for internal conformity assessments must assure their systems against the following:
Upon completion of the above, the entity is required to draft a written EU Declaration of Conformity for each relevant system, which should be maintained for a period of ten years after the system has been placed on the market or put into service. Additionally, the entity must affix a physical/digital Conformité Européene (CE) Mark on the product, indicating their ability to move freely within the European internal market. (Recital 67)
Although internal conformity assessments are generally favoured under the EU AI Act, there is an exception for HRAIS designed for remote biometric identification or making inferences about personal characteristics based on biometric data (including emotion recognition systems). For these cases, a third party must be involved in the conformity assessment process. Regardless, all developers of AI systems have the option to request a third-party assessment if they consider it necessary, irrespective of the system's level of risk. To conduct third-party conformity assessments, notified bodies must be designated as such by national notifying authorities upon satisfaction of requirements provided in Article 33 of the EU AI Act.
After developers and providers of HRAIS choose the notified third-party, the assessing entity must assure conformity of the AI’s technical documentation and quality management system under the procedure outlined under Annex VII. Upon the successful completion of the conformity assessment, the notified entity is required to issue an EU Technical Documentation Certificate (Article 44). This certificate is valid for a period of four years and may be subsequently renewed for a maximum period of four years, based on re-assessments. However, if the notified entity finds that the AI system in question no longer satisfies the requirements for HRAIS, it is required to suspend or withdraw this certificate unless corrective actions to re-ensure compliance are taken by the provider within an appropriate timeline established by the notified body. Following this, the provider has to draft the EU Declaration of Conformity and affix a physical or digital CE mark.
Should the notified entity discover that the HRAIS does not meet its conformity requirements, it must offer a detailed explanation of the non-compliance to the provider. The provider then must undertake relevant corrective measures to ensure compliance – failing which, it must withdraw the system from the market. Interestingly, the EU AI Act provides a remediation mechanism in this instance, where, under Article 45, the provider is empowered to appeal against the notified body’s determination.
Further, the finalised compromise text from the EU parliament has introduced a new provision to ensure that the interests of Small and Medium-sized Enterprises (SMEs) are protected, by mandating that fees for conducting third-party assessments be set proportional to an SME’s size and market share.
In accordance with Article 73 of the EU AI Act, the Commission is empowered to adopt delegated legislations to make changes to conformity assessment provisions considering technical progress, as well as update Annexes VI and VII. In doing so, the Commission will be required to consult the proposed AI Office and relevant stakeholders affected.
In the EU and beyond, AI regulation is gathering pace. New pieces of legislation often focus on fairness and harm mitigation – and at Holistic AI, those are out lifeblood. We fuse technical insight with policy acumen, ensuring we understand both the technology and the context in which it is used. Schedule a call with our expert team to learn how we can help you with your unique needs and industry requirements, giving you a pathway towards compliance.
Authored by Siddhant Chatterjee, Public Policy Associate at Holistic AI.
DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.
Avoid hefty penalties and achieve AI governance
Subscribe to our newsletter!
Join our mailing list to receive the latest news and updates.