On 6 May 2023, the European Commission released a draft outlining the procedures for conducting audits under the Digital Services Act (DSA). These rules specifically pertain to the 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) designated by the Commission on 25 April 2023, including Facebook, LinkedIn, Twitter, Bing and Google search.
Drafted under Article 37 of the DSA, this piece of delegated regulation seeks to enhance the transparency and public accountability of large platforms and search engines through annual independent audits. The draft provides a framework to guide VLOPs, VLOSEs (Audited Providers) and Auditors (Auditing Organisations) on the methodologies, procedural steps and reporting templates that must be implemented for these audits.
In particular, the draft rules mention the need for auditing methodologies for algorithmic systems, which include provisions ranging from assessments before deploying new features, disclosure requirements and comprehensive risk assessments. Algorithmic systems in this context have been defined to include advertising systems, recommendation engines, content moderation technologies and other features that may use novel technologies like generative models.
Relationship between platforms and auditors
The Draft clarifies the relationship between Audited Providers and Auditing Organisations, and lays down provisions for selecting auditors, as well as mechanisms on data sharing and cooperation between the two. Due to the complex and specific nature of such audits, the draft permits Audited Providers to contract different Auditing Organisations or a consortium of auditors to conduct the same.
Prior to conducting an audit, the provider is required to provide the following information to the auditor:
- Description of internal controls for each audited obligation, including historical data and benchmark metrics to measure performance
- Preliminary analysis of inherent and control risks
- Access to all data necessary for the performance of the audit, which may include personal data and information on internal processes and testing environments, among others
Final audit report
The Auditing Organisation is required to send a Final Report of the audit conducted to the Audited Provider, a template for which was provided alongside the draft procedures. The provider in turn, must submit this report to the European Commission and Digital Services Coordinator of its Member State within a month of receipt. The provider is also required to publicly publish an Audit Implementation Report of the Final Report within three months from the date of receipt.
The first Final Audit Report must be completed within one year from the date of application of the obligations to the Audited Provider.
Auditing Organisations are directed to submit Audit Conclusions in the Final Report, which shall be either:
- Positive, where the auditor has concluded that the provider has complied with an audited obligation or commitment
- Positive with comments, where auditing obligations have been satisfied, but:
○ The auditor recommends improvements on meeting certain obligations
○ The auditor uses the Audit Criteria mentioned in Article 10(2) of the draft
- Negative, where obligations have not been complied with
Additionally, the auditor is required to include Audit Opinions and Recommendations that correspond to either of the three Conclusions.
Audit risks analysis
The Final Report shall also include a Risk Analysis conducted by the auditor for the assessment of the Audited Provider’s compliance with each obligation. These should be conducted before and during the audit, and should consider the following:
- Inherent Risks: Risks of non-compliance arising from the nature and use of the audited service, and the context it used in
- Control Risks: Misstatements that have not been prevented or detected by the provider’s internal controls, and
- Detection risks: Misstatements that have not been detected by the auditor.
The Draft also provides guidance on audit methodologies specifically used for Risk Management, Risk Mitigation, Crisis Response and Independent Audits (covered under Articles 34, 35, 36 and 37 of the DSA, respectively), and stipulates requirements to ensure the quality of audit evidence used to create the Final Report.
The draft is open for public comments until 2 June 2023, which can be submitted here.
We're a part of the solution
Taking proactive steps is the best way to get ahead of the Digital Services Act and other global AI regulations. At Holistic AI, we have pioneered the field of AI ethics, carried out over 1000 risk mitigations and developed comprehensive auditing and assurance frameworks. Using our interdisciplinary approach that combines expertise from computer science, law, policy, ethics, and social science, we take a comprehensive approach to AI governance, risk, and compliance, ensuring that we understand both the technology and the context it is used in.
To find out how we can help you get your algorithms legally compliant, get in touch at firstname.lastname@example.org.
Authored by Siddhant Chatterjee, Public Policy Associate at Holistic AI.