Join Webinar: Bias Detection in Large Language Models - Techniques and Best Practices
Register Now
Learn more about EU AI Act

The EU’s Digital Services Act – The Need for Independent Third-Party AI Audits

Authored by
Ashyana-Jasmine Kachra
Policy Associate at Holistic AI
Published on
May 31, 2023
read time
0
min read
share this
The EU’s Digital Services Act – The Need for Independent Third-Party AI Audits

Key Takeaways

  • The Digital Services Act came into effect in November 2022 and will be applicable starting in 2024.
  • The DSA applies to hosting services, marketplaces, and online platforms that offer services in the EU, regardless of their place of establishment.
  • The DSA is underpinned by a risk governance approach where companies will be expected to conduct ongoing risk assessments, outline their mitigation efforts, and participate in annual independent third-party audits.
  • Non-compliance with the Digital Services Act can attract penalties of up to 6% of the company's annual turnover in the European Union.

What is the Digital Services Act (DSA)?

The Digital Services Act (DSA) is an EU law that regulates digital services. It requires companies to assess risks, outline mitigation efforts, and undergo third-party audits for compliance. The DSA is part of the EU's approach to regulating digital technologies, along with the Digital Markets Act (DMA) and EU AI Act.

The Act is a lengthy (300 pages) and horizontal (cross-sector) piece of legislation with composite rules and legal obligations for technology companies. Notably, there is a focus on social media, user-oriented communities, and online services with an advertising-driven business model.

One of the central goals of the Digital Services Act is to put an end to the self-regulation of tech companies and force companies to be more transparent, particularly in the realm of algorithmic accountability and content moderation. To do so, the DSA includes clear responsibilities for the EU and member states to enforce these rules and obligations. The law will be in full effect starting February 17th, 2024.

Who does the Digital Services Act apply to?

The Digital Services Act applies to host services, marketplaces, and online platforms that offer services in the EU, regardless of their place of establishment. Therefore, the effect of the Act and the expectation to comply will be felt globally.

There is a specific focus on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). VLOPs have been defined as platforms that have over 45 million average monthly users in the EU.

The Digital Services Act: key provisions for accountability & compliance

Transparency and algorithmic accountability

  • All online platforms must publicly report how they use automated content moderation tools and the tools’ error rates.
  • All online platforms will be required to disclose the number of removal orders issued by national authorities (enforcers) and all notices about the presence of illegal content by trusted flaggers (content moderators) or obtained by automated means.

A risk governance approach to AI systems

A risk governance approach underpins the EU Digital Services Act. It pertains to regulated responsibilities to address systemic issues such as disinformation, hoaxes and manipulation during pandemics, harm to vulnerable groups and other emerging societal harms. These issues are categorised as online harm/harmful content in the legislation and are governed by Articles 26, 27 and 28.

Referred to as the risk assessments provision, Article 26 stipulates that VLOPs and VLOSEs will need to conduct risk assessments on an annual basis or at the time of deploying a new relevant functionality to identify any systemic risks coming from the design and provision of services.

These assessments must identify risks related to all fundamental rights outlined by the charter, focusing on risks to freedom of expression, electoral processes and civic discourse, the protection of minors, public health and sexual violence. As the technology harms landscape shifts and evolves, risks may evolve too. Ensuring agile risk assessments is critical.

As stipulated by Article 27, these risk assessments must be accompanied by mitigation measures that are reasonable, proportionate and effective. Efforts to mitigate the risks associated with harmful content should bear in mind that harmful content should be treated in the same way as illegal content to the extent that harmful content is not illegal. The DSA’s rules will only impose measures to remove or encourage the removal of illegal content in full respect to the freedom of expression.

Article 28 will require that VLOPs submit annual independent and third-party audits to certify that they comply with Articles 26, 27 and overall reporting requirements. In addition, the audits would ensure that VLOPs comply with Ch. III of the DSA, the third-party auditor, would have to prove independence from the industry for an audit to be considered valid.

EU’s Digital Services Act

Companies should also note that vetted researchers, including academics and civil society organizations, could gain access to relevant data to conduct their own research surrounding risk identification and mitigation.

The enforcement of the Digital Services Act

The DSA will be enforced through National Authorities and the EU Commission, where national authorities must assign a competent authority to supervise and enforce. For VLOPs and VLOSEs, the EU Commission will be the enforcement body.

What are the penalties for the Digital Services Act?

  • Article 52: Non-compliance can attract penalties of up to 6% of annual worldwide revenue.
  • Article 54: Companies and platforms will also be exposed to civil suits and liability, as individuals, businesses, and other users can seek compensation for any damage or loss from non-compliance/infringement.

What comes next and what to prepare

The Digital Services Act came into effect on November 16th, 2022, and digital service providers now have three months to publish their number of active users. As per Article 93, the new rules will become applicable from February 17, 2024. VLOPs and VLOSEs would have to be ready to comply four months earlier.

Digital Services Act Timeline

Taking precautionary steps is the best way to get ahead of the Digital Services Act and other global AI regulations. The Holistic AI Governanceplatform, combined with a team of AI experts who, informed by relevant policy, can help you manage the risks of your AI systems and processes. Reach out to us at we@holisticai.com to learn more about how we can help you embrace AI confidently.

DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.

Take command of your AI ecosystem

Learn more

Track AI Regulations in Real-time

Learn more
Subscriber to our Newsletter
Join our mailing list to receive the latest news and updates.
We’re committed to your privacy. Holistic AI uses this information to contact you about relevant information, news, and services. You may unsubscribe at anytime. Privacy Policy.

Discover how we can help your company

Schedule a call with one of our experts

Get a demo