The Digital Services Act (DSA) brings EU values into the digital world

 

The Digital Services Act (DSA) brings EU values into the digital world and aims to protect our children, societies and democracies by placing more strict rules on accountability and transparency on very large online platforms such as Facebook, X (formerly known as Twitter), and Instagram.

 

On Friday, 25 August 2023, the DSA came into force, placing stringent obligations on very large online platforms and search engines (VLOPs) visited by over 45 million Europeans every month. While the DSA's primary focus is within the EU, its implications extend beyond European borders, including for Australia. The DSA represents a significant milestone in the regulation of digital services in the entire world.

What does the DSA regulate and what does it mean for consumers and affected platforms?

Removing Illegal Content

Under the DSA, online platforms like Facebook and TikTok are mandated to swiftly remove content that breaches EU laws upon notification by authorities or individuals. To enhance user involvement, these platforms must establish user-friendly mechanisms for reporting potentially illegal content. Users who frequently post such content must be warned before suspension. Online marketplaces, including Amazon and AliExpress, are required to make diligent efforts to monitor their traders, mitigating the sale of illegal products.

Managing Harmful Content

VLOPs are obliged to provide detailed annual reports to the European Commission, outlining systemic risks related to illegal content, disinformation, cyberbullying, and their impact on fundamental rights and mental health. Companies must take measures to mitigate identified risks, such as adjusting algorithms and implementing content labelling. Independent auditing firms will evaluate companies' efforts, ensuring compliance.

Empowering Users

The DSA emphasizes transparent and understandable terms and conditions. Platforms must inform users when their content is removed or restricted and provide explanations. Users have the right to challenge these decisions through various channels, including out-of-court bodies and the legal system. Tech companies must also disclose algorithmic parameters and offer non-personal data-based content recommendations.

Restrictions on Targeted Ads

The DSA prohibits the targeting of online ads based on sensitive personal data, including religious beliefs, sexual preferences, and political affiliations. Collecting personal data from children and teenagers for targeted ads is also banned. Additionally, manipulative designs, known as "dark patterns," are outlawed.

Transparency and Disclosure

Platforms must disclose information about their content moderation teams, including size, expertise, and languages spoken. They are required to reveal their use of artificial intelligence for content removal and its error rate. Public reports assessing risks to society, such as threats to freedom of speech, public health, and elections, must also be made available.

The DSA's impact extends to countries like Australia, particularly for businesses and organizations that operate within the EU and meet the specifications of a VLOP. The DSA applies in the EU single market, without discrimination, including to those online intermediaries established outside of the European Union that offer their services in the single market. When not established in the EU, they will have to appoint a legal representative, as many companies already do as part of their obligations in other legal instruments.

Additionally, Australian consumers using VLOPs falling under the DSA's jurisdiction might expect enhanced content control, improved transparency, and greater protection against harmful digital practices.

For more information on the DSA, have a look at the European Commission website.