The Digital Service Act, which was voted on by the European Parliament, will impose significant obligations on the web giants. It should be implemented in 2024.
It is an unprecedented text in the history of the European Union. The European Digital Services Act (DSA), which should regulate large tech companies in the territory, was adopted on Saturday 23 April. After two years of deliberations, this foundational digital technology legislation has finally been approved by the European Parliament, the Commission and the European Council.
With the support of European Commissioner Thierry Breton, among other things, the latter calls for better cooperation between companies and authorities, but also for a fundamental rethink of user security.
Targeted web giants
The text, the final version of which has not yet been published, concerns at least twenty companies. Among them, social networks, search engines, or even e-commerce sites.
More specifically, the text targets Gafam (Google, Amazon, Facebook, Apple, Microsoft), but also Twitter, AliExpress or TikTok. The commitments are proportional to the size of the companies, with a maximum requirement that drains more than 45 million European users.
The companies involved have until January 1, 2024 to comply with these new regulations. After this date the penalties become effective. It may range from a fine equivalent to 6% of the turnover, to a complete ban on work on the territory, in case of repeated violations.
Moderation and transparency
The work of social networks is one of the main aspects of the text. Better moderation, more transparency, and better protected rights: these are the three main points it is based on.
From the point of view of users, these changes in moderation will be the most noticeable. This social networking practice, which consists of managing content posted online, is often the subject of criticism.
Thus, platforms are called upon to have human resources commensurate with the number of their users, in order to better fight against unwanted and illegal content. They must also offer moderation that is appropriate for each country’s language, allowing each user to report a problem in their native language.
Two examples illustrate this decision: Last December, BFMTV revealed that Twitter employs fewer than 2,000 moderators in the world, for nearly 400 million monthly users. Facebook, for its part, has been criticized for not offering effective moderation in a large number of languages, leaving the door open for abuse and spam.
In the event that an account is discarded or content is deleted, the social network must also systematically provide the details of its decision to the user concerned.
DSA enshrines algorithm transparency in law. These lines of code, which are the basis of the work of all social networks, are regularly subject to criticism. Users must be able to choose how they receive content. Thus, the social network will have to decide whether its news feed is chronologically arranged or sorted by its own algorithm, and present the choice between the two.
at recent days , Wall Street Magazine It revealed that the TikTok algorithm had often trapped very young users in what was described as “rabbit holes”: the concept of filter bubbles, taken to the extreme. The main danger is the large number of harmful content, which these algorithms maintain, which can harm the mental health of users.
The text aims to better protect buyers in online marketplaces, for example by requiring sellers to provide a phone number and email address for registration. It should also make it easier to report illegal and dangerous content.
In November 2021, French authorities notably enforced the delisting of the e-commerce platform Wish. This online sales site was the subject of a report by DGCCRF, noting the abnormally large number of dangerous products available for sale.
Clarity and respect for data
“Dark patterns” are also searched for. This concept refers to interfaces specially developed to deceive Internet users. For a platform, it might be showing checkboxes to not share certain data, rather than the same checkboxes to accept this sharing. A counter-intuitive process that is regularly denounced by privacy advocates.
Online advertising is also influenced by text. Platforms will no longer be able to use sensitive personal data, such as religion or political opinions, to deliver targeted advertising to Internet users. Children and teens should no longer be exposed to it.
Finally, the platforms will have to undergo independent controls, and provide annual reports on the risks faced by users.