GDPR, DMA, DSA, how does the European Union want to frame the digital world ?
What is personal data?
The French National Commission of informatic et freedom (CNIL) defines "personal data" as "any information relating to an identified or identifiable individual". This definition is taken up at the European level in the General Data Protection Regulation.
This personal data is a crucial issue of our era and constitutes a new black gold, as we mentioned in a previous article dedicated to big data.
Everyone disseminates it daily through each online interaction when registering on a site or buying on a marketplace, making its integrity particularly vulnerable, until now without any transparency on its use by third parties.
This situation, in which users lose control of their information, has prompted the European authorities to act with the implementation of the GDPR, after four years of negotiation between the different political groups, against a background of intense lobbying by stakeholders.
What is the General Data Protection Regulation?
The General Data Protection Regulation (GDPR) is a new European law that came into force on 25 May 2018. It replaces the outdated European Union Data Protection Directive dating from 1995. The regulation applies to all businesses that process or intend to process the data of individuals within the European Union, regardless of whether the data processing takes place in the EU.
The GDPR sets out rules for collecting, using, and protecting personal data. It also gives individuals new rights over their data.
In concrete terms, what are the obligations set out in the GDPR?
Firstly, companies must obtain the explicit consent of individuals before collecting, using, or sharing their data, giving rise to those little boxes that many people accept without paying any particular attention to them.
More restrictive, the Regulation imposes a series of obligations on companies to comply. This starts with providing individuals with clear and concise information about their rights under the Regulation. It is also mandatory to appoint a Data Protection Officer (DPO) and to keep a register of processing activities that aggregates and summarizes the use made of personal data.
Finally, companies must take all measures to protect personal data from unauthorized access, use, or disclosure and report any data breach to the relevant authorities within 72 hours.
Digital Market Act, the first part of an ambitious plan
The Digital Market Act (DMA) is the first of two parts proposed by Competition Commissioner Margrethe Vestager and her Internal Market counterpart Thierry Breton in May 2020. The aim is to establish a digital single market that defines the responsibilities of the players and protects users in the European Union.
The European Parliament website announces these objectives in these words. "Both texts aim to address technology companies' societal and economic effects by setting clear standards for how they operate and provide services in the EU, in line with fundamental rights and European values.
Specifically, the Digital Market Act (DMA) can be seen as an anti-trust law that aims to prevent abuses of dominance, including by GAFAMs, by establishing a level playing field for companies operating in the EU's single digital market.
According to the European Commission, 90% of the 10,000 platforms operating in the EU are small and medium-sized enterprises, but most of the value is captured by the largest. According to the Commission, the turnover of GAFAMs is equivalent to French tax revenues.
As we can see, the DMA targets platforms that operate as "gatekeepers", i.e. as intermediaries such as Amazon or Airbnb, to rebalance the competitive balance of power and combat this oligopolistic situation. More specifically, all companies that meet the following criteria will be presumed to be gatekeepers:
- Provide one or more essential platform services in at least three European countries
- Have an annual turnover in Europe of more than €7.5 billion in the last three years or a market capitalization of more than €75 billion
- Register more than 45 million European users per month and 10,000 professionals per year over the last three years.
In all, ten "core platform services" areas are listed, including marketplaces, browsers, search engines, social networks, cloud services, and virtual assistants.
In other words, GAFAMs will no longer be able to require users to use certain software to use their services or to require developers to associate certain services. Also, a platform will no longer be able to favor its products or use data collected from vendors to compete with them.
DSA, a tool to make GAFAMs accountable?
The DSA is aimed at the same targets as the DMA and tends to regulate the same activities, namely the provision of internet access, marketplaces, and especially service platforms.
Unlike the latter, companies with less than 45 million users will be exempt from many of the obligations set out in the DSA.
Firstly, digital companies will have to appoint a legal representative located in one of the 27 EU countries.
From now on, platforms will have to be transparent, starting with content moderation. They will have to establish a complaints system that allows users whose accounts are suspended to challenge the decision.
Above all, the opacity behind advertising algorithms will have to be dispelled. Platforms will be obliged to present how they work and will have to offer a system that is not based on profiling. It should be noted that targeted advertising will be prohibited for minors.
Finally, large platforms such as Meta (Facebook, Instagram), Amazon, or Google, will be obliged to carry out an audit at their own expense to "analyze the systemic risks they generate and to put in place a risk reduction analysis". This will result in a report that will identify and reduce risks relating to
- Dissemination of illegal content
- Adverse effects on fundamental rights
- Manipulation of their services with an impact on democratic processes and public security
- Negative effects on gender-based violence, on minors, and serious consequences on the physical or mental health of users
Through this text, the public authorities wish to combat the dissemination of false information. However, the verification of such information, known as fact-checking, requires rare skills and the processing capacity of a verification unit can be very limited when based on conventional models.
However, faced with the constraint of compliance with legislation, what solutions will be deployed by the platforms? Will European institutions take a proactive stance on this issue?
Following the example of Buster.Ai, some players have taken up the subject and are banking on artificial intelligence to increase tenfold the capacity to process information and the accuracy of verdicts during verification.
New EU Laws for Digital Competition and User Data Protection
After having regulated the collection and use of personal data through the GDPR, the European Union is going to apply two new texts which should establish a healthier competitive environment in which the user is no longer forced to use certain platforms for basic services (DMA) and then regulate the risks linked to the use of the latter through the DSA.
Compliance with the new legal obligations is a challenge for the GAFAMs as the efforts made will have to be communicated to the authorities. Also, the European Commission foresees heavy fines in case of non-compliance, which can go up to 6% of the global turnover of the offending group in the framework of the DSA and 10% in the framework of the DMA, with the possibility of going up to 20% and being excluded from the market in case of recidivism.
In the spirit of the antitrust laws, a company that commits three violations over eight years could fall under the yoke of an investigation and be obliged to divest one or more of its activities.
With 327,300 French people who have been victims of personal data theft since the beginning of 2022, will European regulations be sufficient to guarantee free competition between digital players and user security?