Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

EU: Social media regulation - tackling the sector's growth pains

The use of social media platforms has developed rapidly over the past decade, with many becoming household names globally and revolutionizing the way we communicate and consume information. There are, however, challenges that come with such growth, including the proliferation of, among other things, misinformation, disinformation, hate speech, and online harassment, all of which highlight the need for comprehensive regulations to govern virtual platforms. In this Insight article, João Peixe, from Vasconcelos Arruda Advogados, provides an exploration of the ever-evolving landscape of EU social media regulation, delving deep into the multifaceted challenges presented by the sector's explosive growth.

enjoynz / Signature collection / istockphoto.com

Introduction

In recent years, we have witnessed the rapid growth of social media in society. Whether for leisure or work, the truth is that in an increasingly digital world, social media has become an essential part of many people's daily lives, creating many business opportunities that would not have been possible without it.

According to statistics published by Eurostat in December 2022 (available for consultation here):

  • the share of EU households with internet access has risen to 93%, up from 72% in 2011;
  • 68% of individuals in the EU ordered or bought goods or services over the internet for private use in 2022 - up from 54% in 2017; and
  • 90% of EU individuals aged between 16 and 74 years used the internet at least once within the three months prior to the survey date, with a very large number of purposes such as communication, access to information, and entertainment, including the necessary use of social media.

Need for regulation

In light of all the positive developments that this increasingly digital world has brought, unbridled growth has also brought growing pains, as we have seen and are seeing in other areas, such as the recent cryptocurrency boom. This unchecked expansion has led to a series of problems in the absence of regulation. In some cases, we have seen loopholes resulting from the lack of regulation being exploited, jeopardizing the safety of social media users and their rights, some of which are fundamental, as enshrined in the EU Charter of Fundamental Rights.

It is undeniable that, despite the widespread use of social media, a dark side exists. Recent years have seen numerous reports of online harassment and bullying, the use of hate speech against individuals, ethnic groups, minorities, and immigrants, the spreading of politically motivated lies, the posting of obscene images, and the dissemination of divisive propaganda. Until recently, the regulation and monitoring of these offensive types of social media posts has been left to the discretion of individual social media companies, without any legally binding instrument. This approach has yielded significant consequences, as the regulatory landscape was primarily shaped by guidelines and directives issued by administrative authorities in each Member State or by consumer protection associations.

On February 10, 2021, following critical events in terms of disinformation (commonly known as fake news) and hate speech on some social networks reported around the world, an important debate unfolded among Members of the European Parliament, where the enormous power of social networks and their worrying impact on politics and freedom of expression were criticized, and it was stressed that the companies that own social networks must have responsibilities in terms of encouraging citizen participation and ensuring freedom of expression. However, the dangers of hate speech, disinformation, fake news, and incitement to violence need to be taken into account and platforms need to be held accountable as sources of information.

Despite the efforts announced by the major players in the social media field to strengthen content moderation, either through departments employing dozens or hundreds of people or through the increasingly frequent introduction of tools based on artificial intelligence (AI), the truth is that the rules that should govern both content moderation and the policies and procedures to be implemented by the owners of digital platforms have, until recently, been non-existent. This void allowed for a system that, on the one hand, disadvantaged those most in need of protection (the users) and, on the other hand, favored those who exploit and control the platforms, who, with some limitations resulting from legislation that indirectly imposes certain types of limits, set their own rules.

The recently published Digital Service Act (DSA) responds to a need for regulation that has been felt for several years, establishing a robust framework for supervising intermediary service providers, such as social networks and search engines.

In this Insight article, we will focus on the importance of the DSA in regulating social media as a way of addressing the growth pains suffered by an unregulated, fast-growing sector, although it is important to make some comments about the GDPR and the Digital Markets Act (DMA).

GDPR

The GDPR was undoubtedly a major milestone in strengthening the protection of individuals as holders of personal data. It empowered each person with greater control over their personal data, and standardized this control across the EU, while at the same time putting a number of brakes and limits on abusive and non-transparent practices (which, as some media reports have shown, have proved to be a nightmare for some companies that own social networks and which, even five years after its full implementation, are still unable to ensure compliance with the GDPR).

Unlike the DSA, the focus of the GDPR was in no way to combat the abovementioned growing pains of social media and of the digital world, but rather, as stated in Article 1, to establish rules on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and to defend the fundamental rights and freedoms of natural persons, in particular their right to the protection of personal data.

However, it is worth noting that the GDPR laid the ground for some of the options outlined in the DSA. Additionally, concerning personal data, the GDPR has paved the way for a number of obligations arising from the DSA, in particular on the subject of profiling, risk assessment, data minimization principle, and the access and control that providers of intermediary services (such as the owner of a social network) must grant and that platform users must have over their personal data.

DMA

On October 12, 2022, the DMA was published in the Official Journal of the European Union. With a focus on digital platforms, the European Commission seeks to regulate the activities of companies in the digital sector (the so-called gatekeepers). Although it formally entered into force on November 1, 2022, implementation did not begin until May 2, 2023.

The DMA aims to prevent gatekeepers from imposing unfair conditions on businesses and end users and to ensure the openness of essential digital services. Therefore, it is safe to say that the core objective of the DMA is to make the internet more competitive by establishing significant regulations in the field of competition law and ensuring the safety of its users.

While the DMA is and will continue to be an important legal instrument for regulating what happens in the digital world and giving users more choices, unlike the DSA, it does not require online platforms to be more transparent about how they collect data or to outline how to deal with illegal content and disinformation, the latter being the key issue analyzed in this article.

DSA

On October 27, 2022, the DSA was published in the Official Journal of the European Union. The DSA will apply to all Member States without the need for local implementation legislation from February 17, 2024, onwards, although some provisions are already applicable since November 16, 2022.

The DSA follows the principle that what is illegal offline should also be illegal online and its main objective is to update the liability regime for digital service providers, which was previously set out in Articles 12 to 15 of the repealed Electronic Commerce Directive (2000/31/EC). This liability regime for intermediary services for the transmission and hosting of illegal content, which is almost 20 years old, needed to be updated in light of all the controversies surrounding the distribution of illegal content, including copyright infringements, support for terrorism, child pornography, etc. Attempts to moderate this content (involving errors in signaling and blocking algorithms, false positives and negatives) further necessitated a revamp. These challenges resulted in restrictions on the fundamental rights of access to information and freedom of expression, leading to the so-called 'chilling effects,' that have become apparent with the ubiquity of and dependence on these services in today's society.

While there is much to discuss regarding the DSA and what it aims to combat or regulate, this article's nature limits us to discussing the initiatives aimed at combating hate speech and disinformation, as well as strengthening the protection of vulnerable users of social networks, particularly minors, who, in some cases, constitute the largest user group on these platforms. A separate, dedicated article could explore the topic of combating dark patterns and targeted advertising.

VLOP and VLOSE as main actors

It should be noted that the rules proposed by the DSA, on a risk-based approach, are designed asymmetrically. This means that larger intermediary services with significant societal impact are subject to stricter rules.

The DSA singles out Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs) for some of the most stringent requirements. VLOPs and VLOSEs are defined in the DSA as platforms and search engines that provide their services to 45 million or more average monthly active users in the EU and have been designated by the EU regulators as meeting the threshold standards.

On April 25, 2023, the European Commission identified 17 VLOPs and two VLOSEs, with eight of those VLOPs consisting of social networks.

Disinformation and hate speech - content moderation

Among the key obligations of the companies subject to the DSA is to increase the thoroughness of content moderation, with the goal of decreasing the spread of disinformation:

  • VLOPs and VLOSEs must take measures to combat the dissemination of illegal content online and, more generally, to address the negative impact of online content on freedom of expression and information.
  • Platforms must establish mitigation measures to reduce the spread of disinformation and the use of fake accounts that spread disinformation, hate speech, and promote violence.

The DSA also requires that providers of intermediary services make their content moderation actions publicly available, at least annually, in a machine-readable format and an easily accessible manner. These reports should be clear and readily understandable regarding any content moderation undertaken during the relevant period.

Vulnerable users - minors and their protection

Children are assumed to be less sophisticated and more vulnerable to advertising or predatory activity. As such, the DSA establishes certain rules that any intermediary service, such as a social network, must comply with. These rules include the requirement to be transparent about their terms and conditions, which should also be easy to understand, particularly when the service is primarily directed at minors or is predominantly used by them.

The DSA requires platforms to redesign their systems to ensure a high level of privacy, security, and safety for minors. Platforms are also obligated to conduct a risk assessment of their sites to identify potential negative effects on the mental health of children. Following this assessment, platforms must adapt their services to mitigate these risks.

Enforceability

While the European Commission will exercise direct regulatory control over VLOPs and VLOSEs, the DSA creates a multi-national structure for the regulation of online activity, with each Member State having Digital Services Coordinators in the near future with supervisory control over smaller platforms and search engines.

Non-compliance

Following the approach already established in the GDPR, the DSA uses a potentially large fine as a means of persuading a potential offender not to consider paying a fine, given the economic benefits of the offense (e.g., in the context of a risk assessment).

For example, under the DSA, VLOPs and VLOSEs may have to pay fines of up to 6% of their total worldwide annual turnover in the preceding financial year for certain violations of the DSA.

Conclusion

Much more can and will be written on this subject, but now that we have a framework for it and the main initiatives that have been built in Europe, it is important to applaud these initiatives, even if, in my opinion, they have come somewhat late. The digital revolution, even if not on the scale we are experiencing today, began a long time ago.

Once again, the EU, as it has done for example with the GDPR (which has subsequently inspired so many other countries, such as Brazil), stands as a pioneer in this field. It is expected that giants such as the US, where many of the big tech companies affected by these issues are based, will be in the forefront in the near future, adopting measures to make the digital world increasingly safe for their users, respecting their fundamental rights, especially freedom of expression, and avoiding the shameful disinformation and hateful speech campaigns we have seen in the past.

Nevertheless, as in 2018 (again with the GDPR), it seems too early to assess the real impact these measures will have. For now, we welcome the initiative, the awareness it has raised, and the obvious concern that the issue has already generated among the public and also among VLOPs and VLOSEs (which are at the center of attention more than any other services). However, let's hope that it goes much further than that.

Next year will be a crucial year in terms of determining whether the path that has been mapped out is indeed the one that was intended. It is expected that both the European Commission and each of the Digital Services Coordinators will take a proactive stance that should go beyond the DSA, through the publication of specific guidelines and directives, aimed at informing platform users on the one hand and enabling platforms to implement all the necessary mechanisms to make the DSA a success on the other.

Only time will reveal the true outcome!

João Peixe Senior Associate Lawyer
[email protected]
Vasconcelos Arruda Advogados, Portugal