Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

EU: New horizons - the interplay of the GDPR with the Data Act and the Digital Services Act

The construction of a fair, thriving, and progressive data ecosystem in the European Union (EU) is a major goal of the European Commission (EC). Following the entry into force of the GDPR in 2018, the backbone of the EU's set of rules on processing personal data, the EC has outlined subsequent regulatory initiatives to address the challenges and unleash the opportunities presented by the data economy in the EU. The success of the EU's Digital Decade initiative hinges on various factors, such as the responsiveness of organizations, the availability of clear guidance, and the establishment of legal certainty and coherence among regulators. One thing is clear: the EU is exploring new horizons in digital rulemaking.

In this Insight article, Heidi Waem, Muhammed Demircan, and Simon Verschaeve, from DLA Piper UK LLP, outline the main points of interplay of the GDPR, Chapter II of the Data Act, and the DSA.

Rosmarie Wirz/Moment via Getty Images

One significant example of the EU's ambitious goals is the Data Act, which entered into force on January 11, 2024, with most of its provisions becoming applicable as of September 12, 2025. This wide-ranging, sector-neutral legislation regulates several aspects of the data economy and will be relevant for many companies due to its horizontal nature. Notably, Chapter II of the Data Act lays down rules concerning access and use of the data generated, collected, or obtained by connected products and related services in both B2B and B2C contexts. Chapter II of the Data Act empowers users of connected products and related services by granting them new access and data portability rights and aims to shift the control over data generated, collected, or obtained within the Internet of Things (IoT) industry towards the users.

Another key regulation is the Digital Services Act (DSA) which entered into force on November 16, 2022, with most provisions starting to apply as of February 17, 2024. The DSA rules apply to categories of online intermediaries according to their role, size, and impact on the online ecosystem. Online intermediaries such as online marketplaces, app stores, collaborative economy platforms, search engines, and social media platforms will have to comply with a range of obligations to ensure transparency, accountability, and responsibility for their actions. While the DSA can be classified as a platform regulation, it also contains provisions directly affecting the processing of personal data.

Both the Data Act and the DSA explicitly state that the rules on the processing of personal data, i.e., the General Data Protection Regulations (GDPR) are unaffected by these regulations. This requires organizations concerned to carefully consider the GDPR implications of this new set of rules.

The Data Act and the GDPR

Chapter II of the Data Act introduces various design and data-sharing obligations for manufacturers, sellers, renters, and lessors of connected products and providers of related services. The key obligations are as follows:

  • Article 3.1 of the Data Act requires manufacturers of connected products and providers of related services placed on the market in the EU to ensure 'access by design' for sharing product data and/or related service data with the user. This means manufacturers and providers may need to modify the production cycle of such connected products and related services to ensure that data is shared with users by default. In practical terms, IoT products and related services, such as smartwatches and their supporting mobile applications, will need to automatically provide the required data to users, potentially through a real-time online portal accessible via the relevant mobile application. This obligation will apply to connected products and related services placed on the EU market after September 12, 2025.
  • Article 4.1 of the Data Act provides that where the user cannot directly access data from the connected product or related service, the data holder must make readily available data, as well as the relevant metadata necessary to interpret and use that data, accessible to the user.
  • Lastly, Article 5 provides for the right of a user to instruct the data holder to share their data with a third party. This novel data portability right has similarities with the data portability right under the GDPR, but unlike the GDPR which allows data portability only where 'technically feasible,' the Data Act aims to ensure that technical obstacles do not hinder or prevent access to such data. It is noteworthy that the Digital Markets Act also provides a similar data portability right for the end users of core platform services.

Furthermore, Articles 3.2 and 3.3 of the Data Act require parties offering connected products and related services to inform the users on various elements relating to data generation, classification, frequency, and the exercise of user access and data portability rights. These information notices will need to be presented to users before concluding a contract.

The expanding scope of personal data and its implications for the Data Act

The Data Act states in Recital 2 that it aims to promote and eliminate barriers to data sharing. Such barriers are caused by a variety of factors such as bottleneck situations, lack of common data-sharing practices and investments, legal uncertainties, and lack of technical standards. The Data Act has considerable potential to increase data sharing, including for personal data.

Establishing whether data qualifies as personal or non-personal data is a crucial step for ensuring compliance, firstly because the processing of personal data triggers the applicability of the GDPR. Secondly, the Data Act prohibits data holders from using non-personal data arising through a connected product or related service without the approval of the user. However, qualifying information as personal or non-personal data remains a tricky exercise, especially in sectors where personal data is inextricably mixed with non-personal data, for example, the GPS data in a smartwatch.

The GDPR defines personal data as 'any information relating to an identified or identifiable natural person.' The relevant test to conduct for qualifying data as personal data is found in Recital 26 of the GDPR, which states that 'to determine whether a natural person is identifiable, account should be taken of all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly.' The Court of Justice of the European Union (CJEU) has confirmed, in Patrick Breyer v Bundesrepublik Deutschland, that data such as IP addresses qualify as personal data even if legal means are required to make a person 'identifiable.' In a more relevant case for the IoT world, Gesamtverband Autoteile-Handel e.V. v Scania CV AB, the CJEU has stated that vehicle identification numbers are capable of being personal data if parties have reasonable means at their disposal to link it to a person. National data protection authorities tend to follow this wide interpretation. In addition, the Finnish data protection authority ruled in 2019 that vehicle service histories are considered to be personal data as they could directly or indirectly reveal information about individuals' activities in the course of their daily lives (only available in Finnish here).

Most organizations, especially in the IoT sector, may store data arising from the use of connected products or related services in mixed datasets combining non-personal data with personal data. It should be therefore noted that if the non-personal data and the personal data are 'inextricably linked,' the data protection rights and obligations stemming from the GDPR fully apply to the whole mixed dataset, even when personal data represent only a small part of the dataset.

Consequently, organizations should first carefully map out all the data that is generated, collected, or obtained by connected products and related services offered to the users. Such a mapping exercise will allow organizations to decide which regulatory regimes apply to which dataset, including the GDPR.

Transparency obligations

As noted above, the Data Act comes with a novel set of information notices to be presented to users before concluding a contract. It could be argued that these notices serve the same goal as the GDPR's privacy notices under Articles 13 and 14: informing users on what data is collected and how it is managed.

A comprehensive approach that considers both the Data Act's information notices and the GDPR's privacy notices is likely to be the most appropriate strategy. Organizations will need to disclose extensive information regarding the data generated, collected, or obtained via the use of connected products or related services. Much of this data may qualify as personal data under the GDPR, necessitating inclusion in relevant privacy notices. As both types of notices reveal an organization's data practices to users, coherence and accuracy between these notices will reduce the risk of non-compliance.

Searching for legal bases to share personal data

The processing of personal data requires a valid basis under Article 6 of the GDPR. As the term 'processing' encompasses nearly all activities involving personal data, the collection, generation, or sharing of personal data also require legal bases.

This raises the question of which legal basis under the GDPR is appropriate for data holders to share personal data. Recital 7 of the Data Act states that the Data Act 'does not constitute a legal basis for the collection or generation of personal data by the data holder.' However, the same recital also states that the Data Act 'imposes an obligation on data holders to make personal data available to users or third parties of a user's choice upon that user's request.' Thus, the Data Act creates a legal obligation to share certain data, including personal data, with users and third parties chosen by the users. Consequently, for such situations, Article 6.1(c) of the GDPR (i.e., legal obligation) may provide a legal basis.

However, not every legal obligation allows reliance on this legal basis. According to the Article 29 Working Party, the predecessor of the European Data Protection Board (EDPB), '[…] the legal obligation itself must be sufficiently clear as to the processing of personal data it requires. Thus, Article [6.1(c)] applies on the basis of legal provisions referring explicitly to the nature and object of the processing.' The Data Act is arguably clear about the scope and purpose of personal data processing as part of access obligations and rights. However, this is likely to be a priority issue for national authorities and the CJEU to address.

The situation is markedly different if the user of a connected product or a related service is not the data subject. Often, the user under the Data Act is a legal entity (e.g., an employer) but the connected product is in fact used by an individual (e.g., an employee). Recital 7 of the Data Act states that 'where the user is not the data subject, the [Data Act] does not create a legal basis for providing access to personal data or for making personal data available to a third party […]. In those cases, it could be in the interest of the user to facilitate meeting the requirements of Article 6 of [the GDPR].' Consequently, in cases where the user demands the data holder to transfer the relevant data to a third party, data holders will need to carefully analyze whether there is a valid legal basis, such as legitimate interest under Article 6.1(f) GDPR. In order to rely upon this legal basis, the data holders will need to conduct and document a three-step test (i.e., purpose, necessity, and balancing tests).

Recital 34 of the Data Act states that when the user is not the data subject but an enterprise, the user is considered the controller. To 'request personal data generated by the use of a connected product or related service,' users must 'have a legal basis for processing the data as required by Article 6(1) of [the GDPR], such as the consent of the data subject or the performance of a contract to which the data subject is a party.' Therefore, it is clear that users in such cases will not be able to rely on Article 6.1(c) of the GDPR (i.e., legal obligation) as a legal basis. Rather, a case-by-case analysis will be required in such scenarios to determine whether the users have a valid legal basis to request such data. Although this does not affect data holders directly, it implies that data holders must confirm that the user has such a legal basis before providing the data. The robustness and level of detail of such confirmation remain to be resolved by the national authorities and competent courts, including the CJEU.

The DSA and the GDPR

Unlike the Data Act, the DSA applies solely to a range of online intermediaries. The DSA aims to tackle issues such as illegal content, online misinformation, and similar harmful activities. Therefore, the DSA has rather a more indirect relationship with the GDPR compared to the Data Act. Nonetheless, there are certain considerations to take into account. Namely, the DSA:

  • bans personalized advertisements based on profiling (which is a term defined by the GDPR), using special categories of personal data referred to in Article 9.1 of the GDPR;
  • requires providers of online platforms accessible to minors to implement appropriate and proportionate measures to ensure a high level of privacy;
  • prohibits all personalized advertisements based on profiling for minors; and
  • requires providers of very large online platforms (VLOPs) and very large online search engines (VLOSEs) to conduct risk assessments in relation to the design and functioning of their services every year, with the goal of identifying and mitigating negative effects on the protection of personal data.

Personalized advertisements and processing of personal data

Article 26.3 of the DSA prohibits online platforms from presenting personalized advertisements based on special categories of personal data. Similarly, Article 28.2 bans personalized advertisements based on profiling minors. The DSA does not impose any obligations on the customers of such online platforms, such as conducting risk assessments for DSA compliance. However, since these personalized advertisements and the associated processing of special categories of personal data are now prohibited, online platforms will not have a valid legal basis under Article 6 of the GDPR or an exception ground under Article 9 of the GDPR to process such data.

Many online platforms, especially the larger ones, position themselves as 'processors' under the GDPR. If an online platform violates this ban, the customers of such a platform may also risk processing such data under a prohibited purpose as 'controllers.' This could potentially lead to non-compliance with the GDPR. Therefore, it is important for organizations that rely on online platforms to increase awareness among their marketing and advertisement teams and develop internal guidance to ensure compliance. Even though the DSA does not directly impose responsibilities in this regard on customers of online platforms, taking these steps can help mitigate the risk of GDPR non-compliance.

Recommender systems and the GDPR fairness principle

The DSA regulates the operation of recommender systems, i.e., complex algorithms that decide what content is visible to users and the order in which it appears, using parameters set by the online platform. Online platforms are required to explain the main parameters of their recommender systems as well as any options available to users to modify or influence the parameters. In relation to this obligation, VLOPs and VLOSEs will need to assess risks to the protection of personal data. Recital 96 of the DSA establishes that recommender systems must be adjusted when biases in such recommender systems lead to discrimination. Similarly, the GDPR's fairness principle prohibits processing personal data in a way that is unduly detrimental, unexpected, or misleading to users.

Similar to the aforementioned situation with personalized advertisements, although these DSA obligations solely apply to VLOPs, customers of such platforms may have obligations arising from their qualifications (i.e., controller) under the GDPR. Therefore, customers should duly analyze the reliability of recommender systems vis-à-vis fairness and accuracy principles of the GDPR if they plan to use personalized advertisements. Online platforms may offer certain options such as choosing the data attributes that will be used for personalized advertisements, allowing customers to configure how personal data is processed within the rollout of their content on online platforms. Such options may be leveraged and documented to demonstrate the compliance efforts of organizations.

Heidi Waem Partner
[email protected]
Muhammed Demircan Associate
[email protected]
Simon Verschaeve Associate
[email protected]
DLA Piper UK LLP, Belgium