Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

UK: The ICO and CMA's joint paper on harmful design in digital markets

It was obvious from the start that the internet would transform how individuals interacted. What perhaps wasn't taken into account at the time (understandably given those early online interfaces) was how important design would become to the online experience – both for positive and for harmful reasons. The online user experience and digital environments available in 2023 are light years away from the early years of the internet from the mid-1990s. As the design methods became more sophisticated, companies realized they could produce websites that wooed customers to spend more money (and time) on their platforms. Likewise, there are business benefits for targeted advertising or building insights where website interfaces are designed to encourage users to provide their personal data.   

It has taken some time for European law to establish requirements around good design in digital markets. It wasn't until the application of the General Data Protection Regulation (GDPR) from May 2018 that there has been a legal requirement to build online interfaces in a way that promotes data protection by Design and Default (Article 25) – itself a concept that reflects pioneering work from regulators in Canada. Alongside this data protection by design requirement, failing to design online interfaces properly can impact wider data protection principles such as fairness and transparency. It is much harder to argue that the use of personal data is fair when its collection is carried out in a way that is opaque to individuals. Victoria Hordern, from Taylor Wessing LLP, discusses what this joint paper covers on harmful design and contemplates whether we will see further enforcement in this area. 

in-future / Signature collection /

DRCF paper 

It's noteworthy that one of the first joint regulator position papers produced by the UK's Digital Regulation Cooperation Forum (DRCF) concerns online choice architecture and the way that online design can be used for harmful purposes. The authors – the Information Commissioner's Office (ICO) and Competition and Markets Authority (CMA) – set out their joint concerns in the paper that 'some common online design practices influence consumers' decisions in ways they are not aware of and may not want.' These practices can infringe data protection law and negatively impact consumers and competition. This is why the two regulators have a joint interest in examining the practices online, providing an overview of how design choices can lead to harm, and setting out examples of harmful design in digital markets. It's not the first time the two regulators have come together to provide their view on an area of mutual interest. In 2021, the regulators issued a joint statement concerning competition and data protection in digital markets where they linked meaningful choice and control as fundamentally necessary for robust data protection and effective competition.   

This most recent paper sets out the regulators' primary expectations concerning Online Choice Architecture (OCA): organizations must put the user at the heart of design choices and use design that empowers user choice and control. OCA practices include the way prices are displayed and how consumers are presented with information about goods and services in digital markets. But OCA practices can also affect how individuals are tracked online and what information and choices they are presented with to control such tracking. Well-designed OCA can help users make decisions that align with their preferences. Conversely, practices such as 'confirmshaming' and bundled consent can confuse users and are highlighted as examples of harmful practices.   

Types of harms identified 

The data protection harms listed in the paper include: 

  • unwarranted intrusion where poor OCA practices manipulate and influence users of digital services to make choices about their personal data that do not align with their preferences; 
  • loss of control or autonomy where poor OCA practices make it unduly difficult for users to choose freely how their data is processed, depriving users of meaningful control of how their personal data is used; and 
  • costs of avoiding or mitigating harm where poor OCA practices can increase the amount of time users must spend to make informed choices about personal data processing.  

The competition and consumer protection harms that relate to data processing include: 

  • nudging consumers towards choices that reinforce a company's market position which could therefore weaken competition; 
  • using consumer data to leverage network effects to strengthen their market position, create lock-ins that make it harder for consumers to switch to a different provider, and so make it harder for rivals to compete; and 
  • distorting consumer choices by making certain options easier or more desirable to choose over others.  

Examples of harmful design practices 

The ICO and CMA call out a number of potentially harmful OCA practices in their paper. These are: 

Harmful nudges and sludge  

A harmful nudge is where a company makes it easy for users to make inadvertent or ill-considered decisions. Sludge is where a user is encouraged to choose one option since the alternative presents with excessive or unjustified friction. So, a company can make one option for a user much easier to achieve than another option. The classic example here is a cookie banner where a user is presented with a one-click option to consent ('Accept') but must go through multiple clicks to reject all cookies. It is also misleading when the options presented to a user in the cookie banner only include 'Cookie settings' and 'Accept' which can imply there is no option to reject cookies.  


This is when a user is pressurized or shamed into carrying out an action through making them feel guilty or embarrassed for not doing it. The language used can suggest to the individual that the choice is either good or bad. This kind of practice can cause a user to agree to the use of their personal data in a way they would not otherwise agree to. Examples are where a user is encouraged to provide their email address in return for a discount which is presented as an unmissable saving for them.  

Biased framing  

This practice operates in two ways. Firstly, as positive framing which presents choices in a way that emphasizes the supposed benefits of positive outcomes of a particular option. Secondly, as negative framing where the supposed risks or negative consequences of a particular option are presented to discourage a user from selecting it. Typically, biased framing can be used by a company to preference their own service over rival services.  

Bundled consent  

This practice involves asking a user to consent to the use of their personal data for multiple separate purposes via a single consent option. Essentially, it prevents a user exercising granular control over which purposes they may be willing to agree to. This practice can lead to users inadvertently consenting to uses of their personal data in order to access other requested functionality. A website could also request that a user clicks to accept the Terms and Conditions, Privacy Policy, and cookie preferences in a single consent. This approach falls short of data protection and privacy requirements in a number of ways.  

Default settings  

This is where a company applies a predefined choice that the user needs to take active steps to change, such as automatically selected add-ons or pre-ticked boxes. Default settings can reduce the ability of users to make effective choices and are flagged in the paper as one of the strongest practices that influence user behavior. This can be because users are in a hurry or do not feel they have the understanding to overturn the default settings which can appear to be endorsed by the digital provider.  

Growing concerns around online design harms 

The concerns expressed by the ICO and CMA in this recent paper are echoed in other laws. For instance, amongst other things, the new EU Digital Services Act (DSA) prohibits dark patterns that affect the design of websites/apps in order to manipulate users into choices they do not intend to make. Article 25 requires that providers of online platforms do not design or operate their online interfaces to deceive, manipulate, or otherwise materially distort or impair the ability of users to make free and informed decisions.  

While the UK is not bound by the DSA, the newly approved UK Online Safety Act 2023 (OSA) includes provisions that reflect a design-based concern. For instance, right from the opening provision of the OSA, a duty is imposed on a provider to ensure that their services are designed to ensure protection for children and that users' rights to privacy are protected. Likewise, under Ofcom's draft consultation proposals concerning the OSA released recently, there are obligations on staff involved in the design and operational management of the service to be sufficiently trained in the service's approach to compliance.  

Cookie consent banners 

The ICO and CMA repeatedly refer to examples relating to the pervasive use of cookies when carrying out their critique of OCA. When the original regulations on cookies first emerged in 2002, the general practice was that websites told individuals that by continuing to use their website, the individual accepted the use of cookies. Of course, the clarifications spelt out in subsequent laws made it clear that this approach was not sufficient. Consent had to be actively obtained through a separate step, not simply continuing to use the website. The use of cookies also became more business critical during the decade from 2000 due to the growth of the online advertising industry which relied on cookies (or similar trackers) to collect data for targeting.  

However, European regulators have become more active in enforcing the rules around the use of cookies when there are concerns that users are not being given a clear and free choice on whether or not to accept them. Indeed, in January 2023, the European Data Protection Board (EDPB) published its report of the work undertaken by the Cookie Banner Taskforce which highlighted a number of practices where dark patterns are used in cookie banners. The Taskforce was prompted to consider the minimum thresholds required to assess the use of cookies due to complaints from the privacy organization None of Your Business (NOYB). As part of its activities, the Taskforce highlighted the following concerns around design, a number of which chimes with the DRCF paper from the ICO and CMA: 

  • no reject button on the first layer of the cookie banner – so while websites included an 'Accept All' button, users were not given a simple means of refusing cookies upfront;  
  • the use of pre-ticked boxes in banners where the website asks users for consent – since consent requires an active step from the individual user, pre-ticked boxes which are presented as already obtaining consent from the user are not valid; 
  • websites that contain a link and not a button for users to click on to reject cookies, forcing users to go through to a second layer of the cookie banner in order to exercise their choice and where these links are not provided prominently to users so that a user would not necessarily notice it; 
  • the use of deceptive button colors and deceptive button contrasts – so where a website might provide an 'Accept all' button which is green and a bigger size to the alternative 'Select options' or 'Reject all' buttons which are red and smaller. Likewise, websites can be designed to use contrast or other visual tools to make one option more attractive to a user; 
  • where the website argues it can rely on legitimate interest as a lawful basis for certain subsequent processing of data but the list of purposes for processing for legitimate interest is presented in a confusing way so that users think that, if they do not consent, they still need to refuse the legitimate interest processing;  
  • where cookies are incorrectly classified as essential – since strictly necessary or essential cookies can be implemented without consent, websites can seek to frame certain cookies as essential where they are not; and  
  • where the website does not present an icon allowing users to easily withdraw their consent – if there is no visible icon or equivalent on each webpage, it is difficult for users to easily withdraw their consent.  

While there's not yet been publicized regulatory enforcement activity in the UK concerning cookie banners, there has been in other jurisdictions such as France and Spain. Just recently, the Spanish data protection authority fined a website €12,000 for its cookie banner which presented the user with only two options – 'Settings' or 'Consent.' Since there was no 'Reject All Cookies' button, a user had to go through over 300+ clicks to reject all cookies. The settings on this website reflect a widespread practice where legitimate interest is listed as a lawful basis for processing personal data and is pre-selected. Clearly the amount of time and effort a user would have to go to in order to express their refusal of cookies is too onerous in such a scenario. This is an example of where a regulator is taking action to sanction digital interfaces which deliberately design their platforms to deprive users of simple, straightforward options to reject cookies.  

What does a safe OCA design look like?  

According to the ICO and CMA in their paper, there are four key practices: 

  • Put the user at the heart of design choices – default settings should be designed to reflect the users' interests. User control should be enhanced and users should easily be able to exercise their privacy preferences;  
  • Use a design that empowers user choice and control – information provided to users about their choices should be easy to understand and balanced. Information should be presented so that users can understand the choice they face and can make effective decisions without being confused or overloaded with information;
  • Test and trial design choices – providers are expected to test user behavior and consumer comprehension, experience, and feelings of control. This could be through online experiments, customer surveys, or A/B testing. Testing can help understand how harm occurs and can mitigate the risk of poor consumer outcomes. Additionally, the results of such testing (especially interactions with users) can be useful as evidence should a regulator investigate the provider's OCA; and
  • Comply with data protection, consumer, and competition law – this may be stating the obvious but the paper underlines that some OCA practices may breach legislative requirements. 

The paper recognizes that all users are subject to behavioral biases that affect their decision-making. Just as in the offline world, digital market operators will use practices to exploit these biases to influence the decisions consumers make. So, does this mean that all digital market operators need to design their interfaces in a way that accounts for the behaviors and biases of some theoretical objective user? How different would the expectations be on a website designer seeking to target university students (who may be short of funds) to, say, a website targeting corporate finance professionals (who should be savvier about protecting their funds)? And should there be any difference in approach between the two websites? Of course, the potential harm caused can be greater if a user is more vulnerable, so individuals who have addictions are more likely to be influenced by messages that exploit their addictive behaviors. But what about websites that can be used by all sections of society, from the young to the old? Should the website be designed on the basis of interactions with the most vulnerable potential user? Regulators and the courts are likely to answer these questions in time and, of course, certain websites deliberately targeting specific sections of society should apply a commonsense approach when considering who their likely audience will be. 

Likelihood of enforcement 

The CMA has an ongoing program considering OCA practices in general, so is already taking action to deal with misleading online practices. In the paper, the ICO indicates that it will take enforcement action where necessary to protect people's privacy rights, and it especially underlines design practices that lead to harm for people who are vulnerable, i.e., children, those in ill health, and the elderly. The publication of this paper signals a degree of intent by both regulators to put digital operators on notice that OCA is an area on which both will focus their regulatory scrutiny. While other European data protection authorities have taken action against a number of companies (big and small) for contraventions in this area (especially connected with cookies), the ICO has historically and culturally tended to focus on big-ticket enforcement activity rather than pursuing investigations into smaller infringements by multiple businesses.   

While we've not yet seen significant enforcement activity from the ICO in this area, there is a recent change of tone. On November 21, 2023, the ICO issued a statement warning the UK's top websites that they face enforcement action unless they provide users with fair choices over tracking for personalized advertising. A number of companies have received letters from the ICO setting out the regulator's concerns around cookie banner design and being given a month to comply. These moves indicate that the ICO is stepping up its scrutiny and is more likely to start enforcement against digital operators that fail to comply.  

Victoria Hordern Partner 
[email protected]  
Taylor Wessing LLP, London