UK: Online Safety Bill - who is in scope?
Ben Dunham and Neil O'Sullivan, from Osborne Clarke LLP, delve into the intricacies of the Online Safety Bill (OSB), highlighting its potential impact on internet services. They explore the OSB's scope, user-generated content, enforcement powers, and more, emphasizing the critical steps providers must take to ensure compliance in this evolving regulatory landscape.
The OSB, the UK's long-awaited landmark legislation designed to protect children and adults online, is finally due to become law in autumn 2023. The legislation marks a significant change in the way online spaces are regulated and will place wide-ranging responsibilities on online platforms to protect users from illegal and harmful content. The Office of Communications (Ofcom) has been entrusted with the responsibility of enforcing the OSB and will have extensive powers to investigate providers and sanction them for breaches.
As a first step when preparing for the introduction of the new regime, online service providers will need to determine whether they fall within its scope and, if so, what content on their platforms will be subject to regulation.
Which categories of service are within scope?
The OSB is targeted primarily at online user-to-user services and search services.
User-to-user services are internet services that allow content generated, uploaded, or shared by users on the service to be encountered by other users of the service. This includes social media platforms, messaging apps, online forums, message boards, dating apps and so on.
Even if the sharing of user-generated content is only a portion of a service's overall functionality, the service will be caught by the legislation, unless it benefits from an exemption.
Search services are services that are, or which include a search engine. This encompasses the major search engines and other services that search through large numbers of websites and databases. It also covers speech-based digital assistants. However, websites that have internal search functionality, but do not present results from the wider internet, will not be covered.
The OSB also applies to internet services that publish or display pornographic content.
Which services are exempt?
Several categories of service are expressly exempt from the OSB. Email, SMS, MMS, and one-to-one live aural communications services are not covered (although video calling services and instant messaging apps are). Additionally, certain services offered by public bodies are exempt.
Internal business services are also exempt. This exemption is intended to cover tools that are available only to a closed group of people within a business, such as business intranets, customer relationship management systems, and database management software.
For the exemption to be applicable, the business using the service must also be the service provider. A user-to-user service is defined as the entity that has control over who can use the service. If a business licenses a software platform with user-to-user functionality from a third-party software-as-a-service provider and controls access to the platform, then the business (rather than the third-party software-as-a-service provider) will be considered the provider for the purpose of the OSB. The business will be able to rely on the exemption, as long as the other applicable conditions are satisfied.
Limited functionality services are also exempt. This covers services that permit only limited user interaction in relation to content published by the provider, such as posting comments or reviews or applying a 'like' button or an emoji. This is intended to put services outside the scope of the OSB where the only user interaction consists of 'below the line' content or user reviews of directly provided goods and services. Consequently, newspaper websites that host comments underneath articles nor are company websites that host customer reviews of their goods or services are not subject to the OSB.
The OSB gives the secretary of state the power to make regulations that change its scope, including the power to bring one-to-one live aural communications, comments, and reviews on provider content within scope if they are considered to pose harm to UK users. Providers should be aware that, even if their services are not in scope on 'day one,' they may be encompassed in the future. This is a reminder that compliance is not a one-off exercise but a continuous process.
What is meant by 'links with the UK'?
Services covered by the OSB that do not qualify for an exemption will fall under its jurisdiction if they have 'links with the UK.' A service will be considered to have links with the UK if any of the following conditions are met:
- it has a significant user base in the UK;
- UK users form a target market for the service; or
- the service is accessible in the UK, and there are reasonable grounds to believe there is a material risk of significant harm to UK individuals presented by the content of the service.
The OSB, therefore, has an extraterritorial effect and will apply to services that operate from outside the UK. Given the global nature of the internet, this cross-border application is necessary in order to create an effective regulatory regime. However, the concept of 'links with the UK' is defined very widely. A service may have little connection with the UK and may be used by very few UK individuals yet it could still fall under the scope of the OSB if it is determined that the content of the service poses a serious risk to UK individuals, even if the numbers are small.
The OSB does not specify what threshold of UK users would constitute a 'significant number.' The major social media platforms will be caught. However, smaller platforms where UK users constitute a relatively small proportion of the global total may find it more challenging to determine whether they are within scope, particularly if there are few factors indicating that the UK is a target market for the service.
What content is within scope?
Providers that are within scope will have to consider which types of content are regulated by the OSB. The proposed legislation creates a regulatory regime based on 'duties of care,' which oblige providers to take steps to protect users from certain types of content. The duties focus on the systems and processes used to operate services and present content to users, rather than on the moderation of specific pieces of content. However, the ability to identify content that is of a type covered by the OSB will still be an essential part of compliance.
The OSB defines 'regulated user-generated content' as all content generated by users except content of the type that would appear on an exempt service, such as emails, SMS, one-to-one live aural communications, comments, and reviews on provider content. Additionally, content from news publishers, defined as any content published by a recognized news publisher (or reproductions of or links to such content) does not fall within the scope of 'regulated user-generated content.' However, certain providers will nevertheless need to be able to identify such content in order to comply with some of their duties.
Apart from this, effectively all user-generated content, spanning text, images, videos, and music are potentially subject to the regulation. This includes content created by bots.
Paid-for advertising content, however, is exempt from this scope, as it is expected to be regulated through future legislation being explored as part of the government's Online Advertising Programme. The only exception to this is fraudulent advertising. Larger providers will bear the responsibility of safeguarding users against this specific category of advertising content as outlined in the OSB.
What do the duties of care look like?
Providers within the OSB's scope will face a range of duties of care pertaining to regulated content. The scope of a provider's duties will depend on that provider's categorization. Certain services will be designated as Category 1, 2A, or 2B services, with Category 1 services carrying the most rigorous obligations.
The categories will be defined by 'threshold conditions,' which will be set out in secondary legislation. Currently, the precise boundaries of these categories are not known; however, they will be contingent upon factors such as the number of UK users, the functionalities of the service, and the anticipated risk of harm to users. It is reasonable to assume that the largest social media platforms will be designated as Category 1 services. However, there may be instances of potential disagreement or uncertainty at the margins of the categorization process.
Regulated providers will need to carry out risk assessments to assess the risks of illegal content appearing on their services. Subsequently, they must implement measures to prevent users from encountering certain specified types of illegal content. Furthermore, when such content does appear on the platform, it is quickly identified and removed.
Additionally, regulated providers will also need to determine whether children are likely to access their service. If children are likely to access a service, providers will be subject to additional duties to carry out a children's risk assessment and implement measures to prevent child users from encountering the most harmful forms of content while also mitigating the risks associated with other types of harmful content.
There are additional responsibilities concerning content reporting and complaints procedures, as well as obligations relating to providers' terms of service, which apply to users of all ages. Larger providers will also be subject to various 'user empowerment duties' to enable adult users to better control the content they are exposed to.
To counterbalance the risk that compliance might impede online freedom of expression, providers must give 'particular regard' to freedom of speech and other fundamental rights when complying with their duties. Larger providers will also have express duties relating to journalistic and news-provider content and must implement various safeguards to regulate the removal of such content. Despite this, concerns have been raised that these 'balancing measures' will be less impactful than the more robust safety duties and that the OSB may encourage the over-zealous removal of content, which has a chilling effect on freedom of expression.
What are Ofcom's enforcement powers?
Ofcom will have wide-ranging powers to seek information from regulated providers to ensure they are complying with their obligations. In instances where Ofcom determines that a provider has violated an enforceable obligation, it can compel the provider to take steps to rectify the breach. Additionally, Ofcom has the authority to levy fines, with penalties reaching up to £18 million or 10% of the provider's global annual revenue, whichever is higher. Furthermore, Ofcom can require platforms to proactively employ accredited technology for identifying and removing content associated with terrorism or child sexual abuse.
Senior managers of regulated providers will also be liable for criminal prosecution, both in respect of failures to comply with Ofcom's investigation and enforcement procedures and, in certain circumstances, in relation to providers' substantive failures to comply with the children's online safety duties.
Given the severe enforcement powers in place, it becomes essential for providers of internet services to take early and comprehensive steps to establish whether and to what extent their services are in scope, understand their duties under the OSB, and make appropriate changes to their systems and processes to ensure that they do not fall foul of the new law.