On 19 September 2023, the UK Parliament passed the Online Safety Bill (“OSB”). The OSB aims to protect individuals from illegal online content and focuses on the protection of children by requiring the removal of content that is legal but harmful to children. For example, social media platforms will be required to act rapidly to prevent children from viewing illegal material, or content that is harmful to them, such as pornography, online bullying, and the promotion of suicide, self-harm or eating disorders. The definition of illegal content covers content that is already unlawful under existing legislation, such as terrorism, hate speech and child sexual exploitation, and introduces new offences relating to more recent online phenomena such as revenge pornography, and ‘upskirting’ and ‘downblousing’ images. This is one of the most significant pieces of UK legislation post-Brexit and shows a distinctly UK approach to online harms, which businesses operating globally will need to comply with. This will need to be reviewed in parallel with the EU Digital Services Act, which has similar goals in making Europe a safe online environment.
A date for Royal Assent (when the OSB will become law) is expected shortly. The OSB’s wide scope makes it likely to result in implementation problems and potential challenges resulting from the impact the OSB is likely to have on freedom of expression and personal privacy. The underlying principles of the OSB are very different to those familiar with US laws and the constitutional protections for free speech. The risks of non-compliance will be significant, with extremely high potential fines of up to 10% of a company’s global revenue.
In-scope services
The OSB imposes legal requirements on:
- ‘user-to-user services’ – internet services that allow users to “encounter” (read, view, hear or otherwise experience) content generated, uploaded or shared by other users (even if content is not actually shared). For example: social media and video sharing platforms, e-market places, and online messaging services such as WhatsApp and iMessage;
- ‘search services’ – search engines which enable users to search multiple websites or databases;
- any service that publishes pornographic content (rather than merely hosting user-generated content) that can be accessed by UK users; and
- providers of an ‘access facility.’ For example: (i) internet access services by means of which a regulated service is made available; and (ii) app stores through which a mobile app for a regulated service may be downloaded or otherwise accessed.
Thousands of platforms are likely to be in-scope of the OSB, including messaging services and websites through which information can be shared between users.
Extraterritorial effect
As well as UK service providers, the OSB will apply to services outside the UK which:
- have a significant number of UK users;
- have UK users as a target market; or
- are capable of being used in the UK where there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK from user-generated content (user-to-user services) or search content of the service (search services).
Given this broad extraterritorial applicability, some international platforms may seek to fall outside the scope of the OSB by blocking UK users.
Categories of service
In-scope services will fall into one of three categories (Category 1, Category 2A or Category 2B) depending on factors such as user numbers and service functionality. Different obligations will apply to each category. The criteria for each category will be set out by the UK Secretary of State once the OSB becomes law.
Key obligations
In-scope services will be subject to a range of obligations, including:
- conducting a risk assessment and taking proportionate measures in relation to the design and operation of the service, including a duty to provide details in the applicable terms of service about how individuals are protected from illegal content;
- removing illegal content swiftly and preventing it from appearing;
- preventing children from accessing harmful and age-restricted content (such as violent or pornographic content, or content that promotes suicide, self-harm or eating disorders), which will require online service providers to implement age verification technologies;
- enforcing age limits and implementing age-assurance measures;
- implementing systems and processes allowing users to report content;
- operating a complaints procedure; and
- protecting content of democratic importance, news publisher content, journalistic content, and freedom of expression and privacy (only Category 1 service providers will need to comply with the free speech duties).
Exempt services
The following services will be exempt:
- any user-to-user service where emails, SMS messages, MMS messages or a combination of SMS and MMS messages are the only user-generated content (other than identifying content such as a username or profile picture) enabled by the service;
- any user-to-user service where one-to-one real-time live aural communications are the only user-generated content (other than identifying content) enabled by the service. If the communication is accompanied by user-generated content of any other description (such as written messages, videos or visual images), the exemption will not apply;
- any user-to-user service with limited functionalities such that users can communicate by means of the service only in the following ways:
- posting comments or reviews relating to provider content;
- sharing such comments or reviews on a different internet service;
- expressing a view on such comments or reviews, or on provider content, by specified means such as ‘like’ or ‘dislike’ buttons or emojis; or
- producing or displaying identifying content in connection with any of these activities;
- any user-to-user service or search service comprising certain internal resources or tools for a business (such as internal intranets, portals, and message boards); and
- any user-to-user service or search service provided by certain public bodies exercising public functions.
Ofcom enforcement powers
Ofcom (the UK communications regulator) will be responsible for enforcing the OSB and will be granted enforcement powers, including the power to:
- enter and inspect a company’s premises;
- issue a notice of contravention requiring a company to do, or refrain from doing, any action required under the OSB;
- apply to court for business disruption measures such as blocking non-compliant services;
- issue fines of up to £18 million or 10% of a company’s global revenue;
- bring criminal sanctions against senior managers who fail to ensure their company’s compliance with information requests from Ofcom, including fines and imprisonment for up to two years; and
- issue an order requiring a provider of ‘ancillary services’ (i.e., a service that facilitates the provision of a regulated service or part of it, such as credit card services or advertising) to withdraw the ancillary service to the extent that it relates to the relevant service.
Data privacy implications
Significant privacy risks include:
- Age assurance – age-assurance techniques (such as scanning and checking passports or collecting biometric data using facial recognition technology) reduce user anonymity and create the potential for data to be collected and therefore the need to protect that data.
- End-to-end encryption – the OSB empowers Ofcom to issue a notice requiring that a regulated service provider use accredited technology to identify terrorism and child sexual abuse, and swiftly to take down that content. However, it is not clear what criteria would go into the accreditation of any such technology. In an open letter to the UK Government, WhatsApp and several other encrypted messaging service providers expressed concern that such a power could enable Ofcom to try to force proactive scanning of private messages. In a separate open letter, a group of 68 researchers and scientists working in the fields of information security and cryptography expressed concern that surveillance technologies deployed in pursuit of online safety could undermine user privacy. End-to-end encryption means that only the sender and recipient of a message can view that message. Any service provider compelled to scan those messages would therefore need to monitor them pre-encryption or utilise a special key to gain access post-encryption, undermining user security and creating privacy risks.
- GDPR compliance – in-scope services will likely come to possess a significant amount of user information, such as preferences, beliefs, and sexual orientation, and will need to ensure that such information is held in accordance with GDPR requirements, and that robust cybersecurity measures are implemented to protect such information and mitigate the risk of data breaches.
In-scope service providers will need to think carefully about the data protection issues posed by the OSB. Such providers will also want to ensure that compliance with the requirements of the OSB does not also entail a compromise to user security and risk a violation of data privacy laws.
The material contained in this communication is informational, general in nature and does not constitute legal advice. The material contained in this communication should not be relied upon or used without consulting a lawyer to consider your specific circumstances. This communication was published on the date specified and may not include any changes in the topics, laws, rules or regulations covered. Receipt of this communication does not establish an attorney-client relationship. In some jurisdictions, this communication may be considered attorney advertising.