OSB in focus: what categories of content and communications are within the Online Safety Bill's scope?
Published on 10th Jun 2021
The threat of big fines on annual global turnover means that it is crucial that online services understand the nature and categories of content that can be encountered on their platforms
When an internet service provider determines that it is within the scope of the UK Online Safety Bill, the next step will be to consider the types of content on its platform and whether in relation to this a duty of care is owed. The opening article in this series offered an overview of the types of service caught by the Bill, but what categories of content and communication are within the Bill's scope?
The Bill sets out what type of content will be considered "regulated content" in relation to user-to-user services. Regulated content will attract the core duties of care requiring platforms to ensure that such content is not illegal, is not harmful to children and (for category one services) is not harmful to adults.
The definition of "regulated content" broadly aligns with the same concepts dealt with regarding regulated services. It constitutes any user-generated content, in pretty much any form (such as text, photo, video or music), generated or uploaded by a user of the service that may be encountered by other users of the same service. However, the following types of content are expressly identified in the Bill as not being "regulated content":
- emails, SMS messages, MMS messages;
- one-to-one live aural communication, as long as not accompanied by any written, video or visual messages;
- comments and reviews on content published by or on behalf of the service provider (or on other users' comments or reviews on that content);
- paid for ads; and
- news publisher content, being any content generated by a "recognised news publisher", or content which reproduces or links to the full version of an article originally published by a recognised news publisher, including a recording of an item originally broadcast by such a publisher.
'Recognised news publisher'
The definition of a "recognised news publisher" is notably limited by the Bill. To qualify, a publisher is required to hold a broadcasting licence under the Broadcasting Act 1990 or 1996 and publish news-related material in connection with the authority granted under that licence. Alternatively, a publisher is required to meet a comprehensive set of criteria, which require the publisher (amongst other things) to have a business address in the UK and to publish news-related material (subject to editorial control and in accordance with a standards code) as its principal purpose.
The requirement that a recognised news publisher must have a business address in the UK is of particular interest and suggests that content generated by foreign news publishers will not necessarily be able to avail themselves of exemption (even if they can satisfy the remaining criteria).
Even where content is capable of falling within this exemption, the current draft of the Bill appears to suggest that any commentary relating to such content will fall outside the scope of the exemption and will continue to be recognised as "regulated content". The exception for comments and reviews (referred to above) only extends to comments or reviews on content published by or on behalf of the service provider, which, in this case, would be the platform itself rather than the recognised news publisher.
Freedom of expression
The Bill imposes a duty on category one services to ensure that the importance of freedom of expression of "journalistic content" and "content of democratic importance" is taken into account when content-related decisions are being made.
The duty imposed for both types of content is merely to "take into account" the importance of freedom of expression. Query the extent to which that advances the current position in relation to many online platforms where freedom of expression issues are very much at the forefront of moderation minds in deciding whether or not to remove content. When contrasted with the more robust "safety duties" contained in the Bill, these duties are inherently vague and are probably there as a tool to try to create the appearance of balance in the proposals. In practice, in their current form, these duties are likely to be no more than a box-ticking exercise rather than meaningful counter-balance to avoid the risk of over-removal.
The definition of "journalistic content" in the Bill is much wider than the relatively limited definition of "news publisher content" referred to earlier. The Bill defines "journalistic content" as news publisher content or regulated content (that is, user-generated content) which is generated for the purposes of journalism and is "UK-linked" (meaning the content is targeted towards UK users or is likely to be of interest to a significant number of UK users).
Despite the government's assertion that it is vitally important not to affect adversely journalistic content by introducing this type of online safety law, the concept of journalistic content under the Bill is not limited to traditional media. Any user-generated content is capable of being classed as journalistic content as long as it is generated for the purposes of journalism and is "UK linked". Arguably, this broad notion of journalistic content waters down the concept of journalism, including the democratic importance that attaches to it.
The Bill's current broad-brush approach to this type of content runs a real risk of undermining the very importance of journalist content which the government expressly seeks to protect, by also affording protection to content which historically may not have been viewed as journalistic content. It also runs the risk of creating more of a moderation headache for online platforms which may ultimately want more clarity as to the scope of the journalistic content duty of care. Without such clarity, platforms are exposed to arguments that extremist user posts are for the purposes of journalism and therefore carry additional protection from take-down.
In a surprising and controversial move, the government has added some extra protection of content of "democratic importance". The Bill defines "content of democratic importance" as news publisher content or regulated content (that is, user-generated content) which must be or at least appear to be specifically intended to contribute to democratic political debate within the UK. Notably, neither the Bill nor the accompanying explanatory notes explain what content is (and is not) capable of contributing to democratic political debate (presumably because any attempt to define such content would be a futile exercise).
The government has not fully explained why a specific duty has been introduced for content of democratic importance, nor why such content qualifies for special consideration over and above any other types of important speech, although it at least has the appearance of being a reaction to the removal of high-profile political posts from social media platforms.
What is of "democratic importance" is an extremely sensitive and subjective subject and is likely to stir great debate, just as the removal of posts has done. Further clarity is likely to be welcomed by Category 1 services who will be the ones attempting to strike the right balance between harmful content and content of democratic importance in practice.
So why does this matter? The threat of fines reaching £18 million or 10 per cent of annual global turnover means that it is crucial that in-scope services fully understand the nature and categories of content that may be encountered on their platform. This is necessary to ascertain whether such content is regulated content and subject to the broad safety duties to protect different categories of user. Similarly, it will be necessary to understand whether any journalistic content or content of democratic importance can be encountered on the platform and, if so, how the platform will take into account the importance of freedom of expression in respect of these categories of content.
This is the second article in our OSB in Focus series about the UK's draft Online Safety Bill. The series can be found here and will take an in-depth look at the specific topics, issues and applications of the proposals, and how they should be dealt with in practice.
Osborne Clarke has a dedicated international Online Safety team which is analysing the issues airing from the Bill, as well as following legislative developments in numerous jurisdictions in this area, including at European level through the Digital Services Act. Please contact one of our experts if we can be of assistance.