Online Safety

OSB in focus: will the Online Safety Bill increase the scope for actions brought by individuals against online platforms?

Published on 2nd Jul 2021

Draft legislation designed to make the UK the safest place to be online (while defending freedom of expression) requires difficult decisions from regulated services that could turn aggrieved users to litigation

DT_mobile-laptop

The forthcoming online safety legislation will create huge challenges to "regulated services" that fall within its scope, including significant compliance costs and the ever present threat of regulatory action. Ofcom, the nominated regulator, will have extensive powers, including to levy fines of up to £18million or 10% of global annual turnover, whichever is greater. But to what extent does the legislation pave the way for civil actions and complaints by internet users against the operators of regulated services?

Initial indications from the government

Prior to publication of the Online Safety Bill, there were some positive noises from the government in this regard. In its full response to the Online Harms White Paper, the Government made clear that:

  • The new regulatory framework will not establish new avenues for individuals to sue companies.
  • "Illegal" content would not cover material that only gives rise to a risk of civil liability (for example, defamation). Harms resulting from breaches of intellectual property rights and data protection legislation were to be specifically excluded from scope.
  • Robust protections were promised for journalistic content. Ofcom would not be given the mandate of investigating individual pieces of content or arbitrating on individual cases (compare this with the Information Commissioner's Office's role in investigating individual alleged breaches of the General Data Protection Regulation (GDPR). What is more, the government does not intend to establish an independent dispute resolution mechanism.
  • The intermediary defences for mere conduits, caching providers and hosts in Regulations 17-19 of the E-Commerce (EC Directive) Regulations 2002 (the E-Commerce Regulations) would continue to remain in force.

This provided some cause for optimism for regulated services that they will not face costly litigation or long running complaints by individuals as a result of the new online safety regime. But analysing the position now the Bill has been published, the picture is far less rosy.

Handling user complaints

To start with, regulated services must have systems and processes in place that allow users to "easily" report content or otherwise complain about it. This will not only have the practical effect of making it easier to fix internet platforms with knowledge of illegal content but also heighten expectations in relation to the speed of removal and their ability to rely on the intermediary liability defences.

Next, while many regulated services will only be concerned with "illegal content" (that is, content that breaches criminal law), larger/higher risk services with so-called "Category 1" status as well as those services "likely to be accessed by children" will have to grapple with the more slippery notion of "harmful content", which may be harmful but legal.

Content is to be regarded as "harmful" where the regulated service has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on (as appropriate) an adult or child of ordinary sensibilities.

Would severe distress (often found to have been suffered by victims of privacy intrusions or serious libels) in some cases amount to a "significant adverse physical or psychological impact"? If so, then this potentially opens the door to both Category 1 regulated services being deluged with user complaints over all manner of content.

To add balance, all regulated services will be under a duty to have regard to the importance of freedom of expression. Category 1 services will also be under additional duties to "take account of" the importance of "content of democratic importance" and journalistic content. These are extremely broad concepts that are notoriously difficult to apply in practice.

In practical terms, this means that when a Category 1 service removes content, and where the creator of or user who uploaded the content considers it to be journalistic content, the regulated service must make a "dedicated and expedited complaints procedure" open to that person. If such a complaint is upheld, then the content must be "swiftly" reinstated.

All this lays the ground for some highly contentious user complaints. Regulated services, particularly those with Category 1 status, will need to take some difficult decisions on whether content should be removed and the extent to which regard should be had of the freedom of expression rights in play.

This is bound to leave one or more party aggrieved. And, while it is true that Ofcom will not investigate individual pieces of content or arbitrate in individual cases, it seems likely that individual cases will provide the impetus for complaints to Ofcom by disgruntled users. These complaints may in turn prompt Ofcom to start looking at the regulated service's systems and processes and, ultimately, to regulatory action, particularly if the complaints relate to high profile content that garners substantial media attention.

Share
Interested in hearing more from Osborne Clarke?

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?

Related articles