OSB in focus: will the Online Safety Bill increase the scope for actions brought by individuals against online platforms?

Written on 2 Jul 2021

Draft legislation designed to make the UK the safest place to be online (while defending freedom of expression) requires difficult decisions from regulated services that could turn aggrieved users to litigation

The forthcoming online safety legislation will create huge challenges to "regulated services" that fall within its scope, including significant compliance costs and the ever present threat of regulatory action. Ofcom, the nominated regulator, will have extensive powers, including to levy fines of up to £18million or 10% of global annual turnover, whichever is greater. But to what extent does the legislation pave the way for civil actions and complaints by internet users against the operators of regulated services?

Initial indications from the government

Prior to publication of the Online Safety Bill, there were some positive noises from the government in this regard. In its full response to the Online Harms White Paper, the Government made clear that:

  • The new regulatory framework will not establish new avenues for individuals to sue companies.
  • "Illegal" content would not cover material that only gives rise to a risk of civil liability (for example, defamation). Harms resulting from breaches of intellectual property rights and data protection legislation were to be specifically excluded from scope.
  • Robust protections were promised for journalistic content. Ofcom would not be given the mandate of investigating individual pieces of content or arbitrating on individual cases (compare this with the Information Commissioner's Office's role in investigating individual alleged breaches of the General Data Protection Regulation (GDPR). What is more, the government does not intend to establish an independent dispute resolution mechanism.
  • The intermediary defences for mere conduits, caching providers and hosts in Regulations 17-19 of the E-Commerce (EC Directive) Regulations 2002 (the E-Commerce Regulations) would continue to remain in force.

This provided some cause for optimism for regulated services that they will not face costly litigation or long running complaints by individuals as a result of the new online safety regime. But analysing the position now the Bill has been published, the picture is far less rosy.

Handling user complaints

To start with, regulated services must have systems and processes in place that allow users to "easily" report content or otherwise complain about it. This will not only have the practical effect of making it easier to fix internet platforms with knowledge of illegal content but also heighten expectations in relation to the speed of removal and their ability to rely on the intermediary liability defences.

Next, while many regulated services will only be concerned with "illegal content" (that is, content that breaches criminal law), larger/higher risk services with so-called "Category 1" status as well as those services "likely to be accessed by children" will have to grapple with the more slippery notion of "harmful content", which may be harmful but legal.

Content is to be regarded as "harmful" where the regulated service has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on (as appropriate) an adult or child of ordinary sensibilities.

Would severe distress (often found to have been suffered by victims of privacy intrusions or serious libels) in some cases amount to a "significant adverse physical or psychological impact"? If so, then this potentially opens the door to both Category 1 regulated services being deluged with user complaints over all manner of content.

To add balance, all regulated services will be under a duty to have regard to the importance of freedom of expression. Category 1 services will also be under additional duties to "take account of" the importance of "content of democratic importance" and journalistic content. These are extremely broad concepts that are notoriously difficult to apply in practice.

In practical terms, this means that when a Category 1 service removes content, and where the creator of or user who uploaded the content considers it to be journalistic content, the regulated service must make a "dedicated and expedited complaints procedure" open to that person. If such a complaint is upheld, then the content must be "swiftly" reinstated.

All this lays the ground for some highly contentious user complaints. Regulated services, particularly those with Category 1 status, will need to take some difficult decisions on whether content should be removed and the extent to which regard should be had of the freedom of expression rights in play.

This is bound to leave one or more party aggrieved. And, while it is true that Ofcom will not investigate individual pieces of content or arbitrate in individual cases, it seems likely that individual cases will provide the impetus for complaints to Ofcom by disgruntled users. These complaints may in turn prompt Ofcom to start looking at the regulated service's systems and processes and, ultimately, to regulatory action, particularly if the complaints relate to high profile content that garners substantial media attention.

Litigation risk

It may be that aggrieved users will wish to litigate their disputes. While the Bill does not create any freestanding causes of action or rights of compensation, the government is perhaps optimistic in thinking that the legislation will not lead to more litigation.

Once enacted, it seems only a matter of time before a regulated service (particularly a Category 1 service or a service likely to be accessed by children) is the subject of civil action in which it is accused of a failure to comply with one or more provisions of the legislation. A regulatory finding that a service provider has failed to put in place appropriate systems to deal with certain types of complaint (including the complaint that is the subject of legal action) will clearly not help the service provider’s attempts to rely on the intermediary liability defences.

The Bill contains detailed requirements for the “terms of service” of regulated services. For instance, all regulated services are under a duty to specify in their terms of service how individuals are to be protected from illegal content. This includes addressing how the regulated service will meet its obligations in clause 9(3) of the Bill to minimise the availability of illegal content and to swiftly take down such content once it becomes aware of it. There are additional requirements for the terms of service of Category 1 services, and services likely to be accessed by children.

If the terms of services are to be regarded as a contract between the regulated service and its users then, in principle, a user could bring a breach of contract claim over any alleged failure by the regulated service to comply with its own terms of service or the duties set out in the Bill.

Next, there’s negligence – the legal arena in which the concept of “duty of care” is most applied. The Bill imposes various duties on regulated services to protect their users. Given the government was at pains to stress that the legislation should not create additional avenues by which individuals can sue companies, it would appear that these duties are not intended to be “duties of care” in the tortious sense. But this seems ripe for challenge: could a victim of some form of serious online harm not allege (in certain circumstances) that the regulated service concerned was negligent in failing to comply with its duties under the legislation? Claimant lawyers are almost certain to try these arguments, just as they run “duty of care” arguments under the GDPR (despite the GDPR already providing for a specific right to compensation).

Elsewhere, the Bill provides that the Ofcom codes of practice (yet to be published) will be admissible in court proceedings, and that in “any proceedings” a court must take into account a provision of a code of practice where that provision appears relevant to a question which the court is determining.

The Bill also requires a significant number of documents to be produced and published by regulated services. Not only terms of service, but various risk assessments and impact assessments as well as annual “transparency reports”. This will provide detailed information on the inner workings of these companies, and it seems inevitable that these documents will be deployed in future litigation against regulated services, possibly even as the subject of attempts to seek pre-action disclosure.

Finally, both the E-Commerce Directive and the E-Commerce Regulations in the UK provide good defences to claims for compensation but have never been defences to injunctions. In that respect, post-Brexit, service providers in the UK can no longer rely on the protection against states imposing general monitoring obligations under Article 15 of the E-Commerce Directive. This provision provides an important defence against attempts to seek wide-ranging blocking injunctions. However, it was never carried across to the E-Commerce Regulations and so it is no longer part of English law given that the Directive has fallen away. Service providers may wish to seek additional protection in this area.

Osborne Clarke comment

The UK government’s commitment to introduce legislation to make the UK the safest place in the world to be online but at the same time defend freedom of expression was always going to be a tall order.

The Bill, as currently drafted, will mean that difficult decisions will need to be made by regulated services when assessing content on their platforms. Inevitably, this will lead to users and content creators being unhappy with decisions made. This will not only create significant additional work for regulated services, but will likely see aggrieved parties looking to test the courts’ willingness to adjudicate on matters relating to this legislation.

Claimant lawyers, especially those seeking to bring group claims backed by litigation funders, are becoming evermore creative in their ability to rack up costs in pursuit of pushing the boundaries of the law. Unless the drafting of the Bill is tightened up, the Online Safety Bill could well have lots of unintended litigation consequences.