Online Safety

Online Harms: UK government confirms new Online Safety Bill

Published on 17th Dec 2020

The UK government has confirmed that it will introduce an Online Safety Bill in 2021 setting out a strict new regime to tackle the removal of illegal content online, including terrorist material, child sex abuse, suicide promotion and cyberbullying.

DT_mobile-touch

The full consultation response broadly follows the Government's previous proposals outlined in an April 2019 White Paper and subsequent consultation response published in February 2020, but provides some further certainty as to the upcoming changes.

Who will the rules apply to?

The Bill will impose a duty of care on companies that:

  • host user-generated content which can be accessed in the UK; and
  • facilitate public or private interaction between users (one or more of whom is in the UK).

Social media platforms, dating apps, search engines, online marketplaces, P2P services, online forums and video games will therefore all be caught by the Bill. Low risk businesses with limited functionalities and newspaper/broadcaster websites will be exempt.

Importantly, the government has confirmed that Ofgem's powers under Bill will have extraterritorial reach, as the rules will apply to companies' overseas operations where they relate to services accessible by users situated in the UK.

What measures will companies be subject to?

Confirming earlier proposals, the legislation will place these companies under a statutory duty of care to protect users from illegal material, and to implement measures for the reporting and removal of content that is harmful (but not illegal). The Bill will also include powers for the government to make more detailed rules on the most serious categories of harmful content such as child sex abuse, terrorist, and violent content, via secondary legislation.

Companies will be categorised into two tiers according to the size of their online presence and the level of risk posed on the platform. Category 1 companies will likely include large household-name social media companies. As well as the duty to address relevant illegal content and content which is harmful to children, Category 1 companies will also be under a duty to take action against content which, while strictly legal, may be harmful. This will not a requirement of Category 2 companies. The legislation will define content as harmful where: 'it gives rise to a reasonably foreseeable risk of a significant adverse physical or psychological impact on individuals'.

Category 1 companies will also be under a legal requirement to publish transparency reports on the measures they have taken to tackle online harms.

Companies will be able to fulfil their duty of care by complying with statutory codes of practice published by Ofcom. This will involve the implementation of systems and processes to improve users' safety, such as specific user tools, content moderation, recommendation procedures, and reporting and redress mechanisms.

The government is promising to publish interim codes of practice on terrorism and child sexual exploitation and abuse which, whilst voluntary, will help companies to understand the changes they need to make before the publication of the statutory codes of practice.

One measure announced to address concerns regarding freedom of expression is that companies will need to implement effective complaint mechanisms to enable users to object if they feel their content has been unfairly removed.

What are the sanctions for non-compliance?

Ofcom, the media and communications regulator, will be responsible for enforcing the rules and will have the power to impose fines for non-compliance, of up to 10% of a company's annual turnover or £18 million (whichever is higher).

The regulator may also take enforcement action to require providers to withdraw access to key services. For serious failures of the duty of care, Ofcom has the power to entirely block a company's services from being available in the UK.

Earlier proposals also included the possibility of imposing criminal penalties on senior executives for failure to comply with the duty of care generally. While the government has chosen not to pursue such broad sanctions, it has decided to include a power for the government to impose criminal offences at a later date in secondary legislation where senior managers 'fail to respond fully, accurately, and in a timely manner, to information requests from the online harms regulator'. The power will expire after two years, and the government has said it will only exercise it if, on review of the new regime in the first year, it was apparent that industry hadn't complied with their new information sharing requirements.

Osborne Clarke view

The 10% figure for fines will clearly be the headline from this latest development, as this dwarfs even the most serious GDPR fines. Nevertheless, the full response is largely a confirmation of earlier proposals. Despite some delays, we now have some more certainty as to timings, with the introduction of a Bill in 2021.

There will still be much to be debated when the Bill is published. There are very serious questions of free speech and technical implementation to be addressed in relation to the government's intention to police lawful content, an avenue that the European Commission is avoiding in its plans for the Digital Services Act, further details of which were published on the same day. This will put a post-Brexit UK out of kilter with Europe and give rise to billions of moderation headaches.

Follow

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?