Online Safety

Online harms regulation | Clarity awaited but reforms set to be delayed

Published on 11th May 2020

RMT_business_woman

The proliferation of COVID-19 related disinformation has heightened the UK government's desire to tackle a whole variety of online harms. The full consultation response on a proposed Online Harms Bill, expected imminently, should shed some light on the government's thinking. With the legislation slated to be published in draft before it is introduced to parliament, however, and amid the many competing challenges the country is facing, the reforms look set to be pushed into 2021, or potentially even further down the line.

We set out below a quick recap below and some of the major issues to watch out for.

Where have we got to so far?

Since sketching out its plans for the regulation of platforms and content in the UK last year, the government provided further indications of what the future Online Harms Bill will look like in its Initial Consultation Response on 12 February 2020. Before the Parliamentary Committee for Digital, Culture, Media and Sport on 22 April 2020, the new DCMS Secretary of State Oliver Dowden suggested that he was considering carrying out pre-legislative scrutiny on the Online Harms Bill, causing some to conclude that the legislation won't actually be introduced until the next parliamentary session (potentially as far back as 2022/23).

It is clear that the government is under a lot of pressure to bring these rules into effect as soon as possible, with several MPs pressing for the publication of draft legislation and clarity over a date for introduction. However, the legislative pressures posed by COVID-19 are likely to have an effect on parliamentary time, so it is possible that legislation will not be introduced until early 2021, and given that there was already a packed legislative agenda (many things were already delayed as a result of Brexit), following COVID-19 we could even see some bills being postponed.

If pre-legislative scrutiny gets underway before the end of the year, we will see the publication of a draft bill accompanied by an invitation for industry to comment on its contents. This would be an opportunity for affected businesses to start focussing in on the detail and making representations.

What do we know so far about the Online Harms Bill?

The online harms white paper gives an indication of the main parameters of the proposed legislation:

  • Companies will have to demonstrate adherence to the new statutory "duty of care" by complying with Codes of Practice in relation to different types of online harms. The Codes will set the applicable "standard of care" to which a court would hold a company in the event of action taken by that regulator.
  • While voluntary, the interim codes of practice relating to online terrorism and child sexual exploitation (to be published soon alongside the full consultation response), are intended to enable industry to gear up its compliance in advance of the codes becoming the demonstrable (and enforceable) legal standard.
  • The existing broadcasting regulator, Ofcom, will be endowed with new powers to enforce the rules – with the government already addressing challenges from some opposition MPs to ensure that the powers go far enough and are 'future-proofed'.
  • The legislation will operate on twin tracks, with one set of standards for illegal content and another for "potentially harmful content".
  • In terms of regulatory sanctions, the government confirmed in the Commons that 'nothing is off the table', including directors' liability. The interim consultation response already confirmed that sanctions could extend to 'substantial fines', business disruption activities, and ISP blocking.
  • The government has also confirmed that Ofcom's new functions would initially be publicly financed, but that over time they expected to put into place an industry-financed scheme similar to those already operating in other regulated sectors, with further details in the full consultation response.

Questions to be answered

It is highly likely that the UK will lead the charge on this type of code of conduct-driven legislation and that we will see other countries take a similar approach. International platforms will therefore be keeping a very close eye on it and thinking about how international moderation, take-down procedures, and transparency will need to change. In that respect, how the various harms are defined will be critical, as standards of lawfulness and morality vary hugely across borders.

There are also a number of tricky legal issues that will be ever-present across all of the areas of online harms. Most notably, how the new codes of conduct and duty of care will sit alongside the protections from liability for internet platforms contained in Articles 12-14 of the E-Commerce Directive 2000, implemented into English law by the E-Commerce (EC Directive) Regulations 2002. Whilst the government has been keen to stress that such protections will remain in place, there is no doubt that the new duty of care will accelerate the pace with which platforms put themselves on notice of unlawful content and heighten expectations as to the speed with which unlawful content is removed. Which is good news for obviously unlawful content but presents difficulties for the millions of items of online content that sit in the grey area between lawful and unlawful – some of which are the subject of litigation.

We will also be hearing much about the extent to which internet platforms are obligated to use technological measures to screen content, including machine learning, and how such obligations sit alongside the protection in Article 15 of the E-Commerce Directive, which prevents Member States from imposing general monitoring obligations on internet platforms.

Finally, while much of the debate will centre around the major internet platforms, the voice of smaller internet platforms will need to be heard. Despite the government's claim that the legislation will only capture 5% of UK businesses, the knock-on effects of the proposed legislation are likely to be extremely wide. The duty of care is intended to apply to all companies that "allow, enable, facilitate users to share or discover user-generated content, or interact with each other online." Proportionality will be repeatedly stressed, but some small internet businesses will not be ready for the complexity that is around the corner.

Follow

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?