Digital Regulation | UK Regulatory Outlook November 2024
Published on 27th Nov 2024
Government consults on introducing sanctions for senior executives of online platforms and marketplaces for failure to remove content on weapons and knives | Ofcom explains how the OSA will apply to generative AI and chatbots | Ofcom calls for evidence to inform its report on researchers' access to information from regulated services under the OSA
UK updates
Government consults on introducing sanctions for senior executives of online platforms and marketplaces for failure to remove content on weapons and knives
The government is consulting on introducing personal liability measures for senior executives of online platforms and marketplaces who fail to remove illegal content relating to knives and offensive weapons.
Existing laws already make it a criminal offence to manufacture, sell and offer for sale prohibited offensive weapons, and to market knives. The Online Safety Act 2023 (OSA) also requires platforms to remove illegal content when they become aware of it and to protect children from harmful and age-inappropriate content.
However, the government is concerned that social media platforms are being used to sell prohibited weapons and knives, including to under 18s, in ways that encourage violence. Therefore, it believes that stronger action is needed in this area.
The government proposes giving the police the power to issue content removal notices to companies and designated senior executives, requiring the removal of illegal content within 48 hours. If the company fails to remove the content within the time limit, a second content removal notice would be sent to the senior executive. Continued non-compliance would result in a notice of intent being sent to the senior executive, stating that legal action will be taken against them if they fail to comply. The senior executive would have 28 days to object.
Non-compliant senior executives would then face civil action and the possibility of a fine of up to £10,000. The consultation closes on 11 December 2024.
Ofcom explains how the OSA will apply to generative AI and chatbots
Ofcom has published an open letter to UK online service providers on how the OSA will apply to generative AI and chatbots. Ofcom reminds providers that the following AI tools and content will be in scope of the OSA: user-to-user services; search services and pornographic material.
User-to-user services:
- Sites or apps which include a chatbot enabling users to share text, images or videos generated by the chatbot with other users.
- Services allowing users to upload or create their own chatbots (user chatbots), which are then made available to other users. Any content created by these chatbots is "user-generated content" and is regulated by the OSA.
- Any AI-generated content shared by users on a user-to-user service is user-generated content and would be regulated in the same way as human-generated content (for example, deepfake and human-generated fraud material). This applies regardless of whether the content was created on the platform where it is shared or uploaded from another site.
Search services: generative AI tools that enable the search of multiple websites and/or databases, including tools that modify or facilitate the delivery of search results, or which provide "live" internet results.
Pornographic material: sites and apps that include generative AI tools that can generate pornographic material. These services are required to use highly effective age assurance measures to ensure children cannot normally access such material.
Ofcom is ready to enforce and urges regulated services to start preparing for compliance now, as the first set of duties under the OSA will take effect in December this year. See our recent Insight on Ofcom's OSA implementation roadmap.
Ofcom calls for evidence to inform its report on researchers' access to information from regulated services under the OSA
The OSA requires Ofcom to report on the ways and extent to which independent researchers access information on online safety matters from providers of regulated services. Ofcom wants to understand how researchers currently obtain information from providers, the challenges they encounter, and how greater access to the information might be achieved. The findings from this call for evidence will inform Ofcom's report. The deadline for responses is 17 January 2025.
UK government publishes draft Statement of Strategic Priorities for online safety under OSA
The government has set out its focus areas for online safety under the OSA. The draft Statement of Strategic Priorities for online safety, produced under section 172 of the OSA, highlights five priorities that Ofcom must have regard to when implementing the OSA and that industry is expected to adhere to.
The government will prioritise:
- Implementing safety by design to deliver safe online experiences for all users, especially children. The government wants industry to see this as a basic principle for operating in the UK market.
- Increasing industry transparency and accountability, including algorithmic transparency, to create a "culture of candour".
- Delivering an agile approach to regulation to keep pace with emerging technology and behaviour. The government wants Ofcom to design a "forward-looking" approach that can quickly mitigate significant risks that emerge.
- Developing an inclusive and resilient online society of well-informed users through Ofcom media literacy initiatives, interventions and research, as well as industry adoption of best practice principles for "literacy by design".
- Fostering the innovation of online safety technologies, including effective age assurance technologies.
Before finalising the statement next year, the government will seek additional input from online safety experts, individuals with lived experience of online harms and Ofcom. If it becomes clear that new legislation is the only way to solve certain issues, the government will consider taking this step, but its aim is to deliver results within the OSA's current provisions.
With the first set of online safety duties under the OSA due to commence in December 2024, now is the time for businesses to start bringing their practices into compliance.
Sharing intimate images made a priority offence under OSA
The Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2024, which make the new offence of sharing intimate images without consent a priority offence under the OSA, were made on 19 November 2024. See this Regulatory Outlook for more information.
The regulations come into force on 10 December 2024.
New inquiry into links between algorithms, genAI and the spread of harmful content online
The House of Commons Science, Innovation and Technology Committee has launched an inquiry to examine the connection between algorithms used by social media and search engines, generative AI, and the spread of harmful and false content online. This inquiry comes in response to the widespread dissemination of misinformation seen last summer and the online safety regulator, Ofcom, saying recently that algorithmic recommendations can contribute to "divisive narratives in a crisis period".
The inquiry will also assess whether existing and proposed regulations governing social media algorithms and generative AI, including the OSA, are effective, and whether additional measures are necessary. The inquiry is currently inviting written evidence to be submitted by 18 December 2024.
EU updates
EU Commission adopts implementing regulation on transparency reporting under the DSA
The implementing regulation standardises templates and reporting periods for the transparency reports that providers of intermediary services have to publish under the Digital Services Act (DSA) in relation to their content moderation practices.
Very large online platforms (VLOPs) and very large online search engines (VLOSEs) must report twice a year, while other services report annually.
The DSA details the specific categories of information that the transparency reports must contain, such as the number of content items and user accounts removed, the accuracy of any automated systems used and information on content moderation teams. The standardisation provisions in the new implementing regulation should simplify compliance for providers and ensure consistency in reporting practices so that comparisons can be made.
Under the implementing regulation, providers must start collecting data in line with the templates from 1 July 2025, with the first harmonised reports due at the beginning of 2026.
The Commission also plans to update the requirements for submitting statements of reasons to the DSA Transparency Database so that they align with the implementing regulation.
EU Commission consultation on draft delegated regulation on rules for researchers to access online platform data under the DSA
Under article 40(4) of the DSA, VLOPs and VLOSEs have to allow "vetted researchers" (who meet the relevant requirements in the DSA) to access their data, subject to approval from their Digital Services Coordinator, for the purposes of evaluating systemic risks and mitigation measures. The proposed regulation will further specify the procedures, conditions and purposes for such data sharing and use.
The consultation opened on 29 October and has been extended to 10 December 2024. The Commission plans to adopt the rules in the first quarter of 2025.