Opted out: UK government moves away from preferred position on AI and copyright following report
Published on 25th March 2026
No concrete policy decisions made; wide text and data mining copyright exception with opt-out no longer preferred approach; more information gathering to follow
At a glance
Wide text and data mining copyright exception with rightsholder opt-out no longer the government's preferred approach to AI training: all options remain on the table.
Further information gathering expected in relation to transparency, licensing, enforcement and copyright protection for computer-generated works.
Consultation on digital replicas and the potential introduction of a new personality right to take place this summer.
The UK government has published its long-awaited report on copyright and artificial intelligence (AI), alongside an economic impact assessment of the policy options, as required by the Data (Use and Access) Act 2025. The report follows the government's 2024 consultation, which attracted over 11,500 responses from interested stakeholders.
The government's initial preferred policy option of introducing a broad text and data mining (TDM) exception that allows TDM for commercial purposes with a rightsholder opt-out (similar to the position in the EU) was met with intense criticism from the creative industry. The government has stepped back from this but has not suggested what its preferred approach now is.
The cautious report notes that the government must take time to "get this right" and that it will not introduce legislative reforms unless they will meet its objectives of protecting the UK's creative industries and "unlocking the extraordinary potential of AI" for economic growth and social improvement.
In short, the government has kicked the policy decisions can down the road. In many instances, it indicates that it wants to collect more evidence, and monitor how ongoing litigation unfolds and what action is taken elsewhere. However, this "watch and wait" strategy could undermine the UK's position as a leading AI jurisdiction and the lack of certainty presents risks for both the creative industries and the AI sector.
The dropping of any immediate plans for a TDM exception for commercial purposes will be cheered by many in the creative industry. However, the lack of substantive detail in the report creates legal uncertainty for both AI developers and creative industries. AI developers are without a clear understanding of what their liability might be in the UK and will likely develop and train their models elsewhere. Meanwhile rightsholders face enforcement issues, particularly when AI systems are trained in permissive foreign jurisdictions and subsequently used in the UK (as illustrated in the Getty v Stability AI litigation, which is on appeal).
AI training
No favoured policy option
The 2024 consultation set out four policy options, including the government's preferred option of introducing a broad TDM exception with a rightsholder opt-out mechanism and transparency requirements. However, following "strong opposition…from many creators and the creative industries", this is now "no longer the government's preferred way forward".
The report noted that there is "considerable uncertainty" about the outcomes of this approach and the availability of technical measures to ensure rightsholders are able to effectively opt out. The availability of technical measures is an issue that has also been raised in the EU, where the Commission is considering standardisation of opt-out methods due to reported difficulties with rights reservation.
At this stage, the government does not state what its new preferred policy approach is, merely that it will "give consideration to alternative approaches". Whether this will include reconsidering some of the other options from the consultation or new unspecified options is not yet clear.
The government also commits to gathering more evidence on "how copyright laws are impacting the development and deployment of AI across the economy and the economic benefits of reform". How this will be done without duplicating information received in the 11,500 responses to its previous consultation is unclear, as is the timeframe for undertaking this work. Nevertheless, once this additional evidence has been gathered, the government will consider whether "specific interventions" are needed.
Input transparency
The report notes that there is currently a varied approach to transparency concerning training data by AI developers, but that the provision of this information can help both developers demonstrate their compliance with copyright law and rightsholders in asserting their rights.
In response to the consultation, the creative industries "strongly supported" the introduction of mandatory standards on transparency, whereas technology companies supported transparency but maintained that commitments should be "high-level and industry-led". There were mixed views on whether transparency approaches from other jurisdictions should be followed.
Accordingly, the government has proposed to continue to monitor the effect of transparency rules in other countries. It has also pledged to work with industry and other experts to develop transparency best practice – this will include working with AI developers developing systems outside of the UK.
Output transparency
The report acknowledged that there was "positive engagement" on the issue of output transparency, both in the consultation responses and the government's subsequent working groups. The government proposes to continue to work with industry to explore "good practice" on labelling AI-generated content, which is to be balanced and effective and to help establish consumer confidence and public trust in AI outputs.
It will also monitor international developments to avoid duplicating work that will "help set global standards" and it will work with international partners to support the development of "common solutions". Again, no timeframes are attached to any of this work.
Technical tools and standards
The report welcomed the "pace of development of technical tools and standards" to support rightsholders in enforcing their rights and managing permissions and licensing their content for AI training and development since the 2024 consultation.
The government believes that it should not "stand in the way of the innovation that is taking place" and sees its role as supporting industry to develop best practice and encourage the uptake of technical tools and standards. The need for regulation in this area will be kept under review, with the government monitoring the effectiveness of approaches in other countries.
Licensing
Positively for rightsholders, a key takeaway from the report is the government's clear statement that rightsholders "should be fairly remunerated" for the value creative materials add in the AI supply chain. The report notes that the licensing market for AI training materials in the UK is "new and growing" and there is not sufficient evidence to justify government intervention at this stage.
The government suggests that its proposals to work with industry to develop best practice on input transparency and on technical tools and standards might have "positive outcomes" in relation to licensing. Market-led licensing approaches will be kept under review, including the effects on rightsholders, AI developers and users of AI systems who are individuals or small or medium-sized enterprises (SMEs).
With respect to AI systems developed outside of the UK, the government proposes to continue monitoring global developments and litigation in the UK and elsewhere, including how secondary liability may apply to imported AI models placed on the UK market. This will be an issue considered by the Court of Appeal in the Getty v Stability AI appeal.
Enforcement
The government states that it has been consistently clear about the "importance of effective, proportionate copyright enforcement". Again, it proposes to continue to work with partners across relevant sectors, as well as with law enforcement and the judiciary, to ensure that the UK enforcement framework provides "effective and accessible routes to redress", including where AI systems are developed outside of the UK.
The government concluded that the UK framework for enforcing copyright is "effective and capable of adapting to developments in AI". However, as wider policy decisions on AI and copyright are considered, it will undertake a programme of work to consider ways of enforcing requirements and restrictions relating to the use of copyright works to train systems, both where they are developed in and outside the UK. Particular focus will be given to individuals, SMEs and micro businesses with fewer resources when considering the impact of any policy proposals.
Computer-generated works
In a positive step for creatives, the report states that copyright should "incentivise and protect human creativity". In the 2024 consultation, the government was clear that its preferred approach was to remove the copyright protection currently provided for computer-generated works (CGWs), unless evidence of positive effects of the protection emerged from the consultation responses.
Despite noting that the consultation showed "minimal evidence that CGWs protection is being used or has significant economic effect", the government has still decided to wait and monitor the area before taking action to remove the protection. This is an unusual approach given that there was limited support for protection for CGWs from both creative industries and from the AI sector. This highlights the government's hesitancy in making any moves that might make the UK a less attractive jurisdiction – both to the creative industries and to the AI sector.
Digital replicas
Unlike other jurisdictions such as France, Germany and the USA, the UK does not currently have a personality right and there is no one IP right that provides protection against digital replicas or deep fakes. A combination of rights, including copyright, performers' rights, registered trade marks and passing off, may be applicable if an individual wishes to commercialise elements of their personal likeness or seeks redress for unauthorised use, but there are gaps in what is protected. Famous individuals, such as darts player Luke Littler and footballer Cole Palmer, have increasingly been applying for registered trade mark protection to attempt to combat deepfakes, but this does not provide a complete solution to the issue.
The report notes that "realistic impersonation through AI creates new risks – as well as opportunities – for artists and the general public". There were strong concerns in the consultation responses that the current legal framework is not "sufficiently robust to deal with unauthorised digital replicas".
The government proposes to explore options to address these risks and concerns, while promoting growth and innovation. This will include considering whether a new personality right might be appropriate. A(nother) consultation on this issue is set to be launched this summer, therefore progress is likely to be relatively slow.
Osborne Clarke comment
While the report and economic impact assessment have a combined length of nearly 200 pages, they give little detail on the policy approach the government intends to take to deal with the many issues surrounding AI and copyright. Indeed, the government has stepped back from the clarity of its preferred approach to dealing with the use of copyright works to train AI models.
This is perhaps unsurprising given the complexity of the issues at play and the lack of consensus across industries. However, the government's "wait and see" approach risks upsetting both creative industries and the AI sector, leaving all players with a lack of legal certainty.
The government is hesitant to make a move that will undermine either the creative industries or the AI sector, both of which are extremely important to the UK economy. However, the lack of certainty that results from this delaying approach risks making the UK an unattractive location to develop and train AI systems, which then poses enforcement risks for rightsholders.
A market-led approach to licensing could potentially lead to the biggest players securing licensing deals (some of which we have already seen), but leave smaller AI developers without a clear route to quality datasets, and rightsholders without appropriate recompense for the use of their copyright works.
For the time being, it looks like the UK will be monitoring and potentially following other jurisdictions rather than leading on these important issues that will shape the future of the AI industry. While this reflects the political difficulty of the situation, it is a particularly unattractive outcome given the emphasis the government has placed on AI to generate economic growth.