04 March 2025

Will the UK Government mandate that AI users mark their content as AI generated?

Commentary by Managing Associate Adam Gilbertson has been featured in The Global Legal Post, Solicitor’s Journal and The Trademark Lawyer as he responds to the latest UK Government AI Consultation regarding the implications of marking AI generated content.

Read the extended press release below.


Will the UK Government mandate that AI users mark their content as AI generated?

The UK Government should seriously consider mandating AI users to mark their content as AI generated says leading property law firm Mathys & Squire in response to the UK’s consultation on AI consultation that closes on February 25.

In December 2024 the UK Government launched a consultation on how copyright law should be changed to take into account the rise in Gen-AI (Copyright and Artificial Intelligence). The document asks for readers’ advice on whether AI material should be flagged as being created by AI – and whether the Government should be responsible for regulating that.

Currently, there is no obligation for material generated using artificial intelligence to be marked as such.

Adam Gilbertson, Managing Associate at Mathys & Squire, says that “As content generated using Gen-AI becomes increasingly indistinguishable from genuine manmade material, in many cases it’s in the public interest to know what material has been generated by AI so that they can make a properly informed opinion about that material.”

That applies to a whole range of content from Gen-AI created reports, articles and newspaper stories through to Gen-AI images, and not just for copyright purposes but for wider authenticity and bias concerns.

UK Government must balance responsibly to protect copyright holders with need to drive AI R&D in the UK

In the Copyright and Artificial Intelligence consultation, the Government indicates that it favours an ‘opt-out’ model of making copyrighted material unavailable for AI training, similar to the EU’s current text and data mining (TDM) exception. This would mean copyrighted material can be used for training AI for commercial purposes unless the copyright owner makes it clear (in some way) that they have ‘reserved their rights’ to such material.

Adam Gilbertson says that this is a sensible middle ground that should help ensure that content creators can seek fair remuneration for use of their copyright protected works whilst providing a safe harbour for AI developers, helping to make the UK a welcoming environment for AI R&D without too much red-tape.

Government must standardise how copyright holders opt-out from AI models training on their data

Adam Gilbertson also recommends that the Government should standardise the way in which copyright holders opt-out of such an exception and reserve their rights to help avoid potential disputes over what material is and is not available for commercial use.

Says Adam Gilbertson: “There is a big question mark over how to implement such an opt out in practice to avoid confusion over what material is and is not freely available for use. Lessons should be learnt from the uncertainty and disputes that have arisen in the EU over what counts as a valid opt-out of the EU’s TDM exception due to a lack of standardisation. If the UK goes down this route, some form of standardisation would help provide greater legal certainty and make it easier for AI companies to operate in the UK.”