As we have detailed previously, in December the UK Government launched a consultation into the implications of artificial intelligence on copyright law. While the consultation aims to address this tension between copyright and AI across multiple industries, the video games sector – which is both a significant copyright owner as well as being a user of AI – is uniquely placed to provide evidence and input into the consultation, and to be impacted by any resulting reforms.
In this article, we analyse how the UK Copyright and AI consultation may influence the future of the games industry and raise key considerations for those weighing up its potential impact on their business.
What is the UK AI and Copyright consultation?
As set out in our previous article, the Copyright and AI consultation has three aims:
- Enhancing right holders' control of their content and ability to be remunerated for its use;
- Supporting the development of world-leading AI models in the UK by ensuring wide and lawful access to high-quality data; and
- Promoting greater trust and transparency between the sectors.
While the Government is consulting a range of policy options, it has strongly signalled the 'broad parameters' of its preferred approach and has outlined a package of measures which it believes will meet its three aims. This proposed preferred approach includes:
- a new commercial text-and-data mining ('TDM') exception, permitting the use of copyright protected works for AI training purposes, provided the rightsholder has not "reserved their rights" (i.e. opted out); and
- enhanced transparency requirements aimed at increasing trust and accountability in AI training practices. These could include requirements for those engaging in TDM to disclose the use of specific works and datasets, keep records, or to evidence compliance with rights reservations.
Other recommendations include the following:
- A review of contracts and licensing to improve clarity and consistency for rightsholders and AI developers;
- Revisiting the protection of "computer-generated works" under s9(3) of the CDPA;
- A review of the current "temporary copies" exception in relation to AI training;
- Labelling of AI output; and
- Consideration of legal protections to address digital replicas and deepfakes.
Before considering the implications of the consultation on the video game sector, it is worth noting further developments in relation to AI and copyright. On 13 January, the UK government published the AI Opportunities Action Plan it commissioned following the election, and simultaneously announced that it would endorse and take forward the recommendations made in the Action Plan. Relevantly, the Action Plan contained a recommendation to "Reform the UK text and data mining regime so that it is at least as competitive as the EU". While the formal Government response to the Action Plan merely notes that "The government has launched a consultation" on the TDM issue, some commentators have expressed concern that the Government have effectively pre-empted the outcome of the consultation and are strongly committed to their proposed preferred approach of introducing a new commercial text and data mining exception.
More recently, on 28 January, the House of Lords voted to approve Baroness Kidron's proposed amendments to the Data (Use and Access) Bill which, among other things, requires overseas generative AI companies to respect UK copyright law if they sell their models in the UK. This vote by the House of Lords represented a surprise defeat for the Government, with members of the House of Lords from across the political spectrum expressing their concerns during the debate about the Government's proposed approach in relation to the commercial TDM exception. While Baroness Kidron's proposed amendments may be removed from the Data (Use and Access) Bill when it returns to the House of Commons, the vote in the House of Lords suggests that the legislative pathway for the Government in introducing its preferred TDM exception option following the consultation may not be straightforward. We have written more generally about Data (Use and Access) Bill here.
How is AI currently used in video game development?
The video games industry is a well-established user of different forms of AI in game development and production. Primitive forms of AI have been used in the very earliest interactive electronic games. For example, in 1951 the Nimrod computer used an early form of AI known as programmed behaviour to play a game of Nim against a human. By 1972, programmed behaviour techniques had advanced to the point where they could control the paddle of a computer opponent in Pong. By the 1980s this form of AI was able to control the movement and behaviours of individual NPCs, such as ghosts in Pac-Man. The 2000s saw further advancements in AI techniques, enabling the simulation of lives and interactions of virtual characters in games like The Sims. AI, including generative AI, is now more widely adopted in how games are both designed and played, with notable examples including the following:
- Middle-earth: Shadow of Mordor (2014): introduced the Nemesis System, where unique AI generated enemies remember their past encounters with the player and adapt their behaviour accordingly.
- No Man's Sky (2016): used procedural generation to create vast worlds and ecosystems including landscapes, plans and animals, which can evolve and change.
- Unity's Muse (2024): a generative AI tool integrated directly into the Unity Editor that allows developers to generate prototype textures, code, animations, 2D art and character interactions using simple text-based prompts.
While video game development has long used AI tools for uses such as detecting cheating, procedural content generation or scripting enemy or NPC behaviour, the use of generative AI tools in video game development has been more controversial. 30% of respondents to the GDC's 2025 State of the Game Industry survey reported a belief that generative AI is having a negative impact on the games industry. Developers pointed to concerns surrounding intellectual property infringement, energy consumption, the quality of AI-generated content, potential biases and regulatory issues. The GDC survey for 2024 reflected concerns about a potential homogenisation of content, job displacement and the potential for AI to stifle creativity and innovation within the industry. Despite these concerns, it is expected that the use of generative AI in game development is likely to continue to grow, particularly given rising game production costs and broader financial pressures on the sector.
It is evident that there is a dividing line between the adoption of AI in general and the perception generative AI within parts of the industry. However, as copyright material is one of the most important assets for a game, and as the use of generative AI is widespread in the sector, it is important for the video game sector to participate in this consultation.
How may this consultation impact my game?
Below we highlight the key areas of the consultation with the potential for the greatest impact on the video game sector.
1. The preferred option of introducing a commercial TDM exception, subject to opt out
As previously discussed, TDM refers to the use of automated techniques to analyse information. TDM is a practice deployed at mass scale by AI companies to train AI models, who often use automated web scraping bots to collate datasets of material on publicly available website before using that material for AI training. While these automated web scraping bots are unlikely to be able to copy game source code itself (as it will not be freely available online), often fans will have shared game screenshots or streamed game footage, and it is likely that video game content will be included in AI training datasets.
The existing TDM exception in the Copyright, Designs and Patents Act 1988 ("CDPA") applies only where TDM is undertaken "for the sole purpose of research for a non-commercial purpose". This consultation proposes introducing a new broader TDM exception which would permit commercial TDM. This new proposed exception would apply to all works to which the user has lawful access unless the rights holder had not "reserved their rights" (i.e. opted out). Where a rightsholder had opted out, the proposed new TDM exception would not apply, and training AI models on the content would only be permitted if a licence was granted by the rightsholder.
It is unclear whether in a practical sense video games companies will be able to reserve their rights to opt-out under the proposed commercial TDM exception. There is no consensus on appropriate opt-out methods for content made available online. The UK government has referenced several methods in the proposal, such as:
- metadata instructions: embed instructions in the metadata of a work itself that notifies AI models or developers that TDM is prohibited;
- robots.txt: the most widely adopted standard, which blocks web crawling at site level; and
- 'Do-Not-Train' registries: rightsholders individually notify AI firms directly that they do not want their works to be used for training AI, and this is captured in a registry.
These methods may be ineffective for video game companies for several reasons. Video game screenshots and gameplay are often distributed and shared by fans on online platforms beyond the control of the rightsholders. While game streaming may technically be a copyright infringement by players unless authorised, it is often ignored or encouraged by developers and publishers alike and has generated a large online streaming community of players. If screenshots and gameplay are shared by fans, the game developers may have no practical ability to apply metadata instructions or robots.txt opt outs to that content. Consequently, rightsholders may not be able to fully opt out, regardless of the method used, as their material will be shared online beyond their control.
Requiring AI developers to comply with 'do-not-train' registries and the use of automated content recognition technologies is an opt-out approach supported by several other creative industries. At a technical level, automated content recognition may however be more challenging to apply to video games which have dynamic content which changes based on player interactions.
These practical challenges with existing opt out technologies will however need to be addressed by video game companies and AI developers irrespective of whether the UK proposals proceed, given that an opt out commercial TDM exception already exists within the EU. In the Copyright and AI consultation the UK government acknowledges that the EU approach "is still being developed" and that while it is a "useful precedent" there remains "some uncertainty about how it works in practice". We have written about some of the emerging case law from EU member states which have begun to interpret the EU opt out requirements, however this remains a contentious area.
Video games are composite intellectual property made up copyright protected inputs from a series of individual contributors, such as developers, artists, musicians and voiceover performers. Many of these individual contributors are protective of their contributions and may not want them used in the training of an AI model, even where they have assigned their economic rights to an overall video game developer. If a commercial TDM exception is adopted in the UK, video game companies should be prepared to carefully review their contracts with these individuals and hold discussions surrounded the use of such materials for AI training and whether an opt out should be in place for the game content as a whole. It is therefore worth video game consultation respondents considering if there is any effective method for opt out to operate in their industry or whether standard forms of video game opt out technologies could be developed. Many businesses in the creative industries have expressed opt out concerns, which have been highly publicised, and remain of the view that the existing system should be retained or alternative opt in approaches to AI training should be adopted.
2. Labelling and watermarking
The Government is considering whether generative AI outputs should be labelled or watermarked as AI generated to differentiate them from human-created works. It is hoped that this will better inform the public of the consumption choices they make and support source attribution. However, the consultation also recognises the practical challenge of ensuring consistent AI output labelling, such as what degree of AI should require an AI output label, how works should be labelled and the degree of information that is reasonable to provide on a label.
Given the composite nature of video game IP these challenges are magnified, and clarity may be needed on how any labelling requirements may apply to video game content. No player wants to hear an NPC declare "this interaction has been AI generated" after meeting them. However, would a brief label in the credits of a lengthy campaign be sufficient to satisfy any government mandated labelling requirement? This proposal sparks questions right from the nascent stages of a game's production, such as would a game need to disclose if a developer has used Midjourney for initial game ideation or another AI tool to generate prototype world textures? While it is unlikely that labelling will be required for the game as a whole if AI was only used in the initial development and prototyping, a labelling requirement may require developers to carefully track every time AI is used throughout the development process by any one of the multiple contributors to a game if they want to ensure that the released game does not include AI content. Again, irrespective of the outcome of the UK consultation, video game developers may need to consider these issues as similar transparency and labelling requirements exist in the EU. The new EU AI Act puts in place requirements for AI systems which interact directly with natural persons or generate synthetic audio, image, video or text content, but practical challenges remain in relation to implementation of these obligations and the UK consultation explicitly asks respondents for their views on the EU's approach to AI output labelling.
The UK proposal also highlights the challenge of ensuring that AI labels are resilient to manipulation, either by editing the label or removing the label entirely. In the circumstances where an AI label is included in a game's credits or displayed at the point of purchase, there is no guarantee that such a label will be retained in second level content such as online streaming.
3. Section 9(3) of the Copyright, Designs and Patents Act 1988 ("CDPA")
Computer-generated works without a human author can be protected under section 9(3) of the CDPA. The author of such a work is deemed to be the person "by whom the arrangements necessary for the creation of the work are undertaken." It is however unclear whether this section requires such computer-generated works to demonstrate the usual originality requirement applying to works with a human author, or how such a standard requiring the work to reflect its "authors personality and creative choices" might apply to AI outputs. There is therefore significant uncertainty around the application of the provision, and there has been very limited case law considering its operation. The government is therefore considering amending or removing the protection for computer generated works, and has sought feedback on three options in relation to section 9(3) CDPA:
- no change: maintain the provision as it currently stands;
- reform the current protection: remove the "originality" requirement in relation to computer generated works, or define originality in some other way such as whether an identical human-authored work would have been considered original; or
- remove the current protection: remove the protection provided to computer-generated works entirely.
Video game developers and publishers should carefully consider the impact of amending or removing this provision, as the industry may be one of the few that can provide examples of substantive computer-generated works with no human author. For example, while an algorithm used in the procedural generation of unique virtual worlds, such as those discussed above in No Man's Sky, will be protected as a literary work, it may be less likely that the procedurally generated graphic outputs would meet the usual human originality requirement. Similarly, while the individual graphic elements used in a roguelike game may be protected as artistic works, there may be an argument that the procedurally generated levels combining these elements would only be protected by reliance on section 9(3) CDPA.
The protection currently offered in the UK to computer-generated works by section 9(3) of the CDPA is however unusual at an international level, with the US Copyright Office recently confirming that in the US copyright does not extend to purely AI-generated material, and broad consensus that in the EU such protection would not be compatible with the originality requirement under EU copyright law. There may be both advantages and disadvantages for the UK to take an approach which is out of step with other major jurisdictions – for example, additional protection may have the benefit incentivising game development work occurring in the UK.
Next steps
The deadline for responding to the consultation is 11:59pm on 25 February 2025. By engaging with the consultation, stakeholders in the interactive entertainment sector can ensure their interests are represented and help shape the future of AI and copyright law.
Please get in touch with a member of our Interactive Entertainment team below if you need support with drafting your consultation response or have concerns about how the consultation outcome may impact your business.
