When we had the second reading debate of my private members bill, the Artificial Intelligence (Regulation) Bill on Friday 22nd March, I spoke about why we need this new law. I also went through the bill, clause by clause, explaining the point and the purpose of each. Clause 5 deals specifically with provisions relating to Intellectual Property (IP) and Copyright and is set out below in full.
Clause 5; Transparency, IP obligations and labelling
(1)The Secretary of State, after consulting the AI Authority and such other persons as he or she considers appropriate, must by regulations provide that—
(a) any person involved in training AI must—
(i) supply to the AI Authority a record of all third-party data and intellectual property (“IP”) used in that training; and
(ii) assure the AI Authority that—
- (A) they use all such data and IP by informed consent; and
- (B) they comply with all applicable IP and copyright obligations;
(iii) any person supplying a product or service involving AI must give customers clear and unambiguous health warnings, labelling and opportunities to give or withhold informed consent in advance; and
(iv) any business which develops, deploys or uses AI must allow independent third parties accredited by the AI Authority to audit its processes and systems.
(2) Regulations under this section may provide for informed consent to be express (opt-in) or implied (opt out) and may make different provision for different cases.
Full text of the AI (Regulation) Bill
The Alliance for Intellectual Property
In consulting on this important aspect of the bill I met with 25 organisations representing all parts of the creative industry: music, publishing, TV and film, news media, photography and picture libraries, visual arts, entertainment retailing and audio.
Collectively, they constitute a significant part of a sector already worth £126 billion in GDP and growing at twice the rate as the rest of the economy (DCMS figures). This is an economic prize worth protecting, but one that highlights the asymmetry of power between those creatives and those currently benefiting from stealing their work.
Director General Dan Guthrie who attended the meeting in the House of Lords said:
It really is extraordinary that large tech firms, worth trillions of dollars between them, feel they can use other peoples’ creative works, that they have toiled over, without seeking their consent or making payment whilst vigorously defending their own IP – it is hypocritical, morally reprehensible and illegal.”
Dan Guthrie, Director General, Alliance for Intellectual Property
Speakers at the meeting made clear that the creative industries are embracing AI and are likely to use it as part of their human creativity. Some are also licensing content to AI developers including picture libraries, though not to Large Language Models (LLMs).
Serious concerns raised during the meeting
- LLMs are ingesting creative content without the consent of rights-holders.
- No payments are being made for this ingestion.
- There is a lack of transparency about what had been ingested or what instructions the ‘crawlers’ are given.
- UK law is clear that this activity is illegal and infringes the rights of creators and rights-holders.
- Generative AI can create content ‘in the style of’ which is difficult to tackle.
- Requests to LLMs not to use content are going unheeded and retrospective ‘forgetting’ is not possible.
- Infringements are being undertaken by large, well-funded LLMs who protect their own IP vigorously.
Other jurisdictions?
Despite suggestions to the contrary, the kind of mass copyright infringing ingestion is not legal in other territories including the USA and Japan. Cases are being brought in the US and the Japanese Government has recently been forced to clarify its own laws.
UK Government Policy
Since the development of a code of practice for licensing had been discontinued (because LLMs argued they didn’t need to licence) the Government has said they are looking at other policy options. But the current lack of respect by LLMs for copyright laws has left significant uncertainty over liability for those using them. This is likely to hinder adoption due to a lack of trust and legal uncertainty. Some think that the Government should also look closely at some form of image or personality rights to help protect their members.
AI Bill for Our AI Futures
It is absolutely clear we must do more. We cannot continue to wait and see what happens while these processes continue stealing the souls of artists who strive to perfect their art. We have to think hard about creativity and the value we attach to it. Creatives are not Luddites – far from it – many are keen to incorporate AI to their own work but on the other side of this lack of clarity in the law is the issue of liability and ethical use. Can people trust that what they do with AI tools is ethical, legal, safe? If we want to support a local AI industry we have to address the liability issue. This is what my bill is about, thinking into these important questions for our AI futures.
Related posts:
Artificial Intelligence (Regulation) Bill – Lord Holmes of Richmond MBE (lordchrisholmes.com)
AI Bill – UK Leadership on AI Regulation – Lord Holmes of Richmond MBE (lordchrisholmes.com)
AI Bill – Ethical AI – Lord Holmes of Richmond MBE (lordchrisholmes.com)
AI Bill – Second Reading Debate – Lord Holmes of Richmond MBE (lordchrisholmes.com)
AI Bill – Regulatory Capability – Lord Holmes of Richmond MBE (lordchrisholmes.com)