On Friday 22 March 2024 my private members bill – the Artificial Intelligence (Regulation) Bill – had second reading in the House of Lords. This is an important opportunity for me to make the case for why we need a new law and hear from colleagues, including the Minister, about what they think of the bill.
I set out some of the reasons – social, democratic and economic – why I think this regulation is needed. I gave details about each of the clauses in the bill and I was clear about why the government must act now.
History tells us, right-size regulation is pro-citizen, pro-consumer and pro-innovation; it drives innovation and inward investment. I was taken by so much of what the Ada Lovelace Institute put in their report. It really is the case that the Government really have given themselves all the eyes and not the hands to act. It reminds me very much of a Yorkshire saying: see all, hear all, do nowt. What is required is for these technologies to be human led, in our human hands, and human in the loop throughout. Right-size regulation, because it is principles-based, is necessarily agile, adaptive and can move as the technology moves. It should be principles-based and outcomes focused, with inputs that are transparent, understood, permission-ed and, wherever and whenever applicable, paid for.
Lord Holmes of Richmond, House of Lords, 22 March 2024
Throughout the debate I was grateful for support from several colleagues. Lord Thomas of Cwmgiedd was kind enough to say he supported the bill “because it has the right balance of radicalism to fit the revolution in which we are living”. Viscount Chandos also indicated the scale of the challenge, saying “there can have been few Private Members’ Bills that have set out to address such towering issues as this bill”. He went on to state:
I strongly support this well-judged and balanced Bill, which recognizes the fast-changing, dynamic nature of this technology—Moore’s law on steroids, as I have previously suggested—and sets out a logical and coherent role for the proposed AI authority, bringing a transparency and clarity to the regulation of AI for its developers and users that is currently lacking.
Viscount Chandos, House of Lords, 22 March 2024
Lord Young of Cookham was particularly positive about Clause 1(2)(c) which would ensure “a gap analysis of regulatory responsibilities in respect of AI.” Giving an excellent example of the current situation in just one sector, education:
We have a shortage of teachers in many disciplines, and many complain about paperwork and are thinking of leaving. There is a huge contribution to be made by AI. But who is in charge? If you put the question into Google, it says, “the DFE is responsible for children’s services and education”. Then there is Ofsted, which inspects schools; there is Ofqual, which deals with exams; and then there is the Office for Students. The Russell group of universities have signed up to a set of principles ensuring that pupils would be taught to become AI literate. Who is looking at the huge volume of material which AI companies are drowning schools and teachers with, as new and more accessible chatbots are developed? Who is looking at AI for marking homework? What about AI for adaptive testing? Who is looking at AI being used for home tuition, as increasingly used by parents? Who is looking at AI for marking papers? As my noble friend said, what happens if they get it wrong? The education sector is trying to get a handle on this technological maelstrom and there may be some bad actors in there. However, the same may be happening elsewhere because the regulatory regimes lack clarity.
Lord Young of Cookham, House of Lords, 22 March 2024
Baroness Stowell of Beeston, Chair of the Communications and Digital Select Committee made important points from a recent report:
Our large language model report looked in detail at what needs to happen over the next three years to catalyse AI innovation responsibly and mitigate risks proportionately. The UK is well-placed to be among the world leaders of this technology, but we can only achieve that by being positive and ambitious. The recent focus on existential sci-fi scenarios has shifted attention towards too narrow a view of AI safety. On its own, a concentration on safety will not deliver the broader capabilities and commercial heft that the UK needs to shape international norms. However, we cannot keep up with international competitors without more focus on supporting commercial opportunities and academic excellence. A re-balance in government strategy and a more positive vision is therefore needed. The Government should improve access to computing power, increase support for digital, and do more to help start-ups grow out of university research.
Baroness Stowell of Beeston, House of Lords, 22 March 2024
Baroness Kidron who is an adviser to the Oxford Institute for Ethics in AI and the UN Secretary-General’s AI Advisory Body proposed some excellent additions to the bill,
under Clause 2, which sets out regulatory principles, I would like to see consideration of children’s rights and development needs; employment rights, concerning both management by AI and job displacement; a public interest case; and more clarity that material that is an offence—such as creating viruses, CSAM or inciting violence—is also an offence, whether created by AI or not, with specific responsibilities that accrue to users, developers and distributors.
Baroness Kidron, House of Lords, 22 March 2024
Lord Kirkhope reminded us all of the importance of defining and understanding our humanity
Yet, as we delve into the mechanics of regulation and oversight, we must also pause to reflect on the quintessentially human aspect of our existence that AI can never replicate: emotion. The depth and complexity of emotions that define our humanity remain beyond the realm of AI and always will. These elements, intrinsic to our being, highlight the irreplaceable value of the human touch. While AI can augment, it can never replace human experience. The challenge before us is to foster an environment where innovation thrives within a framework of ethical and responsible governance.
Lord Kirkhope of Harogate, House of Lords, 22 March 2024
Unfortunately the Minister was not persuaded, reiterating the governments ‘light-touch’ ‘pro-innovation’ approach set out in the white paper response. He was clear
our approach, combining a principles-based framework, international leadership and voluntary measures on developers, is right for today, as it allows us to keep pace with rapid and uncertain advances in AI.
The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology, Viscount Camrose, House of Lords, 22 March 2024.
It is not right for today. It is not enough. We must act, now for all #OurAIFutures.
Related posts:
Artificial Intelligence (Regulation) Bill – Lord Holmes of Richmond MBE (lordchrisholmes.com)
AI Bill – UK Leadership on AI Regulation – Lord Holmes of Richmond MBE (lordchrisholmes.com)
AI Bill – Ethical AI – Lord Holmes of Richmond MBE (lordchrisholmes.com)
AI Bill – IP and Copyright: Clause 5 – Lord Holmes of Richmond MBE (lordchrisholmes.com)
AI Bill – Regulatory Capability – Lord Holmes of Richmond MBE (lordchrisholmes.com)
Related articles:
Bid to create AI Authority amid pleas for swifter action from UK Government | The Independent