The Artificial Intelligence (Regulation) Bill received its second reading in the House of Lords, with various issues, risks and, most significantly, praise being voiced. We picked out the key parts of the AI Bill here; if enacted, the Bill would create a new UK Artificial Intelligence “AI” regulator, creation of Chief AI Officers, and mandate the introduction of further AI regulation. The next stage is the Committee Stage for a line by line examination of the bill, which is yet to be scheduled. For now, we pick out a few select points from the second reading in the House of Lords.
The need for AI regulation remains and is increasing, but not everyone agrees
Lord Holmes, the Bill's sponsor, said: There are at last three reasons why we should legislate on this: social, democratic and economic. On social, there are potentially great benefits. On democracy, there are potentially great risks, especially in 2024 when more than 50% of the global population could head to the polls. On jurisdiction and system of law:
the UK has a unique opportunity at this moment in time. We do not have to fear being in the first mover spotlight—the EU has taken that with its Act, in all its 892 pages. The US has had the executive order but is still yet to commit fully to this phase. The UK, with our common-law tradition, respected right around the world, has such an opportunity to legislate in a way that will be adaptive, versatile and able to develop through precedent and case law.
Various other Lords who spoke at the second reading also recognised the need for an AI Bill, echoing the pressing opportunities and risks that AI presents, all warranting regulatory action, albeit each with varying angles and emphasis. In particular, some view that some of the government's proposals in the White Paper - such as cross-sector principles - may only be effective with legal force and financial consequences for non-compliance.
However, some also made the case that a specific AI regulator is not needed. Self-regulation can be good regulation argue some. The current approach - using existing regulators - is seen by some as the right one.
The evolving regulatory landscape
Some also recognised that there are multiple regulators and bodies, but a holistic view of their names, numbers, and where their remits start and end is unclear. There are calls for an AI regulator, or central government, to address this.
The draft Bill may change
Lord Holmes summarises each of the clauses and recognises that, say, the definitions are drawn broadly now ready for further debate. At the committee stage, the Bill will be examined line by line. For now, the direction of travel of potential AI regulation in the UK, even if not the final destination, is clear.
Further, there is a debate around whether the Bill reaches the right balance between risk and reward. Some considered that the Bill placed a significant emphasis on risk mitigation, with little room for measures for innovation. The Bill does include some aspects specifically for regulation, such as creation of sandboxes. Some may argue that risk mitigation is central to innovation. However, the emphasis on risk mitigation may suggest a regulator's role is focussed towards risk management; some may want regulators to have a clear objective of boosting (AI-related) innovation. Achieving the right balance in regulation is a challenge and for further debate.
The transcript of the second reading is available here: Artificial Intelligence (Regulation) Bill [HL] - Hansard - UK Parliament
If you would like to discuss how current or future regulations impact what you do with AI, please contact Tom Whittaker, Brian Wong, Lucy Pegler, David Varney, Martin Cook or any other member in our Technology team.