The Government’s long-running battle to get its data reforms passed before the next General Election is proving ever more arduous with calls for even more amendments to be added as the Lords struggle to get through the 240 changes already before them.
The latest demand for change comes in the form of an open letter from a group of over 30 civil society organisations, academics, legal professionals, think tanks and unions, demanding that the Government should be under a legal duty to be upfront about when it uses artificial intelligence to make decisions that affect people’s lives, such as in education, health or welfare.
As Government uses AI and algorithms to make more and more decisions that affect people’s daily lives, from A-levels to Universal Credit, the group has asked Secretary of State for Science, Innovation and Technology Michelle Donelan to amend the Data Protection & Digital Information Bill currently in the House of Lords.
The group – spearheaded by the Public Law Project – argue that a change in the law is urgently needed if the Government wants to build public trust in how it uses technology and avoid catastrophes such as the Australian Robodebt and Dutch welfare scandals that harmed tens of thousands of people.
Public Law Project chief executive Shameem Ahmad said: “Any decision-making system, AI or human, will at some point go wrong and treat someone unfairly; whether that affects their A-level results, the time they must wait for an organ transplant or their access to vital welfare payments.
“The difference at the moment is that public bodies can get away with using automated tools in the shadows. Without transparency, there can be no accountability or trust in the systems that govern us.
“If the Government is serious about using this technology for the good of everyone, it must urgently make sure public sector AI use is transparent. Knowing how automated decisions are being made is the first step in seeking redress and putting things right if they go wrong.
“AI has great potential, but scandals in Australia and the Netherlands show that these systems can make incorrect decisions at speed, inflicting great harm on a huge number of individuals. Given the stakes, the Government must take this opportunity and act now.”
The letter states: “The speed and volume of decision-making that new technologies will deliver is unprecedented. Their introduction creates the potential for decisions to be made more efficiently and at lower costs. However, if the use of these systems is opaque, they cannot be properly scrutinised and those operating them cannot be held accountable.”
At the moment, public bodies can voluntarily publish information on how they use algorithms under the Algorithm Transparency Recording Standard (ATRS) which was created in 2021.
However, since its inception, only seven transparency reports have been released.
Many of the key government departments using tools that fall within the scope of the ATRS, such as the Home Office and Department for Work & Pensions, have never submitted a report.
The Tracking Automated Government register was launched in 2023 by the Public Law Project. It currently lists 55 automated tools used by public authorities. The TAG register was not built by the Government; it was pieced together by Public Law Project from information, hard-won through investigations by journalists, civil society organisations and academics.
As part of the Government’s response to its AI regulation White Paper consultation, it announced that the ATRS will become a ‘requirement’ for government departments using algorithmic and automated tools, but this requirement will not be on a statutory footing; it will be in guidance only.
The letter to the Secretary of State says an explicit legal obligation is needed: “Such a duty is proportionate to the nature and impact of the risk posed by the widespread and fast-growing use of AI and algorithmic tools and will ensure that public authorities can be held accountable for failure to comply with the duty.”
The move comes as the Lords struggle with the line-by-line scrutiny of the current version of the Bill, which contains the 240 last-minute amendments waved through on the third reading in the Commons in November.
Parliament has already added a further four days to the five days originally allotted to the Lords committee stage but whether this will be enough is a moot point.
Once the committee stage is complete, the Bill then goes to another report stage, which gives all members of the Lords a further opportunity to examine and make amendments. After that, the Bill will have its third reading in the Lords, and then pass back to the Commons for any amendments made by the second House to be considered.
Only when the exact wording has been agreed by the Commons and the Lords, is the Bill is ready for royal assent before becoming an Act of Parliament.
At the moment, at least, this appears to be a long way off.
Related stories
Data reforms ‘make it harder for people to get justice’
Lords back on track in double boost for data reforms
Government forced to get extension on data reforms Bill
Regulator wades into row over ‘data snooping’ by DWP
That was then, this is now… for data protection reforms
Data reforms face delay over Parliamentary ‘ping-pong’
Data reforms to be rubber-stamped by MPs next week
Govt urged to pass data reforms ‘for the sake of SMEs’
Data reforms under fresh fire for favouring big business
Decision Marketing Data Clinic: Data reforms explained
Privacy organisations fume at ‘weakened data laws’