Government plans to scrap human review of automated decisions made by AI – including recruitment and loan eligibility – could have life-changing effects on individuals and needs a major rethink.
That is according to the professional body for the technology industry – BCS, The Chartered Institute for IT – in its response to the Department for Digital, Culture, Media & Sport (DCMS) consultation on data reforms, Data: A New Direction.
DCMS is seeking further evidence before forming firm proposals on reform of the UK’s existing data legislation, including considering the removal of Article 22 of GDPR, which focuses specifically on the right to review fully automated decisions.
The consultation states that “automated decision making is likely to increase greatly in many industries in the coming years. The need to maintain a capability to provide human review may, in future, not be practicable or proportionate, and it is important to assess when this safeguard is needed and how it works in practice”.
It does acknowledge, however, that there may be “legitimate need for certain ‘high risk’ AI-derived decisions to require a human review, even if this restricts the scope of use of such systems or makes them slower”.
But BCS insists the right to human review of decisions made fully by computers should not be removed “while AI is still in its infancy”, and true protection of the right to revisit must consider wider regulation of AI.
Dr Sam De Silva, chair of BCS’ Law Specialist Group and a partner at law firm CMS, explained: “We need clarity on the rights someone has in the scenario where there is fully automated decision making which could have significant impact on that individual.
“We would also welcome clarity on whether Article 22(1) should be interpreted as a blanket prohibition of all automated data processing that fits the criteria or a more limited right to challenge a decision resulting from such processing. BCS is not convinced that either retaining Article 22 in its current form or removing it achieves such clarity.
“We also need to consider that protection of human review of fully automated decisions is currently in a piece of legislation dealing with personal data. If no personal data is involved the protection does not apply, but the decision could still have a life-changing impact on us.”
De Silva quotes the example of an algorithm created to decide whether a consumer should get a vaccine. The data needed to be entered into the system is likely to be date of birth, ethnicity, and other things, but not name or anything which could identify them as the person.
He added: “Based on the input, the decision could be that the person not eligible for a vaccine. But any protections in the GDPR would not apply as there is no personal data.
“So, if we think the protection is important enough it should not go into GDPR. It begs the question – do we need to regulate AI generally – and not through the ‘back door’ via GDPR?
“It‘s welcome that the Government is consulting carefully before making any changes to people’s right to appeal decisions about them by algorithms and automated systems – but the technology is still in its infancy.”
BCS is currently gathering views from across its membership.
The Information Commissioner’s Office has also taken issue with the proposed data reforms, warning an overhaul of the regulator’s structure risked its independence and could pave the way for Government meddling.
Data reforms could lead to Govt meddling, ICO warns
Privacy group slams ‘bonfire of rights’ in data reforms
Will tougher fines bring victory in nuisance call war?
How will UK data reforms hit the marketing industry?
Govt reforms to axe Information Commissioner’s role
Critics round on overhaul of data law; Daily Mail rejoices
Govt plots major data law shake-up steered by NZ chief
MPs warn new data regulator must not be Govt patsy