ICO whacked for ‘pathetic’ response to A-level outrage

strap2The Information Commissioner’s Office is facing mounting criticism for not doing enough to intervene in the escalating row over the algorithm marking system used to award A-level grades, amid claims that the entire process is in breach of GDPR.

The issue, sparked by exam regulator Ofqual downgrading millions of pupils’ results, has already sparked a raft of legal action. There are also numerous petitions on Change.org, with one, by pupil Curtis Parfitt-North calling for a fairer system, attracting over 245,000 signatures and counting.

At the heart of the issue is Ofqual’s data protection impact assessment, which many argue is not worth the paper it is written on and fails to cover the basics. In response to a tweet by Pat Walshe of Privacy Matters containing the document, one experts wrote: “No identification of risks, just occasional complacent statements that imply they have been mitigated. No consultation. Necessity and proportionality looking only at input data not processing itself.”

Another said: “Wow – so many questions – most glaring for me is the lack of prior consultation (some data protection expertise, data subject or focus groups) and the mitigation of huge risks (not identified) the impact of this processing is clearly high risk.”

A third wrote: “No analysis in there about potential adverse impact on students. Chronic omission as they knew (from back testing) average grade accuracy of algorithm was only 60%.”

Meanwhile, digital rights group Foxglove argues that one of the key problems is that schools, rather than individual students, are being assessed.

The organisation, which is also threatening legal action, said: “Automating a major decision about pupils in this way potentially violates GDPR and UK Data Protection Act. Those laws (e.g., Art. 22 GDPR) provide significant protections from automated decisions about people which may have significant consequences – and this is a significant consequence for every student expecting GCSE/A-Level results this week.

“Teachers know best. To fix this system, government should institute a) an appeal route for students to challenge unfair results, and b) give greater weight to teachers’ assessments of their pupils.”

Foxglove co-founder and director Cori Crider commented: “Ofqual’s algorithm deserves an F for unfairness. This system fails bright kids in bad schools, and treats pupils as statistics, not individuals. This will damage social mobility and undermines the sense that grades award individual effort and achievement. Where’s the meritocracy in doling out grades on the basis of some made-up bell curve?”

Richard Murphy, a visiting professor at Anglia Ruskin University and at the University of Sheffield Business School, added: “So far there is no personal appeal allowed for resulting errors. The process is then in breach of these rules. I would suggest for that reason alone that what is happening is likely to be illegal.

“But there is also a complete failure to provide to all A-level students information on how their decision was reached. That too is a failure. And it can fairly be said that consent was never sought for this process. In that case the chance that this algorithmic process was legal looks to be low to me. If anyone is planning a legal challenge this seems like the way to go to me.”

In response to the uproar, the ICO has issued a statement saying: “We understand how important A-level results and other qualifications are to students across the country. When so much is at stake, it’s especially important that their personal data is used fairly and transparently. We have been engaging with Ofqual to understand how it has responded to the exceptional circumstances posed by the Covid-19 pandemic, and we will continue to discuss any concerns that may arise following the publication of results.

“The GDPR places strict restrictions on organisations making solely automated decisions that have a legal or similarly significant effect on individuals. The law also requires the processing to be fair, even where decisions are not automated.

“Ofqual has stated that automated decision making does not take place when the standardisation model is applied, and that teachers and exam board officers are involved in decisions on calculated grades. Anyone with any concerns about how their data has been handled should raise those concerns with the exam boards first, then report to us if they are not satisfied.

“The ICO will continue to monitor the situation and engage with Ofqual.”

However, Privacy Matters’ Walshe said: “[There are] multiple issues. But sadly, here in the UK, we don’t have a data protection regulator that seems to want to enforce the law and uphold rights. The ICO is proving to be as much use as a chocolate teapot.”

And data protection consultant Tim Turner tweeted: “This is a generation let down; the ICO response is pitiful.”

Others have replied to the ICO statement on Twitter with comments including: “Do your job”; “this is a cover-up”; “weak”; and “bollocks”.

Related stories
Privacy groups hit out at fresh delay to adtech probe
ICO reveals fewer than 1% of investigations led to a fine
‘Chicken’ ICO kicks adtech investigation into long grass
ICO ‘cosies up’ to industry in bid to tackle adtech issue
ICO urged to act now on adtech or be seen as soft touch
Rogues go free as nuisance call crackdown is sidelined
Half of last year’s £2m fines for PECR breaches unpaid
Show us the money: £7m in ICO fines still outstanding

Print Friendly