Brand owners should be forced to provide consumers with realtime information on all the personal data they hold on them in a single online registry because the right to privacy may exist in law but it is simply not being enforced.
That is just one of the key recommendations of a new report from the Parliamentary Joint Committee on Human Rights, following its inquiry into the Right to Privacy and the Digital Revolution, amid claims that GDPR has yet to be effectively implemented by companies or enforced by regulators.
The committee reports serious grounds for concern about the nature of the “consent” consumers provide when giving over information about themselves, to be used for commercial gain by private companies.
It insists that privacy policies are too complicated for the vast majority of people to understand: while individuals may understand they are consenting to data collection from a given site in exchange for “free” access to content, they may not understand that information is being compiled, without their knowledge, across sites to create a profile.
The committee said it heard “alarming evidence” about eye tracking software being used to make assumptions about people’s sexual orientation, whether they have a mental illness, are drunk or have taken drugs: all then added to their profile.
It claims that all too often the use of a service or website is conditional on consent being given – raising questions about whether it is freely given – yet consumers cannot find out what they have consented to.
The committee also maintains that it is difficult, if not nearly impossible, for tech experts – let alone consumers – to find out who their data has been shared with, to stop it being shared or to delete inaccurate information about themselves.
The report adds: “The consent model relies on individuals knowing about the risks associated with using web based services when the system should provide adequate protection from the risks as a default.
“It is completely inappropriate to use consent when processing children’s data: children aged 13 and older are, under the current legal framework, considered old enough to consent to their data being used, even though many adults struggle to understand what they are consenting to.”
The committee points out that there is a real risk of discrimination against some groups and individuals through the way this data is used: it heard “deeply troubling” evidence about some companies using personal data to ensure that only people of a certain age or race, for example, see a particular job opportunity or housing advertisement.
There are also long-established concerns about the use of such data to discriminate in provision of insurance or credit products, it said.
The report continued: “Unlike traditional print advertising where such blatant discrimination would be obvious and potentially illegal personalisation of content means people have no way of knowing how what they see online compares to anyone else.”
The committee calls on the Government to ensure there is robust regulation over how data can be collected and used and it calls for better enforcement of that regulation.
The report states: “The consent model is broken and should not be used as a blanket basis for processing. It is impossible for people to know what they are consenting to when making a non-negotiable, take it-or-leave-it ‘choice’ about joining services like Facebook, Snapchat and YouTube based on lengthy, complex T&Cs, subject to future changes to terms.
“This model puts too much onus on the individual, but the responsibility of knowing about the risks with using web based services cannot be on the individual. The Government should strengthen regulation to ensure there is safe passage on the Internet guaranteed.”
The report argues that it should be made much simpler for individuals to see what data has been shared about them, and with whom, and to prevent some or all of their data being shared.
It adds: “The Government should look at creating a single online registry that would allow people to see, in realtime, all the companies that hold personal data on them, and what data they hold.”
Committee chair Harriet Harman said: “Individuals are giving away lots of information about themselves when using web based services and the expectation is that they should know about the risks of using the Internet. Individuals cannot be expected to know whether their data is being used appropriately and what risks this poses to their right to privacy. Instead there should be adequate regulation in place to ensure that everyone’s privacy is protected online.
“It should be simple to know what data is shared about individuals and it must be equally easy to correct or delete data held about us as it was to us to sign up to the service in the first place. These rights already exist, but they clearly have yet to be effectively implemented by companies and enforced by regulators. The Government must address this, urgently. We say it often but it bears repeating again now: rights are meaningless if not enforced.”
Some 18 months ago, just days after GDPR was adopted into UK law through the Data Protection Act 2018, Labour MP Darren Jones claimed the legislation was already out of date and did nothing to tackle the issues of companies exploiting personal data.
ICO ‘failings’ exposed as most probes come to nothing
‘GDPR experts’ in the dock over dubious legal advice
Have companies done enough to comply with GDPR?
Now Marriott takes a £99m battering for GDPR failings
BA faces record £183m GDPR fine for data meltdown
GDPR one year on: Data is now a major boardroom issue
Data marketing industry ‘put on notice to adapt or die’
ICO urged to act now on adtech or be seen as soft touch
ICO: online ad industry ‘leaving millions at risk of harm’
Germans unleash GDPR blitz on behavioural ad giants
‘Inadequate’ Data Protection Bill is ‘already out of date’