The Government must bring in new legislation to tighten up the use of algorithms that drive content on social media – including advertising, videos and posts – and hand back control to consumers rather than let big tech firms ride roughshod over everyone.
That is just one of the recommendations contained in the first report into data ethics and data-driven technology published by the Government’s own advisory body, the Centre for Data Ethics & Innovation (CDEI), which insists that existing legislation is “out of step” with consumers’ expectations.
The report, which follows a year-long review of current practices, also calls for the Government to consider forcing social media platforms to hand over their data to independent academics to study issues of public concern, such as the effects of social media on mental health, or its role in spreading misinformation.
Online platforms should also create publicly accessible archives for “high-risk” advertisements, building on the voluntary directories they already operate for political ads, to cover areas such as jobs, housing, credit and age-restricted products, the CDEI said.
The report includes a new analysis of public attitudes towards online targeting, which reveals that while consumers welcome the convenience of targeting systems, there are widespread concerns that platforms are unaccountable.
Conducted with Ipsos MORI, the analysis shows that only 29% of consumers trust platforms to target them in a responsible way, and when they try to change settings, only a third (34%) trust these companies to do what they ask. Over three-fifths (61%) of consumers want greater regulatory oversight of online targeting, compared with 17% of people who support self-regulation.
CDEI chair Roger Taylor said: “Most people do not want targeting stopped. But they do want to know that it is being done safely and responsibly. And they want more control. Tech platforms’ ability to decide what information people see puts them in a position of real power. To build public trust over the long-term it is vital for the Government to ensure that the new online harms regulator looks at how platforms recommend content, establishing robust processes to protect vulnerable people. ”
Dr Bernadka Dubicka, chair of the Child & Adolescent Faculty at the Royal College of Psychiatrists, added: “We completely agree that there needs to be greater accountability, transparency and control in the online world. It is fantastic to see the CDEI join our call for the regulator to be able to compel social media companies to give independent researchers secure access to their data.”
Related stories
Tough new laws aim to protect kids from online harm
YouTube cuts off targeted kids ads in global
Ad body scoffs at claims that TV spots make kids gorge
Government urged to tighten up data laws for children
CMA: Strong argument for new laws to curb ad duopoly
Duopoly braced for new probe of ad market dominance
The big squeeze: digital ad duopoly to be probed at last