Facebook intentionally and knowingly violated both data privacy and competition laws in order to maximise profits from user data and only acted when serious breaches became public.
So says a damning report from the Digital, Culture, Media & Sport (DCMS) select committee, following an 18-month investigation into fake news which also gathered evidence on Facebook’s business practices before and after the Cambridge Analytica scandal.
According to the committee, the social media giant was willing to override its users’ privacy settings in order to transfer data to app developers and was able to starve some developers of data and force them out of business.
The report states: “Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law.
“Facebook continues to choose profit over data security, taking risks in order to prioritise their aim of making money from user data. It seems clear to us that Facebook acts only when serious breaches become public.”
Among the many recommendations, the report calls for a compulsory code of ethics for tech companies, overseen by an independent regulator who would be given powers to launch legal action if companies breach the code.
It also wants the Government to reform current electoral laws and rules on overseas involvement in UK elections and calls for tech companies operating in the UK to be taxed to help fund the work for the Information Commissioner’s Office and any new regulator set up to oversee them.
DCMS committee chair Damian Collins said: “Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use every day. Much of this is directed from agencies working in foreign countries, including Russia.
“Companies like Facebook exercise massive market power, which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.
“These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the ‘move fast and break things’ culture often seems to be that it is better to apologise than ask permission.”
In response to the report, Karim Palant, from Facebook UK’s public policy department, said: “We share the committee’s concerns about false news and election integrity and are pleased to have made a significant contribution to their investigation over the past 18 months, answering more than 700 questions and with four of our most senior executives giving evidence.
“[But] we have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for seven years. No other channel for political advertising is as transparent and offers the tools that we do.
“We also support effective privacy legislation that holds companies to high standards in their use of data and transparency for users.
“While we still have more to do, we are not the same company we were a year ago. We have tripled the size of the team working to detect and protect users from bad content to 30,000 people and invested heavily in machine learning, artificial intelligence and computer vision technology to help prevent this type of abuse.”
Germany goes to war with Facebook over data sharing
Irish confirm seven GDPR probes as Facebook turns 15
Watchdog collars Facebook over messenger merger plan
3 million EU users hit by Facebook security breach
Facebook hit with €10m fine over dodgy data practices
Facebook bids to overturn £500,000 data abuse fine
Facebook finally hit with maximum £500,000 data fine
Denham under fire over ‘unchallenged’ Facebook fine