Social media giants face ‘full force of Online Safety Act’

Online platforms such as Facebook, TikTok, WhatsApp, Instagram, X, YouTube and Google face multi-million pound fines and even being shut down if they fail to implement tough new measures to tackle illegal content – including fraud, terrorism and child sexual abuse material – under new rules that kick in today.

While the tech giants account for the lion’s share of the £41bn-plus UK advertising market, the Online Safety Act actually covers more than 100,000 services. Each site and app must now start implementing measures to remove illegal material quickly when they become aware of it, and to reduce the risk of ‘priority’ criminal content from appearing in the first place.

In the coming weeks, Ofcom says it will be assessing platforms’ compliance with the new illegal harms obligations under the Act, and launching targeted enforcement action where it uncovers concerns over content that encourages suicide, extreme pornography and selling drugs.

Assessing providers’ compliance with their safety duties over online child sexual abuse material (CSAM) has been identified as one of Ofcom’s early priorities for enforcement.

The regulator’s evidence shows file-sharing and file-storage services are particularly susceptible to being used for the sharing of image-based CSAM.

Among the 40 safety measures set out in its illegal harms codes of practice, it recommends, for example, that certain services – including all file-sharing services at high risk of hosting CSAM, regardless of size – use automated moderation technology, including ‘perceptual hash-matching’, to assess whether content is CSAM and, if so, to swiftly take it down.

Today, Ofcom has launched an enforcement programme to assess the safety measures being taken, or that will soon be taken, by file-sharing and file-storage providers to prevent offenders from disseminating CSAM on their services.

It has written to a number of these services to put them on notice that it will shortly be sending them formal information requests regarding the measures they have in place, or will soon have in place, to tackle the issue, and requiring them to submit their illegal harms risk assessments.

If any platform does not engage with Ofcom or come into compliance, the regulator says it will not hesitate to open investigations into individual services. It has strong enforcement powers at its disposal, including being able to issue fines of up to 10% of turnover or £18m – whichever is greater – or to apply to a court to block a site in the UK in the most serious cases.

Ofcom’s preliminary supervision activity has involved working closely with law enforcement agencies and other organisations – including the Internet Watch Foundation (IWF), the Canadian Centre for Child Protection (C3P) and the National Centre for Missing and Exploited Children (NCMEC) – to identify file-sharing and file-storage services at highest risk of hosting image-based CSAM.

In recent months, the regulator says it has been engaging with the largest file-sharing and file-storage services about their obligations under the Act. Additionally, a taskforce dedicated to driving compliance with small but risky services has identified and engaged with providers of smaller file-sharing and file-storage services to assess whether they are already taking appropriate measures.

Today’s enforcement programme represents the third opened by Ofcom as online safety regulator since the start of this year. In January, it opened an enforcement programme into age assurance measures in the adult sector.

Two weeks ago, it issued formal information requests to providers of a number of services setting them a deadline of March 31 by which to submit their illegal harms risk assessments to us.

It expects to make additional announcements on formal enforcement action over the coming weeks.

Ofcom enforcement director Suzanne Cater said:Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that. But, make no mistake, any provider who fails to introduce the necessary protections can expect to face the full force of our enforcement action.”

Internet Watch Foundation interim chief executive Derek Ray-Hill added: “We stand ready to work alongside Ofcom as it enforces the Online Safety Act, and to help companies to do everything they can to comply with the new duties. We have been at the forefront of the fight against online child sexual abuse for nearly three decades, and our tools, tech, and data are cutting edge.

“The Online Safety Act has the potential to be transformational in protecting children from online exploitation. Now is the time for online platforms to join the fight and make sure they are doing everything they can to stop the spread of this dangerous and devastating material.”

Related stories
Three-pronged probe into abuse of children’s privacy
Social media giants warned over children’s privacy
Hands off our kids’ data, ICO warns social media giants
Social media giants cough up €3bn for privacy failings
TikTok insists ‘we’ve changed’ following €345m EU fine
TikTok hit by £1.9m fine for data governance failings