Social media and video sharing platforms have been warned to improve their children’s privacy practices or face enforcement action amid claims many are in potential breach of the children’s code.
The UK’s Information Commissioner’s Office threat follows an ongoing review of social media platforms (SMPs) and video sharing platforms (VSPs) as part of the ICO’s children’s ocde strategy, which looked into 34 SMPs and VSPs focusing on the process young people go through to sign-up for accounts.
Varying levels of adherence to the children’s code were found, with some platforms not doing enough to protect children’s privacy.
Eleven out of the 34 platforms are being quizzed about issues relating to default privacy settings, geolocation or age assurance, and to explain how their approach conforms with the code, following concerns raised by the review.
The regulator is also speaking to some of the platforms about targeted advertising to set out expectations for changes to ensure practices are in line with both the law and the code.
Deputy commissioner Emily Keaney said: “There is no excuse for online services likely to be accessed by children to have poor privacy practices. Where organisations fail to protect children’s personal information, we will step in and take action.
“Online services and platforms have a duty of care to children. Poorly designed products and services can leave children at risk of serious harm from abuse, bullying and even loss of control of their personal information.”
The ICO has also identified areas where further evidence is needed to improve understanding of how these services are impacting children’s privacy and is launching a call for stakeholders, including online services, academics and civil society to share their views and evidence on areas of children’s privacy.
These include how children’s personal information is currently being used in recommender systems (algorithms that use people’s details to learn their interests and preferences in order to deliver content to them); and recent developments in the use of age assurance to identify children under 13 years old.
The evidence gathered will be used to inform the ICO’s ongoing work to secure further improvements in how SMPs and VSPs protect children’s privacy.
Keaney concluded: “Our world-leading children’s code has helped stop targeted advertising at children on some of the biggest social media platforms. The code has even encouraged other areas, including tech-famous California, to create their own codes. We’re now building on the code’s achievements to gather more evidence and push for further changes.”
Related stories
Hands off our kids’ data, ICO warns social media giants
Social media giants cough up €3bn for privacy failings
TikTok insists ‘we’ve changed’ following €345m EU fine
TikTok whacked with £12.7m fine for UK privacy failings
Clouds gather over TikTok: Do marketers give a toss?