Online companies will soon face the threat of mega fines for ignoring a raft of new measures designed to protect children from content which could leave them open to sexual abuse, self-harm and suicide following the publication of new rules for “age-appropriate design”.
The Information Commissioner’s Office has been working on the new rules since they were enshrined in Data Protection Act 2018, which brought GDPR into UK law. The ICO submitted the code to the Secretary of State in November and it must complete a statutory process before it is laid in Parliament for approval.
After that, organisations will have 12 months to update their practices before the code comes into full effect in the autumn of 2021; firms that break the law can face GDPR-size sanctions, including fines of up to £17m or 4% of global turnover.
Companies that make services likely to be accessed by a child will have to take account of 15 principles designed to ensure their services do not cause harm by default. These include:
– Settings must be “high privacy” by default, unless there is a compelling reason not to;
– Only the minimum amount of personal data should be collected and retained;
– Children’s data should not usually be shared;
– Geolocation services should be switched off by default;
– Nudge techniques should not be used to encourage children to provide unnecessary personal data, weaken or turn off their privacy settings. is a compelling reason not to.
Information Commissioner Elizabeth Denham said: “Personal data often drives the content that our children are exposed to – what they like, what they search for, when they log on and off and even how they are feeling.
“In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing online services do so with the best interests of children in mind. Children’s privacy must not be traded in the chase for profit.”
In the final version of the code, the ICO says it will take a “commonsense” approach to the question, but notes that “if your service is the kind of service that you would not want children to use in any case, then your focus should be on how you prevent access.
Denham added: “One in five Internet users in the UK is a child, but they are using an Internet that was not designed for them. There are laws to protect children in the real world – film ratings, car seats, age restrictions on drinking and smoking. We need our laws to protect children in the digital world too.
“In a generation from now, we will look back and find it astonishing that online services weren’t always designed with children in mind.”
The code to be large social media companies, including YouTube, TikTok and Snapchat, all of which have significant numbers of child users.
Google has already tightened up its YouTube kids’ video policies, meaning brand owners will no longer be able to run targeted ads and the comments section will also be disabled as part of a $170m settlement with US authorities over alleged violations of children’s privacy laws.
YouTube cuts off targeted kids ads in global overhaul
Ad body scoffs at claims that TV spots make kids gorge
Government urged to tighten up data laws for children
Kids’ online privacy study among winners of ICO grants
Junk food lobby goes ape over ad for banana gadget