ICO insists kids are much safer online, but are they?

kiddieThe Information Commissioner’s Office insists that one year on from the introduction of the Age Appropriate Design Code – commonly known as the Children’s Code – kids are far better protected online, despite not a single enforcement notice or fine being issued.

The Children’s code was fully rolled out in September 2021, requiring online services including websites, apps and games to provide better privacy protections for children, ensuring their personal data is protected within the digital world.

In the past year, the ICO claims its action has prompted changes by social media platforms, gaming websites and video streaming services.

Changes include targeted and personalised ads being blocked for children, children’s accounts set to private by default, adults blocked from directly messaging children and notifications turned off at bedtime.

The regulator maintains the code has also had an international effect, inspiring reviews of children’s privacy protections in California, Europe, Canada and Australia.

Information Commissioner John Edwards said: “We’ve seen real changes since the code came into force a year ago. These changes come as a result of the ICO’s action enforcing the code, making clear to industry the changes that are required.

“The result is that children are better protected online in 2022 than they were in 2021.

“This code makes clear that children are not like adults online, and their data needs greater protections. We want children to be online, learning, playing and experiencing the world, but with the right protections in place to do so.

“There’s more for us to achieve. We are currently looking into a number of different online services and their conformance with the code as well as ongoing investigations. And we’ll use our enforcement powers where they are required.”

The regulator claims the code has been instrumental in behaviour change of big tech platforms and smaller online services. Some changes over the past year include Facebook and Instagram limiting targeting to age, gender, and location for under-18s.

Both Facebook and Instagram ask for people’s date of birth at sign up, preventing them from signing up if they repeatedly entered different dates, and disabling accounts where people can’t prove they are over 13. Instagram also launched parental supervision tools, along with new features like Take A Break to help teens manage their time on the app.

In addition, YouTube has turned off autoplay by default and turned on take a break and bedtime reminders by default for Google Accounts for under 18s.

Meanwhile, Google has enabled anyone under 18 (or their parent/guardian) to request to remove their images from Google image search results, location history cannot be enabled by Google accounts of under 18s and they have expanded safeguards to prohibit age-sensitive ad categories from being shown to these users.

Finally, Nintendo only allows users above 16 years-of-age to create their own account and set their own preferences.

While the code applies to any service being used by children living in the UK, the ICO insists the changes have been adopted by companies around the world. There has also been the launch of the California Age Appropriate Design Code Bill, which uses the ICO’s Children’s Code as a template, while Unicef is looking at how protections can be brought in globally.

The ICO says it is currently looking into how over 50 different online services are conforming with the code, with four ongoing investigations. It has also audited nine organisations and is currently assessing their outcomes.

Even so, a number of thorny issue remain including growing evidence that children are likely to be accessing adult-only services and that these pose data protection harms, with children losing control of their data or being manipulated to give more data, in addition to content harms.

As well as engaging with adult-only services directly to ensure they conform with the code, the regulator says it will be working closely with Ofcom and the Department for Digital, Culture, Media and Sport to establish how the code works in practice in relation to adult-only services and what they should expect.

Meanwhile, an new investigation by children’s digital rights charity 5Rights has accused education technology companies of making children’s data vulnerable to commercial exploitation.

The 5Rights Foundation has conducted research showing how Google as well as other third parties have tracked children’s clicks on external links, while they were using Google Classroom and ClassDojo. This data can be used to determine preferences and display personalised advertising, the organisation claims.

And, a final decision on a complaint against Instagram’s handling of children’s data in the European Union is set to land within weeks. The inquiry about Instagram’s handling of kids’ data was opened by the Irish Data Protection Commission back in September 2020.

Related stories
ICO says it has ‘limited resources’ to enforce kids code
Tough new laws aim to protect kids from online harm
Government urged to tighten up data laws for children
Kids’ online privacy study among winners of ICO grants
5m customers hit as kids’ toy firm Vtech is hacked