Apps for children must offer privacy by default
Apps, social media platforms and online games that are specifically targeted at children will now have to put privacy at the heart of their design.
A code of practice outlining how children’s data should be protected has come into force and firms have 12 months to comply with the new rules.
If they do not, they could face huge fines imposed by the Information Commissioner’s Office.
Some questioned whether the code would bring about real change.
Information commissioner Elizabeth Denham said it was an important step towards protecting children online.
“A generation from now we will all be astonished that there was ever a time when there wasn’t specific regulation to protect kids online. It will be as normal as putting on a seatbelt.
“This code makes clear that kids are not like adults online, and their data needs greater protections.”
She said the Information Commissioner’s Office (ICO) recognised that it could be difficult for smaller businesses to comply with the code and would offer “help and support” over the coming year.
Among the tenets of the code are:
- the best interests of the child should be a primary consideration when designing and developing online services
- high levels of privacy must be set by default
- only a minimum amount of personal data should be collected and retained
- children’s data should not be shared unless there is a compelling reason to do so
- children’s personal data should not be used in ways that could be detrimental to their wellbeing
- geo-location should be switched off by default
Others who must conform to the code include educational websites, streaming services that use, analyse and profile children’s data and the makers of connected toys.
‘Well-intentioned’
The ICO has the power to fine firms up to 4% of their global turnover if they breach data protection guidelines. The organisation has previously said it will take more severe action when it sees harm to children.
In September last year, YouTube was fined $170m (£139m) for collecting data on children under 13 without the consent of their parents, following a US investigation by the Federal Trade Commission.
The scope of protections needed for children online was huge and the ICO might not be up to the job, said one digital rights campaigner, Jen Persson.
“The code is well-intentioned, and if enforced, may bring about some more focused change in the approach of some apps and platforms to stop collecting excessive data from children for example, and start to meet the requirements of core data protection law in place for over 20 years.
“The key risks are that since the ICO has not enforced to date on behalf of children in its current remit of concrete data protection law, that it may be seen as not having the capability to enforce those new things in the code that go beyond that and are subjective, such as the best interests of the child, or that outstrip the ICO technical knowledge and capacity.”
Andy Burrows, head of child safety online policy at the NSPCC, said he hoped the code would force a rethink on the content provided to children.
“Tech firms have a year to prepare for this transformative code that will force them to take online harms seriously, so there can be no more excuses for putting children at risk.
“For the first time, high-risk social networks will have a legal duty to assess their sites for sexual abuse risks and no longer serve up harmful self-harm and suicide content to children.”