The ICO’s Age Appropriate Design Code, or Children’s Code, which came into force last year, exemplifies this and its impact has continued to grow. The Code includes 15 standards to which companies must adhere in order to more effectively safeguard children online and uphold their data rights in acknowledgment of the fact that children are classified as vulnerable data subjects. These include having geolocation tracking turned off by default for under 18s, having profiling turned off by default, and ensuring that companies uphold their own user behaviour and content policies.
The code applies to edtech services that are likely to be accessed by children on a direct-to-consumer basis. The code also applies to edtech services in another scenario, such as where an edtech service is provided to children through a school, and the edtech provider influences the nature and purpose of children’s data processing.
The secretary of state laid the Age Appropriate Design Code to Parliament under section 125(1)(b) of the Data Protection Act 2018 (the Act) on 11 June 2020. The ICO issued the code on 12 August 2020, and it is now in force. The Act itself details further responsibilities for schools when procuring edtech services. These include due diligence, safeguarding and oversight.
Guidance on the ICO’s website indicates: “Schools must think carefully about the responsibilities they and the EdTech provider will hold under a specific contractual agreement. More specifically, the degree to which the edtech provider will be able to influence how children’s data is used. Schools need to consider who is acting as a sole data controller, or whether they are joint controllers and processors.”
The IEEE standards body issued the Age Appropriate Design framework which details a set of processes for organizations when designing digital products and services with children in mind. To operationalise the principles and processes detailed in both the Code and the IEEE standard TrustElevate has developed a Child Rights Impact Assessment (CRIA), as part of a UK Government funded programme of work, and in association with a range of stakeholders including both edtech companies and device distributors. A CRIA is a series of interrelated decision trees requiring engineers, data scientists, and commercial teams to consider risks, harms and safeguards associated with product features made accessible to children in specific age bands.
Increasingly blended education is the norm and edtech companies are responsive to safeguarding issues, in part because the value chain within which these operate include schools that are answerable to the Department for Education.
Edtech companies are also trialing a child age verification and parental consent service developed by TrustElevate that enables businesses to ensure that children’s data are protected and that sufficient thought is given to age-appropriate safeguarding measures.
While this may, for some, be challenging to adhere to in light of many businesses’ reliance on the gathering and selling of personal data, it is a positive step for our children and young people. As we look forward, I’m sure we can agree that it is critical that children and young people are enabled to grow and learn in digital environments without worrying about harmful data practices and minimal safeguarding and that we all do our part in facilitating that.
We must acknowledge and respect that children are different from adults and require greater protections. We must acknowledge, too, that 1 in 3 internet users is a child and ensuring that they are seen and treated accordingly will be a combined effort.
There are few sectors to which these principles are more vital: edtech is instrumental in shaping children’s day to day experiences of digital environments, informing their understanding of what is acceptable conduct, content and contact online and in demonstrating the duty of care digital service providers should be showing them. As digital service providers’ duties of care become more clear over the next few years and standards evolve, those demonstrating best practice will be the most successful.
You might also like: Is ID verification the answer to social media safety?