Playing catch-up with technology is not enough when it comes to safeguarding issues

Internet safeguarding issues and legal requirements for educators are complex and stringent, with institutions at constant risk of playing ‘catch-up’ with emerging technology. There are calls for legislation to be strengthened, and greater training for educators. Giving young people and children agency to help control safety issues is crucial because they can talk directly to their peers, writes Angela Youngman

Internet safeguarding issues have never been more complex, nor the legal duties on schools and colleges more stringent. It’s a situation that’s set to get even tougher with the widespread use of artificial intelligence (AI), biometrics and surveillance, such as embedded sensors and cameras.

Research undertaken by 5Rights Foundation and Revealing Reality has indicated that within 24 hours of children opening an account on TikTok and Facebook, users can be targeted with a stream of unsolicited content including eating disorders, suicide, sexualised imagery and distorted body images. When interviewing engineers and designers, Revealing Reality discovered the aim was to maximise engagement, activity and followers, not to keep children safe. September 2021NSPCC analysis of police records revealed that reports of child sexual abuse offences involving an online element have surged by 79% in just four years.

Legislation needs strengthening

In 2020, the Age Appropriate Design Code, incorporating a range of design features relating to duty of care became law, and, following a 12-month transition period, online services now have to comply with it. Further legislation is under way with an Online Safety Bill expected to become effective in late 2023/early 2024. 

“The Online Safety Bill introduces the concept of a regulator for the internet. Ofcom will have powers to ensure that the major internet companies demonstrate a duty of care to their users. The bill seeks to ensure that children will not be exposed to online harm, and that companies can and will be fined if they fail in their duty of care,” explains Carolyn Bunting of Internet Matters.

Although the Online Safety Bill is regarded as a broadly workable model, the NSPCC believe it needs strengthening to:

  • Stop grooming and abuse spreading between apps
  • Disrupt abuse at the earliest possible stage
  • Fix major gaps in child safety duty, since high-risk sites such as Telegram and OnlyFans could be excluded because only companies with a ‘significant’ number of children on their apps would be required to protect them, resulting in high-risk services possibly being displaced to smaller sites
  • Senior management must be held accountable, with companies liable for criminal sanctions if duty of care is not upheld
  • Commit to a statutory user advocate for children.

Sonia Livingstone from London School of Economics (LSE) points out a further problem with both the Online Safety Bill and the Age Appropriate Design Code, since neither cover technology used in schools for learning or safeguarding, “because the contract is not provider-to-user but provider-to-school, the legal responsibility seems to be with the school rather than the digital provider”.

“The bill seeks to ensure that children will not be exposed to online harm, and that companies can and will be fined if they fail in their duty of care” – Carolyn Bunting, Internet Matters

She adds, “Given the pace of technological change, it is vital for schools and also businesses to make use of anticipatory strategies like data protection impact assessments, safety by design and child rights impact.”

CRIA (Child Rights Impact Assessment) was introduced as a way of assessing the impact of policies and programmes on children’s rights.  Consideration is now under way by the Digital Futures Commission as to the feasibility of using CRIA as a means of embedding children’s best interests in a digital world.

Creators of new systems are more interested in devising product than in issues of safeguarding. With technology constantly evolving, the risk is that legislators and educators play ‘catch-up’ rather than taking the initiative.

Educators need constant training

Equipping educators with the tools to safeguard students as the line between digital/physical lives continues to blur, involves constant training, greater awareness of the issues, and understanding the tech and how it accelerates the problem, in order to combat it. Organisations like the NSPCC provide training and resources on procedures and policies, identifying potential risks including online radicalisation, extremism, sexting and cyberbullying together with management training, setting up online communities and ensuring virtual environments are safe. Videoconferencing has increased in use and requires considerations around consent, adult-to-child ratios and maintaining professional boundaries. All educational establishments have a legal responsibility to have a clear strategy and named people to contact when problems arise, whether internal or external, such as Childline.

Jon Farrer, designated safeguarding lead (DSL) at Windlesham House School, says, “It is key for educators to work to implement effective digital citizenship programmes in schools. It is vital that educators keep themselves online. Schools can implement monitoring and filtering systems, encouraging and educating families to use the available tools at home to develop positive digital wellbeing habits.”

Image source: Rawpixel/Freepik

Technology can play an important role, as Carolyn Bunting points out. “New technologies are emerging all the time, particularly the use of artificial intelligence (AI). This is key in tackling the experience young people have online. We need AI to make better recommendations, so children are not exposed repeatedly to inappropriate content or fake news and misinformation.”

Give children and young people agency

Ultimately, one of the most crucial elements in providing effective safeguarding is to ensure children and young people become active participants in the process. Louise Willis-Keeler of Psych-Logical says: “They need to know where there is a safe space and where to go if something doesn’t feel right, and what to do about it, not ignore it. It could be a matter of refusing to participate, of blocking a website, contact or using a wider support network, providing others with an alert. Clear signposts are needed. Students have the right to feel safe and the right to speak out.”

Children and young people have the right to be involved alongside educators in ensuring effective long-term safeguarding for themselves and their wider community, understanding that speaking out can make a difference. By acquiring the skills and knowledge to recognise risk and prevent it turning into a harmful situation, they gain awareness of what is acceptable and advise friends and peers accordingly.

Childnet has made this a priority, as Will Gardner explains. “They are the target audience and should be involved as part of the solution. It is important to build up strategies with them, not for them. Childnet has a Digital Leaders Programme involving online courses in safety. Young people become designated leaders within their community and seek ways to raise awareness.  We know young people try to make decisions about unacceptable content every day, and we aim to give them agency to undertake this effectively.”

Recent digital innovations have helped make children reflective users of technology. The BBC IT app offers a default keyboard overlay responding to the user’s interactions; while Children in Need has assisted with the development of Ollee, a digital friend who can provide help and advice.

Companies and technology providers need to consider safeguarding implications as integral to what they are producing or amending as it’s much harder to add it on afterwards. Educators have their role in ensuring that the technology can be safely used by children and young people, and, if necessary, add relevant controls. 

You might also like: The cheat is on

Leave a Reply

Free live webinar & QA

The digital difference - Build a culture of reading with ebooks & audiobooks

Free Education Webinar with OverDrive

Friday, June 24, 2PM London BST

In this webinar, hear from Havant Academy Librarian Joanna Parsons to learn how she uses ebooks and audiobooks to help boost reading among her secondary students.