Home Latest Playing catch-up with technology is not enough when it comes to safeguarding issues – Education Technology

Playing catch-up with technology is not enough when it comes to safeguarding issues – Education Technology

0
Playing catch-up with technology is not enough when it comes to safeguarding issues – Education Technology

[ad_1]

In 2020, the Age Appropriate Design Code, incorporating a range of design features relating to duty of care became law, and, following a 12-month transition period, online services now have to comply with it. Further legislation is under way with an Online Safety Bill expected to become effective in late 2023/early 2024. 

“The Online Safety Bill introduces the concept of a regulator for the internet. Ofcom will have powers to ensure that the major internet companies demonstrate a duty of care to their users. The bill seeks to ensure that children will not be exposed to online harm, and that companies can and will be fined if they fail in their duty of care,” explains Carolyn Bunting of Internet Matters.

Although the Online Safety Bill is regarded as a broadly workable model, the NSPCC believe it needs strengthening to:

  • Stop grooming and abuse spreading between apps
  • Disrupt abuse at the earliest possible stage
  • Fix major gaps in child safety duty, since high-risk sites such as Telegram and OnlyFans could be excluded because only companies with a ‘significant’ number of children on their apps would be required to protect them, resulting in high-risk services possibly being displaced to smaller sites
  • Senior management must be held accountable, with companies liable for criminal sanctions if duty of care is not upheld
  • Commit to a statutory user advocate for children.

Sonia Livingstone from London School of Economics (LSE) points out a further problem with both the Online Safety Bill and the Age Appropriate Design Code, since neither cover technology used in schools for learning or safeguarding, “because the contract is not provider-to-user but provider-to-school, the legal responsibility seems to be with the school rather than the digital provider”.

“The bill seeks to ensure that children will not be exposed to online harm, and that companies can and will be fined if they fail in their duty of care” – Carolyn Bunting, Internet Matters

She adds, “Given the pace of technological change, it is vital for schools and also businesses to make use of anticipatory strategies like data protection impact assessments, safety by design and child rights impact.”

CRIA (Child Rights Impact Assessment) was introduced as a way of assessing the impact of policies and programmes on children’s rights.  Consideration is now under way by the Digital Futures Commission as to the feasibility of using CRIA as a means of embedding children’s best interests in a digital world.

Creators of new systems are more interested in devising product than in issues of safeguarding. With technology constantly evolving, the risk is that legislators and educators play ‘catch-up’ rather than taking the initiative.

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here