New online safety legislation outlines greater responsibilities for platforms

 

Mark Bentley, Safeguarding and Cyber Security Lead at edtech charity

LGfL–The National Grid for Learning

 

School’s vital role in keeping children safe online continues, however new legislation outlines greater online safety responsibilities for platforms.

The urgent need for online platforms to take responsibility for content on their sites was highlighted by the shocking levels of unwanted and inappropriate contact experienced by children online. Ofcom’s recent research, Children and Parents: Media Use and Attitudes, reported: 60% of 11-18 year-olds have been contacted online in a way that potentially made them feel uncomfortable; 30% received an unwanted friend or follow request; and 16% of secondary-schoolers have either been sent naked or half-dressed photos, or been asked to share these themselves.

Commenting on the report, Dame Melanie Dawes, Ofcom’s Chief Executive, said: “If these unwanted approaches occurred so often in the outside world, most parents would hardly want their children to leave the house. Yet somehow, in the online space, they have become almost routine. That cannot continue.

“Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression. Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.”

Under draft Codes published this November by Ofcom for the Online Safety Bill, large platforms with higher-risk services should ensure that, by default:

  • Children are not presented with lists of suggested friends
  • Children do not appear in other users’ lists of suggested friends
  • Children are not visible in other users’ connection lists
  • Children’s connection lists are not visible to other users
  • Accounts outside a child’s connection list cannot send them direct messages, and
  • Children’s location information is not visible to any other users.

 

Ofcom is also proposing that larger platforms with higher-risk services should:

  • Use a technology called ‘hash matching’ – which is a way of identifying illegal images of child sexual abuse by matching them to a database of illegal images, to help detect and remove child sexual abuse material (CSAM) circulating online, and
  • Use automated tools to detect URLs that have been identified as hosting CSAM
  • Provide crisis prevention information in response to search requests regarding suicide, and queries seeking specific, practical or instructive information regarding suicide methods.

The bottom line is that this will only work where platforms know the age of their users. To date, platforms only need to say that their site is not intended for under-18s and it is fairly easy to give a false date of birth or answer yes to “are you over 18?”. The new law however brings in a duty for platforms to have “highly effective” age checking, especially for the most harmful content – ‘primary priority content’ such as pornography and the encouragement of suicide, self-harm and eating disorders.

Hopefully in the future we will see a new ecosystem of apps which are appropriate for children, keeping them safe from many other harms too –  helped by new duties of care that apply to all sites likely to be used by children.

The good news is that some companies are already quietly releasing new systems for age and identity verification to test the waters – mostly voluntary so far – ahead of enforcement and fines of up to £18 million or 10 per cent of revenue which begins towards the end of 2024.

I would like to see more focus on simplifying parental controls, which are often difficult to use. I’d also like to see smaller sites brought into the fray – those which have by nature a niche user-base but can cause enormous harm – such as sites that encourage eating disorders.

It’s hoped that previous reliance on parents and schools will be bolstered by industry’s best efforts. In the meantime, schools – who have a responsibility to keep children safe – must continue to hold honest conversations with young people and parents, regarding the risks and harms presented by the online world.

And although these proposed changes and new online legislation will not happen overnight –  it does mean the future’s looking brighter.

 

Mark Bentley

Online safety expert Mark Bentley sits on the Department for Education’s UK Council for Internet Safety’s Education Group and works for London education charity LGfL-The National Grid for Learning where he helps schools across the country understand the safeguarding implications of today’s online world for children and young people.

Mark has been involved in the progress of the Online Safety Act since 2017.