My Saved Shows
      You haven't saved any shows yet!

Nine out of ten adults in East of England back social network regulation

12 Feb Nine out of ten adults in East of England back social network regulation

More than nine out of ten adults in the East of England (92%) back regulation of social networks to make tech firms legally responsible for protecting children, a new NSPCC survey has revealed.

Almost six out of ten adults in the East (59%) do not think social networks protect children from sexual grooming, and 55 per-cent don’t think networks protect children from inappropriate content like self-harm, violence or suicide.

Nationally, six out of ten British parents do not think social networks protect children from sexual grooming and inappropriate content.

The figures emerged as the children’s charity released a detailed proposal setting out how a robust independent regulator should enforce a legal duty of care to children on social networks.

The NSPCC’s ‘Taming The Wild West Web’ vision, drawn up with the assistance of international law firm Herbert Smith Freehills, proposes the introduction of a social media regulator to force social networks to protect children on their platforms.

The regulator would:

– have legal powers to investigate tech firms and demand information about their child safety measures;
require social networks to meet a set of minimum child safeguarding standards (making their platforms safe by design) and to proactively tackle online harms including grooming;
– deploy tough sanctions for failures to protect their young users – including steep fines for tech firms of up to €20m, bans for boardroom directors, shaming tactics and a new criminal offence for platforms that commit gross breaches of duty of care (akin to corporate negligence and corporate manslaughter).

A huge majority of adults in the NSPCC’s survey also backed a call for social networks to be legally required to make children’s accounts safe, including the highest privacy settings by default, friend suggestions turned off, not being publicly searchable, and geolocation settings turned off.

Ruth Moss, whose daughter Sophie took her own life at the age of 13 after looking at self-harm and suicide content on social media, is backing the NSPCC’s campaign for statutory regulation.

Ruth said: “Sophie’s death devastated me. No mother, or family, should have to go through that. It was so unnecessary; she had so much to live for. She was only 13.

“I found out that she had been looking at completely inappropriate things online. Some of the images were so graphic that even as an adult, I was shocked. She was also communicating with people in their 30s and pretending to be older than she was, under a made up persona. Whilst the internet was heavily controlled at home and at school, Sophie had free Wi-Fi when she was out, making it very hard to ‘police’ her internet use 24 hours a day.

“Social networks should have a duty of care to protect children and vulnerable people from damaging material and self-regulation is clearly not working. The protection of our children is too important to leave to the goodwill of large, profit-orientated organisations. Statutory regulation is needed and as a matter of urgency.”