Top 5 This Week

Related Posts

Law change for anyone using a phone over ‘toxic’ group chats | UK | News

Children may be able to secretly exit online chats to protect them from “toxic” groups as part of a set of measures to protect young people online.

It builds on existing measures in The Online Safety Act, set to launch in early 2025, which aims to prevent children from accessing harmful content.

Ofcom will regulate the new bill and its chief, Dame Melanie Dawes, said: “Young people should be able to take themselves out of group chats that they know are toxic for them, without everybody being able to see.”

In response to an Instagram post revealing the changes, one user agreed: “Everyone should be able to remove themselves from group chats that they don’t want to be part of without feeling the anxiety of being seen!”

Another agreed, saying: “This would be such a powerful thing for children and adults!”

The Online Safety Act means companies such as Facebook, Instagram and Whatsapp could be faced with hefty fines of up to 10% of global revenue, or £18 million, if they fail to comply. Bosses could also face prison time.

One commenter believed the fines wouldn’t be effective: “Large fines won’t work, these are trillion dollar companies!”

Under the act, cyber-flashing – sending unsolicited sexual imagery online – will be a crime, as well as sharing deepfake pornography, which means using AI to impose someone’s face onto pornographic material.

It also will require pornography sites to check ages to prevent children from viewing the content.

Bereaved parents should also be able to obtain information about their children from tech firms in a more seamless way.

The act came up against opposition because of its requirement to access private messages in cases with potential child sexual abuse content.

Companies such as Whatsapp and iMessage use encrypted messaging to protect users’ privacy, and have threatened to leave the UK rather than compromise message security.

The government said the regulator Ofcom would only ask tech firms to access messages once “feasible technology” had been developed.

As pert of the act, social media firms must enforce measures to prevent children from accessing all content involving child sexual abuse; controlling or coercive behaviour; extreme sexual violence; illegal immigration and people smuggling; promoting or facilitating suicide; promoting self-harm; animal cruelty; selling illegal drugs or weapons; and terrorism.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles