Skip to content
Search

Latest Stories

Ofcom launches investigation into online suicide forum

The investigation will look into whether the forum’s service provider failed to implement adequate safety measures to protect UK users from illegal content and activity.

online-safety-iStock

This is the first time Ofcom is investigating a specific service provider under the Online Safety Act of 2023. (Representational image: iStock)

iStock

THE UK’s broadcasting regulator, Ofcom, has launched an investigation into an online suicide forum that local media reports have linked to at least 50 deaths in the country.

The investigation will look into whether the forum’s service provider failed to implement adequate safety measures to protect UK users from illegal content and activity.


This is the first time Ofcom is investigating a specific service provider under the Online Safety Act of 2023. The law requires service providers to remove illegal content once they become aware of it.

Ofcom said in a statement, "We have made several attempts to engage with this service provider in respect of its duties under the act and issued a legally binding request to submit the record of its illegal harms risk assessment to us."

"Having received a limited response to our request, and unsatisfactory information about the steps being taken to protect UK users from illegal content, we have today launched an investigation into whether the provider is complying with its legal obligations under the act."

The regulator has not named the service provider or the website due to the nature of the content.

According to the BBC, the forum is hosted in the United States and has tens of thousands of users, including children.

The report said that members of the forum discuss suicide methods and share instructions on how to obtain and use a toxic chemical.

The BBC also reported that around 50 suicides in the UK have been linked to the forum.

If the provider is found to be in breach of the law, Ofcom could seek a court order to remove the content. The provider could also face a fine of up to £18 million or 10 percent of its global revenue.

More For You

UK moves to ban DeepNude-style AI ‘nudification’ apps in online abuse crackdown

Creating explicit deepfake images of someone without their consent is already a criminal offence

iStock

UK moves to ban DeepNude-style AI ‘nudification’ apps in online abuse crackdown

Highlights

  • Government plans to ban AI tools that digitally remove clothing from images
  • New offences target the creation and supply of nudification apps
  • Measures form part of a wider strategy to cut violence against women and girls

Ban targets AI-powered image abuse

The UK government says it will ban so-called “nudification” apps, describing them as tools that fuel misogyny and online abuse. The announcement is made on Thursday as part of a broader plan to halve violence against women and girls.

Under the proposed laws, it will become illegal to create or supply artificial intelligence tools that allow users to edit images to make it appear as though a person’s clothing has been removed. The government says the offences will strengthen existing rules on sexually explicit deepfakes and intimate image abuse.

Keep ReadingShow less