Charity helps Asian nations rein in child sex abuse websites


Figures obtained by Eastern Eye also showed that in 2019, 318 reports were made by people in India about potentially illegal websites through the IWF’s English language portal. In 2018, there were 316 reports. The complaints could relate to material hosted anywhere in the world (Photo: iStock).
Figures obtained by Eastern Eye also showed that in 2019, 318 reports were made by people in India about potentially illegal websites through the IWF’s English language portal. In 2018, there were 316 reports. The complaints could relate to material hosted anywhere in the world (Photo: iStock).

 



By Nadeem Badshah

CHILD sex abuse websites hosted in south Asia are being shut down thanks to a partnership between the countries’ authorities and a British charity.

The UK-based Internet Watch Foundation (IWF) in 2018 helped to remove seven websites where obscene material or servers were stored in India. Action was also taken against a website in Bangladesh.



The IWF works with partners across the world to remove criminal images and videos and is launching a portal in Pakistan this year.

Figures obtained by Eastern Eye also showed that in 2019, 318 reports were made by people in India about potentially illegal websites through the IWF’s English language portal. In 2018, there were 316 reports. The complaints could relate to material hosted anywhere in the world.

A spokesman for the IWF said: “This does not relate to the countries in which child sexual abuse material is created in or where people are looking at it, rather, it is where this material is being stored or the servers are based.



“We have taken action to remove material and worked with local partners to make the internet safer wherever we can.”

The IWF said it has seen an “astronomical” rise in videos and images on paedophile websites that girls as young as 11 had been groomed into making of themselves.

Self-generated images now made up a third of the material it found on the 132,700 websites it removed in 2019, a rise of 26 per cent on 2018.



“We would definitely urge people to report criminal content to us whenever they spot it online so we can continue to make the internet a safer place, but would encourage people to read through the guidance on our website first to make sure what they are reporting is relevant to the IWF and our work,” the IWF said.

“If we can’t help, our website does have advice for how people can report other types of material which fall outside our remit.”

An investigation by IWF analysts reported web links to check whether they contain child sex abuse material. If they are found to contain illegal images, they can be taken down. The IWF also saw a rise last year in the number of reports of abuse material, 260,400 up from 229,328 in 2018.

It comes after a landmark inquiry in January urged the government to ramp up protection of children abroad from abuse and exploitation by UK nationals.

The inquiry heard there has been no national plan to protect youngsters from abuse by Britons since 2001, and there is a lack of coordinated response from the Home Office, the Foreign and Commonwealth Office and the National Crime Agency.

Sinead Geoghegan, media manager for Every Child Protected Against Trafficking (ECPAT UK) which was involved in the inquiry, told Eastern Eye: “Internet service providers have a responsibility to keep children safe on their platforms but they must be held to account by governments, who are ultimately responsible for protecting children from harm and safeguarding victims.

“Due to the transnational nature of these crimes, offenders, victims and internet platforms may be located in different countries. It’s therefore vital that governments put in place mechanisms for greater international cooperation and information sharing to prevent child sexual exploitation online and investigate offences.”

She added: “ECPAT UK has long expressed major concerns about the sexual exploitation of vulnerable children abroad by UK nationals, who have been able to offend with impurity.

“For the last 25 years, we have monitored this issue and we know that offenders from the UK are exploiting poverty, inequality and the anonymity of the internet to abuse children abroad both remotely via the internet, as well as in person.”

Meanwhile, the Information Commissioner’s Office unveiled its Code of Practice in January which called for age verification for some websites. Internet giants will face fresh legal requirements to set up age check barriers to stop under18s accessing unsuitable material or being exploited.

On the trend of youngsters being groomed into taking pictures of themselves, Andy Burrows, head of child safety online policy at the NSPCC charity, said: “Too many of these self-generated images will be the result of children being groomed on social networks. Once an abuser has coerced a young person into taking these photos or videos, the child could lose control of who views them and could be blackmailed into sending more.

“We can not emphasise enough the need for the government to tackle this horrendous problem by introducing a comprehensive Duty of Care that forces tech companies to keep children safe on their sites.”

Meanwhile, Baroness Shields, the UK’s former online child safety czar, said technology companies should pay compensation for child abuse on their websites “in the same way as oil spills”.

She said social networks needed to pay “reparations” to help victims due to the “vast” scale of abuse unfolding on their sites.

The former senior Facebook executive added she was “absolutely appalled” by the firm’s plan to encrypt its Messenger app.

It would mean even Facebook would be unable to see what was being sent on the app, a move slammed by police, politicians and children’s charities.

Home secretary Priti Patel wrote to Facebook CEO Mark Zuckerberg last year, warning him that encryption risked creating a “digital blindspot” where paedophiles and terrorists would be able to hide their “despicable” crimes.