Boodor is Your Go-to Source for the Latest Business News, Stay Informed and Make Informed Decisions.
⎯ 《 Boodor • Com 》

Instagram 'most important platform' for child sex abuse networks: report

2023-06-09 03:49
Instagram is the main platform used by pedophile networks to promote and sell content showing child sexual abuse, according to a report by Stanford University...
Instagram 'most important platform' for child sex abuse networks: report

Instagram is the main platform used by pedophile networks to promote and sell content showing child sexual abuse, according to a report by Stanford University and the Wall Street Journal.

"Large networks of accounts that appear to be operated by minors are openly advertising self-generated child sexual abuse material for sale," said researchers at the US university's Cyber Policy Center.

"Instagram is currently the most important platform for these networks with features like recommendation algorithms and direct messaging that help connect buyers and sellers."

A Meta spokesperson on Thursday told AFP that the company works "aggressively" to fight child exploitation and support police efforts to capture those involved.

"Child exploitation is a horrific crime," the Meta spokesperson said in response to an AFP inquiry.

"We're continuously exploring ways to actively defend against this behavior, and we set up an internal task force to investigate these claims and immediately address them."

Meta teams dismantled 27 abusive networks between 2020 and 2022, and in January of this year disabled more than 490,000 accounts for violating the tech company's child safety policies, the spokesperson added.

"We're committed to continuing our work to protect teens, obstruct criminals, and support law enforcement in bringing them to justice," the Meta spokesperson said.

According to the Journal, a simple search for sexually explicit keywords specifically referencing children leads to accounts that use these terms to advertise content showing sexual abuse of minors.

The profiles often "claim to be driven by the children themselves and use overtly sexual pseudonyms", the article detailed.

While not specifically saying they sell these images, the accounts do feature menus with options, including in some cases specific sex acts.

Stanford researchers also spotted offers for videos with bestiality and self-harm.

"At a certain price, children are available for in-person 'meetings'," the article continued.

Last March, pension and investment funds filed a complaint against Meta for having "turned a blind eye" to human trafficking and child sex abuse images on its platforms.

As of the end of last year, technology put in place by Meta had removed more than 34 million pieces of child exploitation content from Facebook and Instagram, all but a scant percentage of it automatically, according to the Silicon Valley tech firm.

bur-leg-gc/arp