Massive Controversy Erupts as Meta’s Apps Infamously Circulate CSAM Material

Dec 24, 2023 | Instagram News

PodSwap is Instagram Pods 2.0

No more fake engagement and messy pod management. AI-powered connections, genuine engagement tracking, and rewards for doing it right. Get real followers, likes, views, comments, and story views – all for FREE with PodSwap


  • Child abuse content continues to be distributed on Meta’s networks, according to new investigations by independent research groups.
  • Groups on Facebook and Instagram have been found to distribute child sexual abuse, including live-streaming videos of child sex abuse on Instagram.
  • Meta is working with other platforms to improve enforcement efforts and has improved its technology to identify offensive content.

Child Abuse Content Still Being Distributed on Meta’s Networks

Meta, formerly known as Facebook, is facing further scrutiny over its efforts to enforce policies against child sexual abuse material (CSAM) on its platforms. Investigations by independent research groups have found that instances of child abuse content are still being distributed on Meta’s networks, including Facebook and Instagram. Despite efforts to hide and disable groups and accounts, child sexual abuse continues to be shared across the platforms. Meta has stated that it is working to improve its enforcement efforts and has enhanced its technology to identify offensive content. The company is also expanding its network detection efforts to prevent pedophiles from connecting with each other on its apps. However, CSAM actors are constantly revising their approaches to evade detection, posing a continuous challenge for Meta.

CSAM is a significant concern for all social and messaging platforms, with Meta having a bigger responsibility due to its size and reach. Meta’s own statistics reveal the extent of CSAM on its platforms. In 2021, Meta detected and reported 22 million pieces of child abuse imagery to the National Centre for Missing and Exploited Children. In the previous year, Facebook was responsible for 94% of the 69 million child sex abuse images reported by U.S. technology companies. The prevalence of CSAM on Meta’s platforms has raised concerns about the company’s push for full messaging encryption, as it could hinder efforts to combat the distribution of such content.

While encryption would provide more privacy for users, it also raises the risk of expanded CSAM distribution as it becomes more difficult to monitor and intervene in encrypted groups. The balance between privacy and moderation is a challenging one that social platforms constantly grapple with. Elon Musk, for example, has advocated for more speech on his social app, but this has also resulted in advertisers opting not to display their promotions on his platform. Meta has faced revenue pressure and has instructed its integrity teams to prioritize objectives that reduce “advertiser friction” and avoid limiting the usage of its products.

Another challenge is Meta’s recommendation systems inadvertently connecting like-minded users, potentially leading to more CSAM-related activity. Meta is always working to restrict the spread of CSAM, but with CSAM groups adapting their communication methods, it becomes harder for Meta’s systems to detect and avoid recommending related content. The company is also facing scrutiny from European regulators regarding child safety concerns on Instagram. Compliance with regulations such as the Digital Services Act may be necessary to avoid fines and further sanctions in the EU.

Meta has until December 22nd to outline its efforts to combat CSAM and ensure greater safety on its platforms.



Original Source News Link

Free Instagram Engagement & Connections Forever

Step 1

Register Your FREE Account

Step 2

Securely Add Your Instagram Usernames to the Account Manager

No Passwords Required!

Step 3

Interact and Earn Engagement Points

Exchange Engagement Points for Real Engagement on Instagram!