Meta Accused of Misrepresenting Moderation Performance and Failing to Protect Young Users
Meta, the parent company of Facebook, has been accused of misrepresenting the performance of its moderation teams and failing to protect young users in a newly unsealed complaint filed on behalf of 33 states. The complaint alleges that Meta’s Community Standards Enforcement Reports do not accurately reflect the company’s internal data on violations, as they exclude key data from user experience surveys. These surveys reveal much higher rates of user encounters with harmful content than what Meta reports publicly. For example, while Meta claims that only 10 or 11 out of every 10,000 content views on its platforms contain hate speech, internal surveys show that 19.3% of Instagram users and 17.6% of Facebook users reported witnessing hate speech or discrimination. The complaint also accuses Meta of allowing users under the age of 13 to access Instagram, despite receiving more than 1.1 million reports since 2019, and only disabling a fraction of those accounts.
The allegations have been laid out as part of a federal lawsuit filed in the U.S. District Court for the Northern District of California. If found to be in violation of privacy laws, Meta could face significant fines and increased scrutiny regarding its protection and moderation measures, particularly in relation to younger users. The outcome of the lawsuit may also provide more accurate insight into the actual rates of exposure to harmful content within Meta’s apps.
In response, Meta has stated that the complaint mischaracterizes its work and uses selective quotes and cherry-picked documents. However, the complaint raises concerns about Meta’s moderation practices and user safety approach, potentially leading to the implementation of stricter regulations around young users and data access. Europe’s new Digital Services Act (D.S.A.) includes provisions aimed at protecting younger users, such as a ban on collecting personal data for advertising purposes. Similar restrictions could be introduced in the U.S. if the complaint proceeds.
While Meta has implemented better age detection and security measures, many children still access adult versions of its apps by providing false birth year information. The complaint highlights the need for parents to monitor their child’s screen time and ensure they do not access inappropriate content. If the investigation finds that Meta knowingly allowed underage access, it could have significant consequences for the company and the social media sector as a whole.
Overall, the complaint against Meta raises questions about the accuracy of its reporting and the effectiveness of its moderation and protection measures. The outcome of the lawsuit will shed light on these issues and potentially lead to further improvements in user safety.