Meta Faces Unredacted Complaint, Denying Allegations Linked to Child Exploitation Content

Meta
Unsplash/Julio Lopez

Meta's Facebook and Instagram faces a recent legal filing regarding child exploitation, claiming a 2021 internal company estimate found as many as 100,000 children receive sexual harassment on the platforms daily, including pictures of adult genitalia.

Child Protection Lawsuit Concerns

The allegations were revealed as portions of a newly unredacted complaint against the social media giant in an ongoing child protection lawsuit from the attorney general of New Mexico, as the platforms become increasingly popular, especially among young people.

The complaint also contains details about a 2020 internal chat at Meta, where an employee asked a colleague, "What exactly are we doing about child grooming (something I just heard is happening a lot on TikTok)?" and the colleague responded, "Child safety is an explicit non-goal this half."

In the same year, Meta executives scrambled to address a complaint from an Apple executive whose 12-year-old child was "solicited" on Facebook. Meta employees shared concerns that such incidents could anger Apple enough to consider removing Meta from the App Store. They also asked when measures would be implemented to prevent adults from messaging minors on Instagram Direct.

Meta's Commitment to Age-Appropriate Online Safety

A Meta spokesperson mentioned that the company has addressed numerous issues highlighted in the complaint, including deactivating over half a million accounts for violating child safety policies in one month alone.

Meta is committed to providing safe and age-appropriate online experiences, citing over 30 tools to support teens and their parents. They emphasized a decade of effort in addressing these concerns and hiring individuals dedicated to young people's online safety, asserting that the complaint mischaracterizes their efforts by using selective quotes and cherry-picked documents.

The lawsuit claims that Facebook and Instagram failed to safeguard underage users from online predators and that Meta employees reportedly recommended safety improvements that the company did not implement. Filed on December 5, the lawsuit asserts that the company declined to implement suggested changes because it prioritized boosting social media engagement and advertising growth over child safety, where Mark Zuckerberg, Meta's founder and CEO, is mentioned as a defendant.

Meta Employees' Attempt to Raise Concerns

New Mexico Attorney General Raul Torrez stated on Thursday that Meta employees attempted to raise concerns for years about how Meta executives' decisions exposed children to harmful solicitations and sexual exploitation, asserting that Meta executives, including Zuckerberg, consistently prioritized growth over children's safety. Despite the company downplaying the extent of illegal and harmful activities children face on its platforms, Torrez stated that Meta's internal data and presentations reveal a severe and pervasive problem.

Meta has long been criticized for how it deals with troubling content aimed at younger users. In 2021, a whistleblower named Frances Haugen leaked internal documents to the Wall Street Journal, revealing the harm caused by damaging content to teenage girls on Instagram, which Meta took no action to address despite being fully aware. Haugen later testified before a Senate panel, where she faced questions from outraged lawmakers with concerns that the company prioritized profits over users' safety.

Real Time Analytics