Meta Faces Another Legal EU Probe Over Breach of Digital Services Act on Child Safety

Meta
Unsplash/Dima Solomin

Meta, Facebook's parent company, is the subject of a significant investigation by the European Union regarding potential violations of the bloc's strict online content regulations regarding child safety risks.

Meta's Probe Under EU Commission

The European Commission, the EU's executive body, announced that it is investigating whether Facebook and Instagram, owned by Meta, might encourage behavioral addictions in children and create what is known as 'rabbit-hole effects.' The commission also expressed concerns about age verification processes on Meta's platforms and privacy risks associated with the company's recommendation algorithms.

A Meta spokesperson informed CNBC via email that the company is committed to providing young people with safe and age-appropriate experiences online. The spokesperson cited over 50 tools and policies Meta has developed over the past decade to protect them. The spokesperson added that addressing these concerns is a challenge faced by the entire industry, and Meta looks forward to sharing details of its efforts with the European Commission.

The EU's Digital Services Act

The commission stated that its decision to launch an investigation follows a preliminary analysis of Meta's risk assessment report submitted in September 2023.

Thierry Breton, the EU's commissioner for the internal market, expressed in a statement that the regulator is not convinced that Meta has taken sufficient measures to comply with the obligations outlined in the Digital Services Act (DSA) to reduce the risks of adverse effects on the physical and mental health of young Europeans using its platforms.

The EU announced that it will conduct a thorough investigation into Meta's child protection measures as a priority. The bloc can gather further evidence through requests for information, interviews, or inspections. The EU's initiation of a DSA probe enables it to pursue additional enforcement measures, such as interim actions and non-compliance decisions. It can consider any commitments Meta made to address its concerns.

Meta and Other Tech Companies's Scrutiny over Children's Online Safety

Meta and other major US tech companies have long faced scrutiny from the EU since the introduction of its Digital Services Act, a pioneering law aimed at addressing harmful content. Companies with violations could be fined up to 6% of their global annual revenues. However, the bloc has not yet imposed fines on tech giants under this new law.

In December 2023, the EU initiated infringement proceedings against X, formerly known as Twitter, due to suspected inadequacies in addressing content disinformation and manipulation.

The commission is also probing Meta for suspected breaches of the DSA concerning its management of election disinformation.

In April, the bloc initiated an investigation into the company, expressing concerns that Meta has not taken sufficient measures to address disinformation before the upcoming European Parliament elections.

In addition to the EU, other authorities are also taking action against Meta regarding child safety concerns, with the attorney general of New Mexico suing the company, alleging that Facebook and Instagram facilitated child sexual abuse, solicitation, and trafficking in the US Meta spokesperson stated that the company utilizes "sophisticated technology" and implements other preventive measures to identify and address predators.

Tags
Facebook, Instagram
Real Time Analytics