Pulse360
Tech · · 2 min read

Meta isn’t doing enough to keep kids off Facebook and Instagram, rules EU

Meta is breaching Europe's Digital Services Act (DSA) rules by failing to prevent children under 13 from using Facebook and Instagram, according to a preliminary decision issued…

Meta Faces Scrutiny Over Child Safety on Social Media Platforms

The European Commission has issued a preliminary decision indicating that Meta Platforms, Inc. is not doing enough to prevent children under the age of 13 from accessing its social media platforms, Facebook and Instagram. This ruling comes as part of an extensive investigation that has lasted nearly two years, examining the company’s compliance with the Digital Services Act (DSA).

Background of the Investigation

The DSA, which came into effect to enhance online safety and accountability, mandates that digital service providers implement robust measures to protect vulnerable users, particularly minors. The European Commission’s investigation focused on whether Meta has adequately enforced age restrictions and implemented effective verification processes to prevent underage users from accessing its platforms.

Findings of the European Commission

In its preliminary ruling announced on Wednesday, the European Commission found that Meta’s current measures are insufficient. The Commission highlighted that the company has not effectively restricted access to Facebook and Instagram for children under 13, raising significant concerns about the potential risks these platforms pose to young users.

The investigation revealed that while Meta has established some age verification protocols, they are not stringent enough to prevent underage users from creating accounts. The Commission emphasized the need for more robust systems to ensure that children are not exposed to inappropriate content or online interactions that could be harmful to their well-being.

Meta’s Response

In response to the Commission’s findings, Meta stated that it is committed to ensuring the safety of its users, particularly children. The company expressed its intention to work closely with regulators to address the concerns raised. Meta has previously introduced features aimed at enhancing user safety, such as parental controls and content moderation tools. However, the effectiveness of these measures remains under scrutiny.

Implications of the Ruling

If the preliminary ruling is finalized, Meta could face significant penalties under the DSA, which includes fines that could amount to billions of euros. This decision also sets a precedent for how tech companies must approach user safety and compliance with regulations aimed at protecting minors online.

The ruling reflects a broader trend in Europe, where regulators are increasingly scrutinizing the practices of major tech companies regarding user safety and data protection. As concerns about online safety for children continue to grow, this decision may prompt other jurisdictions to reevaluate their regulations and enforcement mechanisms.

Conclusion

The European Commission’s preliminary ruling against Meta underscores the ongoing challenges and responsibilities that social media platforms face in safeguarding young users. As the investigation progresses, it remains to be seen how Meta will adapt its policies and practices to meet regulatory expectations and enhance the safety of its platforms for all users, particularly the most vulnerable. The outcome of this case could have far-reaching implications for the tech industry and its approach to child safety in the digital age.

Related stories

Tech
US · 2 min read · 29m ago

How David Sacks crashed and burned in the White House

Hello and welcome to Regulator, a newsletter exclusively for Verge subscribers about tech, politics, and Washington intrigue. (It's basically House of Cards, but for nerds.) Not a…

theverge.com