Pulse360
Tech · · 2 min read

Facebook and Instagram are using AI bone structure analysis to identify photos of kids

Facebook and Instagram have a new way to detect and remove users under 13: AI bone structure analysis. In a blog post on Tuesday, Meta - Facebook and Instagram's parent company -…

Meta Implements AI Technology to Enhance Age Verification on Social Media

In a significant move to bolster child safety on its platforms, Meta, the parent company of Facebook and Instagram, has announced the introduction of an artificial intelligence (AI) system designed to detect and remove users under the age of 13. This initiative, detailed in a recent blog post, employs advanced bone structure analysis to identify the age of individuals depicted in photos and videos shared on these platforms.

The Role of AI in Age Verification

The AI system developed by Meta aims to enhance the existing measures for age verification, a critical issue given the increasing concerns surrounding the safety of minors on social media. By analyzing “general themes and visual cues,” including height and bone structure, the AI can assess the likelihood that a user is underage. This technology represents a significant advancement in how social media platforms can leverage AI to protect younger audiences from inappropriate content and interactions.

Addressing Underage Use of Social Media

The challenge of underage users accessing social media platforms is not new. Regulations in various jurisdictions, including the Children’s Online Privacy Protection Act (COPPA) in the United States, mandate that platforms must take steps to prevent children under 13 from creating accounts. Despite these regulations, many children manage to bypass age restrictions, raising concerns among parents and regulators alike.

Meta’s introduction of AI bone structure analysis is part of a broader strategy to create a safer online environment for children. By employing this technology, the company aims to proactively identify and remove underage accounts before they can engage with potentially harmful content or interactions.

Ethical Considerations and Privacy Concerns

While the use of AI for age verification may enhance safety, it also raises important ethical and privacy considerations. Critics argue that the implementation of such technology could lead to potential misuse, including the collection of biometric data without user consent. Meta has stated that it is committed to ensuring privacy and security throughout this process, but concerns remain regarding how data will be handled and the potential for false positives in age detection.

The effectiveness of AI in accurately determining age based on physical characteristics is still a subject of debate. While advancements in technology have made significant strides, the accuracy of such systems can vary, leading to questions about the reliability of the results and the implications for users who may be incorrectly identified as underage.

Looking Ahead

As Meta rolls out this AI-driven approach, it will be essential to monitor its impact on user experience and safety. The company has indicated that it will continue to refine its systems based on user feedback and technological advancements. The success of this initiative could set a precedent for other social media platforms grappling with similar challenges regarding age verification and child safety.

In conclusion, Meta’s introduction of AI bone structure analysis marks a noteworthy step towards enhancing the safety of children on social media. However, as with any technological advancement, it is crucial to balance safety measures with ethical considerations and user privacy. The ongoing dialogue surrounding these issues will be vital as the digital landscape continues to evolve.

Related stories

Tech
US · 2 min read · 29m ago

How David Sacks crashed and burned in the White House

Hello and welcome to Regulator, a newsletter exclusively for Verge subscribers about tech, politics, and Washington intrigue. (It's basically House of Cards, but for nerds.) Not a…

theverge.com