Meta is rolling out an AI system that analyzes visual cues—including height and bone structure from photos—to identify underage users on its platforms. The system is currently operating in select countries and Meta plans a broader rollout. This approach attempts to solve Meta's persistent challenge of age verification on platforms like Instagram, where underage users often misrepresent their age.
The system raises significant privacy and accuracy concerns. Visual analysis for age determination is inherently imprecise and could create false positives and negatives. It also requires analyzing biometric data from user-uploaded photos, which carries privacy implications.
What This Means for Your Business
This signals that regulators increasingly expect platforms to implement active age verification—not just relying on user self-reporting. If your company operates any platform where age matters (gaming, social media, commerce), expect similar pressure to deploy verification systems. However, be cautious about the accuracy and privacy tradeoffs. Biometric-based age detection is unproven at scale and may create liability. Consider third-party age verification services or digital ID integration as more transparent alternatives.