Post by : Shweta
The European Commission has preliminarily determined that Meta Platforms may have breached EU digital safety regulations by inadequately safeguarding children on Facebook and Instagram.
As per the Commission's findings, Meta's measures fell short in preventing users under 13 from accessing their platforms, despite internal policies prohibiting account creation for that age group.
This investigation is part of the EU’s Digital Services Act (DSA), which mandates substantial online platforms to shield users, especially minors, from harmful online threats.
Regulators noted that Meta's existing safeguards are ineffective since children can easily circumvent age checks merely by inputting incorrect birthdates during registration. They contend the company lacks robust age verification mechanisms to prevent underage access.
Additionally, the Commission criticized the complexity of Meta’s reporting tools, stating users face multiple hurdles before accessing interfaces designed to report harmful content.
Regulators also indicated that Meta’s internal assessment underestimated how many children under 13 access Facebook and Instagram, with EU estimates suggesting that 10 to 12 percent of minors in the region may still engage with these platforms.
Henna Virkkunen, the EU’s technology chief, emphasized that online safety regulations need to be actionable, urging companies to implement effective protections for younger users.
In response, Meta expressed their disagreement with these preliminary conclusions but admitted that age verification is a challenge industry-wide. The company also announced forthcoming safety enhancements.
The investigation into Meta initiated in May 2024 as part of broader European measures to reinforce oversight of major tech firms under the DSA. This legislation empowers EU regulators with enhanced capabilities to probe issues such as child safety and misinformation.
If these preliminary findings are substantiated, Meta could face significant financial penalties. Under EU law, firms that violate the DSA could be fined up to six percent of their global annual revenue, which may reach billions.
This case underscores escalating global scrutiny of major social media platforms regarding child safety. Governments and regulatory bodies are intensifying their examination of digital companies’ handling of underage users, harmful content, and related mental health matters.
The European Commission has also launched inquiries into additional social media platforms regarding child safety concerns, including recent investigations into Snapchat.
The EU's actions are reflective of a broader international dialogue about the accountability of large tech companies in protecting younger users. Policymakers are increasingly advocating for enhanced identity verification, more rigorous content moderation, and streamlined reporting solutions to mitigate online risks for children.
The Meta investigation remains ongoing, granting the company an opportunity to formally address the allegations and suggested changes before any definitive judgement is rendered by EU authorities.
DAE's First Quarter Financial Surge Sets New Highs
Dubai Aerospace Enterprise sees record first-quarter revenue and profit growth, alongside a major ac
Sony's PS5 Price Increase Set for Southeast Asia on May 1
Starting May 1, 2026, Sony will raise PS5 prices across Southeast Asia. Discover what this means for
Potential Super El Niño 2026: Understanding Climate Threats
Is a Super El Niño on the horizon for 2026? Explore its potential effects and global climate implica
Global Oil Supply Crisis Heightens Market Uncertainty | Prices Rise
Global markets are unsettled as oil supply issues escalate, driving prices up and impacting investme
Must-See Attractions in London for Every Traveler
Explore London's top attractions from royal sites to cultural hubs, ensuring an unforgettable trip f
2026 Flight Booking Tips: Secure the Best Rates
Unlock the secrets to finding affordable flights in 2026 with these expert strategies and timing tri