Social media platforms Facebook and Instagram are currently being investigated by the Europen Union (EU), over child safety concerns.
The EU via a press release on Thursday, said it was concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioral addictions in children.
The EU wrote,
Tekedia Mini-MBA edition 16 (Feb 10 – May 3, 2025) opens registrations; register today for early bird discounts.
Tekedia AI in Business Masterclass opens registrations here.
Join Tekedia Capital Syndicate and invest in Africa’s finest startups here.
“Today, the commission has opened formal proceedings to assess whether Meta, the provider of Facebook and Instagram, may have breached the Digital Services Act (DSA) in areas linked to the protection of minors. The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioral addictions in children, as well as create so-called ‘rabbit-hole effects.
“In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta. Today’s opening of proceedings is based on a preliminary analysis of the risk assessment report sent by Meta in September 2023.”
In a briefing with journalists, senior Commission officials said they suspect Meta of failing to properly assess and mitigate risk affecting underage accounts.
Moving forward, the EU suspects Meta of infringing DSA Articles 28, 34, and 35. The Commission therefore announced plans to commence an in-depth investigation of the two platforms’ approach to child protection.
The EU investigation will address the following areas,
•Meta’s compliance with DA obligations on assessment and mitigation of risks caused by the design of Facebook’s and Instagram’s online interfaces, which may exploit the weaknesses and inexperience of minors and cause addictive behavior, and/or reinforce the so-called ‘rabbit hole’ effect. Such an assessment is required to counter potential risks for the exercise of the fundamental right to the physical and mental well-being of children as well as to the respect of their rights.
• Meta’s compliance with DSA requirements in relation to the mitigation measures to prevent access by minors to inappropriate content, notably age-verification tools used by Meta, which may not be reasonable, proportionate, and effective.
• Meta’s compliance with DA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems.
In response to the EU investigation, a Meta spokesperson said,
“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
Meta further added that its method of verifying the age of users on Facebook and Instagram involves a blend of self-declared age and AI assessments to identify minors misrepresenting their age. Furthermore, the social media giant added that users are presented with the option to report suspected underage accounts.