Skip to content
Home » Editorials » Meta Faces EU Scrutiny For Failing To Protect Children On Social Media

Meta Faces EU Scrutiny For Failing To Protect Children On Social Media

The European Union has initiated a fresh investigation into Meta’s Facebook and Instagram over concerns that the platforms are failing to adequately protect children. The inquiry focuses on whether the social media giants’ recommendation systems exploit the vulnerabilities of young users, fostering addictive behaviors and exposing them to disturbing content.

This probe is part of the EU’s efforts to enforce the newly implemented Digital Service Act, which mandates stringent safety measures for minors on large online platforms.

EU Investigates Meta for Failing to Protect Children

  • Investigation Trigger: The EU is concerned that Facebook and Instagram exploit children’s inexperience and promote addictive behavior. They fear the platforms lead kids to harmful content.
  • Age Verification Concerns: The probe will check if Meta effectively blocks children under 13 from using Facebook and Instagram. The EU wants to see if Meta complies with the Digital Service Act (DSA).
  • Digital Service Act (DSA) Requirements: The DSA requires large platforms with over 45 million users to offer non-profile-based recommendations and share data with the EU.
  • Protecting Minors: Platforms must shield minors from harmful content and use age verification and parental controls. Facebook and Instagram fall under these rules.
  • Evidence Gathering: The EU will gather evidence by requesting information, conducting interviews, and inspecting Meta’s practices.
  • Potential Solutions: The Commission may accept Meta’s proposals to fix the issues found during the investigation.

Meta’s Efforts to Protect Children

  • AI-Driven Nudity Protection: Earlier this year, Meta tested an AI tool that blurs images containing nudity sent to minors through its messaging system.
  • Enhanced Safety Measures: Meta announced plans to tighten content restrictions and improve parental supervision tools for users under 18.

Other Investigations

  • Deceptive Advertising and Disinformation: In April, the EU investigated Meta for failing to address deceptive advertising and disinformation before the European Parliament elections. The regulator was concerned about disinformation from Russia, China, and Iran.
  • Scrutiny Beyond the EU: Before the DSA, Meta’s Instagram faced criticism in the U.S. A Wall Street Journal report highlighted the platform’s role in promoting underage sex content. Meta responded by improving internal controls, eliminating 27 paedophile networks, and removing 490,000 accounts violating kid safety rules.

Initiatives by the Indian Government To Protect Children Online

The Indian government has a two-pronged approach to protect children online:

Legal Measures:

  • Strict Laws: The Information Technology Act and the Protection of Children from Sexual Offences (POCSO) Act have provisions with serious penalties for online child sexual abuse content and cyber stalking.
  • Reporting Platform: A dedicated online portal (https://cybercrime.gov.in/) lets people report suspected child sexual abuse material.

Empowering Users:

  • Social Media Guidelines: The government issued guidelines for social media platforms to proactively identify and remove harmful content, making them more accountable for user safety.
  • Cyber Safety Awareness: Initiatives like handbooks and social media campaigns aim to educate children and parents about online safety practices.

Recent Posts