Meta Platforms Inc is facing intensified scrutiny over concerns about its impact on minors. In October 2023, 34 U.S. states filed a lawsuit against Meta, accusing the company of manipulating minors on its platforms, potentially damaging their mental health. This legal action reflects growing apprehension about the influence of Meta's digital offerings on young users, according to Cointelegraph.
In parallel with the U.S. legal challenges, the European Commission has launched a formal investigation into Meta's child protection measures. The investigation probes whether Meta has adequately reduced risks to the physical and mental well-being of children engaging with its platforms. This inquiry underscores the global focus on the safety of social media platforms' youngest users, as reported by NECN.
In response to mounting criticism, Meta announced new policies in January 2024 aimed at enhancing safety for teen users on Facebook and Instagram. These changes involve stricter content controls to prevent minors from seeing unsuitable content, including themes related to self-harm and eating disorders, even from their followed accounts, as noted by CNN. Despite these efforts, questions remain about the effectiveness of Meta's measures, including newly introduced parental controls, to safeguard vulnerable adolescents.