Meta is once again putting the burden on parents to monitor and protect their kids rather than building a safe product to begin with. And this change doesn’t address the fundamental problem: The main function of Meta’s chatbots is to manipulate young people into spending more time on the platform by encouraging teens to form unhealthy emotional connections to bots. If Meta really cared about children’s well-being, it would pull the plug on chatbots for minors rather than adding to parents’ already full plates.
Furthermore, given the company’s history of breaking its promises to protect teens and support parents, there is no reason to take today’s announcement with anything other than a heavy dose of skepticism. As we showed in the ‘Teen Accounts, Broken Promises’ report we co-authored last fall, nearly two-thirds of the safety tools Meta advertised for Instagram Teen Accounts either did not work or were nonexistent.”
- There is widespread documentation of Meta’s AI characters engaging in sexual conversations with young people and talking about suicide and self-harm
- The Wall Street Journal report that found Meta’s companion bots will engage in overtly sexual discussions with users, going so far as to explicitly describe specific sexual acts, even when users self-identify as minors.
- In addition, some of Meta’s most popular companion bots are designed to impersonate children and teens, and reporters found that adults can use these bots to simulate sexual roleplay with minors.
Meta:
- Meta cannot be trusted when it comes to teen safety.
- The 2025 “Teen Accounts, Broken Promises” report extensively documents how nearly two-thirds of Instagram’s promoted safety tools for teens were ineffective or nonexistent.
- Meta spent a record $24.4 million on lobbying in 2024, helping to kill the Kids Online Safety Act in the US House.
- Meta’s own current and former employees told The Washington Post that Meta “deployed its legal team to screen, edit and sometimes veto internal research about youth safety in VR … seeking to ‘establish plausible deniability’ about negative effects of the company’s products.”
- Meta has used announcements like these to stave off government regulation before. The company announced Instagram Teen Accounts in September 2024 right before an important hearing on the Kids Online Safety Act in the US House.