Meta boosts teen safety with new features on Facebook and Messenger

Reuters

Meta is rolling out its “Teen Accounts” feature to Facebook and Messenger, offering enhanced privacy and parental controls to protect young users online. This comes as lawmakers push for stricter social media regulations, responding to growing concerns over children’s safety.

Meta Platforms is rolling out its “Teen Accounts” feature to Facebook and Messenger, a move designed to address growing concerns over the safety of young users online. The feature, which was first introduced on Instagram last year, offers enhanced privacy and parental controls aimed at giving parents more oversight and protecting teens from potential online harms.

This expansion comes amid increased pressure from lawmakers, who are pushing forward with legislative efforts like the Kids Online Safety Act (KOSA), which seeks to impose stricter regulations on how social media platforms engage with minors. Meta, alongside other tech giants like TikTok and YouTube, has faced multiple lawsuits over the addictive nature of social media and its negative effects on children.

Under the new guidelines, teens under 16 will need parental permission to go live on Facebook and Messenger and will have the option to disable a feature that blurs explicit content in direct messages. These changes are expected to be rolled out in the coming months.

As regulatory scrutiny intensifies, Meta’s expansion of teen safety measures highlights the growing need for platforms to take responsibility for how their services affect children and teens in the digital age.

Tags

Comments (0)

What is your opinion on this topic?

Leave the first comment