Meta adds teen safety features to Instagram, Facebook

Meta, the parent company of Instagram and Facebook, has implemented new safety features to protect teenage users on its platforms. The company has removed over 600,000 predatory accounts from Instagram and Facebook, demonstrating its commitment to creating a safer online environment for young users. The enhanced safety protections include features that limit the ability of adults to interact with minors, such as the restriction of direct message functionality and the introduction of age verification measures. Additionally, Meta has implemented tools that empower teenagers to have more control over their online experience, including the ability to bulk delete posts and limit who can comment on their content. These updates are part of Meta's broader efforts to prioritize the well-being and safety of its younger users, who are particularly vulnerable to online risks such as cyberbullying, exploitation, and misinformation. By taking these proactive steps, the company aims to create a more secure and age-appropriate digital experience for teenagers on its platforms.
Note: This is an AI-generated summary of the original article. For the full story, please visit the source link below.