in , ,

Meta Rolls Out Teen-Focused Instagram Accounts with Enhanced Safety Features

Share

Meta, the parent company of Instagram, has unveiled a new initiative aimed at enhancing the safety and well-being of teenage users on its platform. The company is rolling out specialized Instagram accounts for teens that come equipped with built-in safeguards designed to protect young users from potentially harmful content and interactions.

These new Teen Accounts will automatically limit who can contact the user, filter the content they see, and provide new avenues for teens to explore their interests safely. For users under 16, any reduction in these protective measures will require parental consent, ensuring that younger teens remain shielded from potential online risks.

The initial rollout of these enhanced accounts will take place over the next 60 days in the United States, United Kingdom, Canada, and Australia. Meta plans to expand this feature globally in January, with further implementation across other Meta platforms scheduled for 2025.

Meta’s decision to introduce these safeguards stems from growing concerns about the impact of social media on young users. The company acknowledges parents’ worries about their children encountering mature or inappropriate content online and has implemented stricter content rules for teen users.

One key feature of the new Teen Accounts is the default private setting. This applies to both new and existing accounts, requiring users under 16 (and new users under 18) to approve new followers. Non-followers will be unable to view or interact with the teen’s content, providing an additional layer of privacy.

Messaging settings for teen users have also been tightened. Teens on Instagram will only be able to receive messages from people they follow or are connected to, reducing the risk of unsolicited or inappropriate contact.

To combat excessive screen time, the platform will remind teens to log off after 60 minutes of daily use. Additionally, a sleep mode will activate between 10 PM and 7 AM, muting notifications and sending automatic replies to direct messages during these hours.

Meta is also addressing the issue of age misrepresentation. The company is developing technology to identify teen accounts even when an adult birthday is listed, ensuring that all teen users receive appropriate protections regardless of the a

In a related move announced in January 2024, Meta began hiding search results for terms related to sensitive topics such as suicide, self-harm, and eating disorders. Instead, users searching for these terms will be directed to expert resources for help, further demonstrating the company’s commitment to user safety.

As social media continues to play a significant role in the lives of young people, these new measures represent Meta’s efforts to create a safer online environment for its teenage users. By implementing these safeguards, the company aims to address the concerns of parents and policymakers while still allowing teens to engage with the platform in meaningful ways.

Source

Share

What do you think?

Written by Sylvia Duruson

Leave a Reply

Your email address will not be published. Required fields are marked *

Uncap Unveils $33 Million Fund to Empower African SMEs with Innovative Financing

Kenya Unlocks Global IT Job Market for Local Talent