20.1 C
New York

Meta Announces New Safety Features for Instagram Teen Accounts, Adult-Managed Profiles for Children

Meta, the company behind the popular social media platform Instagram, has recently announced some exciting news for its teenage users. In an effort to provide a safer environment for teenagers on the platform, Meta has introduced new safety features for teen accounts on Instagram.

The Menlo Park-based company revealed its plans to expand its Teen Account protection and safety features, specifically for the direct messaging (DM) feature. This move comes as a response to the growing concerns around online safety for young users, and Instagram’s commitment to ensuring a positive and secure experience for its teenage community.

With the new features, teenagers will have access to more tools to help them control their interactions and communication with other users on the platform. This includes the ability to restrict unwanted messages and restrict who can message them in the first place. These features will be available to all users under the age of 18, and can be accessed through the account settings.

One of the key features being introduced is the ability for teens to restrict messages from adults they don’t follow. This means that if a teenager receives a DM from an adult they are not following, the message will be automatically filtered into a “hidden requests” folder. This gives the teen the option to either delete the message or approve it and move it to their main inbox. This feature aims to prevent unwanted interactions and protect young users from potential online predators.

In addition, teenagers will also have the option to restrict their DMs to only people they follow. This will give them more control over who they communicate with on the platform and avoid receiving messages from strangers. This is especially important for young users who may be more vulnerable to online harassment or bullying.

Moreover, Meta has also introduced a new feature that will notify teens when they receive direct messages from adults that have been previously reported for inappropriate behavior. This will serve as a warning for teenagers and encourage them to be cautious when interacting with these individuals.

In a statement, Instagram’s Head of Safety, Antigone Davis, said, “We want Instagram to be a safe place for people of all ages, especially teenagers who are just starting to navigate the online world. With these new safety features, we hope to empower teens to make informed decisions and take control of their own online experience.”

The company has also emphasized the importance of educating teenagers about online safety and responsible social media use. Instagram has partnered with organizations such as ConnectSafely and the National Parent Teacher Association to provide resources and guidance for both parents and teens.

This is not the first time Instagram has taken steps to improve safety for its users, particularly younger ones. In 2019, the platform introduced a feature that allows users to see the age of accounts that they interact with. They also launched a “parent’s guide” to help parents understand the platform and support their children’s online activity.

It’s commendable to see a social media giant like Instagram taking proactive measures to protect its young users. With the rise of cyberbullying, online harassment, and online predators, it’s crucial to have these safety features in place to ensure a positive and secure experience for all users.

In conclusion, Meta’s latest announcement of new safety features for teen accounts on Instagram is a step in the right direction. With these tools, teenagers can feel more confident and in control of their online interactions, and parents can have peace of mind knowing their children are using the platform safely. We applaud Instagram’s efforts and hope to see more initiatives like this in the future.