Instagram Tightens Privacy for Teens with New Features

Parents and guardians can now monitor children's interactions and limit app usage

158 0

Meta Platforms, the parent company of Instagram, has announced the rollout of new teen accounts designed with advanced privacy settings and parental controls. This move is part of the company’s efforts to address growing concerns about the safety of young users on social media, particularly as scrutiny around the mental health impacts of social platforms intensifies.

Instagram’s new teen accounts will feature default privacy settings that limit exposure to unwanted interactions and sensitive content. According to Meta, all designated teen accounts will automatically switch to private accounts by default, meaning only approved followers can interact with the user. Teens under 16 can only be messaged and tagged by accounts they follow or are already connected with, further safeguarding their online experience.

These accounts also have heightened content filtering, choosing sensitive content settings to the strictest levels available. This prevents younger users from viewing potentially harmful or inappropriate material while using the platform.

In addition to privacy measures, Instagram’s new teen accounts will have a robust suite of parental control options. Parents and guardians will be able to monitor their children’s interactions and limit app usage, providing them with a greater ability to oversee and protect their children online. Importantly, any changes to the default settings can only be made with parental consent, adding an extra layer of protection.

Meta is set to roll out these teen accounts within the next 60 days in key markets, including the US, UK, Canada, and Australia. Expansion to the European Union is expected by the end of the year, with plans to make the feature available globally by January 2024.

The release of these teen accounts comes at a time when Meta and other social media giants like TikTok and YouTube are under increasing scrutiny regarding the effects of their platforms on young users. Studies have shown a correlation between excessive social media use and increased levels of anxiety, depression, and other mental health concerns, particularly among teens.

Meta has enhanced safety features across its platforms in response to mounting pressure from regulators and parents. Last year, Meta faced lawsuits filed on behalf of children and school districts, accusing the company of failing to address the addictive nature of social media.

The rollout of Instagram’s teen accounts also follows legislative advancements in the United States aimed at protecting children and teens online. In July, the US Senate passed two significant bills—the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act—requiring social media companies to take responsibility for the safety and well-being of their younger users. By launching these new teen accounts, Instagram aims to provide a safer, more controlled environment for its younger audience while demonstrating its commitment to addressing the concerns of parents, regulators, and lawmakers alike.

live Now