Meta Platforms has begun overhauling privacy and parental controls for Instagram users under 18, converting their accounts into “Teen Accounts,” and making them private by default to address growing concerns about the harmful effects of social media on young people.
With the new controls, users of Teen Accounts will only be able to receive messages and tags from accounts they already follow or are connected to. The platform’s sensitive content settings will also be set to the most restrictive level available, ensuring a safer experience for younger users. Those under 16 can only change these default settings with a parent’s permission. Additionally, parents will have access to a suite of controls to monitor who their children engage with and set limits on their app usage.
In addition to the privacy updates, users under 18 will receive reminders to close the app after 60 minutes of usage each day and will have a default sleep mode to silence notifications overnight. Meta plans to implement these updates within 60 days for teens in the U.S., UK, Canada, and Australia, with European Union users following later in the year. A global rollout will begin in January 2025.
This move comes in response to studies linking social media use with increased levels of depression, anxiety, and learning disabilities, particularly among young users. Meta, along with TikTok and YouTube, has faced lawsuits from U.S. school districts and families over the addictive nature of social media platforms. Last year, 33 U.S. states, including California and New York, sued Meta for allegedly misleading the public about the dangers posed by its platforms.
Meta’s changes follow the U.S. Senate’s advancement of two key online safety bills: the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act, which aim to hold social media companies accountable for their platforms’ impact on young users.