Social Media

Under-16s Need Parental Consent for Instagram Live: Meta’s new Rules Explained

Internet Desk: In a major move to enhance the safety of young users, Meta has expanded its teen safety features from Instagram to Facebook and Messenger. The updated features aim to provide greater parental control and reduce risks associated with online interaction for teenagers.

Internet Desk: In a major move to enhance the safety of young users, Meta has expanded its teen safety features from Instagram to Facebook and Messenger. The updated features aim to provide greater parental control and reduce risks associated with online interaction for teenagers.

One of the most notable changes is a restriction on the use of Instagram Live. Teens under 16 years of age will no longer be allowed to go live unless they have explicit permission from a parent.

Additionally, Meta is strengthening its policy on nudity in direct messages (DMs). A feature that blurs suspected explicit images will now require parental approval to disable.

Teen Account Settings Rolled Out to Facebook and Messenger

Meta will also implement the Teen Account system, first introduced on Instagram, to Facebook and Messenger. These accounts come with built-in safety settings for users under 18.

Parents will now be able to:

  • Set daily screen time limits
  • Restrict usage during specific hours
  • View who their child is messaging

With this update:

  • Users under 16 will need parental consent to change any safety settings.
  • Users aged 16 and 17 will have settings turned on by default, but will have the option to adjust them independently.

The update will initially launch in the US, UK, Canada, and Australia.

Majority of Teens Keep Safety Features On

Meta revealed that more than 90% of 13- to 15-year-olds on Instagram have kept the restrictions enabled. Currently, around 54 million teenagers are using Meta’s teen account system worldwide.

Global Pressure for Safer Online Platforms

The new features come as governments tighten regulations. In the UK, the Online Safety Act — implemented in March — now requires online platforms to protect under-18s from harmful and illegal content, including material related to suicide, self-harm, terrorism, and child abuse.

Meta Shifts Power to Parents

Meta’s former Global Affairs President Nick Clegg previously stated that these changes aim to “shift the balance in favour of parents.” He emphasized that many parents are still not making full use of available safety tools, and the new system is designed to increase their involvement in ensuring online safety.

Back to top button