Skip to content

Meta Introduces New Parental Control Features for Instagram and Messenger

Meta is launching new parental control tools across its platforms, including Instagram, Facebook, and Messenger. The features focus on giving parents more oversight and helping teenagers manage their time on the apps.

New parental control features in Meta's apps

Meta has launched new parental control tools across its platforms, including Instagram, Facebook, and Messenger. The update includes a parental supervision hub in Messenger, a feature that blocks unwanted direct messages, and nudges to remind teenagers to take a break from the app.

The supervision controls, which will first roll out in the U.S., the U.K., and Canada, allow parents to view their teens’ privacy and safety settings, monitor changes in the Messenger contact list, and check how much time they spend on the app. Parents will receive notifications when their teen reports someone, given the teen allows this. Parents can also adjust settings regarding who can message their teens and view their stories, and they will receive a notification if any of these settings are changed.

Image Credits: Meta
Image Credits: Meta

To limit interactions between teens and unknown adults, Instagram will now require such users to send an invitation for permission to interact. Instagram is also introducing controls to encourage users to take breaks from prolonged usage. This includes a "Quiet mode," previously introduced in January, which enables users to pause notifications and auto-reply to direct messages. This feature is now rolling out globally.

Meta is also extending this feature to Facebook, nudging users to take a break after 20 minutes of usage. Teens watching Reels at night will also be prompted to close the app.

Image Credits: Meta
Image Credits: Meta

Moreover, Meta will send teens a new notice on Instagram to allow their guardians to supervise their accounts for better protection. Parents can now view mutual connections for accounts that their teen follows or those that follow them.

These features follow Meta's earlier steps towards safer experiences for young users, including control over ad targeting for teens and tools to prevent the posting of intimate images. However, these steps come after the company faced a fine of over $400 million last year for violating GDPR rules concerning children's privacy.

Latest