Meta is adding new safety rules for teens on Instagram, Facebook, and Messenger. Starting soon, anyone under 16 won’t be able to use Instagram Live or unblur possible nudity in direct messages unless their parents approve.

The updates are part of Meta’s teen safety program, which launched last September. It’s meant to give parents more control over what their kids can do online. So far, the company says 54 million teen accounts have joined the program.

For Instagram, teens under 16 will now need permission from a parent to go live or view images in messages that may contain nudity. Meta said the blurring feature stays on unless a parent allows it to be turned off.

The new rules will first roll out in the US, UK, Canada, and Australia. More countries will get the changes over the next few months.

Facebook and Messenger are getting similar protections. Teen accounts will be private by default, and strangers won’t be able to send messages. There will also be limits on sensitive content like violent videos, reminders to log off after an hour, and fewer notifications at night.

“Teen Accounts on Facebook and Messenger will offer similar, automatic protections to limit inappropriate content and unwanted contact,” Meta said in a blog post. The goal is to make sure teens are using the apps in a safer, more mindful way.

There are a lot of social media companies that are getting complaints about how their platforms impacts their young users. For example, more than 30 US states sued Meta, accusing it of purposely making the platform addictive to young people for monetary reasons and not safeguarding it against unhealthy behavior. The lawsuits alleged that the company shared user data with advertisers without getting permission, which is against the law.

“Increasing the time spent on Meta’s platforms increases the effective delivery of targeted ads — a pivotal factor in Meta’s ability to generate revenue,” a state prosecutors’ complaint said.