Tech giant Meta announced on Thursday that it will begin removing users under 16 in Australia from Instagram, Threads, and Facebook, in preparation for the country’s world-first youth social media law.

Australia’s legislation, set to take effect on December 10, requires major online platforms—including TikTok and YouTube—to block underage users. Companies that fail to take “reasonable steps” to comply face fines of up to Aus$49.5 million (US$32 million).

A Meta spokesperson said: “While we are working hard to remove all users who we understand to be under the age of 16 by 10 December, compliance with the law will be an ongoing and multi-layered process.”

The spokesperson added that younger users will be able to save and download their online histories: “Before you turn 16, we will notify you that you will soon be allowed to regain access to these platforms, and your content will be restored exactly as you left it.”

Instagram alone reported about 350,000 Australian users aged 13 to 15, indicating hundreds of thousands of adolescents will be affected. Some popular apps, including Roblox, Pinterest, and WhatsApp, are exempt, though the list is still under review.

READ ALSO: Meta Deletes Over 10 Million Accounts Over Spam, Impersonation

Meta expressed its commitment to comply with the law but suggested that app stores should share responsibility for age verification. “The government should require app stores to verify age and obtain parental approval whenever teens under 16 download apps, eliminating the need for teens to verify their age multiple times across different apps,” the spokesperson said.

“Social media platforms could then use this verified age information to ensure teens are in age-appropriate experiences.”

YouTube has also criticized the ban, warning that it could make young Australians “less safe”, as under-16s could still access content without an account but would lose YouTube’s safety filters.

Australia’s Communications Minister Anika Wells dismissed this argument as “weird”, saying: “If YouTube is reminding us all that it is not safe and there’s content not appropriate for age-restricted users on their website, that’s a problem that YouTube needs to fix.”

She highlighted that some Australian teens had died by suicide after algorithms “latched on”, targeting them with content that drained their self-esteem. “This specific law will not fix every harm occurring on the internet, but it will make it easier for kids to chase a better version of themselves,” Wells said.

The law has faced legal challenges. Last week, the Digital Freedom Project filed a High Court case, calling the legislation an “unfair” attack on freedom of speech.

Guidelines acknowledge that some teens may try to bypass the restrictions using fake IDs or AI-generated images, and platforms are expected to develop solutions to prevent this. However, the internet safety watchdog warned that “no solution is likely to be 100 percent effective.”

The Australian law has drawn international attention as regulators worldwide consider similar restrictions. Malaysia plans to block under-16s from social media accounts next year, while New Zealand is introducing a comparable ban.