The Australian government has unveiled a landmark law that will require social media platforms to actively prevent Australians under 16 from creating accounts, signaling a major shift in how platforms manage young users and online safety.
Key Points
- Australia has enacted a landmark law requiring social media platforms to prevent users under 16 from creating accounts, aiming to significantly enhance online safety for young Australians.
- Effective December 10, 2025, age-restricted platforms must block under-16s from accounts. The responsibility for enforcement lies solely with tech companies, not minors or their parents.
- Communications Minister Anika Wells stands firm against tech company opposition, prioritizing child safety. The law broadly covers social media but exempts essential services like messaging and education.
Beginning December 10, 2025, social media platforms classified as age-restricted must take steps to block Australians under 16 from creating accounts. The requirement follows late-2024 amendments to the Online Safety Act 2021, which established the Social Media Minimum Age (SMMA) framework.
The new rules for “age-restricted social media platforms” apply to both existing users under 16 and any new accounts created. Platforms covered by the regulation are those primarily designed to enable interaction between two or more users, allow connections or interactions among users, and permit users to post content. This broad definition ensures that the minimum age requirements apply across a wide range of social media services.
The definition of age-restricted social media platforms also allows for flexibility, enabling the government to narrow or adjust the scope through legislative measures. In July, Australia’s Minister for Communications, Anika Wells, introduced the Online Safety Rules 2025 (the Rules), which clarify which online services are exempt from the Social Media Minimum Age framework.
Related: 10 EU Banks Aim to Launch Euro-Pegged Stablecoin by Late 2026
Exemptions include messaging, email, voice, and video calling services, online games, platforms primarily providing information about products or services, professional networking and development services, as well as education and health services. The Rules are designed to protect young users from potential harms associated with social media while ensuring continued access to essential communication, educational, and health services.
Social media platforms meeting these criteria, such as Snapchat, Meta, TikTok, and YouTube, will be required to show that they have taken reasonable measures to prevent users under 16 from creating accounts, placing the responsibility squarely on the platforms themselves. Children under 16 who gain access to an age-restricted platform will not face any penalties, nor will their parents or guardians be held accountable.
In a BBC interview, Wells said she is undeterred by tech companies that oppose the country’s “world-leading” social media ban. “We stand firm on the side of parents and not on the platforms, Wells stated. She argued that tech companies have had years to enhance their practices, particularly given research highlighting the potential harms caused by their platforms. “I am not intimidated by big tech because I understand the moral imperative of what we’re doing,” she added.
Related: Strategy Sells $1.4B in Stock to Cover Bills Amid Bitcoin Slump
The announcement of the social media ban has sparked widespread discussion online. Many have praised the law, with some international observers calling on their own governments to adopt similar measures to protect children. However, critics argue that the legislation amounts to censorship, with certain age groups claiming it infringes on personal freedom.
Concerns over the impact of social media and AI on children have intensified, leading to new laws focused on safeguarding young users from the risks posed by AI chatbots. These measures come in response to lawsuits from parents who claim that certain AI models have negatively affected their children’s mental health and wellbeing. The ongoing debate spotlights the broader challenge of ensuring social media platforms provide a safe environment for children online.
