Australia Removed 4.7 Million Teen Accounts in 30 Days. Here's What Every Parent Needs to Know Now.
On 10 December 2025, Australia became the first country in the world to ban social media for everyone under 16. In the first month, 4.7 million teen accounts were removed. France, the UK, Denmark and Malaysia are watching.

Avery Hayes
Mom Of Two
April 26, 2026 · 14 min read

The phone call I got from another mum in early December went something like this. "Did you see what Australia just did? They actually banned it. They actually went and did it." She was, like me, a mother of a tween. We had been having the same conversation, in different forms, for two years. Whether to allow Instagram. When to allow TikTok. How to handle Snapchat. What our daughters' school felt about it.
And then one country, in one parliament, made the decision for itself.
On 10 December 2025, Australia became the first country in the world to legally ban social media for everyone under the age of 16. Not a nudge. Not a guideline. A ban. With $49.5 million AUD fines per platform for non-compliance. And in the first month after the ban took effect, the country's eSafety Commissioner reported that 4.7 million accounts had been removed, deactivated, or restricted.
This post is a careful, honest look at what Australia actually did, what the early data shows, which countries are following next, and (most usefully) what every parent should be taking from this moment, even if you live somewhere the ban will never reach.

What the law actually does
Let me start with what the law is, because the headlines have been confusing. The Online Safety Amendment (Social Media Minimum Age) Act 2024 was passed by Australia's parliament on 28 November 2024 and came into effect on 10 December 2025. It does four specific things.
One. It sets a legal minimum age of 16 for social media accounts. Below that age, you legally cannot have an account on the platforms covered.
Two. It puts the legal burden on the platforms, not the parents. If your 14-year-old somehow has an Instagram account, it is Meta that is legally exposed, not you.
Three. It removes parental consent as an override. Even if a parent wants to grant permission, the law does not allow it. This is the genuinely radical part. It treats the question of whether a 13-year-old should have TikTok the same way most countries treat whether a 13-year-old should have a beer. Not the parent's call.
Four. The penalty is $49.5 million AUD per platform per breach. Real money, even by Meta's standards.
The law applies to ten platforms specifically: Facebook, Instagram, TikTok, Snapchat, YouTube, X, Reddit, Threads, Twitch, and Kick. WhatsApp, iMessage, and other private messaging services are not included. Educational platforms and games without strong social features are not included either. The law is targeted at the platforms with algorithmic public feeds, which is where most of the documented harm to adolescent mental health has been measured.
The first 4 months of data
It has been just over four months since the ban took effect. Here is what we actually know, separated from the political noise.
The headline figure is real. In the first month, the eSafety Commissioner reported that 4.7 million accounts had been removed, deactivated, or restricted. That equates to roughly two accounts per Australian aged 10 to 16, suggesting many children had accounts on multiple platforms before the ban.
The eSafety Commissioner's first 90-day report, summarised by Tech Policy Press, found "significant concerns" about five platforms (Facebook, Instagram, Snapchat, TikTok, YouTube), particularly around age verification. In some cases, users who had already declared themselves under 16 were being prompted to "correct" their age and were able to regain access. By late March 2026, Communications Minister Anika Wells confirmed all five platforms were under formal investigation, with potential enforcement actions to follow by mid-2026.
The Guardian Australia reported in February 2026 that some teens had found ways around the ban using VPNs, age misrepresentation, or borrowed adult accounts. This was always going to happen and is not, in itself, evidence the law is failing. As eSafety Commissioner Julie Inman Grant put it, "We don't expect safety laws to eliminate every single breach. If we did, speed limits would have failed because people speed."
The polling numbers are striking. YouGov polling in November 2024 (before the law passed) showed 77% support. By December 2025 (right as it came into force), Sydney Morning Herald polling showed 70% endorsing the ban with only 15% opposed. Confidence that it will work remains lower (33%), but support for the principle is high and growing.
Reddit is suing the Australian government in the High Court, arguing the law violates constitutional protections on political speech. The Digital Freedom Project has filed a parallel challenge with two 15-year-olds named as plaintiffs. The state governments of NSW, South Australia and Western Australia have all confirmed they will defend the law. Hearings are scheduled for early 2026.
The honest answer on whether the ban is improving teen mental health is: we do not know yet. The mental health benefits, if they materialise, will take 12 to 24 months to show up in measurable population data. Anyone telling you otherwise (in either direction) is ahead of the evidence.
Today, we can announce that this is working. This is a source of Australian pride. This was world-leading legislation, but it is now being followed up around the world.Australian Prime Minister Anthony Albanese, January 2026 press conference
Which countries are following next
This is where the story stops being just about Australia. As of February 2026, multiple countries are at various stages of following Australia's lead, each in their own way.
Already has a law from 2023 requiring parental consent for under-15s on social media, but it has been weakly enforced. President Macron's government is now actively considering an Australian-style hard ban.
The Online Safety Act is in force but stops short of an age ban. As of early 2026, ministers have signalled openness to considering an Australian-style minimum age, with active parliamentary debate underway.
Publicly committed to following Australia's lead. The Danish government announced in early 2026 that it would pursue similar legislation, citing the scale of Australian compliance as evidence the approach can work.
Announced its own under-16 social media ban in January 2026, becoming the second country to commit to the policy.
Both have active proposals being debated in their respective parliaments. Italy's draft proposal is more aggressive; Germany's leans toward stronger parental controls rather than a hard ban.
Has reframed the issue as national security, citing online recruitment of minors into criminal organisations. Bill under consideration as of March 2026.
No federal action expected. Several state-level proposals exist (Utah, Florida, Texas) but most have been challenged on First Amendment grounds. The US is more likely to act on school phones (which it already is, see below) than on social media age limits.
The pattern is clear. The Overton window on adolescent social media access has shifted permanently. Even in countries that will not pass an outright ban, the conversation now starts from "should we restrict?" rather than "is there a problem?" Australia broke the floor.
What this means for you, even if you live elsewhere
Most readers of this post do not live in Australia. So why does any of this matter for your family?
Three reasons.
One. The platforms are under pressure they have never been under before. Meta, TikTok, Snap, and the rest now have a working blueprint of what real regulatory enforcement looks like. They will, sensibly, start to make changes globally rather than only in jurisdictions with hard laws. Already in 2026, Instagram has rolled out enhanced "Teen Accounts" with default privacy settings, time limits, and content restrictions in markets without ban legislation. That is Australia's pressure, applied to your phone, even if your country has no law.
Two. The data Australia produces will inform every other country's decision. The mental health outcomes (or lack of them) over the next 24 to 36 months will shape global policy. If teen anxiety, sleep quality, and suicide rates measurably improve in Australia, the political case for similar laws elsewhere becomes very hard to resist. If they do not, the case weakens. Either way, you will be living in the country that gets the answer.
Three. The cultural permission has shifted. Two years ago, parents who delayed their children's social media access were quietly seen as overprotective. After Australia, that has reversed. Delaying is now the position with international, governmental, and increasingly research backing. Jonathan Haidt's The Anxious Generation, which made the cultural case for delay, has been substantially vindicated by Australia's choice to act.
Whatever your country's law looks like, your family's policy can be informed by what we now know. Specifically, we know that delay until 16 is implementable. It is enforceable. It is supported by the public. And the platforms have demonstrated, when forced, that they can comply.
The fair criticisms of the ban
I want to be honest about the criticisms, because they are not all bad-faith and a parent thinking through this seriously should hear them.
Critique one: it cuts off support for vulnerable teens. Some teens, especially LGBTQ+ teens in unsupportive households, find their main community online. A blanket ban removes that. UNICEF Australia explicitly raised this concern, supporting the spirit of the law but worrying about isolation effects.
Critique two: it pushes teens to less-regulated spaces. If you remove TikTok from a 14-year-old, they may end up on Discord, Telegram, or harder-to-monitor platforms. The argument is that the well-known platforms are at least surface-level moderated; pushing teens to underground alternatives can be worse.
Critique three: it is a parental authority issue. Some parents, including some who delay social media themselves, are uncomfortable with a law that overrides their right to make the call for their own family. The Reddit lawsuit makes a constitutional version of this argument.
Critique four: enforcement is imperfect. A determined teen with a VPN and an older friend's phone can still access most platforms. The eSafety Commissioner's own 90-day report acknowledges this.
None of these are silly. A serious parent thinking about this should hold them as real considerations. The honest position is that the ban is a blunt tool that solves a big problem imperfectly, and reasonable people disagree about whether the imperfect tool is better than no tool.
My own view, for what it is worth: the evidence on adolescent harm is now substantial enough, and the platform incentives misaligned enough, that delay until mid-teens is the right default. Whether the right mechanism is law or family policy depends on your country and your family. The Australian experiment will help us learn.
What the research consistently supports: delaying smartphones and social media past mid-adolescence is associated with better mental health outcomes. The US Surgeon General's 2023 advisory on social media and youth mental health reached the same broad conclusion. The mechanism (law vs family rule vs school policy) is genuinely up for debate. The principle of delay is increasingly well-supported.
The 6 family rules to set this week
Whatever your country does or does not do legally, here are the family rules that are now, after Australia, evidence-based defaults rather than overprotective choices.
If you can hold the line until 16, the research, and now Australia's law, both back you. If 16 is genuinely impossible in your social context, 14 is a reasonable middle ground that aligns with most platforms' actual minimum age plus one year. Below 13, there is no real defence; the platforms themselves disallow it under COPPA and equivalent laws, and parents who facilitate underage accounts are bypassing both the legal floor and the developmental research.
This is the single most-supported intervention in the entire literature. Phones charge in the kitchen, downstairs, in a parent's room. Not in the child's bedroom. Bedroom phone use is where most of the worst outcomes happen.
Frame it from day one as the entry ticket, not surveillance. "If you want a device, the deal is I have access. Always." Most teens accept this if it is consistent and non-punitive.
"What's been on your feed lately?" "What did you watch this week?" "Did anything bother you?" The teenager who talks about what they are seeing is safer than the one going through it alone. Make it a casual, recurring, non-punitive conversation, not an interrogation.
The single biggest predictor of whether your "no social media until 14" rule survives middle school is whether at least one or two other families in your child's friend group are doing the same. Find them. Have the conversation. Many parents are quietly waiting for someone else to suggest it.
For tweens who need to communicate with friends but are not ready for full smartphones, options like the Tin Can landline (a Wi-Fi-based dumb phone for kids) have exploded in popularity in 2026. With a 100,000-person waiting list as of April 2026, it has become the symbol of the analog-childhood movement. There are also dumb phones, basic flip phones, and watch phones that allow communication without social media access.
Frequently asked questions
Will the Australian ban be reversed by the courts?
Probably not. The two main legal challenges (Reddit's High Court action and the Digital Freedom Project's parallel suit) are both substantial but the Australian Constitution gives parliament broader latitude on this kind of legislation than the US First Amendment would allow. Most legal commentators expect the law to survive in some form. The hearings are scheduled for early 2026, and the law remains in force while they proceed.
If WhatsApp is allowed, what's the point? Aren't kids just moving to that?
The Australian government's reasoning, supported by most adolescent mental health research, is that the harms come predominantly from algorithmic public feeds, not from one-to-one or small-group messaging. WhatsApp resembles old-fashioned texting much more than it resembles TikTok. The mental health research does not show the same pattern of harm from messaging that it does from public algorithmic platforms.
Won't this just make tech companies leave Australia?
None has, four months in. The Australian market is large enough, and the precedent global enough, that platforms are choosing to comply rather than withdraw. If they had withdrawn, that itself would have been telling.
My country won't pass this. What can I actually do?
Set the family rule yourself. The 6 rules in the section above are implementable in any country, by any family, regardless of legislation. The Australian law makes "no social media until 16" the new normal globally, even where it is not the legal minimum.
My child says they will be socially excluded.
This is a real concern, not a manipulation, and the answer matters. The research is consistent that children who delay social media maintain in-person friendships at similar rates to those who do not. The exclusion that does happen tends to be temporary and is often, on close inspection, not really about the platform itself. That said, finding allied families (rule 5 above) substantially reduces this issue.
What about YouTube? My kid uses it for homework.
Australia included YouTube in the ban after initially considering an exemption. The reasoning was that YouTube's algorithmic feed is not meaningfully different from other social platforms, even when used for educational content. Most families that delay YouTube account access still allow YouTube viewing through a parent account or via the YouTube Kids app. There is a real difference between watching a tutorial on a parent's account and having an algorithmically-driven personal feed.
The line worth holding
Whatever happens with Australia's law over the next two years, something has changed. A country said: this is harming our children, the platforms will not fix it themselves, parents cannot fix it alone, and so we are going to fix it. And then they did, and the platforms complied, and 4.7 million accounts disappeared in a month.
You do not need a law to make the same call for your own family. You need a rule, allied families, and the willingness to hold the line through the protests. The cultural cover for that decision is now the strongest it has ever been.
Your child wanting Instagram at 11 is not new. What is new, after December 2025, is that you have a country, a research base, and a global parent community telling you it is okay to say no.
What's your family's current rule? Tell me in the comments. Other parents need to know they're not the only ones holding the line.
Read next
Community Discussion
Join 0 parents sharing their thoughts
Loading conversation...
Save this article for later?
We'll send a beautiful copy straight to your inbox so you never lose it.

Avery Hayes
Mom Of Two
Avery Hayes is a mother of two and a parenting writer passionate about helping families through honest, relatable content.
Related Articles

A $100 Phone for Kids With No Apps Has 100,000 Parents on a Waiting List. Here's What's Going On.
The Tin Can phone has no screen, no apps, no internet, no texting, and no games. It is, quite literally, just a phone. It costs $100. The waiting list is over 100,000. It has gone viral on TikTok, been profiled by Bloomberg and WIRED, and is now appearing in homes from California to Connecticut.

Everyone's Suddenly Carrying an "Analogue Bag" Instead of Scrolling. Here's What's Actually Inside.
A physical tote bag full of screen-free activities you reach for instead of your phone. The "analogue bag" trend is Pinterest's biggest 2026 movement, part of the bigger backlash to doomscrolling. Here is exactly what people are putting in theirs, and how to build yours this weekend.

Every Photo You Post of Your Kid Ends Up Somewhere. Sometimes It's Not Where You Expected.
The Journal of Pediatrics has documented hundreds of thousands of innocent photos of children, originally posted by loving parents, that have resurfaced on predator platforms. Here is what sharenting actually does, what the research shows, and how to still share what matters safely.