Australia's Social Media Age Limit: Missed Opportunity for Nuanced Regulation
TL;DR
- Australia's social media age limit, intended to curb harms, was implemented as a broad ban rather than a nuanced regulatory approach, missing opportunities to incentivize platforms to create safer user experiences.
- The legislative process removed an "exemption framework" that would have allowed platforms to offer curated, less harmful versions of their apps, forfeiting a mechanism to drive product improvement.
- The effectiveness of the age limit is questionable due to its implementation, which relies on porous age verification methods and excludes many platforms and logged-out usage, potentially leading to simple substitution of online activities.
- The focus on a blanket age ban diverted attention and resources from other potential regulatory avenues, such as algorithmic transparency or privacy reforms, that could have addressed broader platform harms.
- The government shifted the law's justification from addressing specific harms like cyberbullying and radicalization to simply starting conversations, lowering the bar for success and questioning the policy's core purpose.
- Australia's approach to tech regulation, including the social media age limit and chatbot rules, is being watched globally, but its implementation highlights the challenge of balancing broad bans with nuanced, effective policy.
- The reliance on US technology and the close geopolitical relationship with the US create complexities for Australian digital sovereignty, with limited public discussion on data privacy and potential US government access.
Deep Dive
Australia's recent social media age limit, while a bold move to protect minors, represents a missed opportunity for more effective, nuanced regulation. The law, which prohibits individuals under 16 from holding social media accounts, was enacted quickly and broadly, but its implementation has been criticized for being "shoddy" and potentially ineffective due to its broad nature and the exclusion of certain platforms and features. This approach, focusing on a blanket ban rather than incentivizing platforms to create safer environments, fails to address systemic issues and may lead to simple substitutions rather than genuine change in user behavior.
The implementation of Australia's social media age limit reveals significant shortcomings in its approach to regulating online harms. The law, which aims to prevent minors under 16 from having social media accounts, relies on a "waterfall" system of age verification, starting with data inference and progressing to age estimation (like facial scanning) and, as a last resort, government ID upload. While this system aims to be less intrusive, widespread circumvention was observed on the first day of its enforcement, indicating potential ineffectiveness. More critically, the policy sidelined more sophisticated regulatory avenues. An initial "exemption framework" that would have incentivized platforms to develop child-friendly versions of their services by removing harmful features like endless scroll or push notifications was removed due to political deal-making. This decision is a significant missed opportunity, as it replaced a potentially more impactful regulatory tool--one that would have forced platforms to adapt their core functionality--with a simpler, albeit less effective, ban. This focus on a broad ban, rather than incentivizing platform reform, risks condemning the entire internet and its potential for positive use, rather than guiding its evolution toward safer online experiences.
The policy's effectiveness is further undermined by its narrow focus and the government's shifting justifications. While initially framed as a solution to social harm, predatory algorithms, and cyberbullying, the law excludes key platforms like messaging apps (WhatsApp, Telegram) and gaming platforms (Roblox), and still allows logged-out usage of platforms like YouTube and TikTok, which can provide algorithmic recommendations. This creates a scenario where users may simply substitute one platform for another or engage in the same behaviors without being logged in, negating the intended impact on screen time and online safety. The government's pivot to claiming success based on initiating conversations and changing social expectations represents a lowering of the bar, diverting attention from the need for more substantive reforms. This approach overlooks existing, more nuanced regulatory efforts, such as the online safety act and children's online privacy code, which were potentially overshadowed or rendered redundant by the age limit.
The broad ban approach, while politically expedient, misses a crucial opportunity to leverage regulation to fundamentally reshape platform incentives. Instead of demanding that platforms adapt to user expectations and create safer environments, the current law essentially tells users to stay offline until they are 16. This fails to address the underlying issues that make online spaces harmful for all ages, including the dominance of algorithmic feeds and the lack of genuine competition. The rollout of generative AI and chatbots further highlights the limitations of this approach, as these technologies also present risks that require careful regulation, including age verification and prevention of harmful conversations with minors, an area where Australia is taking some steps but which remains largely under-addressed in the broader public discourse. Ultimately, while the age limit may be a bold step, its current form risks failing to deliver lasting improvements to online safety, leaving young people to re-enter an internet landscape that has not fundamentally improved.
Action Items
- Audit social media age verification: Assess effectiveness of age inference, estimation, and ID upload methods across 3-5 major platforms for systemic vulnerabilities.
- Design alternative platform features: Propose 3-5 curated features for youth users that address harms like endless scroll and push notifications, creating an opt-in framework.
- Evaluate chatbot regulation: Analyze current age verification and content restriction measures for chatbots, identifying 2-3 potential gaps for minors.
- Track platform compliance: Monitor 5-10 platforms for adherence to new age restrictions and identify common circumvention tactics used by minors.
- Advocate for nuanced regulation: Propose 2-3 alternative regulatory approaches beyond blanket bans, focusing on algorithmic transparency and content moderation standards.
Key Quotes
"Australia last week introduced a legal expectation that social media platforms will take reasonable steps to stop Australians under the age of 16 from having accounts on their platform it applies to all platforms that fit a very broad definition of social media platforms but the government to give some kind of certainty said we're giving you a list of 10 platforms that definitely fit this requirement and that is all the major ones you can think of facebook instagram tiktok youtube which was a kind of controversial inclusion at points snapchat a few others as well and the law itself has passed a year ago it was given a 12 month leading time to figure out the details as certain aspects of implementation were figured out at the same time the government also ran a government commission trial talking age check technologies and that law was actually passed after a pretty quick process where earlier in 2024 there was a little bit of support around from some state premiers like heads of australian states for something like this and then in a radio interview our prime minister was asked do you support a campaign which was launched that month by like a group backed by a popular radio station that said hey we want to raise the minimum age of 13 to 16 so it was also another mainstream media organization running a very similar campaign that was the prime minister said in may yep i back it i support this and then six months later it was a law so um we saw a pretty quick process you know a way for it to actually be locked in and the actual details of implementation which kind of went back and forth and there was lots of questions about how do you actually take this idea which is a very kind of broad law and actually implemented took a little bit longer and then yes starting last week that was the deadlines that platforms had to do something and we did see widespread children's accounts being restricted we also saw widespread circumventing so definitely didn't go off without a hitch which we'll get into a chance to talk about that later on but now you know as of this week we're now almost a week into our post social media world for for teens and you know largely i guess life has continued in a way"
Cam Wilson explains that Australia's new social media age limit, which prohibits individuals under 16 from having accounts on designated platforms, was enacted quickly following a campaign supported by mainstream media. The law, passed a year prior, allowed for a 12-month implementation period, during which age verification technologies were explored. Wilson notes that the rollout has seen both account restrictions and widespread circumvention, indicating it has not been entirely seamless.
"we've just a real mishmash of justifications the prime minister started off by saying social media is causing social harm and we saw that message honed over time into a like argument about the kind of like you know they would call predatory algorithms and in the features of platforms that were encouraging harmful use of their products which is to people say on there longer radicalization all that kind of stuff essentially became like a touch grass policy like as in like get the kids off their damn phones and you know onto the footy fields or whatever the problem with that is that when they had to then translate that idea into practice what we ended up with was this policy that while it had a very wide definition of what a social media platform was it also had exclusions for things like messaging apps weren't included in that and then like the follow on question is like what's a messaging app and like snapchat was ruled as not a messaging app but like whatsapp and telegram were and like whatsapp i don't know if people are familiar with these features i don't really use them that often but it actually has like a lot of social features you've got stories you've got kind of like a facebook page style like mass broadcast stuff gaming was also excluded as well and particularly in the idea of in the world of youth safety at the moment like roblox has been such a hot button issue and roblox of course is excluded because it's a game and then also the fact that like the way the law was written was that it only applies to people using accounts on the platforms which allows those platforms still to be used in a logged out state you can still watch youtube or you want you can actually use the tiktok app without being logged in and you will still get customized like feeds you still get algorithmic recommendations so when they had to translate this app like this this law into something that actually addressed those things that they had raised as the issues you kind of saw like how it almost drifted away from that in a way because it's like okay great so you want kids to be off their phones and yet like for example you can still watch as much youtube and tiktok as you want you can still play roblox you can still do like all these things i would not be surprised if we didn't see a drastic change in kids' screen time because like maybe they'll go outside but i think a lot of them will just substitute like one thing for another whether it's like going to snapchat to whatsapp whether it's going from like logged in youtube to logged out youtube so for those reasons like the success of this as a law while i partly is about like how well it's keeping people out you know there's certainly had questions over it at the moment but we'll have more idea later on i don't know like how well it will work and i also think maybe they just kind of ratcheted up as it goes along maybe they catch more and more people and restrict them more and more from these social media platforms that they they are allowed to be on but these other purposes for the law because they're not even addressing it's just a very bizarre to kind of understand the effectiveness of and so like age verification technology and age estimation technology which is such a hot button issue really to me became only like one part of how we understand how effective it is and how we look at it in the future and particularly as we the law came close to being into effect we saw a real change of the goalposts by the government to go from being like we're going to do something about like cyber bullying we're going to do like there's going to do something about like radicalization there's going to do something about kids spending too long online and not getting enough sleep to like eventually you know a week before the law came into effect the prime minister said this law is already a success because it's started conversations and they said like the point of the law is to be is to change like social expectations so not everyone will just assume that everyone else is on social media at that age and to me i'm just like like i feel like that is such a drift of the law from its original purposes and such a low bar that you kind of are questioning like what is the point why we ended up doing it like this because if that's that success i guess you could argue that there's been plenty of conversations there's been widespread usage and i think you know like to their credit i've definitely thought about my screen time in the meantime i've definitely thought you know more than i thought in a long time about how kids are using social media in terms of effectiveness and maybe we'll get a chance to chat about this later i have just been critical about it because i think like
Resources
External Resources
Books
- "The Anxious Generation" by Jonathan Haidt - Mentioned as an early influence that prompted consideration of social media age limits.
Articles & Papers
- "Conspiracy Nation: Exposing the Dangerous World of Australian Conspiracy Theories" - Co-authored by Cam Wilson.
- "The Sizzle" (Newsletter) - Written by Cam Wilson.
People
- Cam Wilson - Associate editor at Crikey, writer of "The Sizzle" newsletter, co-author of "Conspiracy Nation," and guest on the podcast discussing Australia's social media age limit.
- Jonathan Haidt - Author of "The Anxious Generation," whose book influenced discussions on social media age limits.
- Paris Marx - Host of the "Tech Won't Save Us" podcast, who wrote about social media age limits and the need for comprehensive regulations.
- Kyla Hewson - Producer of the "Tech Won't Save Us" podcast.
- Sagar - Patreon supporter.
- Paul - Patreon supporter.
- Antoine - Patreon supporter.
- Anthony Albanese - Prime Minister of Australia, mentioned in relation to his comments and social media presence regarding the age limit law.
Organizations & Institutions
- Crikey - Employer of Cam Wilson.
- The Nation Magazine - Partner of the "Tech Won't Save Us" podcast.
- News Corp - Media organization that ran a campaign called "Wake Kids to Kids" advocating for a social media age limit.
- 36 Months - Group co-founded by a radio host and a video producer, which campaigned for raising the social media age limit to 16.
- Change.org - Platform where the "36 Months" group gathered signatures for their campaign.
- Meta (formerly Facebook) - Social media platform mentioned in relation to its age verification technologies and its role in the social media age limit debate.
- Google - Mentioned in the context of the News Media Bargaining Code.
- TikTok - Social media platform mentioned as a controversial inclusion in the list of platforms subject to the age limit and for its use by the Prime Minister.
- YouTube - Social media platform mentioned as a controversial inclusion in the list of platforms subject to the age limit.
- Snapchat - Social media platform mentioned as a controversial inclusion in the list of platforms subject to the age limit and later classified as not a messaging app.
- WhatsApp - Messaging app mentioned as having social features and being classified as a social media platform under the law.
- Telegram - Messaging app mentioned as having social features and being classified as a social media platform under the law.
- Roblox - Game platform excluded from the social media age limit law.
- Yoti - Third-party provider for age estimation technology.
- Kod - Third-party provider for age estimation technology.
- Microsoft - Mentioned in relation to its Copilot chatbot and data center investments.
- OpenAI - Mentioned in relation to signing a contract with the Australian government.
- Palantir - Mentioned as having contracts with the Australian government.
- AUKUS - Defense pact involving Australia, the UK, and the US.
- ABC (Australian Broadcasting Corporation) - Public broadcaster mentioned as a potential alternative universe for developing youth-focused online spaces.
Websites & Online Resources
- Patreon - Platform for supporting the "Tech Won't Save Us" podcast.
- mailbag@techwontsave.us - Email address for submitting questions to Paris Marx for a mailbag episode.
- patreon.com/techwontsaveus - URL for supporting the "Tech Won't Save Us" podcast.
Other Resources
- Tech Won't Save Us (Podcast) - Podcast offering a critical perspective on tech, its worldview, and society.
- News Media Bargaining Code - Australian legislation requiring platforms like Google and Meta to pay news outlets.
- Online Safety Act - Australian legislation introduced in the early 2020s concerning platform content regulation.
- Children's Online Privacy Code - Australian privacy reforms restricting data collection on users.
- Fix Our Feed - Campaign advocating for chronological social media feeds by default.
- Microsoft Copilot - Chatbot used by Australian public servants.