SYDNEY, Australia — Nearly four months after Australia implemented a groundbreaking ban on social media access for children under 16, the country's online safety regulator has raised serious concerns about compliance by major platforms. The eSafety Commissioner released its first detailed update report on Thursday, revealing that while millions of accounts have been removed, significant gaps remain in enforcement, prompting investigations into five of the world's largest social media companies.
The ban, which took effect on December 10, 2024, marks the world's first national prohibition on under-16s creating or maintaining accounts on social media platforms. Enforced by the eSafety Commissioner, the legislation requires companies to take "reasonable steps" to restrict access, with potential fines of up to 45.7 million Australian dollars (about $30 million USD) for non-compliance. According to the report, social media giants have removed some 4.7 million accounts by mid-January 2025 and an additional 310,000 by early March, totaling over 5 million deletions in the initial period.
Despite these figures, the report paints a "complicated picture," highlighting "compliance concerns" across four key areas. It notes that messaging to under-16s on some platforms "encouraged children to attempt age assurance even where they declared themselves to be underage." Additionally, certain platforms allowed minors to "repeatedly attempt the same age-assurance method to ultimately pass age checks." Pathways for reporting age-restricted accounts were described as "generally not... accessible and effective, particularly for parents," and some platforms "appear not to have done enough to prevent under-16s having accounts."
eSafety Commissioner Julie Inman Grant announced that her office is now investigating Facebook, Instagram, Snapchat, TikTok, and YouTube for "potential non-compliance." "None of these companies has yet been fined," the report states, adding that a decision on enforcement action is expected by mid-2025. Representatives from Meta, which owns Facebook and Instagram, and Google, parent of YouTube, did not immediately respond to requests for comment on the investigations. Snapchat, TikTok, and YouTube's parent companies also declined to comment, citing ongoing reviews.
The report's release comes amid growing international interest in Australia's experiment. Journalists from Canada, France, Germany, Japan, New Zealand, the United Kingdom, and other nations have been closely following the ban's implementation, often asking two central questions: how successful is it, and are children still accessing social media? The update provides partial answers but leaves many uncertainties, such as the exact number of under-16s still active on platforms. The report acknowledges that the 5 million account removals "sounds impressive," but notes that "many people hold several social media accounts," making it unclear how many children remain online through one or more profiles.
Furthermore, the document does not specify how many new accounts under-16s may have created since the ban's rollout. It also avoids estimating the shift to alternative platforms, though separate reports indicate a "significant spike in downloads" of non-mainstream apps like RedNote, Yope, and Lemon8 since December 2024. eSafety maintains a list of platforms initially covered by the ban, including self-identified services like Bluesky, dating apps such as Tinder, and Lemon8. However, the regulator admits the social media landscape is "constantly evolving," rendering a complete list "impossible to maintain."
Adding to the challenges, the report arrives just a week after the Australian government registered a new legislative rule expanding the ban's scope to include platforms "that have addictive or otherwise harmful design features." This change was enacted in the same week that a U.S. jury found Meta and Google liable for the addictive elements of Instagram and YouTube, respectively, in a high-profile lawsuit over youth mental health harms. The timing underscores ongoing global scrutiny of social media's impact on young users.
Questions about the ban's reach extend to potential loopholes. Reports have pointed to exclusions for gaming apps and messaging services like WhatsApp and Messenger, despite their social networking elements. Roblox, initially considered for inclusion but later exempted, has drawn fresh attention due to government reviews over child grooming concerns. "It is currently being reviewed by the government over concerns about child grooming," according to eSafety updates.
As investigations proceed, eSafety emphasized that determining "reasonable steps" for compliance remains "ultimately a question for the courts to determine." The report explains this assessment must consider "the platform’s service, technological feasibility, and the regulatory landscape." Experts have raised doubts about age-assurance technologies, which often include error rates that could allow some children to bypass checks. Whether such systems meet the "reasonable steps" threshold is unresolved.
The probe is currently limited to the five mainstream platforms, but broader enforcement questions linger. "As new platforms are launched, and as children continue to seek new ways to connect with peers online, the potential spaces where they can encounter harm continues to grow," the report observes. It questions whether self-assessment by tech companies is sufficient to cover all qualifying services.
Beyond account restrictions, the ban does not directly tackle content harms, algorithms, or other platform features that experts link to risks for youth. The Australian government completed consultations on proposed "digital duty of care" legislation in early 2025, which would require platforms to proactively mitigate harms. However, no timeline for its introduction has been announced, leaving a gap in addressing root causes.
Australia's policy has inspired similar debates abroad. In the U.K., lawmakers are advancing an under-16 ban, while the European Union considers stricter age verification under its Digital Services Act. Canadian officials have expressed interest in adapting elements of the Australian model. Yet, critics argue that enforcement challenges, like those highlighted in the eSafety report, could undermine such efforts globally.
For parents and educators in Australia, the report offers mixed reassurance. Organizations like the Australian Parents Council have welcomed the account removals but called for better reporting tools. "Pathways for reporting... have generally not been accessible," one parent advocate noted in response to the findings, echoing the regulator's concerns.
Looking ahead, eSafety plans to expand compliance monitoring as the landscape shifts. The mid-2025 enforcement decisions could set precedents for fines and further rules. Meanwhile, with children reportedly flocking to unregulated alternatives, the ban's long-term effectiveness hinges on adaptive regulation and international cooperation.
In a statement, Commissioner Inman Grant reiterated the office's commitment: "The new report on social media restrictions shows there is a long road ahead for compliance." As Australia navigates this evolving digital frontier, the world watches to see if the under-16 ban can truly shield the youngest users from online perils.
