Ban social media for teens? What Switzerland can learn from Australia
Australia's ban on social media for people under 16 has sparked global debate. Switzerland is also weighing rules for online platforms, but experts warn bans alone won’t fix the problems caused by harmful algorithms and addictive design.
Countries around the world have debated age limits on social media for years, but Australia was the first to act. In December 2025, it blocked access to ten platforms, including TikTok, Instagram and YouTube, for minors under 16. Prime Minister Anthony Albanese calledExternal link it “the day Australian families take back control from Big Tech”.
Major tech platforms are under scrutiny worldwide. Recently, US courts ruled against Meta (which owns Facebook, Instagram and WhatsApp) and Google’s YouTube in separate cases, over harms ranging from child sexual exploitationExternal link to mental health issues and addictionExternal link.
Australia’s move triggered a chain reaction. Countries such as Spain, France and the United Kingdom are considering similar measures.
As Switzerland weighs its own response, experts and civil society organisations warn that the central issue is not just who uses social media, but how platforms and their algorithms operate and the influence they exert on users.
A long-delayed Swiss law to regulate communication platforms and search enginesExternal link has recently undergone public consultation, involving political parties, private companies and other stakeholders. While the law requires platforms to build tools for reporting abuse, it does not compel them to prevent harm or protect minors. Nor does it include a mechanism to sanction major tech companies for violations.
Critically, artificial intelligence (AI)-driven chatbots and systems, which determine what content users see and how long they stay online, remain largely unregulated.
This is a major shortcoming, says Australian expert Daniel Angus, director of the Digital Media Research Centre at Queensland University of Technology.
“If we want to improve platforms for society as a whole, we need to intervene in how they are designed and in their economic logic, not just who is allowed to use them,” he says.
‘A ban does not solve the problem’
Angus sees Australia’s law as too simplistic and says it fails to address the structural causes of online harm. “It is a law that does not support young people, does not educate them and does not solve the underlying problems,” he says.
The real issue, Angus says, is the platforms’ business model, which relies on AI algorithms to profile users, maximise attention and increases time spent online to sell advertising.
Content recommendation systems often operate in opaque ways, yet Angus says the Australian law does not impose greater transparency.
He also believes that excluding young people could reduce pressure on lawmakers to limit harmful posts and advertising, based on the assumption that the need for moderation is lessened if minors are not present.
“Why not clean up platforms instead of excluding young people? Why not remove harmful content and improve the experience for everyone?” he asks.
Canberra defends its law
The Australian Government, for its part, defends its approach. In response to questions from Swissinfo, a spokesperson for Australia’s eSafety Commissioner stressed that the ban is only one part of a broader framework.
This includes measures to tackle online abuse, cyberbullying and illegal content, including material generated through AI, such as deepfakes. Authorities can also require platforms to provide information on how they manage AI-related risks.
According to the spokesperson, these measures are already having an impact. One example is a UK-based company offering widely used “nudify” services – which generate fake intimate images, often of minors, using AI – that withdrew its platforms from the Australian market.
Is the Australian law working? Early data is unclear
Initial data following the introduction of the ban shows mixed results. On the one hand, social media platforms have reportedly removed millions of accountsExternal link linked to minors. Meanwhile, 61% of parents who participated in a government surveyExternal link say they have noticed positive effects on their children, including more in-person interactions.
However, several concerns are emerging. Around a quarter of parents say their children have moved to alternative platforms and report a decline in social interaction and creativity.
In addition, the ban appears easy to circumvent, according to several Australian media reportsExternal link. This issue was already highlighted in a UNICEF surveyExternal link of more than 2’000 young people aged 13 to 17 in Australia, which found that nearly a quarter are often able to bypass restrictions. “This shows how important it is to create safer digital platforms, rather than simply restricting access,” Katie Maskiell, head of policy and advocacy at UNICEF Australia, wrote in a post onlineExternal link.
Angus confirms this trend: “I hear stories every day of young people who are still on Instagram despite the restrictions,” he says.
>> The younger generation in Switzerland can no longer do without chatbots:
More
‘I don’t remember what it was like without AI’: Swiss youth are getting hooked to chatbots
Debate remains open in Switzerland
In Switzerland, the draft law is still under discussion, but divisions are already emerging.
The law requires platforms to explain why content is removed or accounts are blocked, and to give users a way to challenge these decisions. It also allows users to report illegal content through an internal complaints system. However, it does not oblige companies to actively prevent harmful content. “If a platform identifies a risk, it is not required to address it,” notes Estelle Pannatier, Senior Policy Manager at AlgorithmWatch CH.
Civil society organisations are therefore askingExternal link for stricter rules, particularly on recommendation algorithms, which can expose users to harmful content, encourage prolonged use and exploit sensitive data for advertising purposes.
They also see generative AI chatbots, which are increasingly integrated into social media and search engines, as a potential threat, since their responses can shape personal opinions – pointing to the broader influence of communication platforms on democratic processes. “Switzerland currently lacks the tools to intervene effectively with platforms, even when democracy is at risk,” says Rahel Estermann, co-director of the digital consumer rights organisation Digitale Gesellschaft.
These positions contrast with those of the industry. Swico, the association representing digital companies, opposes the proposed regulation of social media platforms – especially if it includes restrictions on AI.
“AI is already part of a separate regulatory process. Regulating it in this law would unnecessarily increase the risk of uncoordinated and harmful overlaps,” says Simon Ruesch, head of legal and public affairs at Swico.
Why Europe may offer a better model than Australia
The Australian case and the Swiss debate highlight how complex it is to regulate digital platforms.
According to Daniel Angus, age-based restrictions are politically appealing because they are easy to communicate, but they risk leaving more complex issues unresolved. “The key policy question is how to address the commercial logic and the algorithms underlying these systems,” he says.
For this reason, Angus urges countries, including Switzerland, to reflect carefully before taking Australia as a model. Instead, he suggests following the European Union’s Digital Services Act, which focuses on transparency and platform accountability, including significant penalties in case of violations. For instance, the EU law requires platforms to explain how their recommendation algorithms function and to limit targeted advertising to minors, with fines of up to 6% of global turnover in case of violations. “European law, although imperfect, is far more advanced than Australian law,” Angus concludes.
The Swiss Government is expected to review the positions of all stakeholders and decide on the next steps for the proposed regulation by the end of the year.
Join the debate:
Edited by Gabe Bullard/VdV/ds
In compliance with the JTI standards
More: SWI swissinfo.ch certified by the Journalism Trust Initiative
You can find an overview of ongoing debates with our journalists here . Please join us!
If you want to start a conversation about a topic raised in this article or want to report factual errors, email us at english@swissinfo.ch.