This is the new MLex platform. Existing customers should continue to use the existing MLex platform until migrated.
For any queries, please contact Customer Services or your Account Manager.
Dismiss

For the world's online safety overseers, 2026 will be the year of the child

By Mike Swift, Sara Brandstätter, Patricia Figueiredo and Maria Dinzeo

December 9, 2025, 14:41 GMT | Comment
Regulators worldwide have woken up to a need to police online safety for minors. Australia has arguably gone the furthest, banning social media for under-16s from this week. In Europe, America, Asia and beyond, governments are reacting to concerns of a surge of mental-health problems for teens, pressuring social media and gaming platforms as never before to improve safety — even if it means losing users. This article introduces a suite of four deep-dive MLex stories to be published through this week focused on video gaming, algorithmically recommended content, age assurance and the US's unique regulatory landscape.
For two decades, social media has been a juggernaut rolling forward relentlessly, as Facebook, Instagram and TikTok surged into the billions of users. But the once unthinkable is happening: the brakes are being applied.
 
Regulatory pressure has forced social-media platforms to start purging the highly coveted demographic of younger teen users, most notably in preparation for Australia’s under-16 social-media ban due to take effect Dec. 10. That ban covers Reddit, Facebook, Twitch, Instagram, Snapchat, Threads, Kick, TikTok, X and YouTube.
 
Australia may be in the vanguard, but it is hardly alone in confronting the problem of children's social-media use. Around the world, in the United States, in Europe and in countries ranging from Brazil to Indonesia to India, litigation is being filed and laws are being passed that will increasingly segregate the online experience of child and adult users.
 
Governments are imposing new age-assurance rules that require online platforms to determine the age of their users — adults as well as minors — to protect young users from sexual or violent content, sexual predators and excessive use that may trigger mental-health problems such as eating disorders and even suicide attempts.

At the same time, child-rights groups and researchers are urging regulators to take a proportionate approach that keeps children’s rights in mind, as exemplified by the case of two Australian teenagers challenging the social-media ban before the country’s highest court. 
 
“In terms of the risks, I think just recognizing that age assurance is so important, but we're probably at the start of quite a big societal change here, of, you know, getting people to access and use the Internet differently,” said Anna Lucas, a director of online safety supervision at UK regulator Ofcom.
 
Behind Lucas’ insight is the evolving science about whether, and to what extent, excessive social-media use driven by platforms’ addictive design harms the mental health of teenagers and children, disrupts schooling and causes other social impacts.
 
Limiting or blocking minors will deal a blow to the pocketbooks of social-media companies, many of which are based in the US. Snapchat, for example, earned $3.16 in average revenue per user in its most recent quarter, while Meta earned $14.46.
 
— MLex special series —

MLex has been reporting daily on this volatile, changing global regulatory landscape, and in addition, we have spent the past few months digging deeper into efforts to limit social media harms to younger users, approaching the issue from multiple viewpoints, resulting in four analytical stories to be published this week. We'll link to them below as each is published.

• One story is on video gaming, one of the world’s most lucrative entertainment industries, which isn't specifically captured in European content moderation laws and which, until now, has been operating under the regulatory radar compared to social media.
 
• MLex has also examined efforts to regulate the algorithms that personalize the feeds of what billions of users see on Instagram and TikTok, as regulators focus on engagement-driven algorithms as the next frontier of online safety policy.

• As age-assurance regimes spread from London to Los Angeles, MLex examined the privacy and security costs of regulator-imposed mandates that platforms collect even more personal data about their users in order to verify their ages.
 
• Finally, with courts in the US poised to hear the first jury trials in 2026 over whether the attention-driven business models of Meta, Snap, YouTube and TikTok have triggered a mental health crisis for teens, MLex examined how US states are circumventing the free-speech protections of the US Constitution, as they seek to limit teens’ social media use (see the story here).
 
In the dock in this global regulatory movement is the attention economy itself, which generates hundreds of billions of dollars in ad revenue by using personal data to target content that welds eyeballs to smartphone screens, enabling the collection of even more personal data in a self-reinforcing cycle.

MLex hopes this broad but deep-diving group of stories will clearly illustrate this global sea change in the online world.

Please email editors@mlex.com to contact the editorial staff regarding this story, or to submit the names of lawyers and advisers.

Tags