The video-games industry has moved into the global regulatory spotlight. Policymakers, particularly in the EU, are advancing new rules and stricter enforcement, while court cases are multiplying in the US, all targeted toward protecting children from exploitation online. After years of scant scrutiny, the sector now faces the prospect of a fragmented patchwork of regional regulations emerging, as regulators probe its business models — including the foundational feature of spending real money on in-game items.
This article is part of an MLex online safety special series running this week. Other stories focus on US-specific regulation, algorithmic content and age verification (see here). “We might have to leave the EU market,” worries Michael Zillmer, co-founder of German video-games studio InnoGames, on moves to tighten digital and consumer rules for gaming across Europe.
If exiting a market altogether sounds extreme, Zillmer is far from the only one voicing deep fears about the direction of new regulations that the European Commission is drawing up.
After years of relative obscurity, the likes of Roblox, Epic Games, Ubisoft and Supercell are being dragged into a spotlight as regulators and lawmakers shift their focus from social media and explicit websites toward video games in response to mounting pressure from consumer and child-safety groups.
Developers are seeing compliance pressures and legal uncertainty mount up as governments sharpen scrutiny on how their games are designed, how they are monetized and how they verify their users' ages (see here).
A report by the UN agency for children, Unicef, laid out in October how video games have powered up to become the world’s most profitable form of entertainment — with more than 3.4 billion players globally, it's now a sector worth close to $190 billion.
Responding to pressure from child safety advocates, anticipating compliance requirements, and perhaps also hoping to ward off lawsuits, major gaming companies are quickly rolling out new policies to better protect children. The latest has been Roblox, one of the world's most popular gaming platforms, which recently said children will, from this month, be blocked from talking to adults or older teenagers.
At the same time, industry leaders are pushing back, emphasizing the sector’s social, educational and economic benefits in an effort to shape the regulatory debate (see here).
— Multifaceted risks —
Child protection is at the heart of the regulatory march, prompting the question: What exactly are the areas of risk stemming from children’s use of video games?
While acknowledged to be multifaceted, they divide into two main categories: on the one hand, financial and commercial risks, such as spending too much money on games; on the other, online safety risks, including physical and psychological harm, such as grooming or cyberbullying.
On commercial risks, the European Consumer Organization, known as BEUC, has taken a strong stance, saying that the use of premium "in-game" currencies misleads consumers, breaches EU consumer-protection laws and exploits children’s vulnerability through opaque pricing and manipulative design practices.
A study published this summer by the UK’s online safety regulator, Ofcom, looked into how persuasive design is leading young users to spend money. It found that around half of gamers who played spent money inside games, with around a third “often regretting purchases.”
Debates are live in the EU and the UK about restricting or banning “loot boxes” — mechanisms that offer players mystery in-game items in exchange for real money — as well as addressing risks linked to “free to play” games that rely on in-game currencies as their business model (see here and here).
On safety risks, Unicef recently warned that while gaming does offer many benefits, criminal organizations are increasingly exploiting online platforms “to socialize and recruit young players, including children, to involve them in violence.”
A separate report by the London School of Economics and the child campaign group 5Rights examined the risks children face across the entire gaming ecosystem, focusing on the UK and Brazil. The study found that risks for young users vary depending on the type of video games, stating that “mobile gaming environments differ significantly from console-based environments.”
To categorize them, the report uses a "4Cs" classification for online risk to children: content risks from exposure to violent or sexual material; contact risks from harmful interactions such as grooming; conduct risks involving bullying or harassment; and contract risks tied to economic exploitation and digital agreements they may not understand.
Alexandra Walsh, leading litigation against Roblox in major cases in the US, and representing families of children saying they were harmed by using it, told MLex that “going after the online currency and the consumer aspect is also a way to help, at least, reduce some of the grooming and exploitation activities.” That’s because “predators use the currency as a way of coercing kids into talking to them, sending a picture, meeting them in person, meeting them on another app.”
Clare Daly, a legal adviser and board member of the Irish pressure group CyberSafeKids, says regulators must broaden their lens to see the full range of risks. “To capture all the new developments, such as big platforms like Roblox, regulators need to take a broad approach,” she told MLex.
— Europe moves first —
As the risks gain visibility and urgency, regulators and policymakers around the world are beginning to respond.
Momentum is particularly strong in Europe: The EU is taking the lead, preparing to propose a suite of stringent, sector-specific rules next year, while the UK’s advertising regulator is prioritizing enforcement of existing frameworks, particularly on tackling gambling ads with strong appeal to children.
Across the Atlantic, the US is following a familiar path of litigation first, with courtrooms rather than regulators driving much of the change.
These may be different routes, but all have the potential to reshape the global video-games industry in profound ways in the name of protecting young users.
In the past, video games have largely been governed by consumer-protection laws in Europe. Because those rules were in place well before Brexit, the EU and the UK have maintained a broadly similar regulatory approach in the decade since. In addition, the sector itself developed its own age-rating system, PEGI, and self-regulatory code of conduct.
More recently, the UK’s Online Safety Act and the EU’s Digital Services Act have sought to make the online environment safer by targeting user-generated content and systemic risks, respectively, with a particular focus on protecting children, and partly capturing video games.
There are differences: the UK regime applies only to user-to-user platforms, while the EU one is broader in scope. This is the first point of divergence for video-games companies operating in Europe, though many of them remain outside both laws due to the definitions used.
Current discussions on a planned overhaul of the EU’s consumer rules, the Digital Fairness Act, could further shake up the situation. The DFA is set to include some of the most tailored rules for the video games industry, such as regulating or banning in-game currencies, loot boxes, and also addictive design features (see here).
That’s a move that could have broader implications.
“If the DFA was designed in such a way that it really significantly updated or changed the law in Europe around how virtual currencies operate, you would pretty much overnight have two fundamentally different systems operating within Europe,” Will Deller, a UK-based lawyer at Bird and Bird, advising major video-games companies, told MLex. “You'd have the UK position and the EU position, and they would be quite different.”
EU-based developers and companies might be forced to overhaul their business models or exit the market altogether, while US- and Asia-based companies seeking entry to Europe would have to choose between complying with tougher EU rules or releasing separate EU and UK versions of their games.
Celia Pontin, director of policy and public affairs at Flux Digital Policy, which advises the gaming sector in Europe, told MLex that the longstanding parallel — with consumer protection regulation focusing on economic harm, and newer online safety regulation mainly addressing physical and psychological harm — held broadly true “until the Digital Fairness Act started happening,” because the new rules aim to tackle both.
Pontin, who previously worked at the UK’s Advertising Standards Authority, said that “the games industry is always going to be regulated by both,” referring to online safety and consumer protection laws, but that a more complex regulatory situation is in the offing.
— Around the world —
As the EU moves to capture the video games sector within online safety and consumer laws, a similar move is underway in Brazil, where child-protection and consumer-safety proposals are emerging.
Under the Brazilian rulebook, the sale of loot boxes will be forbidden for users under 18 in games considered likely to be accessed by minors from next March. Gaming companies will also be required to introduce systems for complaints and to provide the regulator with data on the number of reports received — duties that echo the UK's Online Safety Act.
Meanwhile, focusing mostly on the consumer side, India has imposed a nationwide ban on all online games that involve real money, plunging the country's $25 billion gaming sector into crisis (see here).
In Australia, meanwhile, where a world-first social-media ban for under 16s came into force this week (see here), pressure is already building to extend it to video-game platforms.
The ban could already catch some services, as the Australian government notes: “Online gaming services that enable online social interaction through features and functions such as direct messaging, group chats and livestreaming may be included in the age restrictions if the service’s sole or primary purpose changes.”
Categorization questions such as these are likely to gain more importance as other jurisdictions, such as the EU, may follow suit and introduce age limits for social media platforms (see here and here).
In what is widely interpreted as a response to these regulatory signals, Roblox has announced that children will no longer be able to chat with adult strangers. Mandatory age checks will be introduced for accounts using chat features, rolling out this month across Australia, New Zealand and the Netherlands, before expanding globally from January.
— Litigation in US —
The jurisdictions mentioned above focus on regulating the video-games sector through consumer and online safety laws. By contrast, the focus in the US lies on high-profile litigation against platforms, most notably Roblox, with most action currently unfolding at the state level.
Roblox has seen its global audience mushroom by 70 percent over the last year, to more than 151 million users on an average day, including 34 million daily users in Europe and 26 million in the US and Canada. A landmark child safety lawsuit filed against the gaming platform in 2023 has prompted a wave of additional complaints from families and state attorneys general across the US, who are now pursuing broader action on behalf of children in their jurisdictions (see here).
Kentucky’s attorney general has sued the platform, alleging it has become a “hunting ground for child predators” and failed to implement basic safety controls, while Louisiana brought a similar case in August (see here and here). This month, Texas became the third state to sue Roblox (see here).
For Alexandra Walsh, the lawyer representing families in some cases against Roblox, “we’re at an inflection point where awareness is growing, which is great.” The pattern mirrors earlier mass-harm litigation — such as tobacco cases — in which individual suits set the stage for statewide action and increased political pressure on lawmakers, she said.
There has also been action in regulatory channels: the US Federal Trade Commission has recently taken enforcement actions, including imposing significant fines on Epic Games and Cognosphere, the maker of Genshin Impact (see here and here). The FTC’s leadership has suggested that focus will continue in 2026 (see here).
— Industry pushback —
In the face of these developments, the video game industry is on the defensive, feeling misunderstood. This is mainly focused on the EU, where the rules are the same for small and large companies alike, and so are the concerns — whether it’s Zillmer, the German game developer, or Supercell chief executive Ilkka Paananen, who voiced his concerns in October (see here).
“There is a lack of understanding from some regulators on how massively the sector would be impacted,” said Ann Becker, senior vice president and head of policy and public affairs for Video Games Europe, the biggest industry association in the EU, counting Epic Games and Ubisoft among its members.
Seeing the EU’s move to adopt new rules as especially critical, Becker said the video-games sector has always acted globally. “If the restrictions in the EU would be too severe, some developers may consider geo-blocking Europe,” she told MLex.
Her industry group argues that overly specific regulation and banning in-game purchases could ruin the industry, one of the few digital sectors in which European companies have achieved global success (see here). It argues that there is huge value in its free-to-play business model, which relies on in-app purchases to make it possible for games to be downloaded for free.
While the EU regulator is working on its simplification of digital regulation (see here), “they are adding regulation that’s hitting the creative content sectors more than tech and social media,” Becker said, adding that Europe “is the leader in original content creation, but doesn't necessarily have the big platforms on which you play games."
— Closing the gap —
The industry is on the brink of a new regulatory era, emerging as a key focus of policymakers seeking to strengthen online protections for children.
In particular, the EU’s framework is tightening — and as its digital agenda often sets a global benchmark, its changes are likely to ripple far beyond Europe, influencing how games are designed, monetized and moderated worldwide.
“The risk that video-game companies are dealing with now isn’t ‘will I be regulated?’ It’s the question: Will those regulations reflect the industry, or will they reflect what someone else thinks the industry is?” said Pontin, who has worked in the sector for many years.
“That’s the gap, then, that the industry needs to navigate and needs to try and close as much as possible, as well as prepare to be regulated.”
Please email editors@mlex.com to contact the editorial staff regarding this story, or to submit the names of lawyers and advisers.