On paper, the headline figures look like a policy success. A 2022 report by the state-affiliated China Game Industry Research Institute declared that over 75% of minors spent fewer than three hours a week gaming and that authorities had curbed “internet addiction.” The government presented these numbers as proof that the framework was working.
Research by Cato Institute found no causal link between the playtime regulations and improved health outcomes, and found a 14% higher chance of gamers playing heavily during any given week. A survey revealed that 77% of minors used other people’s identities, a parent’s or older friend’s, for game account registrations. 59% of teen gamers simply migrated to Douyin, China’s TikTok equivalent, which only regulates users under 14 compared to the under-18 limits on video games.
Following the 2021 regulation changes, Tencent’s stock dropped over 8% while NetEase’s fell around 11%. When the government merely proposed further regulations in December 2023, Tencent lost 12.3% of its stock value, and roughly $100 billion USD was erased from the Chinese gaming market in a single trading session. Ubisoft and Prosus saw stock declines as ripple effects spread internationally. A black market for gaming accounts also emerged quickly, with sellers offering unregulated accounts to minors at inflated prices and thousands of children being scammed in the process.
What China’s experiment demonstrates, above everything else, is that restricting one platform or category of content does not reduce screen time. It simply redirects it. Moving a child from Honor of Kings to Douyin is not a public health win, even if it looks like one in the gaming-specific data.
Ghosh recognizes this dynamic immediately when it is described to her. “Children are very clever. If you close one door, they will find the window. The question is not which app they are on; it is why they need to be on it for so many hours. A government cannot substitute good parenting. We need parents to take responsibility too and spend time with their children to help with their addiction.”
South Korea offers a different kind of case study, one that at least ends with an honest conclusion. The Cinderella Law, introduced in 2011, banned children under 16 from playing online games between midnight and 6 am, enforced through South Korea’s national resident registration numbers.
For a decade, it sat on the books, generating controversy in roughly equal measure to compliance. The practical problems were significant. The law caused a major headache for Microsoft when Minecraft unintentionally became an adults-only game in South Korea because of its Xbox Live integration, making a game recommended for children aged 12 and older accessible only to those 19 and above. The law also did not apply to console games or smartphones, meaning a fairly large loophole was built into the framework from day one.
In January 2022, the National Assembly repealed the Shutdown Law. The decision acknowledged what had become difficult to ignore: government-mandated curfews could not replace family-level decisions about screen time. South Korea replaced it with a parental opt-in system, where guardians designate approved gaming hours for their children. The shift was, in effect, a public admission that the law had not achieved what it set out to do, and that sustainable intervention required putting decision-making back with parents rather than servers.
Australia is the most recent and most closely watched example. From December 10, 2025, platforms including TikTok, Instagram, Snapchat, Facebook, and YouTube were required to prevent Australians under 16 from holding accounts, with fines for non-compliance.
Within the first three months, the regulator raised significant concerns about five major gaming and social media platforms, finding that many children under 16 still had active accounts or could create new ones. The enforcement was largely reactive, and account removals had not yet corresponded with measurable reductions in reported harm from cyberbullying or image-based abuse.
Australia’s ban is still young and deserves more time before a full verdict. But the first-quarter data does suggest that removing accounts is not the same thing as reducing harm, and that teenagers motivated to stay online will find ways to do so.