Australia’s online safety regulator has ordered major gaming platforms — including Roblox, Minecraft, Fortnite and Steam — to explain how they protect children from sexual predators and radicalization. The eSafety Commission said it issued legally enforceable transparency notices requiring details on safety systems, staffing and moderation practices; companies that do not comply could face penalties and civil action.
eSafety Commissioner Julie Inman Grant said online games have become social hubs for young people, with nine in 10 Australians aged eight to 17 playing online games. She warned predators use gaming platforms to make contact with children in game environments and then move them to private messaging services. Grant said “predatory adults” target children through grooming or by embedding terrorist and violent extremist narratives in gameplay, increasing risks of contact offending, radicalization and other off‑platform harms.
The move comes as Australia steps up efforts to curb online harms to minors after banning under‑16s from major social media platforms last year. The online safety watchdog found a substantial proportion of Australian children were still accessing banned platforms three months after the ban.
Roblox faces more than 140 US lawsuits alleging it failed to stop the sexual exploitation of children. The company agreed to settlements with the US states of Alabama and West Virginia totaling more than $23 million and recently announced tailored accounts for young users.
Edited by: Louis Oelofse