Australia’s online safety regulator has required major gaming platforms — including Roblox, Minecraft, Fortnite and Steam — to disclose how they protect children from sexual predators and radicalisation.
The eSafety Commission said it issued legally enforceable transparency notices asking companies to supply detailed information about their safety systems, moderation practices, staffing and other protections. Platforms that fail to comply may face enforcement action, including penalties and civil proceedings.
eSafety Commissioner Julie Inman Grant said online games are now key social spaces for young people, noting that nine in 10 Australians aged eight to 17 play online games. She warned that predators are using game environments to make contact with children and then shift conversations to private messaging. Grant also cautioned that some adults exploit games to groom children or to insert terrorist or violent extremist narratives into gameplay, raising the risk of contact offending, radicalisation and other harms that spill outside the platforms.
The regulator’s intervention follows a broader push by Australia to curb online harms to minors. Last year the government banned under-16s from major social media services, but the eSafety watchdog reported that a substantial share of children were still accessing blocked platforms three months after the ban.
Roblox has faced a string of legal challenges in the United States, with more than 140 lawsuits alleging the company failed to prevent sexual exploitation of children. The company reached settlements with the states of Alabama and West Virginia totalling over $23 million and has introduced age-tailored account options for younger users.
Edited by Louis Oelofse