Roblox reported 13,316 instances of child exploitation last year

UPDATE 5.30pm UK: Roblox has responded further to Bloomberg’s article, and has now said the report fails to properly contextualise the scale of the game’s issues with player safety versus the size of its platform.
“A recent article contained glaring mischaracterisations about how we protect users of all ages on the platform and failed to reflect both the complexities of online child safety and the realities of the overwhelmingly positive experiences that tens of millions of people of all ages have on Roblox every single day,” a Roblox spokesperson said in a statement to Eurogamer, providing a link to a lengthy blog post by the company’s chief safety officer Matt Kaufman.

ORIGINAL STORY 4.30pm UK:
Roblox’s record on child safety has come under fire yet again, this time in a damning new Bloomberg report that details the scale of the issue faced by the game’s developers, who are unable to keep all of its young players safe.

Last year, Roblox itself reported 13,316 instances of child exploitation to US authorities, and responded to over 1300 requests from authorities for detail on predatory players.

In the last six years, at least two dozen people have been arrested in the US for abusing victims met via Roblox.

Bloomberg’s report goes into detail on the case of DoctorRofatnik, real name Arnold Castillo, who ran a hugely-successful Sonic the Hedgehog rip-off and earned tens of thousands of dollars in revenue.

The 22-year-old Castillo was admired by an army of children who played his Roblox game, some of whom he paid to work on the project with him. But despite reports by Roblox players of inappropriate contact between Castillo and players as young as 12, it wasn’t until a 15-year-old girl he had preyed on went missing from her home that Castillo was caught by the FBI.

Castillo is now serving a 15-year sentence for grooming and sexually abusing the child.

The scale of the issue and the work needed to solve what Bloomberg calls a “pedophile problem” is enormous. Roblox filters 50,000 chat messages a second between its tens of millions of concurrent users, most of whom are under the age of 18. It employs 3000 human moderators, though relies on AI for much of the work.

Bloomberg has spoken with Roblox moderation staff who say they feel overwhelmed, with hundreds of reports of child safety issues reportedly flooding in every day.

The report alleges that Roblox is putting growth over player safety, as the company pushes towards an eye-opening target of reaching a billion daily users (Roblox currently has around 100 million today).

Responding within the report, Roblox chief safety officer Matt Kaufman declined to comment on specific cases, but said that safety and civility were “foundational”.

“Tens of millions of people of all ages have a safe and positive experience on Roblox every single day,” Kaufman said.

Roblox has repeatedly faced criticism it isn’t doing enough to ensure its young audience are safe on its platform. Earlier this year, Roblox also responded to criticism over comments made by studio head Stefano Corazza regarding payments to children who create games on the platform.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *