


In theory, playing an immersive video game sounds fun, maybe even a little exhilarating. That’s the kind of entertainment people want in the metaverse — but it isn’t all fun and games.
Police in the United Kingdom are currently investigating the case of a 16-year-old girl whose avatar was sexually assaulted by a gang of male avatars while in an unidentified metaverse. The girl was not physically injured, according to the New York Post, but she did suffer psychological trauma. That should not come as a surprise — after all, her innocence was violated whether or not anything she experienced physical harm. (READ MORE: Confronting the ‘Artificial Intelligence’ Backward-Looking Bureaucracy Trap)
While this is the first time a case like this has come under investigation, it’s certainly not the first instance of virtual violence on these platforms. A 21-year-old researcher studying users’ behavior was sexually assaulted in the virtual landscape within her first hour on the platform, and just last year, Nina Patel, a psychologist, reported on X that she had been sexually assaulted in a metaverse within 60 seconds of joining the virtual landscape for the first time.
Just Like the Real World
Patel, who has spent the last six years studying metaverses, recently wrote about that experience for the Telegraph, arguing that it was just as psychologically traumatic as a physical experience would have been:
My heart was racing and a sense of panic bubbled up inside, just as it would if the attack had occurred in the physical world. I entered fight-or-flight mode. Even though I was physically in my living room in London, wearing a headset and experiencing this all in virtual reality, my response was similar to if I’d been attacked in the street.
The problem is that virtual reality experiences are immersive. Meta, and companies like it, have one job: Make the experience as realistic as possible. And so they have. While it’s true that users could simply take off their headset and end the experience, it’s just as true that that’s likely not the first thought that comes to mind when in flight-or-fight mode. (READ MORE from Aubrey Gulick: If You Could Talk to the Dead, Would You?)
“I know it is easy to dismiss this as being not real, but the whole point of these virtual environments is they are incredibly immersive,” U.K. Home Secretary James Cleverly told LBC’s Nick Ferrari at Breakfast. “And we’re talking about a child here, and a child has gone through sexual trauma…. It will have had a very significant psychological effect and we should be very, very careful about being dismissive of this.”
The companies creating virtual realities are trying to address the issue. In Meta’s case, it has built a “Personal Boundary” feature that keeps other users a good distance away from your avatar — a solution that solves the virtual rape problem but doesn’t address other forms of inappropriate behavior that have become common on the platform. For instance, sexual perverts can (and do) groom children in a metaverse, and plenty of people have received explicit messages from fellow users.
Does the Law Apply to the Metaverse?
Given that this 16-year-old’s case is the first of its kind in the U.K., police aren’t even sure they can prosecute the perpetrators, according to the New York Post.
Both the U.K. and the U.S. have laws on sexual content on virtual platforms. In California, for instance, sexual messages that put another user “in reasonable fear for his or her safety” and that are sent by a user who intends to “imminently [cause] that other person unwanted physical contact, injury, or harassment” are punishable offenses. (READ MORE from Aubrey Gulick: The Dark Side of AI: Generating Child Porn)
Of course, making any kind of sexual advances toward a child on the internet is also a crime in both the U.S. and the U.K. — particularly forms of online grooming. According to the Sylff Association, “knowingly seducing or enticing a minor to engage in unlawful sexual conduct can be punished” under Florida law; “[a]pproaching children online with intent to meet them or to engage in sexual conduct is considered a more serious crime because such acts put children at the real risk of sexual exploitation.”
But the question legal experts are going to have to answer is whether that law applies to instances of sexual predators who are treating their victims’ avatars like virtual porn but don’t have any intention of tracking them down in the physical world.
The question parents should be asking themselves — namely, whether to allow their children to play games in the metaverse — should be a lot easier to answer: No.