This story is partCNET’s exploration of the next stage in the evolution of the Internet.
Watching a friend or family member use a VR headset is a weird experience. They gesticulate wildly, scratching the empty space. But they react to what feels like a very real set of stimuli: bad guys who need to be put down or haunted hallways with ghosts around every corner. It seems absurd to you. It feels real to them.
That’s why sexual assault in the metaverse is a problem we can’t afford to wait.
On Tuesday, a researcher from the nonprofit SumOfUs spent time in Horizons Worlds, Meta’s flagship VR world. It took her less than an hour to be “raped” according to a report by the organization.
What does it mean to be sexually assaulted in the metaverse? In this case, the researcher was stuck in a room by two male avatars. One of these avatars came face to face with the researcher and made sexually abusive comments, while the other backed off and apparently drank from a virtual vodka bottle.
The report was mocked on social media, many argue that “rape” is too strong a word to apply to what happened in this clip. This semantics obscures the fact that virtual interactions can cause real trauma. Despite the relatively young age of the metaverse, there are already countless cases of border crossings in virtual space.
In February, virtual reality researcher Nina Jane Patel said she was “gang raped” by four male avatars in Horizon Worlds. They gathered around her, capturing screenshots as they groped her character while saying, among other lewd comments, “don’t pretend you didn’t like it.” Last June, a woman was playing the Echo VR sports game with strangers when one said he recorded her talking to “jerk off” later.
Abuse in the metaverse is likely to be as rampant as it is on social media. But these incidents illustrate how much more traumatic they could be due to the immersive experience these worlds provide. The idea of living in a virtual world, once a selling point for virtual reality, is being turned on its head in the darkest way possible.
“It was surreal”, Patel said of his experience in a blog post. “Virtual reality was basically designed so that the mind and body could not differentiate between virtual/digital experiences and realities. To some extent, my physiological and psychological response was as if it were happening in reality.”
VR worlds need to offer better protection and tools to their users. Social media moderators already have a tough but crucial job, but in the metaverse they will likely have to act more like a police force patrolling the streets of a big city. Instead of removing content after the fact, they should find abuse as it happens.
But that puts a lot of pressure on moderators, and it’s unclear if any company is prepared for this kind of proactive response.
Meta did not respond to requests for comment for this story.
Metaverses are open, sprawling worlds where hundreds or thousands of people socialize. This can take place under the guise of a game, like Fortnite or World of Warcraft, or social simulators like Second Life. It’s an old concept. The reason you’ve heard the phrase so often over the past year is that the Metaverses are moving into their next phase.
What this next phase looks like depends on who you talk to. When blockchain enthusiasts talk about the metaverse, they are referring to more a new way for us to interact.. CNET editor Scott Stein says it’s not necessarily a specific thing, but
Then there’s Meta’s vision. When the company formerly known as Facebook says “metaverse,” it means a massive virtual reality world that simulates the real thing. Blockchain metaverses will be on PC browsers. The metaverse of Meta lives in a VR headset. (Meta CEO Mark Zuckerberg hasin his company’s metaverse, but what it will look like is still unknown.)
The advantage of VR metaverses is that they are more immersive. Unfortunately, it also makes abuse more damning. This is especially true when users are fitted with vests, which allow them to experience unwanted touch.
“It was a nightmare,” Patel said.
Moderate the metaverse
The metaverse will undoubtedly be more difficult to moderate than existing social media. On social media, you can block people who cause you grief, and moderators can remove malicious content. Even with these benefits, platforms like Facebook and Twitter are full of harmful content.
In addition to extensive moderation, companies will need systems in place to limit abuse in the first place. This will be hard enough in browser-based metaverses where, like in today’s MMO games, damage can be done via voice chat. It will be even more difficult in VR worlds where you can be touched virtually and have your space invaded.
Meta CTO Andrew Bosworth said user moderation “on any meaningful scale is virtually impossible”. according to a note seen by the Financial Times. But he also called the widespread harassment an “existential threat” to the success of the Metaverse.
Meta has been tinkering with security tools in recent months. While tweaking Horizon Worlds, which is still in beta, it added security precautions like the. If enabled, it prevents people from approaching within one meter of your avatar.
“We have way more tools than we’ve touched,” said Aaron Stanton, co-creator of the Oculus QuiVR archery game. After being alerted that a woman had been groped by another in-game avatar, Stanton and his co-designer implemented a gesture that allowed users, if threatened, to fend off the attackers.
Now director of the VR Health Institute, Stanton believes developers in these worlds should focus more on features that empower users. Stanton’s reasoning is not that the victim is responsible for his own protection. On the contrary, he says, the tools of protection are often inadequate and can render abused people helpless. But he thinks VR worlds open the door to better moderation than social media platforms.
He takes the example of a gesture that makes you a giant, able to ward off harassers. For the abuser, your avatar would simply disappear. But in your game, you will feel like you have the power to get yourself out of a bad situation.
“The problem with purely protective tools is that they always leave the threat inside the helmet,” he said. Protection tools “don’t really remove the threat, they just trap it online. I think we need solutions that actually address the problem without forcing players out of virtual space.”
Many uncertainties swirl around the metaverse. Arguably the most important thing is to make sure it’s built in a way that doesn’t allow abuse to flourish as it currently does on the internet.
“Over the past twenty years, we have integrated the Internet into our daily lives,” Patel wrote. “The non-negotiable this time around is ignoring the dark side.”