Campaigners claim that a Meta virtual reality platform Horizon Worlds used an avatar to sexually assault a 21-year old researcher.
SumOfUs is the corporate accountability group for which the researcher works. It says that Meta requires better plans to reduce harms in the metaverse.
The annual shareholder meeting of Meta takes place on Wednesday.
According to the company, Horizon Worlds has safety tools that can be used to ensure people have a positive experience.
A spokesperson for Horizon Worlds stated that the company wanted all Horizon Worlds employees to have access to the safety procedures and “help us investigate” the situation.
Horizon Worlds is only currently available to Canadian and US users. The platform’s avatars have a cartoonish, simplified appearance.
SumOfUs claims that virtual assaults can cause severe trauma.
Vicky Wyatt, the campaign director for the group, stated that “it still counts” and that it has a real impact upon users.
Ms Wyatt stated that the researcher who was subject to the alleged assault felt “partially shocked”, and that they thought that it wasn’t their real bodies, but an avatar. Another part thought that this was important research, and that they needed to capture the footage.
Some of the footage has been seen by BBC. Although the avatar of the researcher is not visible in the footage, it is taken from her viewpoint. However, there are two male avatars present in the room. One is watching, the other is very close to her. They make inappropriate comments and share a virtual glass of wine.
Proponents acknowledge that there is no one-size-fits-all definition of the metaverse and that it is still in development.
It uses several pre-existing technologies such as virtual reality and augmented realities.
The metaverse does not consist of a single space. Many different 3D virtual worlds, including established games and virtual worlds, are considered to be part.
Only a few parts of the metaverse currently exist, such as Horizon Worlds. The company is still a champion of the idea and has invested billions of dollars to develop the concept. It also plans to hire thousands to help with it.
Meta introduced new safeguards to its virtual worlds in February after earlier reports of avatar assaults and “creepy” behavior.
The Personal Boundary prohibits avatars from being within a certain distance of one another, making it easier for them to avoid unwanted interactions.
It prevents others from “invading” your avatar’s personal area, said Meta.
“If someone attempts to enter your Personal Boundary the system will stop their forward movement once they reach it.”
Meta states that the default Personal Boundary setting for Meta is approximately 4ft (1.2m), virtual distance between your avatars and those of others not on your friends.
According to the company, there are many ways to report and block users.
SumOfUs reports that the researcher was encouraged to disable the Personal Boundary feature.
According to the group, the researcher witnessed homophobic slurs as well as virtual gun violence.
SumOfUS filed a resolution together with a small number of shareholders asking for a risk assessment by the company of the potential human rights impacts of this metaverse.
A blog post that Meta’s president for global affairs, Nick Clegg, had posted recently was also criticised by the group.
He wrote in it: “We wouldn’t hold a manager of a bar responsible for real-time moderation at their bar. As if they should sit over your table, pay close attention to your conversations, and silence your voice if you hear something they don’t like.”
Ms Wyatt states that Meta must act now in order to address issues: “Rather that Facebook rushes headlong into building the metaverse, we’re saying, look, you need stop looking at all of the harms happening on your platforms now that you don’t even have to deal with.
“Let’s stop repeating and replicating those in the metaverse.” We need a better strategy to reduce online harms in this metaverse.