Britain’s first police investigation into a “virtual rape” has little chance of leading to a prosecution, according to legal experts.
The victim of the alleged assault was a girl aged under 16. According to a report this week in the Mail Online, the child was wearing a VR headset and playing a video game when a group of adult men “sexually attacked” her avatar.
Although there was no physical injury, officers said she suffered real physical and emotional trauma. They added that such attacks are rife on metaverse platforms — but none have resulted in prosecution in the UK. That trend looks set to continue.
“The current legislation surrounding sexual offences such as sexual assault and rape requires a form of physical touching,” Alice Trotter, a criminal litigation associate at law firm Kingsley Napley, told TNW via email.
“This means that in order to cover a virtual sexual offence, the law would need to attribute a legal personality to the avatars, so that their ‘physical’ touching would be covered by the law.”
As this is yet to happen, existing legislation wouldn’t cover a contact offence in a virtual world.
“Tech companies cannot be expected to prevent every harmful act.
A clearer route to conviction could involve a prosecution for harassment or stalking. These offences are more tailored to apply online, where they’re increasingly common.
Yet they too are challenging to prosecute, as the behaviour typically needs to have occurred on more than one occasion. Furthermore, the suspect must have known or suspected that their conduct would cause alarm or distress.
“These occasions can occur in quick succession, but if it is one single (albeit horrible) instance, there may be some difficulties in prosecuting it under this law,” said Trotter, who last year published a blog on virtual policing.
A second potential route to prosecution involves the age of the alleged victim. As she was reportedly 16 years old, a section of the Sexual Offences Act 2003 that covers interactions with children could be applied.
“It could be that police could say that perpetrators are having a sexual communication with a child,” said Gregor Pryor, a partner in the entertainment and media team at law firm Reed Smith, which has created a legal guide to the metaverse.
New rules needed?
The limits of existing legislation have sparked calls for new criminal offences that cover sexual conduct in the metaverse.
This could involve amending current laws or creating entirely new ones. Either way, the move will face a common obstacle for tech oversight: lawmaking struggles to keep pace with digital advances.
It’s an issue that’s caused chaos in the passage of the EU’s Artificial Intelligence Act. The first draft of the regulation was written back in 2021, but the generative AI boom triggered extensive amendments that were only agreed upon last month. By the time the act applies, there may be another new AI wave that the rulebook can’t contain.
“Some aspects of digital laws are outdated the moment they land in society,” Jake Moore, a global cybersecurity advisor at Slovakian antivirus firm ESET, told TNW. “This can leave victims fending for themselves in many situations.”
Still, doesn’t mean that new tech legislation is useless. To illustrate the positive impact that they can have, Moore points to the UK’s Online Safety Act, which became law last October.
The law introduced four new criminal offences: harm-based communication, false communication, threatening communication, and cyber-flashing.
“These measures represent a significant step forward,” Moore said. “This is of course if those behind the offences can be identified — the most difficult aspect of online crime investigations.”
Policing the metaverse
Another problem for prosecutors is the blurry geographical borders of the metaverse, which doesn’t neatly align with legal jurisdictions. Moreover, a virtual world doesn’t typically produce physical evidence of crime.
There’s also an argument that resources would be better used elsewhere. In response to the virtual rape investigation, critics note that any new funding should focus on the record backlog of physical rape cases.
Because of all these issues, policing of the metaverse will remain largely left to platform operators.
Some have responded by adding digital safeguards. Meta’s Horizon Worlds, for instance, now offers a “personal boundary” setting that stops unknown avatars from entering a specific radius of another player. The feature was launched after numerous reports of sexual harassment in the VR game.
Pryor expects tech firms to take on greater responsibility for their users. But he warns that they can’t stop every online offence.
“Technology companies cannot possibly be expected to prevent every single act that may be harmful on the internet,” he said.
“However, as artificial intelligence and detection capacity improve, it may be the case that technology solves the problem that technology may have caused.”