Legal experts believe that the first-ever police inquiry into a “virtual rape” in Britain is unlikely to result in a prosecution. The victim, a girl under 16, reportedly experienced the alleged assault while using a VR headset and playing a video game. The Mail Online reported this incident, stating that a group of adult men “sexually attacked” her virtual character.
Even though there weren’t any physical injuries, the police mentioned that she went through genuine physical and emotional distress. They also noted that these kinds of attacks are widespread on metaverse platforms, but none have led to prosecution in the UK so far. It seems like this pattern is likely to persist.
“The current legislation surrounding sexual offenses such as sexual assault and rape requires a form of physical touching,” Alice Trotter, a criminal litigation associate at law firm Kingsley Napley, told TNW. “This means that in order to cover a virtual sexual offense, the law would need to attribute a legal personality to the avatars so that their ‘physical’ touching would be covered by the law.”
Also Read: Top five AI trends the banking world could see in 2024
Since it hasn’t occurred yet, the current laws wouldn’t address a contact offense in a virtual environment. A more effective path to a conviction might involve charging the individual with harassment or stalking. These offenses are better suited to online situations, where they are becoming more prevalent.
However, prosecuting these offenses is also difficult since the behavior usually has to happen more than once. Additionally, the suspect must have been aware or had a suspicion that their actions would cause alarm or distress.
“These occasions can occur in quick succession, but if it is one single (albeit horrible) instance, there may be some difficulties in prosecuting it under this law,” said Trotter, who last year published a blog on virtual policing.
Another possible way to pursue legal action involves the age of the supposed victim. If the reports of her being 16 years old are accurate, a section of the Sexual Offences Act 2003 that deals with interactions with children could be invoked.
“It could be that police could say that perpetrators are having a sexual communication with a child,” said Gregor Pryor of law firm Reed Smith, which has created a legal guide to the metaverse.
The Reed Smith Guide came out in May 2021, almost three decades after the term was introduced in the science fiction novel Snow Crash. This happened five years after the launch of Pokémon GO’s virtual world and five months before Facebook rebranded itself as Meta.
Since then, the metaverse, web3, NFTs, and blockchain have converged to create decentralized digital environments for living, working, and socializing within open, trustless, and permissionless networks. Cryptocurrency values have experienced significant drops, and there have been substantial investments in technology within this realm. To capture the transformations witnessed over the past year in our metaverse endeavors, we’ve introduced the 2nd Edition of the Reed Smith Guide to the Metaverse.
In the 2nd Edition of the Reed Smith Guide to the Metaverse, we delve into topics like web3, NFTs, metaverse investments, the film and television industry, blockchain and crypto assets, distinctive applications in the aviation sector, and insurance concerns. Additionally, the guide includes a thorough glossary of terms.
The shortcomings of current laws have led to demands for fresh criminal offenses addressing sexual behavior in the metaverse. This might mean tweaking existing laws or crafting entirely new ones. Regardless, the challenge remains the same for overseeing tech: legislation lags behind the rapid pace of digital advancements.
This problem has created a mess in the progress of the EU’s Artificial Intelligence Act. Although the initial draft was penned in 2021, the surge in generative AI led to significant changes that were only finalized last month. When the act finally takes effect, there might be another wave of new AI developments that the regulations can’t handle.
“Some aspects of digital laws are outdated the moment they land in society,” Jake Moore, a global cybersecurity advisor at Slovakian antivirus firm ESET, told TNW. “This can leave victims fending for themselves in many situations.”
However, it doesn’t imply that fresh tech laws are ineffective. To showcase the positive influence they can wield, Moore highlights the UK’s Online Safety Act, which was enacted last October. This legislation brought in four new criminal offenses: harm-based communication, false communication, threatening communication, and cyber-flashing.
“These measures represent a significant step forward,” Moore said. “This is of course if those behind the offences can be identified — the most difficult aspect of online crime investigations.”
Prosecutors face another hurdle due to the unclear geographical boundaries of the metaverse, which don’t neatly match up with legal jurisdictions. Additionally, a virtual world usually doesn’t generate tangible evidence of a crime.
Some argue that resources could be more effectively allocated elsewhere. In light of the virtual rape investigation, critics point out that any additional funding should prioritize addressing the extensive backlog of physical rape cases.
Also Read: These technologies could drive medtech innovation in 2024
Due to these challenges, the responsibility of policing the metaverse will largely continue to rest on platform operators. In response, some have taken steps to implement digital safeguards. For example, Meta’s Horizon Worlds now includes a “personal boundary” setting, preventing unknown avatars from entering a specific radius around another player. This feature was introduced following multiple reports of sexual harassment in the VR game.
A “personal boundary” is like an invisible fence that gives people more room, preventing others from getting too close and making it simpler to steer clear of unwanted interactions. You can choose from three Personal Boundary options, adjusting them anytime based on your comfort. Just head to the Safety tab in Settings to make the changes.
Pryor anticipates that tech companies will assume more responsibility for their users. However, he cautions that they won’t be able to prevent every online offense.
“Technology companies cannot possibly be expected to prevent every single act that may be harmful on the internet,” he said. “However, as artificial intelligence and detection capacity improve, it may be the case that technology solves the problem that technology may have caused.”
Video editing is one of the most in-demand skills in today’s content creation era. If…
There have been whispers about Samsung's ambition to equip their wearable gadgets with a neat trick:…
Taiwan Semiconductor Manufacturing Co (TSMC) recently dropped the news that they're gearing up to kick off production…
Modern chatbots like ChatGPT can churn out dozens of words per second, making them incredibly…
The race for generative AI is in full swing, but don't count on it raking…
JioCinema, the famous Indian on-demand video-streaming service, unveiled a new monthly subscription plan, starting at…