The hardware of those headsets and the interfaces of the virtual keyboards may be at risk of giving hackers more opportunities as Virtual Reality and Augmented Reality, two technologies that are growing in popularity, are said to dominate the upcoming new era of cyberspace, giving most of us a fresh look into the novel digital scope.
Computer scientists from the University of California, Riverside attested the findings of their research in two papers, which they will display at the annual Usenix Security Symposium, an international cyber security conference.
We are aware that Mark Zuckerberg of Facebook, Meta, and other top tech companies are working quickly to create those metaverse technologies, which rely on hardware that can recognize human bodily movements like blinks, steps, and nods. Users will be able to explore the world of virtual reality and augmented reality through games, connect with new people, socialize, and establish a brand-new method of doing business.
The computer team from UCR led by Professors Nael Abu-Ghazaleh and Jiasi Chen has demonstrated how surveillance software can record and track each and every one of our motions in order to apply AI to convert them to written voice or texts with an accuracy of at least 90%.
According to Abu Ghazaleh, if a user has multiple software programs open at the same time, one of them may be spyware that tracks their activity on other programs and can view people around as well as their distance from the user. The user’s interactions with the Metaverse hardware devices are likewise accessible to the hacker.
The spyware can also follow the user’s personal information as they transition from one application to another and enter their password using the virtual keyboard from the headset, further illustrating the privacy hazards. The user’s body movements during a virtual meeting can be accessed using the same techniques, and the hackers can decode the movements to gather private information.
The two research articles that Professor Chen and Abu-Ghazaleh will speak about at the conference were co-written by them, Yicheng Zhang, a doctorate student in computer science at the University of California, Riverside, and Slocum, a visiting assistant professor from Harvey Mudd College.
One of the papers, titled “It’s all in your head(set): Side-channel attacks on AR/VR systems,” had Zhang as its principal author. The study explores the methods by which nefarious hackers can abuse users’ bodily gestures, vocalizations, and typing operations on a virtual keyboard with a precision of more than 90%. The study also sheds light on the techniques used by these malicious actors to watch over applications as consumers launch them. It also shows how they can achieve a precision distance of 10.3 cm in determining how close other people are to the headset wearer.
The second study article, “Going through the motions: AR/VR keylogging from user head motions,” by Solcum, exposes the more urgent worries about the security risks associated with virtual keyboards. The research also covers how complex body gestures, such as user head movements while wearing the headset and entering their passwords on the virtual keyboards, are more than sufficient for these hackers to capture the text’s specifics. The group then developed TyPose, a machine learning-based system that collects these motions and movements and translates them into characters or phrases as the user types.
Those two studies serve the same objective: to warn the tech sector about the cybersecurity risks associated with the metaverse as it plans to incorporate it on a larger scale. Prior to making their findings publicly available, Abu-Ghazaleh claimed that they worked on highlighting the potential for such threats and on the crucial transparency to give these tech companies an opportunity to address these security vulnerabilities.