Think & Built Bigger Faster Better

In recent years, tremendous progress has been made as a result of the combination of artificial intelligence (AI) and extended reality (XR) technologies, such as augmented reality (AR) and virtual reality (VR). A recent study, however, contends that this combination might give rise to privacy issues.

According to research, AI systems can evaluate users’ motion data in AR and VR settings, enabling the startling precision of the disclosure of personal information. Discussions about finding a balance between technological advancement and personal privacy have been generated by this potential.

Researchers from the University of California, Berkeley, conducted two tests earlier this year, and the results showed that users of augmented and virtual worlds unintentionally disclosed more information through their head and hand movements than was previously thought. This data, which was gathered in a matter of minutes, can be used to infer a number of factors, such as age and level of handicap.

The tests were carried out as a part of the metaverse security and privacy research project at the Centre for Responsible, Decentralized Intelligence. One study used open-source data from more than 50,000 Beat Saber VR players, providing a dataset more than 100 times bigger than that of other studies. The scientists found that bodily movements are as distinctive and recognizable as fingerprints.

The researchers were successful in correctly identifying users within 10 to 100 seconds with 73% to 94% accuracy using machine learning classification models. In a another study, the researchers used an aggressive virtual reality game to gather information from 50 players in a lab setting, effectively detecting factors like location, age, and height.

 

Due to ethical considerations, the study did not attempt to assess traits like sexual or political orientation, although the researchers think that such findings are also possible. Vivek Nair, the study’s lead author, emphasizes the necessity of creating protective measures to safeguard users’ privacy and stop identity theft or information exposure.

The privacy and security problems found in this research are especially pertinent in light of the growing popularity of virtual reality headsets, which saw an estimated 10 million purchases last year. In order to reduce these dangers, Nair and his team will concentrate on figuring out how to alter or regulate access to the data.