The discussions of privacy are everywhere, on this site, the mainstream media and a range of blogs and academic publications. It seems like every day there is a new source of privacy concern and that includes a recent article in JAMA pediatrics questioning the privacy we may be offering up when we use virtual reality devices.
By virtual reality devices, I am referring to those headsets and speaker systems that allow us to immerse ourselves within the game or image – becoming one with the action. In creating this reality, the system must track our body’s movements, eye movements, how physically close we approach our virtual friends or enemies. These systems collect our physicality, our body language. That would be of no particular consequence on its own until it is mashed up with some newer forms of analytics, for example, gait analysis. The mechanics of how we walk are in many ways unique, as a physician I can pick out patient’s with a foot drop or wearing a prosthesis under their clothes because of the anomalies in their gait.
Digitizing our body language, when combined with the use of big data and machine-based learning provides patterns we can use to identify individuals. It is nice to use facial recognition, but in a pinch, gait analysis may help out. The author's concern is how all this voluntarily shared information is being used. But I think that rather than focus on his immediate message it may be worthwhile to recognize two interconnected ideas. First, privacy now includes activity we scarcely consider as private or useful in identifying us. Second, we give this information freely to others, in exchange for an experience.
Source: Protecting Nonverbal Data Tracked in Virtual Reality JAMA Pediatrics DOI: 10.1001/jamapediatrics.2018.1909.