Data Privacy in VR/AR: Enough is Enough

Alberto Elias
Alberto Elias
Published in
5 min readApr 15, 2019

--

Imagine you just bought a great new VR headset. You arrive home excitedly, looking forward to open the minimalist box that holds the headset. You take it out and put it on. You’re instantly transported to a custom room made just for you which you can personalize. You look out over an infinite sea and feel proud of your new companion.

Now, at work, you use your VR headset to see multiple screens, easily move information around, visualize data in 3D, and, most importantly, occlude your colleagues until needed to focus properly. It’s something you just weren’t able to do in the open office space outside of the headset.

One morning, you get a message from your company saying you’ve been fired. You’re perplexed, you’ve been meeting your monthly goals and have never had a bad encounter with anyone, in neither the physical nor virtual company space.

All your friends and ex-colleagues ignore you and you just want to understand what happened, what you did wrong. You’re being blacked out. You need a new job, and you have grit to get through this and apply to several companies, all of which send an immediate rejection message. You try easier jobs you’re way overqualified for, same thing. And now your landlord sends a 2 weeks notice email.

Finally Zoe replies back. She was working with you and is also a childhood friend. These kind of situations truly show who your real friends are. She overheard the manager talking about a programme the company recently signed up for. Something about detecting violent people based on their movements which are tracked using VR headsets and controllers.

You can’t believe what you’re hearing. You’ve never considered yourself aggressive at all. What would this technology know? Nobody knows you better than yourself, specially not some analytics software.

You’re completely lost in an eternal self doubt between thinking that maybe you are violent and just not fit for the world, and hating technology and how people have decided to embrace it. Your once initially great VR headset now feels like the devil.

This story sounds crazy, but it is really? The Virtual Human Interaction Lab at Stanford has done a ton of research of what can be inferred just from those 18 degrees of freedom. From their paper on nonverbal data tracked: “commercial systems typically track body movements 90 times per second to display the scene appropriately, and high-end systems record 18 types of movements across the head and hands. Consequently, spending 20 minutes in a VR simulation leaves just under 2million unique recordings of body language.

It can tell things like:

  • Diagnose autism in students.
  • Diagnose attention deficit disorder.
  • How positively a viewer rates the content of the scene.
  • Determine if a person is a high or low performer.

But people aren’t black or white and we definitely aren’t a bunch of 0s and 1s, however much some technology fanatics may fantasize with that idea. We’re complex beings trying to make the most of our time alive and trying to overcome difficulties the best we can. We’re a collection of mostly inconclusive ideas and sensations that change day to day.

Yes, we also output information which, with some clever math, infer a specific thing about us at a specific point in time. But we change. New inputs are constantly bombarding us. Punishments and rewards shape our behaviors and persuasion modifies our believes.

If digital data becomes our new overlord, we’re going down a path of jailing everyone based on ever more complex categories, but never complex enough.

Photo by arvin febry on Unsplash

This is only one example of a worst case scenario based on unwarranted mass data collection. We’ve already seen dangerous issues come up as this is a problem we already have today:

These fears, and the necessity to talk openly about them from the beginning, is why the VR Privacy Summit happened in Stanford last November. The story told above actually comes from several of the scenarios we outlined during the event in a wonderful workshop hosted by the Virtual Human Interaction Lab.

But what can we do? We definitely didn’t fix the world during the summit, but it was a good conversation starter that ended with a group wide brainstorming of possible solutions. Some that I’d like to briefly mention below:

  • Collected data must be deleted after a specific timeframe.
  • Access to data:
  • May be revoked at any time.
  • Is always opt-in.
  • Terms of services and privacy notices must be written in an easy to understand way.
  • If ToS change, a company must not assume automatic acceptance of the new terms.
  • Having specific industry-wide guidelines for how organizations should handle data.
  • Setting up some kind of certification system, where well behaved companies that follow the guidelines mentioned above are awarded.
  • Requested data must be limited to only the essential.
  • All data must stay on the device or on an individual’s controlled cloud.

Mass data collection is already happening. XR makes the issue worse at least tenfold by adding an enormous amount of new data points. It’s not just the headsets, but also the information grabbed from the world with IoT needed for AR, for example. We must build technology for people, they come first. We all dream of making the world better, but for that, we must ensure privacy. I’d like to open up the table further and hear more suggestions of what we should be doing.

Simbol is a VR/AR first identity system where people control their data. Follow our work here.

--

--