Ais expected this year: the , Meta’s pro-level VR headset called and maybe . There’s one thing all of them will likely have in common: eye tracking.
The kind of consumer VR headsets you could buy for everyday use haven’t had eye tracking before. Yes, there are headsets that can use infrared cameras to track eye movement, but these have been more expensive, business-focused VR and AR headsets like the, the and the .
Eye tracking on a consumer device you can get in a store is a whole other step, and one that makes a lot of people uncomfortable.
I’ve usedand I know its benefits. You can more easily control things in VR by looking right at them instead of trying to use your hands, and it could be the gateway to better-looking VR graphics and smaller, smarter headsets. Eye tracking could also open up ways of having your VR avatar make more human-like eye contact with other avatars in future spaces.
But eye tracking also brings a ton of questions about data privacy and how big companies will manage that extra data responsibly.
There are no clear answers to these questions, but I spoke with Anand Srivatsa, CEO of Tobii, which is the largest manufacturer of eye-tracking components for VR, AR, PCs and. He sees eye tracking as the key steppingstone for where headsets and metaverses need to go next.
Foveated rendering could push more graphics and shrink future 5G headsets
Your eyes aren’t capable of seeing all the detail you think they’re taking in. The fovea at the center of the eye jumps around to look at moments in detail, while the rest of the world is perceived peripherally, as if in lower resolution. That same trick can be applied in eye-tracking VR using a technique called, bumping up the graphics processing in the small area your fovea is focused on, and making the rest lower-res. It saves processing power and works surprisingly well, and it’s exactly what the next wave of VR headsets will lean on to create higher-res graphics on small headsets with limited battery life.
“There is this physical limitation on how much compute and graphics you need to render a full field of view, which, of course, is wasteful — the user doesn’t actually need it to be rendered that way,” says Srivatsa. What’s interesting is that this could also lead to smaller headsets with smaller batteries and long-promised cloud-rendered streaming tech (similar to how we can already stream games to play on the fly) could work for VR over 5G.
“If you think about the advent of 5G, the next steps beyond foveated rendering are things like foveated transport — not only can you go in and say we’re going to reduce what we render, but on that pipe on the 5G network, you could reduce the quality of those parts of your image that you’re not going to see,” Srivatsa adds, saying that’ll help shrink the headsets down to the real goal of glasses-sized tech. This is the type of tech, promising lighter-weight, phone-connected glasses.
“The desire, of course, is everybody has glasses like yours, right? Easy to take off, it doesn’t weigh your ears down, you can have it on for hours, you won’t feel weird. But of course you want really high visual fidelity and really good interaction. Otherwise you’re not going to bother being in an experience that’s kind of half baked.”
Eye contact in social VR worlds may make things less awkward
Being a cartoon avatar in aright now is a forced compromise, offering more immersion in some ways while trading off actual eye contact. It can get off-putting fast. Facebook has promised that its next-gen VR headset, coming this year, will use eye and face tracking for its avatars, meaning avatars will be able to see eye to eye. Will that be better, or more uncanny?
for some avatar-based demos, and it does seem to make a big difference in helping it feel like someone’s actually intending to talk to me. Srivatsa sees social metaverse uses as the other big driving force for VR eye tracking right now in 2022.
“There are these two aspects. One is to fool your senses to say this is real, so you want to spend time there. And then the second thing is to go and have interaction that feels like real life, that really allows you to be immersed and not feel like this is something weird — because you still have the other parts of the awkwardness, like a headset. But if we can get really high visual fidelity, and you can get really natural interaction, then maybe that allows you to forgive that, hey, I do have something on my head.”
Srivatsa acknowledges that Tobii, as a manufacturer of the eye tracking tech, isn’t building social experiences: that will be up to companies like Meta, Sony and Apple to solve. But the potential is there, he thinks, to use eye contact as a way to focus conversations and audio, too, which would help for games, larger-scale social spaces and virtual workplace apps. Srivatsa sees scenarios where looking at a person could focus audio and even tune out others having other conversations. “I think it can be really powerful in enabling the kind of interactions we may not expect, where digital can even transcend, in some ways, what you can do in physical.”
Eye tracking and hand tracking could change tomorrow’s VR and AR controllers
Further off, eye tracking could help bring an end to VR’s currently useful but bulky game controllers.controllers are fine for casual home games, but as many companies are striving to create headsets and glasses that could be worn around all the time, they’re useless.
No one’s solved a perfect smart glasses controller yet, but hand tracking (through cameras or some sort of, watch, ring or ) plus eye tracking could maybe get to a future where controls could start feeling like a form of mind-reading. This is what Meta’s already trying to solve with its . Eye tracking in VR headsets could start to bridge how companies start to solve those next-wave controller challenges.
Srivatsa sees eye tracking as a way to get past the weird literal style of current hand tracking in VR, which assumes you’re able to grab things in literal arm’s reach. If the technology sees you looking hundreds of feet away, the hand tracking could “leap” there, sort of like how joysticks let us teleport quickly in VR now. It could also open up a new gesture language for VR and AR. Headsets like the HoloLens already do this to some extent, but there isn’t a consistent language yet.
“It’s like pinch and zoom. Years ago, nobody would understand what you were doing. Now, you can do that on the touchpad, and people understand that this is a gesture. There has to be a little bit of teaching and user behavior change,” Srivatsa says, but this type of use may not come into play for a while.
Privacy questions remain: Who’s responsible?
Eye tracking and face tracking have raised a number of serious privacy concerns about how companies handle an expanding set of user data. These concerns have existed already for smartwatches, smart speakers, video doorbells and tons of mobile apps, but VR and AR would be crossing into a whole new unexplored territory with eye tracking.
Srivatsa sees the concerns, but doesn’t necessarily have clear answers. “For us, this is super important. We have taken a very strong stance here, which is we think that users need to be in control of their data.” Srivatsa sees two levels of eye tracking data. The first is where eye control is basically used like a mouse, which he considers less concerning. “That information isn’t recorded, it’s just driving a particular interaction. For those kinds of things, we see very little impact in terms of stored data about the user.”
The other level, which involves possible recording of “heat maps” of where someone’s looking, or capturing images of people’s eyes, is the bigger unknown. “We think that, of course, that can be perceived as very invasive. We have a data transparency policy. And the intention of that is to make sure that the users have control and understand what their data is being used for if it is being stored,” Srivatsa says. “We expect our customers to be in compliance with our data transparency policy. So if they are storing eye tracking data, they have to go and get the user consent, they have to explain what they’re doing with it. And again, this is very much in the spirit of being transparent, allowing the users to opt in.”
Srivatsa says Tobii expects companies using their tech to do so responsibly and ethically. Enforcing that will be the bigger challenge. Srivatsa sees transparency about data use as the answer. I ask about secure local storage of data, but that isn’t required as part of how headsets use Tobii’s tech.
“I think there are realms of government regulation around data privacy and protection, and I think that’s sort of not where we are playing,” Srivatsa says. “We are really in the realm of data transparency. What we’re talking about is actually a little bit outside of the purview of where the governments are legislating today. We would expect, of course, that the governments in general would have some kind of requirements around biometric data that’s stored, however you collect it. What we want to be very clear about is, we know this is a new type of data that’s being collected. And that if you intend to store it, we want to make sure that users feel like there’s a huge amount of transparency and openness to include the user and the decision to use the data, or to use the technology and to understand what that data is being used for.”
That sounds like companies will be free to make their own decisions on how to use eye tracking tech. And it also means that keeping an eye, so to speak, on how Meta and others incorporate the data will be a big part of how the next wave of VR and AR play out. While Meta has already begun shifting its language on data privacy to prepare for future AR glasses, we don’t know whether those policies will change or be upheld. But it’s something that will likely be in play with your next VR headset.