A ton of companies are racing to get people interconnecting in anusing smart glasses. Right now, few even exist that can work with the phones in our pockets, though. Qualcomm’s latest initiative called Snapdragon Spaces seeks to act as the glue between phones and future smart glasses, with a focus on supporting open web platforms.
The catch, for now, is that it’ll only work with certain premium Android phones with Qualcomm chips and it won’t work with iPhones at all for now.
Qualcomm’s chips already live inside most of the AR and VR headsets in existence, including the, and . The company’s been trying to bridge its VR and AR devices over the and the interest to make it work may be mounting now that more companies like Google are starting to . , and other AR/VR players are focusing on (call it metaverse, or whatever you’d like) for headsets and phones and Qualcomm’s move looks entirely focused on making that headset-to-phone connection finally happen.
Snapdragon Spaces isn’t a chip, though. Hardware has been Qualcomm’s strength in the AR/VR landscape. Spaces is software that Qualcomm sees as necessary to make glasses work with more phone apps. The platform downloads to compatible premium Snapdragon-based phones (although which chips will be supported Qualcomm won’t specify yet) and allows plug-in glasses to connect while also allowing apps on existing app stores like Google Play to hook in and be glasses-compatible. The glasses become an extension of a phone, using the phone’s processing power and cellular connection to drive experiences.
Lenovo (and Motorola), Oppo and Xiaomi are the first hardware partners expected to have compatible phone-connected AR glasses using Snapdragon Spaces next year (Lenovo’swill work with the platform), while Deutsche Telekom (along with T-Mobile) and NTT Docomo are also partnered to test features on networks when the glasses arrive.
Qualcomm also announced the acquisition of hand-tracking technology company Clay AIR. Along with acquiring object-recognition company Wikitude, the aim for Qualcomm is to lock up more of the tools needed to make everything needed for a complete set of AR glasses to work. The developer platform will work with Unreal Engine 4 and Unity and will also be OpenXR compatible.
A lot of the promised features of AR glasses and apps using Qualcomm’s software sound like what we’ve seen in AR over the last few years: pop-up virtual objects tagged to real life objects or images, recognition of floors, desks and other surfaces, meshing of spaces to get a full 3D map of a room, hand tracking and even glimpses of a-like way to remotely control real-world things layered with virtual AR objects (driving an RC car in a room filled with virtual race track obstacles, for instance). Qualcomm’s glasses will also be able to share location-based AR in the real world (via spatial anchors) that other people could see at the same time with their glasses.
Qualcomm’s testing the software out with a few early partners ahead of launch, and Niantic (makers of Pokemon Go and games like the recent) will be using Qualcomm’s platform for new AR experiences via its own software development platform called Lightship announced .
“We share a common vision for creating experiences that fuse the digital and real worlds,” said John Hanke, Founder and CEO of Niantic, in a quote provided by Qualcomm. “What Snapdragon Spaces will achieve for people with AR glasses indoors complements our goal for developers to build planet-scale AR applications on multiple devices and form factors.”
Niantic’s already been trying to get its AR phone experiences onto, and is already on a future smart-glass design. Snapdragon Spaces looks like exactly the sort of phone-to-glasses software that phones need. But without Google or Apple having made major OS-level steps yet to enable glasses to easily work with phones, Qualcomm is making the first moves. That could make 2022 a lot more interesting on this front, fast.