To most people, AR is still something quick and cool you can do for a few moments on your phone. To a company like Snap, which focuses on quick moments of social interaction, that’s been perfect. But Snap’s latest developer tools for augmented reality point to a world that will be a lot more persistently in virtual spaces. Will Snap succeed in a landscape where lots of other companies like, , and as well?
Snap already has a developer-focused pair ofit released last year. This year, the company announced a little hovering selfie drone that will also have a few AR effects, according to Qi Pan, Snap’s director of computer vision engineering.
While Snap has been experimenting withand that will work between phones and glasses, the latest push is to allow for larger-scale experiences that could span entire cities. Snap’s moving away from just downloadable lenses to ones that lean on cloud storage, unlocking what could be an infinite amount of content in a particular lens. Snap’s starting its larger-scale experiments with London’s Zone One, which has been scanned with 360-degree cameras to enable AR experiences to happen anywhere within the city grid.
Snap’s more granular location-based landmark-based AR, which is activated by scanning local QR codes at landmarks or public destinations, works in a similar way. But the larger scope cloud-leaning possibilities mean these lenses could always be available anywhere in a city grid, discoverable when you open Snap’s app or living inside third-party apps that also run Snap lenses (this is happening already with companies like Disney, for simple photo-filter lens effects). A collaboration with Lego allows multiperson brick-building projects in a persistent type of AR lens that points to where Snap could explore more creative,experiences.
The difference between Snap’s existing developer-created mini app-style lenses (which the company says have grown from 3.5 trillion total views around December 2022 to 5 trillion now) and what’s coming next sounds potentially profound. Snap’s looking to allow persistent worlds, almost like channels of reality. It’s similar in spirit to what Niantic’s been pursuing through its world-spanning location-based AR technology. Companies like Apple, Google and Microsoft have already explored location-based AR, and Meta’s currently working on to be AR-enabled as well.
“You can basically just drag and drop content into these locations, or you can programmatically create content, have Spider-Man jumping off roofs, land dragons on certain buildings,” Pan says of Snap’s city-scale AR grids.
Right now the detail level of large-scale maps isn’t always perfect, but Pan sees faster mobile speeds as improving future experiences. “As bandwidth gets higher and higher, you can start expanding the horizon of the models that you get—in the future, if you stood 2 kilometers away from like the Empire State Building, you’d still be able to get a high quality version of the mesh for the Empire State Building, even from super far away. Whereas with today’s bandwidth, probably not yet.”
The expanded and deepened lenses are part of Snap’s mission to try to create a reason for always-on AR glasses. Living maps of information, delivered through these lenses, seem like one clear solution, and an area where Snap imagines a lot of evolution. “In the future, if there are literally billions of people wearing AR glasses with cameras pointed to the world, you would be able to update maps within seconds,” says Pan.
Snap’s newly-announced selfie drone isn’t entirely an augmented reality device, but it will allow some selfie content to be enhanced with effects, much like Snap’s lenses do with faces. That’s one thing AR glasses can’t do and points to Snap’s other AR goal: finding ways to capture people and their environment.
“If you look at what people are doing on phones today, a lot of phone AR usage is selfie,” says Pan. “With glasses, you can’t replace the selfie camera, because you can’t see yourself. Either people still need a device like a rectangle with a camera on it, or you have other cameras in the world around you. Pixie is a great example of something like that. Because it’s flying around autonomously and giving a different vantage point, in the future, for things like being able to map cities, it could be an interesting thing to help provide data as well.”