Peridot Beyond

Will AR glasses eventually replace smart phones?

Snapchat Spectacles are still at development stage but they hint at a time when smart glasses will be as ubiquitous as phones. We speak to Niantic Spatial’s Alicia Berry about what the future might hold, for both users and brands

Whether you love them or hate them, it’s hard to imagine life without smart phones. They regularly get a bad rap these days, with particular anxiety circling round their use by children and teens, but the possibilities they offer in terms of communication, navigation, information and entertainment are unparalleled.

Naturally their ubiquity has also led to the constant question of what might be coming next to revolutionise our lives. For a while VR was the hot take but while headsets have finally entered the mainstream and become affordable for many, their isolating nature means they have broadly remained in the realm of gaming.

Following VR came mixed-reality headsets such as Apple Vision Pro or wearable tech such as Google Glass. Both arrived with much fanfare but faded fast either due to cost, usability or other concerns. Nothing could come close to the convenience and accessibility of the phone format.

But perhaps we gave up on glasses a little too quickly. Signifcantly, Google announced a return to the ‘extended reality’ space at the end of last year with the Android XR, due to be released later in 2025, while Meta, Microsoft and Snap are also in the race to develop a successful smart wearable product.

I had the opportunity to try out Snap’s offering, Snapchat Spectacles, recently. Being a cynical old hack, I approached the trial with a degree of skepticism but found myself surprisingly won over by the product. While heavy on the face and currently with a small AR field of vision, it offered a glimpse of how a smart glasses future might play out, where our tech will slot even more seamlessly into our lives.

“I think you’re right that it is the first time we get a glimmer of life beyond the screen,” says Alicia Berry, who has previously worked at Meta on VR, AR and mixed-reality products and is now executive producer at Niantic Spatial. Niantic is currently developing apps and experiences for Snapchat Spectacles, including pet simulation game Peridot.

Perhaps the most significant element of Snap Spectacles is how they blend with the real world, allowing you to look at what is around you while also interacting with the apps and information in the glasses. “Unlike mixed reality, where you’re watching a video of the world, you’re actually seeing the world,” says Berry. “I think Snap was very bright when they decided to call their applications ‘lenses’.

I can see that world where it is just a series of layers on top of one another that run in real time and augment the world. Versus virtual reality, which is very much isolating

“When we talk about the things that are in the market for mixed reality, you can run one app at a time or sometimes you can run [two apps] together but they can’t interact. So you could watch the YouTube video and do something else over here, but the thing you’re doing can’t react to the YouTube video, for example,” she continues.

“And all of a sudden Snap is giving us this opportunity to perhaps – and this is aspirational – walk around seeing the real world, seeing my friends who I am on Snapchat with, maybe with some lenses that we have co-created together, and also our Peridots playing together.

“I can see that world where it is just a series of layers on top of one another that run in real time and augment the world. Versus virtual reality, which is very much isolating, very much escape. This is something that can bring joy to something that you do every day. And the design paradigms are crazy.”

As Berry suggests, this vision of how the glasses will be used is still very much in development, though it is possible already to imagine how instead of staring down at our phones, we will be accessing the same information (and possibly more) via glasses instead. One of the challenges for designers however is how UX will work in this new environment, which is fundamentally different to a phone screen.

The team at Niantic Spatial is learning through iteration, via apps such as Hello, Dot, a mixed reality experience that allows users to care for their Peridot pets. Berry and the team found that using UX that might be familiar on phones and other screens – such as buttons – are a hindrance here. “We need to create a whole different language,” she says. “I feel like things like reading text buttons, they need to go away in this augmented reality world. Because it’s just less abstracted, it’s just your real world.”

I feel like things like reading text buttons, they need to go away in this augmented reality world. Because it’s just less abstracted, it’s just your real world

Internal testing showed that different methods of interacting were much more successful in Hello, Dot. “You can grab an item out of a machine, you can pull a lever, you can pull a handle, but nothing like [a button].” Berry also advises taking inspiration from other areas where different ‘layers’ of information are already being used, such as navigation and mapping on the windscreens of cars. “I think that’s how we’re going to stumble into some elegant and really noteworthy UX changes,” she says.

For another AR experience, Peridot Beyond, the team at Niantic Spatial went back to basics. “We’ve removed every single UI element served from scratch just because it doesn’t add to the experience,” says Berry. “So it’s really rethinking: how would a toddler interact with this? What would they want to grab? What would they want to push? It’s very action oriented … I think we’re so set in the typewriter mode. I’m excited about the day when we get past Qwerty … or whatever keyboard you’re using. It seems like those days are done. I don’t know what’s going to replace it but I feel like it’s going to come fast.”

Another significant development in the glasses space is the ability for two people to see the same AR experiences and interact with them together. This again is fairly rudimentary at this stage, and Berry stresses that really immersive experiences, where multiple users might play against one another and engage in quests together, are still “many years away”, but that more basic interactions, “like being able to pick up something and throw it, I think will come very quickly and be very ubiquitous”.

There is also an instinctive desire for this from users. “In my opinion solo hallucination is not that fun, but if multiple people can see what you’re seeing…? Every time people get into one of our experiences they ask, ‘Can you see my Peridot?’ Every single time, and I think it is human nature to want to share that, especially if they like it.”

How do you bring brands to customers? I feel like if we start with not being annoying, maybe we can invent something that works for people

To help designers to develop this new tech faster, Berry encourages teams to share their work with the public and get feedback. This is in part to make sure that users are comfortable with the developments too.

“We have this thing on our team, that you can’t invent the future too fast because if you don’t have one foot in the known, it will just be so foreign. In fact I was reading some of our reviews that we got from Hello, Dot and this one player was just getting so angry because they don’t understand. And I’m like … [that’s] poor design, we went too far, too fast.

“That’s why we release some things for free,” she continues. “So then we don’t have to spend a lot of money on user research. And I actually encourage teams to do this if it’s not going to hurt your brand. Try just dropping something in the market and seeing how people feel.”

So what role will advertisers and brands have in this space? Ads will inevitably form part of the AR glasses experience but Berry suggests they are best used as “a call to action” from brands as audiences pass shops or experience the brand in some way in the real world. “I think that’s going to be the next level of reaching customers – finding a way to do it, not in a spammy way, but in a helpful way. How do you bring those brands to customers?”

Again, she advises learning from some of the mistakes that were made with smart phones to avoid creating experiences that are annoying or distracting. “We’ve talked a lot about notifications,” she says, as an example, “and what is the strategy we leverage to make sure that they’re useful in a visual way.

“I don’t know if you felt this, but the early days of cellphone, everyone had their audio on and you would hear [notifications]. And now it’s very taboo to have any noise coming out of your cell phone. Maybe we’ll have a similar arc in glasses, but I feel like if we start with not being annoying, maybe we can invent something that works for people.”

nianticspatial.com; spectacles.com