Smart contact lenses could kick AR into the mainstream

Rendering of a woman using digital features in her contact lenses

Augmented reality (AR) has had a rough start to life, which is perhaps not a surprise for a technology launched into the public consciousness by Google with skydivers. Almost a decade ago, Google Glass was unveiled at the company’s I/O conference by people leaping out of a plane. Despite the flashy entrance, Google Glass stalled, held back by privacy concerns and a failure to meet expectations; rather than offering a snazzy sci-fi overlay, Glass was like trying to read Twitter from a TV on the other side of the room.

Other AR headsets have followed, from Microsoft’s HoloLens to Magic Leap. More recently Meta partnered with Ray-Ban to launch an admittedly better-designed iteration of smart glasses. None, though, have found real mainstream success, although Glass and HoloLens are quietly being useful in medical and enterprise settings.

Now, a set of companies are hoping for a shot at the AR market. Mojo Vision announced the Mojo Lens was ready to go earlier this year, and only awaits regulatory approval, while InWith unveiled its version at CES 2022.

The ideas are the same: cram contact lenses with miniaturised circuits so they can communicate with smartphones to display information. Mojo Lens is a hard contact lens, while InWith is a soft lens, more commonly worn by most people for corrective vision. With countless examples of AR devices failing to catch on, these companies are banking on smart contact lenses being the technology that sees AR winning mainstream acceptance.

How close are we to workable smart lenses?

Michael Hayes, co-founder and CEO of InWith, has been working on building flexible electronics into soft contact lenses for just shy of a decade, developing proof of concepts for lens manufacturers. The technology embeds miniature processors, communications chips, antennae, sensors, chargers, energy storage and more into a tiny lens to display images and text to the user’s eye.

The product will hopefully arrive within the next year, Hayes says. “What we’re working on right now is regulatory approval of the first version of the lenses. That’s the priority right now, and then we go to market.”

Although there’s some buzz around smart lenses, however, people must understand the technology’s limitations, warns Oscar Mendez, lecturer in robotics and artificial intelligence (AI) a the University of Surrey. Notably, at least initially, the field of view will be very small. “Consumers need to be aware that it’s not going to be a hugely immersive experience where even your periphery is looking at things,” he says. “It’s going to be something more targeted.”

What are the limitations of smart lenses?

Hayes says the first versions of smart lens AR will be more of a heads-up display (HUD) than a fully immersive experience. “It’s not going to be like putting on an Oculus Rift right off the bat,” he says. “You’re not going to be immersed in a 3D game experience, you’re going to have information that alerts you to things in your environment, tied to smartphone alerts and cues that you don’t currently have in your field of vision without looking at your smartphone.”

Of course, hardware is only part of the problem. Mendez is also working on camera localisation. If the AR system’s camera “knows” where it is, then virtual items can be pinned or “parked” in a spot and stay there – if you leave a virtual note for yourself as a reminder when you return home, it needs to be there when you get back, for example. “It’s no longer just a heads-up display that goes where you go, it’s grounded in the real world,” he says. “And I think that’s one of the technologies we need to crack properly before we have AR in the way the public expects or hopes.”

Systems like this exist and are used in larger headsets such as the HoloLens, but achieving the same in a smaller package is a challenge, Mendez explains, particularly when figuring out the tradeoff between performance and what people will accept. “Finding that sweet spot between what the human will tolerate in terms of… something moving around a bit will be interesting,” he says. “That’s more human-computer interaction than it is computer vision.”

Futurist William Higham, meanwhile, argues the problem with Google Glass wasn’t the technology, but privacy. “The big problem with Google Glass was the video camera,” he says. “It created this kind of media storm. Take the video camera off, and I think Google Glass could have succeeded.”

Hayes stresses the first iterations of InWith’s smart lenses won’t have cameras. “We don’t plan on initially introducing a lens like that,” he says.“There will be versions of it where it examines the environment and helps people that have low vision be alerted to things and see better,” he clarifies. “But incognito filming? We have no plans to make that part of our lenses in the beginning.”

Can AR ever truly break into the mainstream?

If and when such lenses are approved by regulators, then one question remains: do people want AR? Highman believes so. “Look at what happened with Pokemon Go,” he says, pointing to the AR game that exploded in popularity in 2016. “That was the perfect example that the adoption potential of AR is enormous. It’s absolutely mainstream.

“I think AR vision products are absolutely the way forward because it’s not a solution looking for a problem – it actually potentially solves loads of problems,” Higham adds. “Using a map for directions on a phone can be problematic, whether you’re walking along or driving.”

RELATED RESOURCE

How organisations drive employee empowerment and business results with leading digital technology

What you can achieve with a leading approach to digital work

FREE DOWNLOAD

Beyond gaming, AR has a wide range of potential uses, with Hayes suggesting the technology could be used for navigation, messaging and instructions. Mojo Lens has partnerships with fitness and sports companies, and it’s no stretch to see how runners, cyclists and others might appreciate the real-time activity and health data without having to pull out phones or glance at a smartwatch. “The only two places you can put information on the human body that is useful is somewhere around the hands and somewhere around the eyes, and, to me, the eyes are much better as it’s just right there,” Higham adds.

Mendez warns, however, that although AR might seem like the best way, people often prefer simpler solutions. He describes research efforts to make a set of instructions to operate a coffee machine work on smart glasses, where one of the main challenges was ensuring virtual arrows pointed to the correct button regardless of where the user pointed their head. Plenty of effort went into solving that tracking issue, but in the end, users preferred simply watching a YouTube video and figuring out which button was which for themselves. “Sometimes the really simple solutions are more enjoyable to the user,” Mendez says.

Hayes notes that AR isn’t the only use case. InWith’s lens can also be used to fine-tune vision, in particular for people suffering myopia or presbyopia, and could one day let us zoom our sight as we would a camera. “This vision correction is the future as well as AR applications,” he adds. “The whole ophthalmic industry is going to be disrupted.”

Will smart lenses end the smartphone?

Mojo’s marketing line is that smart contact lenses open up a future of invisible computing, letting users glance at information discreetly rather than hold up a phone to their face or sit at a laptop. While that may one day be possible for the consumption side of computing, it’s hard to imagine anyone throwing away their monitor and keyboard in order to create – be it coding or making a PowerPoint – using the limited display in a smart lens. In the short term, at least, smartphones aren’t likely to be wholly displaced by smart lenses.

“It’s not straightforward that we’re going to use these devices in the same way we use our phones,” says Mendez. “It’s not necessarily going to replace your phone… it’s more of an extension of your phone than a replacement.”

That said, smart contact lenses overcome the friction of Google Glass and other headsets, and raise the possibility of small steps towards AR, if not yet the full metaverse. Indeed, says Higham, smart lenses may bring ideas like the metaverse into the real world. “I was always a bit wary of these avatars and virtual worlds, so I think that the fact you can do it while still living your real life is great,” he explains. “With the metaverse, people have got to dial down the sci-fi of it and see it’s a tool that’s actually going to make real life easier rather than build some other world.”