Walking down any street is a familiar sight: people raise their necks looking at their phones. But in the not-too-distant future, we’ll likely only be staring at the swirling digital information before us, with a mixture of the digital and the real worlds, all thanks to augmented reality.
When I visited the Mojo Vision office in July, I held the augmented reality smart contact lens an inch in front of my eyes to try out its features, turning the cursor around the space in front of me by moving the lens. Since I couldn’t wear contact lenses, I used a VR headset to test out my eye tracking technology and experimental applications, pointing a small pointer by simply moving my eyes. I could read from a digital controller displaying a series of words as I moved my eyes, and I could also look around the room to see arrows pointing north and west, designed to help end users navigate the outdoors.
To click on an app scattered around a circle hovering in front of me, I simply looked at a small tab next to the app for an extra second. Numbers and text appeared in my upper field of view, showing, for example, the speed of my bike, displaying the weather, or giving me information about an upcoming trip. To close the app, I’m going to stay away from this information for a full second.
Technicians have been talking for years about what the next computing platform will be, after a decade of replacing mobile devices with desktop computing as our primary gateway to the Internet. Meta CEO Mark Zuckerberg is placing his bets on the metaverse, a fully immersive virtual world entered via a headset.
But I think the biggest shift will be to augmented reality, where glasses or contact lenses display information about the world around us so we can see the online world and the real world simultaneously. If there is one thing humans love to do (albeit badly in many cases) it is multitasking. Phones will become little servers coordinating all the different devices we’ll increasingly be wearing on our bodies: earphones, watches, and soon glasses, the latest piece in the unseen computing puzzle.
The Mojo Vision lenses are a marvel of engineering and perhaps one of the most ambitious hardware projects in Silicon Valley today. The company had to develop its own chemicals and plastic compounds that would allow the eyeball to breathe through a lens covered with electronics. When I held the lens in my hand, it was noticeably thick, and large enough to extend beyond the iris to cover parts of the white of the eye.
It’s inconvenient,” said David Hobbs, the startup’s senior director of product management who has worn several prototypes.
The lens includes nine titanium batteries of the type typically found in defibrillators and a flexible circuit narrower than a human hair that provides all the power and data. A slightly convex mirror bounces light off a small reflector, simulating the mechanics of a telescope, which magnifies pixels packed into just a micron, about 0.002 millimeters. From a few meters away, this small screen looks like a twinkle of light. But when I looked through the lens closely, I could see a video of Baby Yoda, a picture as crisp and inviting as any I’ve seen on screen.
I can imagine people watching TikTok videos on this day, but Mojo Vision wants the lens to have practical uses. Steve Sinclair, senior vice president of product and marketing, said the information you show on your eye should be “very narrow snippets, fast and fast.” However, the company is figuring out “how much information is a lot of information,” according to Sinclair, who previously worked on the product team at Apple Inc. that developed the iPhone.
Currently, Mojo Vision is working on a lens for visually impaired people that shows digital glowing edges superimposed on objects to make it easier to see those objects. It’s also testing different interfaces with companies making running, skiing and golfing apps for phones, for a new kind of hands-free activity display. Barring regulatory stumbles, Sinclair says, consumers can purchase a customized Mojo prescription lens in as little as five years. That might be an ambitious timeline given that other augmented reality projects have been delayed or, like Google Glass, haven’t lived up to the hype.
Alphabet Inc. failed. Google’s parent company is also into introducing smart contact lenses for medical use (1) but in general, the big tech companies have been driving a lot of development around virtual and augmented reality. Bloomberg News reports that Apple is working on lightweight augmented reality glasses that it plans to release later this decade. Sometime next year, a mixed reality headset, which it showed to its board of directors in May, is also expected to launch. Facebook currently dominates sales of virtual reality devices with its Quest 2 headset, but it’s also racing to launch its first augmented reality glasses in 2024, according to an April report in Verge.
Why does augmented reality take longer? Because it fuses digital elements with physical objects in an ever-moving scene. This is a complex task that requires a lot of processing power. However, our desire to maintain at least one foot in the real world means that we are likely to spend more time in augmented reality in the long run.
The big question is how to balance being in real life with constantly seeing digital information. Today, it takes a few seconds to take out the phone, launch an app, and perform a task on its screen. In the future, we’ll be able to enter the app just by looking at it for an extra second. This will bring up all kinds of thorny issues about addiction and how we interact with the world around us.
Sinclair says the same question was asked of him years ago when he was working on the iPhone. “I can’t say how in Mojo we are going to mitigate that completely,” he said. “But the trend is moving in that direction, where people will have immediate access to information.”
Whether using contact lenses or glasses, the human eye will signal a world swimming in more digital information than ever before. Our brains will have a lot to get used to.
More from Bloomberg Opinion:
Sorry Zuckerberg, Metaverse won’t replace Zoom: Parmy Olson
The whole world can do with an early iPhone: Tim Colban
Who needs government to go to Venus?: Adam Minter
(1) Google announced a partnership with Novartis in 2014 to develop a glucose-sensing smart contact lens. Four years later, Alphabet’s Life Sciences division, Verily, said it called off the project.
This column does not necessarily reflect the opinion of the editorial staff or Bloomberg LP and its owners.
Parmi Olson is a columnist for Bloomberg Opinion covering technology. She was a former reporter for The Wall Street Journal and Forbes, and the author of We Are Anonymous.
More stories like this are available at bloomberg.com/opinion
Latest Posts1 month ago
The Hunt for the Crypto King,’ Richard Linklater rotoscopes in ‘Apollo 10 ½,’ ‘Julia’ remembers TV chef Julia Youngster
Artificial intelligence1 month ago
This Girl Created An AI System to Monitor Her Cat’s Poop
Are the Kids Alright?1 month ago
Tyler James Williams interview on ‘All people Hates Chris’
Environment4 weeks ago
Environmental activists call for a crack down on e-commerce warehouses in New York City
Latest Posts1 month ago
Every thing New within the iOS 16 Climate App
health1 month ago
U.S. masks mandates are making a comeback. However ought to they?
health1 month ago
‘The worst model’ of COVID is spreading. Can we replace our vaccines in time?
Latest Posts1 month ago
Royal Caribbean is placing SpaceX’s Starlink on its cruise ships