Triathlete Craig Alexander wears Oakley's Radar Pace smart glasses during the Intel keynote at CES on Jan. 5, 2016. Credit: James Niccolai
Smart glasses are coming to a face near you.
Negative Nancy naysayers in the tech press thoroughly stigmatized Google Glass during Google's experimental Explorer program. And when Google closed its R&D program and opened its product development effort, the tech press falsely claimed that Google killed glass -- and with it, the nascent smart glass industry.
In fact, smart glasses are already gaining traction in vertical enterprise, medical and military circles.
More importantly, the mainstream, everyday use of smart glasses is a near certainty. It's just a matter of time. And tech.
CES last week ushered in the most promising technology for the mainstreaming of smart glasses from the German lens company, Carl Zeiss.
Carl Zeiss's clear vision about consumer smart glasses
Carl Zeiss Smart Optics, a startup funded entirely by Carl Zeiss, showed elegant new smart glasses at CES. They aren't products, but proof-of-concept prototypes. (The company is working with companies to use their technology in actual products.)
The genius of Carl Zeiss's solution is that it delivers images and words to the eye not with big, bulky hardware that looks dorky and conspicuous -- the feature that made Google Glass's prototype socially unacceptable -- but with a system of clear lens technology that is housed in conventional-looking eyeglass lenses. Subtle lines are visible in the lenses, comparable to the lines you can see on bifocals.
Carl Zeiss's smart glass technology is covered by more than 250 patents. The lenses are usable with different frames, which could be designed by the same companies that design eyeglasses and sunglasses.
It's unlikely that Carl Zeiss's technology will show an image as high in quality as the one delivered by Google's prism technology.
The way Google Glass's prototype worked was that a dedicated physical hardware "boom" sat over the top of the right eye, in front of the prescription or sunglasses lens, if there was one (you could also use Glass without conventional glasses lenses). A tiny projector beamed the screen image through a highly specialized chunk of glass, through what was essentially a one-way mirror fixed at a diagonal to the direction of the beam. The screen image then hit the far end of the Glass prism, which was a concave mirror that reflected the screen image back toward the projector. The two-way diagonal mirror, however, re-directed it into the wearer's eye.
So if you follow the trail of light, Google Glass zapped it perpendicular to the line of sight from right to left, then it was reflected back from left to right before being turned with a diagonal mirror directly into the right eye.
Sign up for CIO Asia eNewsletters.