Many of us in the tech press say that we've hit a lull in technology innovation, after an amazing run of truly disruptive new technologies from cloud computing to social networking, from mobile devices to voice services, in the last decade. Steve "Woz" Wozniak is not so sure, but he does believe that innovations can't be scheduled or even predicted with any certainty. They take off only when many factors come together.
Wozniak, of course, knows a little something about innovation. The 63-year-old engineer is a co-founder of Apple and helped invent personal computing in the forms of the Apple II and Macintosh. He has been an adviser to and sounding board for Apple over much of its history, as well as a consultant to other tech companies. He founded a company called CL 9 in 1987 that produced the first programmable universal TV remote control. He has been an elementary school teacher. Now he is chief scientist at Fusion-io and speaks on technology and innovation throughout the world; this week, he's a featured speaker at the Apps World conference in San Francisco.
There's no question that companies are doggedly pursuing the next big thing in technology, whatever that may be. For example, "everyone is talking about wearable computing. There are about 30 companies that seem to be doing the same thing. But nothing seems to be pointing to the right way," Woznak says. One reason is simple: "You tend to deal with the past," replicating what you know in a new form. Consider the notion of computing eyeware like Google Glass: "People have been marrying eyewear with TV inputs for 20 years."
What it takes for innovation to take root
What does it take for technology innovation to flower in the same way the PC did in the mid-1980s, the Internet in the early 2000s, smartphones in the late 2000s, cloud computing and social networking in the early 2010s, and tablet computing in the mid-2010s? For one, "the enabling technology has to become cheap enough," Wozniak says.
For example, you can buy tiny projectors today, but they're only useful if you have a blank wall to project them against, thus limiting their usefulness — they're basically refinements of projectors we've used for decades. But imagine if a tiny projector, perhaps built into your smartphone, could project holographic images, à la the "Star Wars" movies. That would be a big shift, as then you could project anywhere you are, whether to show video or conduct a virtual face-to-face meeting. "It's too expensive to do that today — we need to wait until it becomes affordable. But you can't predict when that happens." That's why companies like Apple, Google, Microsoft, and IBM have all sorts of research products going on, to see if enabling technologies can be made affordable, then pounce on them when they do.
Sign up for CIO Asia eNewsletters.