"There's a lot of concern that we are developing the race that will replace us," adds Enderle, fears that have been articulated by scientists and others, from tech entrepreneur Elon Musk to renowned physicist Stephen Hawking. "We could create something so smart that it could think that it would be better off without us," Enderle adds. "It would see that we're not always rational and fix the problem, either by migrating off-planet as many hope, or by wiping out the human race."
Not everyone agrees with the doomsday scenario. "I am a meliora conservative regarding computer technology's future," counters Byte magazine's Helmers. (Meliora is Latin for "ever-better," and it is the motto of Helmers' alma mater, the University of Rochester.) "Given another 40 years of creative engineering minds building on the vast past achievements of prior creative minds, our technology will be meliora to the nth degree."
Either way, "The CPU is only a small part of the problem these days; it's what we do with it that's the problem," adds Bob Frankston, who co-invented VisiCalc, the first PC "killer app," in 1978.
"You will have the equivalent of Watson in your wristwatch or embedded inside you — what will you then want to do?" wonders Jonathan Schmidt, one of the designers of the Datapoint 2200. Watson is the name of the IBM artificial intelligence entity famous for winning the TV quiz show "Jeopardy!" against two human champions in 2011.
Ted Nelson, who invented the term hypertext in the 1960s and whose still-unrealized Project Xanadu has many features in common with the later World Wide Web, pretty much rejects both the past and the future. "Advances? It has all turned to crap and imprisonment," he says. As for the next four decades, "More crap, worse imprisonment." (Nelson's Xanadu would give all users file access, including editing privileges. Users on today's Web can do only what a specific site lets them do.)
Some experts have predicted, or called for, specific advances. For example, Stan Mazor, who as a chip designer at Intel was involved in the 8008, says machine vision may be the next frontier.
"When computers can see, we will have a large leap forward in compelling computer applications," says Mazor. "Although typical multiprocessors working on a single task saturate at around 16 CPUs, if a task can be partitioned, then we might see 100,000 CPUs on a chip. Vision's scene analysis might be one of those problems suitable for large-scale parallelism."
"Why can't we use the computing power we have available today to make computers communicate with humans more efficiently, without the need for programming languages, operating systems, etc?" asks Marcian E. "Ted" Hoff, who was Mazor's boss at Intel during the 8008 project. "There has been insufficient progress in natural language processing, a disappointment I hope will be remedied."
Sign up for CIO Asia eNewsletters.