Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

The 8080 chip at 40: What's next for the mighty microprocessor?

Lamont Wood | Jan. 9, 2015
It came out in 1974 and was the basis of the MITS Altair 8800, for which two guys named Bill Gates and Paul Allen wrote BASIC, and millions of people began to realize that they, too, could have their very own, personal, computer.

Carl Helmers, who founded Byte magazine for the PC industry in 1975, adds, "With all our modern silicon technology, we are still only implementing specific realizations of universal Turing machines, building on the now nearly 70-year-old concept of the Von Neumann architecture."

Human-digital synthesis?

How we will interface with computers in the future is of more concern to most experts than is the nature of the computers themselves.

"The last four decades were about creating the technical environment, while the next four will be about merging the human and the digital domains, merging the decision-making of the human being with the number-crunching of a machine," says Rob Enderle, an industry analyst for the past three decades.

This merging will involve people learning how to perform direct brain control of machines, much as they now learn to play musical instruments, predicts Lee Felsenstein. He helped design the Sol-20 (one of the first 8080-based hobbyist machines) and the Osborne 1, the first mass-market portable computer.

"I learned to play the recorder and could make sounds without thinking about it — a normal process that takes a period of time," he notes. Learning a computer-brain interface will likewise be a highly interactive process starting in about middle school, using systems that are initially indistinguishable from toys, he adds.

"A synthesis of people and machines will come out of it, and the results will not be governed by the machines nor by the designers of the machines. Every person and his machine will turn out a little different, and we will have to put up with that — it won't be a Big Brother, one-size-fits-all environment," Felsenstein predicts.

"An effortless interface is the way to go," counters Aaron Goldberg, who heads Content 4 IT and has been following the technology industry as an analyst since 1977. "Ideally it would understand what you are thinking and require no training," considering the computational power that should be available, he adds.

"Interaction with these devices will be less tactile and more verbal," says Andrew Seybold, also a long-time industry analyst. "We will talk to them more and they will talk back more and make more sense. That's either a good thing or a scary thing."

The dark side

Some observers believe increasingly powerful computers could bring problems.

"In the next four decades the biggest issue is what happens when devices become smarter, more capable and more knowledgeable than we are," says Goldberg. "If you follow the curve we will clearly be subordinate to the technology. The results could be terrifying, or empowering. There may always be tension between the two. Much as it has been thrilling to live in this generation, the next should be really exciting — but the problems will also be much bigger."

 

Previous Page  1  2  3  4  Next Page 

Sign up for CIO Asia eNewsletters.