Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

The 8080 chip at 40: What's next for the mighty microprocessor?

Lamont Wood | Jan. 9, 2015
It came out in 1974 and was the basis of the MITS Altair 8800, for which two guys named Bill Gates and Paul Allen wrote BASIC, and millions of people began to realize that they, too, could have their very own, personal, computer.

It came out in 1974 and was the basis of the MITS Altair 8800, for which two guys named Bill Gates and Paul Allen wrote BASIC, and millions of people began to realize that they, too, could have their very own, personal, computer.

Now, some 40 years after the debut of the Intel 8080 microprocessor, the industry can point to direct descendants of the chip that are astronomically more powerful (see sidebar, below). So what's in store for the next four decades?

For those who were involved with, or watched, the birth of the 8080 and know about the resulting PC industry and today's digital environment, escalating hardware specs aren't the concern. These industry watchers are more concerned with the decisions that the computer industry, and humanity as a whole, will face in the coming decades.

The 8080's start

While at Intel, Italian immigrant Fredericco Faggin designed the 8080 as an enhancement of Intel's 8008 chip — the first eight-bit microprocessor, which had debuted two years earlier. The 8008, in turn, had been a single-chip emulation of the processor in the Datapoint 2200, a desktop computer introduced by the Computer Terminal Corp. of Texas in late 1970.

Chief among the Intel 8080's many improvements was the use of 40 connector pins, as opposed to 18 in the 8008. The presence of only 18 pins meant that some I/O lines had to share pins. That had forced designers to use several dozen support chips to multiplex the I/O lines on the 8008, making the chip impractical for many uses, especially for hobbyists.

"The 8080 opened the market suggested by the 8008," says Faggin.

As for the future, he says he hopes to see development that doesn't resemble the past. "Today's computers are no different in concept from the ones used in the early 1950s, with a processor and memory and algorithms executed in sequence," Faggin laments, and he'd like to see that change.

He holds out some hope for the work done to mimic other processes, particularly those in biology. "The way information processing is done inside a living cell is completely different from conventional computing. In living cells it's done by non-linear dynamic systems whose complexity defies the imagination — billions of parts exhibiting near-chaotic behavior. But imagine the big win when we understand the process.

"Forty years from now we will have begun to crack the nut — it will take huge computers just to do the simulations of structures with that kind of dynamic behavior," Faggin says. "Meanwhile, progress in computation will continue using the strategies we have developed."

Nick Tredennick, who in the late 1970s was a designer for the Motorola 68000 processor later used in the original Apple Macintosh, agrees. "The big advances I see coming in the next four decades would be our understanding of what I call bio-informatics, based on biological systems," he says. "We will start to understand and copy the solutions that nature has already evolved."

 

1  2  3  4  Next Page 

Sign up for CIO Asia eNewsletters.