The 8080 chip at 40

It came out in 1974 and was the basis of the MITS Altair 8800, for which two guys named Bill Gates and Paul Allen wrote BASIC, and millions of people began to realize that they, too, could have their very own, personal, computer.

Share

Now, some 40 years after the debut of the Intel 8080 microprocessor, the industry can point to direct descendants of the chip that are astronomically more powerful (see sidebar, below). So what's in store for the next four decades?

For those who were involved with, or watched, the birth of the 8080 and know about the resulting PC industry and today's digital environment, escalating hardware specs aren't the concern. These industry watchers are more concerned with the decisions that the computer industry, and humanity as a whole, will face in the coming decades.

The 8080's start

While at Intel, Italian immigrant Fredericco Faggin designed the 8080 as an enhancement of Intel's 8008 chip -- the first eight-bit microprocessor, which had debuted two years earlier. The 8008, in turn, had been a single-chip emulation of the processor in the Datapoint 2200, a desktop computer introduced by the Computer Terminal Corp. of Texas in late 1970.

Chief among the Intel 8080's many improvements was the use of 40 connector pins, as opposed to 18 in the 8008. The presence of only 18 pins meant that some I/O lines had to share pins. That had forced designers to use several dozen support chips to multiplex the I/O lines on the 8008, making the chip impractical for many uses, especially for hobbyists.

"The 8080 opened the market suggested by the 8008," says Faggin.

As for the future, he says he hopes to see development that doesn't resemble the past. "Today's computers are no different in concept from the ones used in the early 1950s, with a processor and memory and algorithms executed in sequence," Faggin laments, and he'd like to see that change.

He holds out some hope for the work done to mimic other processes, particularly those in biology. "The way information processing is done inside a living cell is completely different from conventional computing. In living cells it's done by non-linear dynamic systems whose complexity defies the imagination -- billions of parts exhibiting near-chaotic behavior. But imagine the big win when we understand the process.

"Forty years from now we will have begun to crack the nut -- it will take huge computers just to do the simulations of structures with that kind of dynamic behavior," Faggin says. "Meanwhile, progress in computation will continue using the strategies we have developed."

Nick Tredennick, who in the late 1970s was a designer for the Motorola 68000 processor later used in the original Apple Macintosh, agrees. "The big advances I see coming in the next four decades would be our understanding of what I call bio-informatics, based on biological systems," he says. "We will start to understand and copy the solutions that nature has already evolved."

Carl Helmers, who founded Byte magazine for the PC industry in 1975, adds, "With all our modern silicon technology, we are still only implementing specific realizations of universal Turing machines, building on the now nearly 70-year-old concept of the Von Neumann architecture."

Human-digital synthesis?

How we will interface with computers in the future is of more concern to most experts than is the nature of the computers themselves.

"The last four decades were about creating the technical environment, while the next four will be about merging the human and the digital domains, merging the decision-making of the human being with the number-crunching of a machine," says Rob Enderle, an industry analyst for the past three decades.

This merging will involve people learning how to perform direct brain control of machines, much as they now learn to play musical instruments, predicts Lee Felsenstein. He helped design the Sol-20 (one of the first 8080-based hobbyist machines) and the Osborne 1, the first mass-market portable computer.

"I learned to play the recorder and could make sounds without thinking about it -- a normal process that takes a period of time," he notes. Learning a computer-brain interface will likewise be a highly interactive process starting in about middle school, using systems that are initially indistinguishable from toys, he adds.

"A synthesis of people and machines will come out of it, and the results will not be governed by the machines nor by the designers of the machines. Every person and his machine will turn out a little different, and we will have to put up with that -- it won't be a Big Brother, one-size-fits-all environment," Felsenstein predicts.

"An effortless interface is the way to go," counters Aaron Goldberg, who heads Content 4 IT and has been following the technology industry as an analyst since 1977. "Ideally it would understand what you are thinking and require no training," considering the computational power that should be available, he adds.

"Interaction with these devices will be less tactile and more verbal," says Andrew Seybold, also a long-time industry analyst. "We will talk to them more and they will talk back more and make more sense. That's either a good thing or a scary thing."

Next section: The dark side

Find your next job with computerworld UK jobs