Microprocessors change the world

December 01, 2011

Before the microprocessor, it was absurd to consider adding a computer to a product; now, in general, only the quirky build anything electronic without embedded intelligence.

I have always wished that my computer would be as easy to use as my telephone. My wish has come true. I no longer know how to use my telephone.
--Bjarne Stroustrup


Everyone knows how Intel invented the computer on a chip in 1971, introducing the 4004 in an ad in a November issue of Electronic News. But everyone might be wrong.

TI filed for a patent for a "computing systems CPU" on August 31 of that same year. It was awarded in 1973 and eventually Intel had to pay licensing fees. It's not clear when they had a functioning version of the TMS1000, but at the time TI engineers thought little of the 4004, dismissing it as "just a calculator chip" since it had been targeted to Busicom's calculators. Ironically the HP-35 calculator later used a version of the TMS1000.

But the history is even murkier. The existence of the Colossus machine was secret for almost three decades after the war, so ENIAC was incorrectly credited with being the first useful electronic digital computer. A similar parallel haunts the first microprocessor.

Grumman had contracted with Garrett AiResearch to build a chipset for the F-14A's Central Air Data Computer. Parts were delivered in 1970, and not a few historians credit the six chips comprising the MP944 as the first microprocessor. But the chips were secret until they were declassified in 1998. Others argue that the multichip MP944 shouldn't get priority over the 4004, as the latter's entire CPU did fit into a single bit of silicon.

In 1969 Four-Phase Systems built the 24-bit AL1, which used multiple chips segmented into 8-bit hunks, not unlike a bit-slice processor. In a patent dispute a quarter century later proof was presented that one could implement a complete 8-bit microprocessor using just one of these chips. The battle was settled out of court, which did not settle the issue of the first micro.

Then there's Pico Electronics in Glenrothes, Scotland, which partnered with General Instruments (whose processor products were later spun off into Microchip) to build a calculator chip called the PICO1. That part reputedly debuted in 1970, and had the CPU as well as ROM and RAM on a single chip.

Clearly the microprocessor was an idea whose time had come.

Japanese company Busicom wanted Intel to produce a dozen chips that would power a new printing calculator, but Intel was a memory company. Ted Hoff realized that a design with a general-purpose processor would consume gobs of RAM and ROM. Thus the 4004 was born.

It was a four-bit machine packing 2,300 transistors into a 16-pin package. Why 16 pins? Because that was the only package Intel could produce at the time. Today fabrication folk are wrestling with the 22-nanometer process node. The 4004 used 10,000-nm geometry. The chip itself cost about $1,100 in today's dollars, or about half a buck per transistor. CompUSA currently lists some netbooks for about $200, or around 10 microcents per transistor. And that's ignoring the keyboard, display, 250-GB hard disk, and all the other components and software that go with the netbook.

Though Busicom did sell some 100,000 4004-powered calculators, the part's real legacy was the birth of the age of embedded systems and the dawn of a new era of electronic design. Before the microprocessor, it was absurd to consider adding a computer to a product; now, in general, only the quirky build anything electronic without embedded intelligence.

At first even Intel didn't understand the new age they had created. In 1952 Harold Aiken figured a half-dozen mainframes would be all the country needed, and in 1971 Intel's marketing people estimated total demand for embedded micros at 2,000 chips per year. Federico Faggin used one in the 4004's production tester, which was perhaps the first commercial embedded system. About the same time the company built the first EPROM and it wasn't long before they slapped a microprocessor into the EPROM burners. It quickly became clear that these chips might have some use after all. Indeed, Ted Hoff had one of his engineers build a video game—Space War—using the four-bitter, though management felt it was a goofy application with no market.

In parallel with the 4004's development, Intel was working with Datapoint on a computer, and in early 1970, Ted Hoff and Stanley Mazor started work on what would become the 8008 processor.

1970 was not a good year for technology; as the Apollo program wound, down many engineers lost their jobs, some pumping gas to keep the families fed. (Before microprocessors automated the pumps, gas stations had legions of attendants who filled the tank and checked the oil. They even washed windows.) Datapoint was struggling, and eventually dropped Intel's design.

In April, 1972, just months after releasing the 4004, Intel announced the 8008. It had 3,500 transistors and cost $650 in 2011 dollars. This 18-pin part was also constrained by the packages the company knew how to build, so it multiplexed data and addresses over the same connections.

Typical development platforms were an Intellec 8 (a general-purpose 8008-based computer) connected to a TTY. One would laboriously put a tiny bootloader into memory by toggling front-panel switches. That would suck in a better loader from the TTY's 10 character-per-second paper tape reader. Then, read the editor and start typing code. Punch a source tape, read in the assembler. That read the source code in three passes before it spit out an object tape. Load the linker, again through the tape reader. Load the object tapes, and finally the linker punched a binary. It took us three days to assemble and link a program that netted 4KB of binary. Needless to say, debugging meant patching in binary instructions with only a very occasional rebuild.

The world had changed. Where I worked we had been building a huge instrument that had an embedded minicomputer. The 8008 version was a tenth the price, a tenth the size, and had a market hundreds of times bigger.

 

Intel's 8008 makes computing personal

It wasn't long before the personal computer came out. In 1973 at least four 8008-based computers targeted to hobbyists appeared: The MCM-70, the R2E Micral, the Scelbi-8H, and the Mark-8. The latter was designed by Jon Titus, who tells me the prototype worked the first time he turned it on. The next year Radio Electronics published an article about the Mark-8, and several hundred circuit boards were sold. People were hungry for computers.

"Hundreds of boards" means most of the planet's billions were still computer-free. I was struck by how much things have changed when the PC in my woodworking shop died this week. I bought a used Pentium box for $60. The seller had a garage with pallets stacked high with Dells, maybe more than all of the personal computers in the world in 1973. And why have a PC in a woodworking shop? Because we live in the country and radio stations are very weak. Instead I get stations' web broadcasts. So this story, which started with the invention of the radio several issues ago, circles back on itself. Today I use many billions of transistors to emulate a four-tube radio.

By the mid-70s the history of the microprocessor becomes a mad jumble of product introductions by dozens of companies. A couple are especially notable.

Intel's 8080 was a greatly improved version of the 8008. The part was immediately popular, but so were many similar processors from other vendors. The 8080, though, spawned the first really successful personal computer, the Altair 8800. This 1975 machine used a motherboard into which various cards were inserted. One was the processor and associated circuits. Others could hold memory boards, communications boards, etc. Offered in kit form for $1800 (in today's dollars), memory was optional. 1KB of RAM was $700. MITS expected to sell 800 a year but were flooded with orders for 1000 in the first month.

Computers are useless without software, and not much existed for that machine. A couple of kids from New England got a copy of the 8080's datasheet and wrote a simulator that ran on a PDP-10. Using that, they wrote and tested a Basic interpreter. One flew to MITS to demonstrate the code, which worked the very first time it was tried on real hardware. Bill Gates and Paul Allen later managed to sell a bit of software for other brands of PCs.

The 8080 required three different power supplies (+5, -5 and +12) as well as a two-phase clock. A startup named Zilog improved the 8080's instruction set considerably and went to a single-supply, single-clock design. Their Z80 hugely simplified the circuits needed to support a microprocessor and was used in a stunning number of embedded systems as well as personal machines, like Radio Shack's TRS-80. CP/M ran most of the Z80 machines and was the inspiration for the x86's DOS.

But processors were expensive. The 8080 debuted at $400 ($1,700 today) just for the chip.

Then MOS Technology introduced the 6501 at a strategic price of $20 (some sources say $25, but I remember buying one for twenty bucks). The response? Motorola sued, since the pinout was identical to their 6800. A new version with scrambled pins quickly followed, and the 6502 helped launch a little startup named Apple.

Other vendors were forced to lower their prices. The result was that cheap computers meant lots of computers. Cut costs more and volumes explode.

Active elements

In this series of articles, I've portrayed the history of the electronics industry as a story of the growth in use of active elements. For decades no product had more than a few tubes. Because of RADAR between 1935 and 1944 some electronic devices employed hundreds. Computers drove the numbers to the thousands. In the 1950s, SAGE had 55,000 per machine. Just six years later the Stretch squeezed in 170,000 of the new active element, the transistor. In the 1960s, ICs using a dozen or less to a few hundred transistors shrank electronic products.

We embedded folk whose families are fed by Moore's Law know what has happened: some micros today contain 3 billion transistors on a single die; memory parts are even denser. And on the low end, a simple 8-bitter costs tens of cents, not bad compared with the millions of dollars needed for a machine just a few decades ago. But how often do we stand back and think about the implications of this change?

Active elements have shrunk in length by about a factor of a million, but an IC is a two-dimensional structure so the effective shrink is more like a trillion.

The cost per GFLOP has fallen by a factor of about 10 billion since 1945.

It's claimed the iPad 2 has about the compute capability of the Cray 2, 1985's leading supercomputer. The Cray cost $35 million more than the iPad. Apple's product runs 10 hours on a charge; the Cray needed 150 KW and liquid Flourinert cooling.

My best guess pegs an iPhone at 100 billion transistors. If we built one using the ENIAC's active element technology, the phone would be about the size of 170 Vertical Assembly Buildings (the largest single-story building in the world). That would certainly discourage texting while driving. Weight? 2,500 Nimitz-class aircraft carriers. And what a power hog! Figure over a terawatt, requiring all of the output of 500 of Olkiluoto power plants (the largest nuclear plant in the world). An ENIAC-technology iPhone would run a cool $50 trillion, roughly the GDP of the entire world. And that's before AT&T's monthly data-plan charges.

Without the microprocessor there would be no Google. No Amazon. No Wikipedia, no web, even. To fly somewhere, you'd call a travel agent, on a dumb phone. The TSA would hand-search you… uh, they still do. Cars would get 20 MPG. No smart thermostats, no remote controls, no HDTV. Vinyl disks, not MP3s. Instead of an iPad, you'd have a pad of paper. CAT scans, MRIs, PET scanners, and most of modern medicine wouldn't exist.

Software engineering would be a minor profession practiced by a relative few.

Accelerating tech

Genus Homo appeared around 2 million years ago. Perhaps our first invention was the control of fire; barbequing started around 400,000 BCE. For almost all of those millennia Homo was a hunter-gatherer, until the appearance of agriculture 10,000 years ago. After another 4k laps around the sun some genius created the wheel, and early writing came around 3,000 BCE.

Though Gutenberg invented the printing press in the 15th century, most people were illiterate until the industrial revolution. As I described in part 1 of this series that was the time when natural philosophers started investigating electricity.

In 1866, within the lifetime of my great-grandparents, it cost $100 to send a 10-word telegram through Cyrus Field's transatlantic cable. A nice middle-class house ran $1,000. 1866 was before the invention of the phonograph. The only entertainment in the average home was the music the family made themselves. One of my great-grandfathers died in the 1930s, just a year after he first got electricity to his farm.

My grandparents were born towards the close of the 19th century. They lived much of their lives in the pre-electronic era. When probing for some family history my late grandmother told me that, yes, growing up in Manhattan she actually knew someone, across town, who had a telephone. That phone was surely a crude device, connected through a manual patch panel at the exchange, using no amplifiers or other active components. It probably used the same carbon transmitter Edison invented in 1877.

My parents grew up with tube radios but no other electronics.

I was born before a transistorized computer had been built. In college all of the engineering students used slide rules exclusively, just as my dad had a generation earlier at MIT. We had limited access to a single mainframe. But my kids were all required to own a laptop as they entered college, and they have grown up never knowing a life without cell phones or any of the other marvels enabled by microprocessors that we so casually take for granted.

The history of electronics spans just a flicker of the human experience. In a century we've gone from products with a single tube to those with hundreds of billions of transistors. The future is inconceivable to us, but surely the astounding will be commonplace. As it is today.

Thanks to Stan Mazor and Jon Titus for their correspondence and background information.


Jack Ganssle (jack@ganssle.com) is a lecturer and consultant specializing in embedded systems development. He has been a columnist with Embedded Systems Design and Embedded.com for over 20 years. For more information on Jack, click here.

Happy Birthday, 4004
Jack Ganssle's series in honor of the 40th anniversary of the 4004 microprocessor.

Part 1: The microprocessor at 40--The birth of electronics The 4004 spawned the age of ubiquitous and cheap computing.

Part 2: From light bulbs to computers From Patent 307,031 to a computer laden with 100,000 vacuum tubes, these milestones in first 70 years of electronics made the MCU possible.

Part 3: The semiconductor revolution In part 3 of Jack's series honoring the 40th anniversary of the microprocessor, the minis create a new niche—the embedded system.

Part 4: Microprocessors change the world In part 4 of Jack's series honoring the 40th anniversary of the microprocessor, now embedded systems are everywhere.

 

Comments

blog comments powered by Disqus
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT