Visit Your Local PBS Station PBS Home PBS Home Programs A-Z TV Schedules Watch Video Donate Shop PBS Search PBS
I, Cringely - The Survival of the Nerdiest with Robert X. Cringely
Search I,Cringely:

The Pulpit
The Pulpit

<< [ It's Not Personal, It's Business ]   |  The $15 PC  |   [ E.T. Phone Home ] >>

Weekly Column

The $15 PC: Computing is About to Enter a Hardware Revolution That Will Change the World as We Know It

Status: [CLOSED]
By Robert X. Cringely
bob@cringely.com

While we are all watching the stock market and scratching our heads, getting excited about the Super Bowl or the World Wrestling Federation, time and technology march on. But it is easy to pretend that they don't because compared to a dropped pass or an illegal wrestling hold, changes in the fundamental science of how we do stuff changes very slowly. But when it finally does change, well, it is with a vengeance, and everything we used to know has to be thrown out of our heads and we start all over. I believe we are on the threshold of just such a technical revolution that is going to make last week's arguments seem puny by comparison. And it will probably put last week's technology leaders completely out of business.

But technology, having often waited so long to move forward, even then moves fitfully, and sometimes even appears to move backwards. That's why it is so easy to scoff at an advance, seeing it as a retreat, because the new stuff, as new stuff is wont to do, often doesn't work very well.

Say you were an English yeoman in the 15th century. You had your fine English longbow, made of yew wood, and you had your skills as an archer, honed over a lifetime of killing animals and enemies. Then your boss' boss up in the manor house said everyone had to trade his bow for a rifle. How did you feel? Sure, the rifle could blow a hole clean through your opponent, but it was not very accurate. Early firearms were so bad they mostly just made noise. The longbow was not only more accurate in the hands of a good archer, it had longer range. But in time, firearms met the accuracy of bows and exceeded them. A 15th century futurist would see that. The trick is in timing when to jump to the new way of doing things.

Jump too early and the new stuff not only isn't good enough to support what you want to do, but people laugh so long and hard that careers can be ruined. Jump too late and, well, it's just too late. You can always be one of those greeters at Wal-Mart. The trick is to jump early enough to be a pioneer, but late enough not lose too much money. That kind of timing builds industries and fortunes. Henry Ford timed perfectly his jump into the auto industry, the founders of Sony embraced the transistor radio at just the right moment, and Marcel Bic fell in love with the ballpoint pen just in time to turn fountain pens into collectibles.

But this is a column primarily about computers. Most of the real progress in computers in the last quarter century has been made in software, but today I am predicting a hardware breakthrough. Our hardware advances in the same time period have been mainly incremental. My first hard disk held five megabytes and my most recent holds 36 gigabytes, but a historian 500 years from now would have a hard time seeing the difference between them. So too with silicon-on-insulator and coppermine technology: They both help chips run faster and cooler, but a chip is still a chip. The planar process of using photolithography to etch circuits on a crystalline substrate is unchanged. And Intel's Gordon Moore thanks us for that, because the durability of the planar process made him a billionaire.

Sorry Gordon, but I think 40 years of making chips the same way is enough. I think we're about ready to get over the whole idea of these things we call chips. Just as the integrated circuit replaced discrete transistors, it is time to replace the IC with something even more highly integrated and cheaper to build. Of course, a lot of money has been lost on just this kind of technical advance. Gene and Roger Amdahl blew $200 million back in the 1980s on something called Wafer Scale Integration, where the idea was not to turn a silicon wafer into 300 microprocessors, but rather to etch on that one wafer all the circuits in an entire computer. Wafer Scale Integration failed because a few defects that might have forced Intel to throw away a few 80386 processors from the hundreds on a wafer doomed the whole wafer when the computer on it wouldn't boot.

It is ironic that 15 years later we are probably at the point where we could make Wafer Scale Integration work, but I think it is already obsolete. Instead, let's follow the lead of the printers and make a great leap forward.

From the time of Gutenberg until the mid-19th century, printing made the same sort of incremental technological improvements that we've been watching with the planar process. Paper improved, type improved, inks improved, paper handling certainly improved, but a letterpress was still a letterpress. Whether it was the 15th century or the 19th century, printers still took a cut sheet of paper, pressed it on a type form, set it out to dry, then reached for another sheet of paper. Mass production in early printing, if it existed at all, just involved doing the same things faster. Then came the Web presses of the late 19th century. Suddenly, paper came in rolls, not sheets. Instead of printing a few pages at a time, entire newspapers could be printed in a single operation, snaking their way through the giant presses and ending, cut, folded, and stacked, at the other end.

Web presses not only changed the way newspapers were printed, they changed everything about what we have come to call the media business. They brought reading to the masses, advertising to the masses, they led to the invention of the magazine, and ultimately to our consumer society. Sure, Gutenberg cave us the Bible, but Web presses gave us the Sears catalog and Dilbert.

The same revolution is about to take place in computing. A Menlo Park, California, company called Rolltronics is well along on perfecting the technology required to essentially print a whole computer on thin plastic.

Imagine making a notebook PC like a magazine. Into the giant press you load a roll of plastic film. Start the presses! As the plastic snakes its way through the press the first "page," what's printed is the insulating back cover. Then come several pages of thin-film batteries. Then comes a page of interfaces (USB, video, modem, network), the microprocessor page, video page, audio page, and a bunch of memory pages (no hard disk). We're nearing the end. All that is left is the display and keyboard page, and the transparent protective final page, which includes touch screen sensors and flat panel speakers. The result, five minutes after the start of printing, is a notebook computer that is a quarter inch thick, completely silent, runs for days, and is shipped fully charged and ready to go. The first computer costs $500 million to make, but the second costs 15 bucks.

We are five years from that $15 computer. Some will scoff, because like those other technologies I mentioned, this one will be slower before it is faster. It is hard on a Web press to perfectly align the colors, called registration. You've seen newspapers that came out wrong and the color pictures looked funny. This limits the feature size possible on a printed computer. But remember that we have a lot of real estate to work with. Our microprocessor isn't some tiny silicon die — it's the size of a sheet of paper, maybe two.

This kind of thin film printing is already being used to make amorphic photovoltaic cells for energy production. The technology to make batteries and displays and processors and memories is well along at Rolltronics. And if for some reason those guys stumble, somebody else will make this happen. And just think what a happening it will be! Take two more orders of magnitude out of the cost of computing and the game is completely changed. Every student has a computer. Every country has a network. Hundreds of millions of cheap computers working together as massively parallel computers lead to scientific advances we can't even imagine. And the old-line companies like Intel and AMD, which are currently fighting over which is the superior obsolete semiconductor company, well, those outfits go out of business.

Comments from the Tribe

Status: [CLOSED] read all comments (0)