Visit Your Local PBS Station PBS Home PBS Home Programs A-Z TV Schedules Watch Video Donate Shop PBS Search PBS
I, Cringely - The Survival of the Nerdiest with Robert X. Cringely
Search I,Cringely:

The Pulpit
The Pulpit

<< [ Caught in a .NET ]   |  Be Absolute for Death  |   [ Bill to Linus: You Owe Me. ] >>

Weekly Column

Be Absolute for Death: Life After Moore's Law

Status: [CLOSED]
By Robert X. Cringely
bob@cringely.com

If there was ever a creator of wealth on a fantastic scale, ever a changer of custom and social values, ever a determinant of where our culture is headed and why, it's Moore's Law. Gordon Moore, the co-founder of Intel, didn't so much invent his "law" as observe it — the apparently inexorable increase in the number of transistors that could be etched on the same silicon wafer. Moore's Law is really a rule of thumb that says computers will either double in power at the same price, or halve in cost for the same power every 18 months. It's an insidious effect, one that has increased silicon density by a billion times in the last 50 years, and in the process placed more computing power in my wristwatch than was used to win the Second World War. But now some people predict Moore's Law is in danger of being repealed.

We hear these arguments every few years as some writer realizes just how far we've come and decides that for some reason, it can't go on much further. But to this point, they have always been wrong. The technical arguments have to do with the photolithography process that is used to make silicon chips, and how the lines etched using this process are becoming thinner than the wavelengths of light used to create them. Plotting into the future, that curve representing silicon density just might hit a wall, we're told, around 2008 or 2010. Well, this certainly isn't Silicon Valley thinking at work, since the whole essence of what that place is about is finding clever cheats to get around seemingly insurmountable problems such as this. Though I haven't any idea how Moore's Law will be granted a life extension, I'm sure it will be, though I'm also sure it will come at a cost.

But what if the nay-sayers are correct? What if Moore's law is suddenly no longer in effect, or more correctly, is no longer in effect for those parts of society unwilling to pay $1,300 for a toilet seat or $900 for a hammer? I think it would be a wonderful thing.

Huh?

Maybe it is because this is the week of Comdex, the big computer trade show in Las Vegas where thousands of new products are being introduced, that I find myself thinking of how short sighted we are in our addiction to all things new and techie. Moore's Law definitely has its downside, though few people want to admit it. Ever-growing computing power means ever-dropping prices, which makes Silicon Valley into a glider that descends at a constant rate and can only climb by finding a technical air current that is rising (gross sales are growing) faster than individual prices are falling. In Silicon Valley, a good year is one in which sales jump 30 to 50 percent. Twenty percent growth is considered no growth at all. And flat sales, which we are presently enjoying, well that's a synonym for economic depression.

What's happening is we have a business that is nearly 30 years old, and we tell ourselves that it is mature, which is to say that such fantastic growth rates ought to be the norm from here on out. But the truth is that we are still in the early days of computing. Having survived the eras of data processing (batch computing with piles of punched cards) and interactive computing (character-based terminals connected to mainframe or minicomputers), we are now well into the era of personal computing (graphical interfaces and one box per user). That's not the future of computing, though. It just feels like the future because of our limited imaginations.

I am no smarter than you are, but my guess about the real future of computing follows the theme that the step after ubiquity is invisibility. Right now, almost everyone has a beige box on or under their desk, so we are reaching the stage of ubiquity, but are far from invisibility. The box is still there, we kick it under the desk, it makes noise, and when we leave the desk, we leave computing behind. That will change in time.

Drop by the MIT Media Lab, and they'll show a future filled with wearable computers that will go with us into the world — except I'll bet that won't be the way things will actually work in the future. This is for a couple reasons. For one, as my hero Larry Tesler once said, a way to ensure that a new technology never becomes commercially viable is to fund research on it at the MIT Media Lab. And quite simply this is because the Media Lab, and outfits like it, are too dedicated to the concept of extending computing into the world, which simply doesn't work. The economics aren't right. Turning your jacket into a computer is one way of ensuring that jackets cost $5,000, and how many people can afford that? Such jackets won't get cheaper over time because once we set a price point, it hardly ever changes since we always want more power. But we also want fashion, and who is going to be happy with only one jacket?

Price points move very slowly. The price point for desktop computing stood at $3,000 for the first 15 years of personal computing, which took the U.S. to approximately 40 million PCs. In the last decade the price point has eroded to around $1,200, and we have close to 200 million computers as a result. We have ubiquity, but the jump to invisibility will require something else — a change not just in the way we package computers, but in how we use them. We have to make them not a part of our clothes so much as a part of our lives and to make that happen requires real price and feature stability; to wit, the end of Moore's Law.

In a sense, what I seek is the end of innovation so we will be allowed finally to truly innovate.

Such thinking makes me a heretic, a Luddite, a guy who would apparently turn his back on the very source of stock options and human genomes and Gameboys that have so come to define our modern life. But that's not my point at all. I just think there is much to be gained from slowing down.

For just one example, look at the role of computers in schools. A textbook has a useful life of 10 years, a desk lasts 25, but a personal computer is scrap in three years. Which of these things costs the most? No wonder we are unable to put computers to good use in schools: the economics simply don't work. But what if Moore's Law did fade from the land, and suddenly a PC could labor away for 25 years? Then every child would have a desk and a computer. There would simply be no argument about it. Perhaps the desk would BE the computer. And would technical innovation cease? No. Haiku is limited to 17 syllables, yet still there are poets.

The best way to make computing invisible is to build it into our infrastructure. I don't mean our computing or networking infrastructure, but the real infrastructure — the roads and curbs and stoplights and buildings that make our world OUR WORLD. We long ago worked out that infrastructure begat cultural, social, and economic development, but only if that infrastructure could be amortized over a period of many years. The U.S. Interstate Highway System was the biggest public capital project in history, but in terms of return on investment, it was probably the cheapest, too. There is no way that the Internet can compare.

We have become so used to the idea of throwing away our technical efforts every year or two that "long term" has been redefined into absurdity. If Moore's Law was gone, all that would change, and we might find ourselves building data structures of enduring quality. We won't be wearing computers in our jackets, we'll be living in a world that has taken the true potential of computing into its heart. Rather than standing on a street corner asking my jacket to tell me the weather forecast, I'll ask a street light or a parking meter. Anything that has power should become a networked device and as a part of the public infrastructure, should be an entry point to the store of public knowledge.

Think about it. A parking meter as a networked device could be used by hundreds of people per day, which is a heck of a lot more cost-effective than wearing a computer suit. And it wouldn't even be that hard to do. But it won't work at all if we have to buy a new parking meter every 18 months. And that's why I am eager for the end of Moore's Law, for a time when we can stop building gizmos and start using digital technology in an enduring way.

I'm not saying we'll return to the age of building great cathedrals, but at least we'll have the right bricks and mortar should we decide to.

Comments from the Tribe

Status: [CLOSED] read all comments (0)