Y2K: The Winter of Our Disconnect?

Learn more about the history of computers on The Computer Museum History Center's Timeline.

 

with Bob Cringely

[back] Double Digit Debacle | Spaghetti Code [next]

It all began in the 1950's when computers were huge, vacuum-tubed, behemoths filling up a room and employing dozens of technicians. "Size was good" during that prosperous era. Instead of shrinking computers down to a desktop, computer specialists predicted everyone would be connected to one of a handful of gigantic, national computers.

Why only a handful? Because of the cost to build and maintain these monstrosities. Computer memory was over a million times more expensive than today's microchips. A large computer card loaded with wires and tubes cost over $1,000 dollars and stored only one kilobyte (or 1,024 bytes) of data. That's 1950's dollars. A new car cost less.

So, the programmers on these machines had to conserve as much space as possible. Everything was abbreviated and shortened. Unlike the feature-rich software we use today, computer applications were written in machine code and performed a minimum number of operations. A popular trick for saving space was to abbreviate the data they had to store and process.

One shrinkable piece of data was the year included in all the dates. Years go by so slowly, they could be tracked with a smaller number, using only one or two digits. Programmers weren't oblivious to the coming millennium change, but they felt comfortable knowing that they could easily change their code later, because there were only a few mainframes around to fix. And the change was decades away.

Then came the personal computer, and microchips started showing up in everything. It was getting more complicated.

[back] Double Digit Debacle | Spaghetti Code [next]

 


UPDATE | Millennial Mania | Truth or Dire Consequences
Double Digit Debacle | Survivalists and Social Security
Home | Radio Library | Y2K, Cringely

Copyright 1999 Oregon Public Broadcasting and PBS Online

Home PBS Online Home