TOPICS > Nation

Y2K: Without a Glitch

January 5, 2000 at 12:00 AM EST
REALAUDIO SEE PODCASTS

TRANSCRIPT

GWEN IFILL: Well, the computers computed, the fireworks fired, and the Y2K bug was, apparently, debugged. So what was all that about? We get the views of Bruce McConnell, director of the International Y2K Cooperation Center, and former chief of information policy and technology at the U.S. Office of Management and Budget; Lou Marcoccio, research director at the Gartner Group, a business and technology consulting firm; David Gelernter, Professor of Computer Science at Yale University, and author of “Machine Beauty: Elegance and the Heart of Technology”; and Paul Strassmann, former chief information officer at the Pentagon and at Xerox Corporation.

He now runs the Information Economics Press, which publishes books on computer management. Mr. Marcoccio, the last time we gathered here, we were talking about dread and foreboding and all the things that could go wrong in racing to the ATM machines and to stock up on water. Three days into the new year, five days into the new year, nothing has happened. What happened?

LOU MARCOCCIO: Well, as far as public panic prior to January 1, we did see much less than we had anticipated here in the U.S. even though our surveys and polls indicated that there would be a lot more panic; I think some of the information that was put forth within the month or within weeks or days before January 1 made public awareness much greater and made people understand that this problem just wasn’t going to be as great, especially for key elements like infrastructure, power banking and so forth.

GWEN IFILL: Mr. McConnell, of course there were two schools of thought on this. One had it that there was not really that big a problem at all. That’s why there wasn’t a big crisis. The other school of thought has it that extraordinary effort solved the problem before it happened.

BRUCE McCONNELL: I think a lot of work did get done. There were a lot of hard-working people who worked long hours to get it done. So there definitely was a problem. If you look at the kinds of glitches we’re seeing now, ranging from failure of satellites to other things, smaller problems, software not functioning properly and heating systems not working, here and there — it’s very localized and I think in the words of the sheriff of a county in Ohio, geez, if we hadn’t done all that work, think how bad things would be. Now all he has to do is use keys to get into the cells instead of the electronic system.

GWEN IFILL: How do you gauge exactly whether it was the work that fixed it or whether there was any problem originally?

BRUCE McCONNELL: Well, I don’t think all these companies would have spent the kinds of money that they did when they could have been making investments in more productive things if there weren’t be a real problem. And I think the kinds of glitches we’re seeing are showing just the tip of the iceberg of what could have happened.

GWEN IFILL: Mr. Strassmann, it seems in some ways that this whole issue boiled down to a series of glitches, minor glitches instead of a massive failure. Do you agree with that?

PAUL STRASSMANN: Well, I see that the issue is one of how much money was spent and how much panic was generated. I consider the Y2K experience, which was the first major crisis of the information age, as a managerial failure. It sets a very bad precedent for our ability to deal with future crises which are yet to come.

GWEN IFILL: Which managers failed?

PAUL STRASSMANN: Well, what really failed was the way how each organization was forced to self-insure all by itself to protect itself at zero defect against all possible contingencies. The history of technology is one of society pooling risk in the form of insurance. We have had problems with technology since riverboats exploded, trains collided, automobiles wrecked and so forth.

Here is one example where the time-honored way of how society deals with risk was not followed. And there was no insurance. There was no standards. There was no testing. And everybody had to, on their own, protect their own fortress. We know from statistics that if you have to safeguard your house to be absolutely fireproof or earthquake proof, you’ll end up spending an enormous amount of money, and this is exactly what happened. We overspent.

GWEN IFILL: Mr. Gerlernter, you are a professor and expert in computer science at Yale. What does this episode tell us about our relationship with technology, our love-hate relationship, I guess?

DAVID GERLERNTER: Well, I think everybody knew that this was going to be a non-problem — so much money and effort and time had been spent that by the time we got to the date, the story was a non-story. The people were worried, and they were worried because they felt like being worried; they panicked because they wanted to be panicked. I think people are uneasy about computers for good reasons.

They rarely have an opportunity to express their uneasiness. And this gave them an opportunity. And I think there’s something much deeper here than a minor bug that was fixed.

GWEN IFILL: You’re saying it’s not politically correct to say that you’re scared by computers?

DAVID GERLERNTER: Exactly. It isn’t so much being scared. It’s people don’t like them. Them don’t trust them. They don’t understand them. They think the software they’ve got is lousy. They think we’ve gone overboard in a lot of ways. One rarely has an opportunity to express that kind of a thing nowadays.

GWEN IFILL: Well, Mr. McConnell, does that mean, do you agree with that? Are we being held hostage to technology?

BRUCE McCONNELL: I think this did demonstrate how reliant we are on technology, the fact that this bug got into everything. It was in all kinds of infrastructures as well as in regular computers. But it also showed that we have a very resilient infrastructure and that people are able to come up with manual processes in many cases or that there are redundant back-ups to deal with things when computers do fail.

We have two countries where the customs system is not working right now, but they have gone to manual processes and goods are still getting into the country. So we’ll see that kind of thing all around. We’ll also see people coping and using the tried and true methods until they get their computers running again.

GWEN IFILL: But, at what cost, Mr. McConnell, at what cost? $100 billion spent in the United States alone to fix these problems that we’re not sure now what the extent of them were.

BRUCE McCONNELL: Well, that’s the – the situation is that we fixed them so you can’t see what the cost would have been, you can’t quantify the cost, what it would have been if we hadn’t fixed them.

But I think if you look just for example at the failure of a couple of military satellites and the fact that they were fixed in a few hours and then finally fixed in a few days, whereas if they had waited and not worked on them, they would have been still fixing them now for several months and the cost of that is hard to measure but it was a significant threat to the capabilities of the countries involved.

GWEN IFILL: Mr. Marcoccio, I want you to respond to that, but also there’s something Mr. Strassmann pointed out, which is that we should have been able to insure ourselves against this risk instead of buying our way out of it. What do you think about that?

LOU MARCOCCIO: Well, first of all, we have quite a few kind of Monday morning quarterbacks that have been basically talking about what would have been and could have been and should have been, but basically we’ve watched and the progress of companies throughout the world, small, medium and large corporations throughout the entire world, make tremendous progress on addressing this issue, making themselves compliant, their business, their systems and so forth, as we’ve reported.

On the 20th of December in some of our press releases and so forth, we identified the fact that we would not see power failures, telephone failures and many of these different types of issues that we were all looking for on January 1. So I don’t think it was a big surprise about January 1. I think that many glitches are still occurring as we go through this process within companies but I think they’re being addressed as far as more routinely because these companies have done a great job of preparing. Yes, it costs a lot of money, of course.

But if you talk to basically any corporation throughout the world, they will tell you that this money was very well spent because they’ve had failures all along related to year 2000. They know what the impact could have been. There’s not one company among the tens of thousands of companies we talk with that have indicated that, you know, this was money that wasn’t well spent and, you know, wasn’t critically important to do.

GWEN IFILL: What about that, Paul Strassmann? Were there hidden benefits in all of this preparation not only for these companies but also for the United States and global competitiveness?

PAUL STRASSMANN: I rate the missing two digits in a calendar, the most trivial example of errors of computer systems. Computer systems basically are faulty and fail all the time. The important issue is to really look what is the lesson we have learned from having spent, I believe by the way much more than $100 billion, and that really has to do with the future of the information age.

We are, as a civilization, threatened by cyber terrorism and information warfare and the risk of the society not being damaged by missing two zeros but by overt acts that will disable our ability to function. Now, we cannot possibly go into the future with the same kind of hysteria, panic, and just buying our way out of some problems when we are going to be confronted with a much more serious onslaught on our ability to function.

So, what I would like to suggest is that the important thing to discuss is the implication and the lessons learned from the way we conducted ourselves in Y2K, rather than debating whether the people who spent the money feel that they did or didn’t spend the money well. This is like asking contractors whether their prices are right.

GWEN IFILL: Well, there you go, Bruce McConnell. There’s the question. Who is accountable and how did we conduct ourselves? What do you think?

BRUCE McCONNELL: Well, I guess I agree with Paul that this is an example of a global technology problem and that security is the next example of a global technology problem, but I disagree that we handled this in a way that’s not applicable to the security issue because, in fact, in this we’ve seen unprecedented cooperation between the government and the private sector and between governments all around the world. I mean, we’ve been working with 170 countries all around the world to address this, to share common solutions.

So although it wasn’t monetarily insured there was a great deal of community effort, community insurance, people working with each other, passing on lessons learned and best practices, both public and private that I think does bode well for our ability to manage tough technology problems like security.

GWEN IFILL: David Gerlernter, how about that, lessons learned in all of this, or was it just a total waste?

DAVID GERLERNTER: It wasn’t a waste. We got the effect we wanted which is that we weathered the so-called crisis without any major problems. I think it’s made make clear to people how primitive software is. It’s a new science, it’s a new technology, it’s a new industry.

Building bug-free software is beyond us. We don’t know how to do it. Software fails all the time everyday. And this was an easy bug to fix. It was clear and simple and sharp and well defined. Most bugs are a lot harder than this.

GWEN IFILL: The next bug, of course, everyone is talking about is a leap year bug. February 29, 2000 is something that the computers weren’t planning for. Do you think that should be taken seriously, Mr. Gelernter?

DAVID GERLERNTER: Certainly. Any potential bug ought to be fixed. But I think this type of bug, a date or calendar that doesn’t do the right thing, is the simplest, is the easiest thing to grasp. It’s the tip of an iceberg. Software is a lot harder than this and the real problems we have with functionality go a lot deeper and are a lot harder to fix.

GWEN IFILL: Lou Marcoccio, how much of our reaction in the past three or four years in preparing for Y2K had to do with preparing for legal eventualities, as well as technological ones, that is, the liability that companies would be held to if indeed their systems went off line?

LOU MARCOCCIO: Well, I think when companies first got involved in remediation or compliance efforts, I think the legal threat was a big one. I think that the fact that individuals, managers, people in corporations and so forth, were worried about themselves being personally responsible if they didn’t do what was defined as due diligence.

There was a lot of pressure put on early on. I think that helped getting people moving and moving forward. I also think that the Y2K Act that was signed earlier this year had a lot to do with the reduction of people driving forward with litigation and so forth.

We’ve seen a lot less activity since that point in time. So I think it’s become a lot less of an issue during the latter half or latter three-quarters of this year but it certainly was an issue when companies got started.

GWEN IFILL: Bruce McConnell, final word. Should Americans be breathing a sigh of relief that they escaped a bullet here or embracing for the next onslaught?

BRUCE McCONNELL: I think we should all be very happy what has happened. People will continue to encounter glitches over the next weeks ahead. I think by mid January or the third week we’ll have a better sense of how big those little glitches are but I don’t expect them to amount to much. I’m personally not too worried about the 29th of February, but security continues to be an issue. People should be on the look out for viruses and watch for Y2K glitches in their statements and things like that.

GWEN IFILL: And maybe keep that bottled water on hand, just in case. Thank you, gentlemen, very much.