What do you think? Leave a respectful comment.

The video for this story is not available, but you can still read the transcript below.
No image

Dr. Bagian on Medical Errors

One of the medical organizations that has invested the most in technology aimed at reducing medical errors is the Veterans Administration. The following is an extended transcript of Dr. James Bagian, director of the National Center for Patient Safety at the VA, discussing the effort with health correspondent Susan Dentzer.

Read the Full Transcript

  • SUSAN DENTZER:

    Thanks so much for talking with us.

  • DR. BAGIAN:

    My pleasure.

  • SUSAN DENTZER:

    Let's start by talking about your former career as an astronaut. What did you do in space?

  • DR. BAGIAN:

    Well, as an astronaut, you do many things. In space, the two missions I flew on — one was where we deployed a satellite. We flew one of the first missions after the accident. We had originally been assigned to the one —

  • SUSAN DENTZER:

    This is the Challenger?

  • DR. BAGIAN:

    The Challenger — I was originally to fly Challenger on the accident or the crew and then we swapped, just a little before the mission. We flew and we deployed a satellite, as well as did a bunch of medical experiments and life-science experiments on orbit.

    The second one, I flew was the first dedicated life-sciences space lab, which is sort of like a physiology lab in space to see how your heart works, how you regulate blood pressure; motion sickness; and on my first flight, I was the first one to use a treatment for motion sickness that is now still the standard of care since, like, 1989 where we used to have 75 percent of our folks would be sick on their first flight.

    Now, with motion sickness treatment, virtually all of them become asymptomatic, which was a good thing. And then I did the investigations for both Challenger and Columbia accident investigation, I was one of the investigators, investigatory members. And, then, was a key part of designing the escape system for the Shuttle, the one that's there currently.

  • SUSAN DENTZER:

    So, how does a nice former astronaut like you get into health care and dealing with safety issues in the health care setting?

  • DR. BAGIAN:

    Well, my background — I started out as an engineer. And then I went to medical school and was a physician. And then I was called by Dr. Ken Kizer, who's the Undersecretary for Health, the person that really runs the hospital systems in the VA And he called me one day and he said we're trying to this thing in patient safety — and this is back in 1997; well in advance of the I-01 Report. And said, we're making progress but, really, not the kind of progress I'd like. Would you chair an oversight committee to look at how we do it?

    And I said, well, why is it you want me to do it, Ken? And I didn't really know Ken, except I had met him once and I didn't really know him well. And he said, well, you know, you have an engineering background; you've done accident investigations; you understand safety from your time as an astronaut and working with the space program and I'd like you to apply it here.

    So, I really was quite anxious to do it because I was kind of chaffed under the fact that, culturally, we were much different in aviation than we were in medicine. You know, certainly, the personalities of surgeons versus pilots or astronauts are almost identical. But, yet, the way we would approach problem-solving was much different.

    In medicine, we talk about professionalism and, you know, I'm diligent; I work hard; I try hard. I try to be perfect and know everything, then everything will be okay.

    We feel the same way as a pilot or an astronaut, except we know I can never be perfect and I have to design my system for those times when I'm not perfect. The overall performance of our system still is okay.

    In medicine, we want the nurses to be perfect; the pharmacists to be perfect; the physicians to be perfect. But when they're not, in many cases, the wheels come off the cart. And that's a much different way of looking at it. So, it's more of a team thing: How do we work together as a system to deliver the good result we want? Rather than, say, it's each person operating in their own little world. And as they operate in their own little world well, they'll be okay.

    Well, you know, nobody, no matter how good they are, can be perfect every day. So, to design a system or — it's not even a system — to have a way we do business that doesn't recognize that, means, you're really condemning your patients to occasionally be harmed.

  • SUSAN DENTZER:

    Why do you think it is that health care took so long to figure this out when aviation, other, basically, aerospace, as you say, got onto this long ago?

  • DR. BAGIAN:

    Well, I think you have to look at it from an evolutionary standpoint. If you look at aviation back in the '30s and '40s, they were just as medicine is today. They had a, you know, an air mail pilot back in the '30s had a life expectancy, on the job, of three to four years. And it wasn't until the '50s, that aviation really started looking and saying we can't just keep building more planes when we crash them.

    It wasn't about losing pilots, it was about, we can't afford to keep building as many planes, because they're getting more complex and cost more.

    So, when they started to look at it, methodically, from a systems approach, like, what can we do to make sure that we don't crash the plane? Then they started getting there. And you can see, like, in the military, for example. The loss rate has gone down over 27-fold in the last 50 years. And that's even with, that we fly in a riskier environment today, because we fly faster, lower to the ground, in all weather, at night, which we didn't do before and, even in spite of that, we're still 27 times less loss rate, which is huge.

    But it didn't happen all at once and it was understanding it's we standardize. It's not like everybody has their own little way they want to fly the plane. We said there are certain ways to do it; that we use checklists for certain things; that, you know, you take away certain latitudes. And it's like, in medicine now, we're getting to with clinical practice guidelines that we say, okay, after a heart attack, there are certain medications most patients should be on.

    And it's not just like, well, I never used that before, I don't think I feel like using it. You know, it's like the science says this is what it is, we all should do it.

    Now, I think one of the problems we have in medicine where we don't have a good electronic medical record in some hospitals, at the VA we do. If you have a good electronic record — one that can remind the physician and say, hey, this patient has this diagnosis, you haven't prescribed this medication, is there a good reason you haven't?

    So, you can't just happen to overlook it because there's only so many things you can maintain in your mind. Most places are still on paper. Well, the chart doesn't talk to you. You know, so, if you overlook something, you know, this is the first time you're seeing a patient, or you can't even get the old chart — how do you even know?

    So, it's really trying to make it a system that helps you be more effective and efficient. And I think medicine's starting to get there. But you have to get away from the rugged individualists that, I'm the captain of this ship and all is on my head.

    I mean, you want to be responsible, but to understand, we work together and that I am flawed, as everyone is flawed and I'm going to design the system so even when my flaws come to the fore, the patient still is not harmed. So, one thing I would say is it's not about errors. I think the I-01 Report really kind of put the emphasis the wrong way, quite frankly, back in '99, where they said it's about preventing errors. Because everything that happens isn't viewed as an error.

    What you're really trying to stop is harming the patient. If you had Aladdin's lamp and you got to have a patient-safety wish, would you wish for no errors next year or nobody hurt inadvertently? I think we'd all agree we don't want to hurt anybody. The patient doesn't care that there's individual errors. If it never translates to hurting them, that's all they care about. And we've seen examples where things hidden in plain sight. Patients have been hurt in the past or almost hurt and nobody looked at it as an error. Therefore, nobody addressed it.

    And, yet, when we changed the lens through which we viewed this in the VA and said, you know what? It can be a close call. You know, the main thing is if you thought it could or did hurt a patient, we're going to look at it, suddenly things percolated up that had been there for years and nobody had addressed. Not just in the VA, but, anywhere. And then we fixed them.

    so, I mean, looking at it differently that way, that it's not like, let's find the guilty party. It's not the guilty party. It's let's find out where the flaws and vulnerabilities are in the system and then attack them in a methodically sound way to fix it.

  • SUSAN DENTZER:

    And let's talk about some of the ways we do that. But, first, let me ask you, because this flows naturally out of what we've just been talking about: The cultural piece of this, that emerges is really important. Possibly even much more important than whether you've got bar-coding or anything else. Why?

  • DR. BAGIAN:

    Well, I think the culture's all what it's about. One of the things, when we first did our first survey, in fact to decide how we would do this. We did the first survey of patient-safety culture done in the world. And we did this and we asked questions and we asked questions. But the one question that was most surprising was, we said, do you think patient-safety is important to good patient care? That sounds like a no-brainer.

    But, in fact, we found — regardless of whether you were a nurse, physician or an administrator — they said that 27 percent said they strongly agreed; 24 percent said they strongly disagreed. So you had a quarter that said absolutely yes, you had a quarter that said absolutely no.

    When we went back to follow that up to find out why, it wasn't that the quarter that said, no, didn't think it was important, but they didn't think it was important for them. It was somebody else's problem. It was that other doctor down the street. It was that hospital down the street. It was that other nursing floor, but not them.

    Well, how does safety occur? Each person makes things safe. So, if you don't think it's your problem, you're just ignoring it, right? You think somebody else is going to make it happen. And if three-quarters of your folks think somebody else is going to do it, that means it doesn't get done.

    So, we had to show people that it's not that you're a bad physician or a bad nurse or a bad pharmacist or what have you. It's the fact that bad things can happen, understand what those are and go ahead.

    And the culture's much different in medicine. Up till now — I think it's changed quite a bit in the VA — but up until now, in medicine, what people thought is that it's always somebody else; it's not their problem. And we often say, like, in aviation: the person who doesn't think a bad thing can happen to them, is the most unsafe person in the room. Everybody has to know, given the right set of circumstances, any of us can have a problem.

    And it's just either colossal ignorance or arrogance to think that it couldn't happen to you, given the right set of circumstances, And that's the mind set that has to change. And I think we're seeing that change in the VA right now, Quite substantially.

  • SUSAN DENTZER:

    All right, let's talk now about how you, in effect, implement this on a system basis. And just start at the broad end of things, I guess, is, really, where we want to look.

  • DR. BAGIAN:

    All right.

  • SUSAN DENTZER:

    We'll talk about your whole approach to sort of anticipating failures. Let's talk about that.

  • DR. BAGIAN:

    Well, the first thing, in order to be able to fix things, you have to know what's wrong. So, you have to have a system that allows people to bring things up. A reporting system, if you will, that identifies things. This isn't the count, but it's to say where are the vulnerabilities? Where are the problems, the rough spots in the road?

    And that doesn't require having to have an accident first or an adverse event. A close call can be enough; that we almost operated on the wrong person and we almost gave a patient the wrong medication. We don't have to wait until we give the patient a wrong medication. It's like when you drive a car. You know, a young child runs out from the side that was playing with a ball and you jam on the brakes and, suddenly, your heart's beating fast — you drive your car differently the next time your kids playing. you didn't have to run over that kid. But, unfortunately, organizations — not just in medicine, but most organizations, when there's a close call, the person that had the close call, but they don't tell everybody else they work with and share that. So, instead we each learn by our own bitter experience.

    And, as they say, experience is the best teacher, but it's also the most expensive teacher, and who pays the tuition for us in health care? The patient. That's not the way to do business.

    So, what we do is we make it safe for the people, for the workers, health care workers to report, so they can bring up a problem, where they're saying this is a problem and not be in fear of, somebody's going to say, okay, let's blame you.

    Because I can tell you, you know, we have a 158 hospitals and hundreds and hundreds of outpatient clinics and people will think this thing that happened is unique to them, it couldn't happen anywhere else? Well, I got news for you, it happens everywhere. Not just in the VA, but outside.

    And, recognizing that, we can say, okay, what is the system's level issues? Why was it that a nurse wanted to, you know, give the wrong medication? And then design systems that make it virtually impossible for that to happen. And you will see, you know, you can go into the VA hospitals and they use bar codes for medication administration. Where the patient has a bar code on their wristband; there's a bar code on the med, it's linked by computer and radio message to the computer.

    So, they scan it, it tells the nurse here is the drugs they may be getting right now. They pick it out, they scan it, it's not the right — when it tells them, and they do it. So, it's not just a have to misread it or — it makes it so it makes it virtually impossible in this medicated patient. It's not just telling the nurse: be more careful. that's not the Nobel-Prize-winning approach, telling people to be more careful. I mean, they're already trying to be careful. So, it's understanding those kinds of things. So, when they can report and not be punished, they can do it.

    Now, we make our safety separate from administrative control. So supervisors still look for a performance. People are constantly having problems, you have to deal with that. But it says when we look at the safety side, which is separate — any — but separate. You don't use safety investigation to discipline individuals. You know, the supervisors have to do their job; the administrators have to do their job. The safety, we look at what are the system's issues, so it has a greatest applicability to help our patients.

  • SUSAN DENTZER:

    What do you call this reporting system?

  • DR. BAGIAN:

    Well, ours, we say, we don't even — we have a reporting system that is our patient safety information system. But what happens is we actually have two. We have an internal system where it comes in there — but we were concerned, we said we want to have one that's outside the VA, So if people thought that they couldn't trust us — and it's a matter of trust, they could report somewhere else. The problem is that they report outside and we got NASA to do this for us. We said, if NASA did it for us, they only would be able to tell us that it happened in some VA, they wouldn't tell us it's in this particular VA, which doesn't give us the laser precision to say where do we need to act and learn, but it's better than not knowing at all.

    Now, if you look at aviation, that's what they did and they got tons of reports outside the FAA to go into NASA because they did it in aviation. We found we did it and we get almost no reporting there, because people have such confidence that they'll be treated safely.

    We found, when we put this new system in the VA that we designed that we came up with and we showed that you wouldn't be punished, you know, for a report. There are exceptions, but you won't be punished. We found our reporting went up 30-fold.

    We found our reporting of close-calls went up 1,000-fold and has stayed up ever since that. So, it's been over four years and it stayed there. It didn't just go up and then come down. It stayed there. So, the culture really has changed how they looked at it. And it's by making it safe.

    Now, we do not have a blame-free system. Sometimes you've heard people say that. I think that's misguided. We have a system that says what's blame-worthy. So, we say, if you commit an intentionally unsafe act, that's an act that was a criminal act; an act that involved substance or alcohol abuse by the care provider; or, an act that was purposely unsafe, that is, you knew it was going to be unsafe and you did it anyway, that doesn't go in the safety system. That goes in the administrative system, you know, because that way, if you needed to punish an individual or take administrative action, you can. But if it's not that, then we look at the system's issues, because that's where somebody was trying to do their job correctly and it didn't work out right.

    We don't punish people for trying to do a good job and a bad thing happening, not from the safety system side.

  • SUSAN DENTZER:

    Give me an example of a close-call report that you got that triggered the kinds of changes that you describe.

  • DR. BAGIAN:

    Oh there are a myriad of them. We found, for example, There is a pacemaker, a thing that — a device that makes the heart beat at the right rate and rhythm that it was an external one, not the ones they implant in your chest, but they hook up temporarily, while you're in intensive care unit.

    A patient, they put it in, they went to activate it and it came up with an error message. And this was like a big VCR, remote-control on steroids, and it has a little display screen and it said error and it had a number after it. And they tried to clear the error and they couldn't; they tried to clear the error and they couldn't. They tried again and it wouldn't work. So, they said, well, let's turn it off and on, maybe that'll make it work, right?

    They turned it off and on, it still didn't do it, well this patient's not being paced and they were about ready to arrest, go into cardiac arrest and die. At that point the nurse says, wait a minute, I have another pacemaker, let's hook that up. They hook it up, it worked. Well, in the old days what they did was, they just said, well, let's send that broken one back to the bio-engineers to fix. This time, they asked what happens when we send it back there? Because what would happen is they'd send it to the engineers; the engineers would look in the book and it says, oh, take the battery out and put it right back in and suddenly the error message goes away.

    And then they'd send it back to the floor, sometimes, within hours. The nurses would say, wow those engineers must be wizards. They just fixed it so readily. But they never understood what it was. Well, we went and we talked to the company and the fellow that actually was one of the designers, and he said, oh, yeah, that happens, we get calls about that all the time. But they should know that you take the battery in and out.

    Well, understand, if you didn't have a spare pacemaker, this patient could have died very easily or stroked or, you know, been seriously hurt. And this is the most-widely used pacemaker in the entire world. Well, because we understood this, we immediately placarded all our pacemakers that said, if error 004, take the batter out and put it back in.

    So, you could do that, like, in 10 seconds. That was — nobody got hurt at our place doing that. We learned that from a close call. That had been hiding in plain sight. And the reason was, nobody looked at that as an error. So, when you're saying report errors, they just said this is broken, it's not an error, it's just broken.

    When we said, instead, here's something that could hurt a patient, everybody understood that. This could have hurt a patient, it almost did. They looked at it and we fixed it.

    There are others. MRIs — we found from close-call reports of magnetic resonance scanners, it's like an x-ray machine and it's a very strong magnet. We had a patient rolled into an MRI scanner that had sandbags stabilizing his arm that said MR-safe. Well, they weren't really MR-safe. They were MR-safe for the old magnetic resonance scanner, but the new one is more powerful. So this patient ended up pinned against the magnet with his arm held against the magnet and the magnets are always on, it's not like they just turn them on to take an image, they're on 24/7.

    He's pinned to the magnet, took four people to four people to extricate him from the magnet. We went and looked and we found this happens a lot. Just people don't report it because that kind of stuff happens all the time. Like, IV polls fly through the room. We have chairs. I can show you, pictures of chairs stuck in the MR. IV pumps stuck in the MR; and we said, we have to do things differently.

    And we brought this up and, you know, I don't say we totally eliminated it, but we reduced the risk by making people aware of things they need to look for and put it on the Web for everybody to see. And that was about a year before that incident in New York where they killed that young child; from the same — the oxygen bottle, through the same causes, you know. So, that was a close-call we learned from. We didn't hurt anybody in the VA doing this, but we learned from a close call.

    So, there are many, I mean, that's just some of them. We learned the same thing with correct sight-surgery. We've seen from just close-calls where somebody almost had a bad thing go wrong, reported it and we looked at the causes and without having to have the wrong surgery, we could then understand the causes.

    That's not to say we've never had incidents. We certainly have. But the close-calls are an additional way to learn and people are a lot more likely to be forthcoming when nobody was hurt than if somebody was hurt; they feel guilt; they may be in denial; I mean, there's all kinds of things going on. And close-calls are just a very good way to do it.

    The other advantage of looking at close calls is that close calls happen anywhere from 10 to 200 times more frequently than the event they're the precursor of. So, you can think of for every incorrect surgery that's done, there's anywhere from 10 to 200 that almost happened. Why not learn from those? Why do we have to amputate the wrong limb? Take out the wrong person's gallbladder, to learn? Couldn't we just learn when somebody gets in there and says, oh, wait a minute, this is the wrong patient. How did the wrong patient get in the room? And understand it at that level.

    So, it's sort of like, it's not no blood, no foul, which you'll hear people say sometimes. That's true for tort, if you don't hurt people, you don't have to pay. But if you're trying to learn, you want to say how did I almost have a problem and then let's fix it.

  • SUSAN DENTZER:

    One of the close-call causes, I guess or, actually, a cause of actual injury that you unearthed is misidentification of patients in surgery. Not just the wrong site, the wrong patient?

  • DR. BAGIAN:

    Right.

  • SUSAN DENTZER:

    Tell us how you found that?

  • DR. BAGIAN:

    Well, we collect a number of reports and we've had over the last four or five years, we've had over 150,000 reports because we tell people, no matter how insignificant it is, report it. We prioritize them. We can't thoroughly investigate 150,000, but we have a technique that's used, really, around the world now in many other countries and around the United States.

    It's a way to say how severe is it? How likely is it to occur? And, certainly, the more severe things or ones that could be more severe or more common, we look at first. But what we do is, we look at those, and then we say, okay, how do we learn from this.

    So, with the surgery, we went and looked. And we said, what really causes these? And people were incorrectly saying, it's confusing left and right, that that was the big issue. It's an issue, but not the biggest issue —

  • SUSAN DENTZER:

    And by that we mean?

  • DR. BAGIAN:

    Looking at left versus right, so suppose we were going to, you know, we've all heard of the stories where they were supposed to amputate the individual's left leg and they amputated the right leg. Well, they said, well, let's just mark left versus right, then we'll be okay. Well, we said, that's really, maybe not what it is. We looked and we found out that 44 percent of incorrect surgeries — and that's what we call them — we don't call them wrong-sided, because that's not right. Forty-four percent were left/right foul ups; 36 percent were the wrong patient.

    And it could be because you and I were going to go in the OR and we're both going to have our knees scoped, you know where they looked through a little scope to look in our knee and, maybe, do an operation. And they confuse us in the order and they bring me, instead of you. And they already had in their mind, I'm going to do the left knee on the first patient. But then meant to do my right knee. But they got us out of order and they didn't identify us correctly. They're just looking at, it's like, a knee. They do the wrong knee. But the reason they did the wrong knee was they thought I was you.

  • SUSAN DENTZER:

    And that happened 36 percent of the time?

  • DR. BAGIAN:

    Thirty-six percent of the cases were that. And we don't think it's any different in the VA than anywhere else from the data we have available. So, we said, let's put a real systems approach and look at this. And there's five basic steps. I won't go through them all.

    But it's how you identify the patient. Right? The only way that we think it's safe to identify the patient, as long as the patient can talk to you, which they can most of the time. There are exceptions. But they should tell you their name. So, if somebody comes up to me, they're identifying me, they'd say, him Jim, how are you doing? And you know, can you tell me your name and your birthdate, please, or your Social Security number. And I would say, my name is Jim Bagian and my birthdate is, you know, whatever it is. That way, you don't confuse them. And people said, well, do you really need to do that.

    Well, we do, because there's many cases we can show you where somebody would walk up to me and say, hi, Jim, how are you, or they would say, hi, Tim, but I'm a little hard of hearing. And I go, I'm doing great. They think I'm Tim, but I'm not. You know, and sometimes people are hard of hearing and they're a little embarrassed or they say, I know the doc really knows who I am so I'm not going to correct him. I mean, I don't know how many people have children that have this problem, but I have four kids. And do I call one daughter the other's name occasionally? Sometimes, they'll correct me and sometimes they go, well, Dad, just had a slip of the lip, they don't bother correcting me. But, maybe I really did mean to talk to the other daughter.

    It's important to get that straight, especially if you're going in for a study or whatever. So, we said, you have to deliberately and in an active way identify the patient. That means make them be a part of it. Make them not just say, yes. Say their name. If they say their name and birthdate or so, the chance of misidentifying them almost goes to zero.

    We can show you a number of cases where they just had them say, yes or nod their head and they're off to the races to a bad event. So we do that. We mark all our sites.

  • SUSAN DENTZER:

    As a layperson, one hears things like this and just shakes one's head and says how could this be? How could these kinds of errors happen?

  • DR. BAGIAN:

    Well, you know, I would say these bad events — I hate to call them errors. But the fact is, health care is a very complex, you know, sort of tasks that have to be performed by a number of people. And when you're dealing with people in communications and communications underlie about 82 percent of all the bad events that occur. Communications is one of the root causes. And when you have that much communication, people don't do it just right or depending on people's assumptions they come in already thinking they know the answer, they hear what they want to hear. We can't stop that. I mean, we just can't. And that's just the nature of we're human beings.

    So, how do we design our system to get around that. The fact is, these things happen, we have to acknowledge that they can happen and then design the system so that we don't have the bad thing that occurs. So, it's make the patient say their name, don't just assume you recognize them. You know, they might not be wearing their glasses that morning, so they're different. Maybe they don't have their makeup on, maybe they have a wig. They don't have it on, you know, so they look much different. So, you're going to be confused. If they say their name, you'll get it right.

    So we look at those kinds of things to try to design the system, as some people might say to make it idiot proof. So, that even if all the things that could go wrong, go wrong, it still doesn't result in a bad event. And I think it's this acknowledgment, once again of that we're all fallible, given the right set of circumstances. So, let's design our system to recognize that we're all fallible, given the right set of circumstances and so even when we do fall short of where we wish we were, the patient still gets a good result.

    And I think it's that acknowledgment of fallibility that we're less than perfect is the key to a good culture of safety. And if you look at aviation or nuclear power, that's how they are. You look at medicine, we haven't been that way. We weren't trained to be that way. But I think we're changing that. In fact, we've — now that we've rolled this out in the VA and I think we're, you know, we're not there yet, but I think we're way far along in there.

    We're now training residents, medical students, pharmacy students, nursing students, from the ground up. So, we don't teach just old dogs new tricks. Train them right from the ground up, so that they understand it's all right to question authority. It's all right to have the senior person there say, well, I think we should do this. That you can say, well, excuse me, sir, but I think it should be this, and you're not looked at like you're insolent or impudent, that, in fact, it's good for you to question and have skepticism. It doesn't mean you're right, but you should be able to voice it and have a discussion and not just be going, boy this doesn't look right to me, but I don't want to speak up.

    In fact, one of the things I think would show you how it differs in aviation, for example, than from most other fields, like medicine or any other industry you'd like. In aviation what you'll say is, if you're not sure it's safe, it's not safe. And if you think about that, if you're not sure it's safe, it's not safe. That means the first time you get an inkling, you speak up and say this doesn't look right. You're not afraid of being the boy who cried wolf. That, you know, you might bring something up and it turns out it really wasn't a problem. People don't say, well, boy, Jim, why did you bring that up, you know, you're an idiot? They go, boy, thanks for bringing it up, I'm glad we made sure that was okay.

    Whereas in medicine, and people will laugh, when I talk to physicians or nurses. I said is that how you work here? And they go, well, no. Because if you would bring something up that turned out to be the false alarm, if you will, people would think less of you. They think you're an alarmist, they think you're not competent. So, you wait till you're sure. Do you think that's what the patient wants? The patient wants to make sure that you're sure of bad things happening? If they're really sure, things have gone terribly wrong, probably, already. It's too late. You can make sure your batting is as great as the clinician that you're never wrong.

    The patient doesn't care if you have a little false alarm in your debates to begin with. I'd rather you be overly cautious than make sure if you're always going to be right, because if you're always right when there's a bad thing, that means you probably missed some bad things before they became bad.

  • SUSAN DENTZER:

    And one example of how you've institutionalized this approach is the time-out, pre-surgery.

  • DR. BAGIAN:

    Right.

  • SUSAN DENTZER:

    In a brief manner, describe to us what that is all about.

  • DR. BAGIAN:

    Well, what we've done in the time-out and, you know, now it's used not just inside the VA And many other had talked about doing this beforehand, like the American Academy of Orthopedics talked about it.

    But what we did was, we said, let's look at what's the last chance. You know, we've done a number of steps before that. How we make our consent forms that we have what the patient said in their own words. That we mark all sites, not just left/right, all sites; that we identify the patient as I said, that they have to say their first and last name and their birthdate, but then the last chance before we — and we also check if they needed certain x-rays available that they're available before they start the surgery.

    Because many times, people start the surgery and then they look for the films and can't find them. And then they have additional anesthetic time or they don't even have the information they need to and they guess. That's not a good thing.

    So, we said, right before you start, we go around the room between the anesthesia provider, the surgeon and the nurses and you say, okay, this is, you know, John Jones, he's in here today to have his right knee operated on to do such and such. The films are in the room, you know, that we need. Are they there or not? Yes, they are. We check all those things, what position are they supposed to be on their left side or on the right, you know, all those things. And then, if that's all agreed on, they say, okay, let's begin. But if one of those things isn't right, we stop. And we say stop immediately, let's go back and make sure we get all of those. So, if all those things aren't right. If the films aren't there, we say, wait a minute, cool it. Let's wait, get the films before we start.

    And we even go into things like, is the right implant there? They're going to do, say a hip, artificial hip, to make sure is the right hip implant there? Because it happens where they'll get in there and they go, oh, we need one with this angle. They go, oh, we don't have one. Well, now, the patient's asleep, we've cut the top of their thighbone off and we don't have the implant that belongs there. So, now, you're calling all around the city to find out if somebody has one. Not a good thing. And that happens and has happened.

    So, we do have to have a methodical way. Now, I must say, when we did this, many people didn't want to do it. Because surgeons said, well, it's never happened to me. And we said, do you want to wait till it happens to you, then you'll be interested? This is for the patient. In the beginning there was some reticence from many people, quite frankly.

    Once we've been doing it now, we've seen many close-calls that get reported that where one of these steps caught it. You know, the number of bad events we've seen has gone down quite a bit. But, you know, the number of close-calls has gone up because these systems now are catching it. And we find our system, like 85 percent of all the bad events that could happen get caught by three or four of the steps. That means that even if they don't do them all right, you have three or four chances to catch it. So, that's a good thing, that's redundancy.

    And we made it that way so that even when people don't do every step perfectly, you still will have a good outcome for the patient.

  • SUSAN DENTZER:

    Let's talk about root-cause analysis. What is that, why do you do it?

  • DR. BAGIAN:

    Okay, root-cause analysis is really a misnomer, we don't like the term, because it implies it's just analysis. The root cause, and, really, there's never usually just one; there's many of them, we call them root causes, contributing factors. And you want to analyze what happened; understand the systems behind what happened; and then come up with corrective actions that address those causes. And the thing is, this is much different than people think about it.

    They'll want to say how many medication events have you had? How many incorrect surgeries have you had? How many falls have you had? Those are symptoms. The causes are what you fix. You don't fix the fall. You fix the cause. Well, I have to know what cause to say you can fix it. So, it's important to understand the underlying cause. And you may find that the cause of a fall or the medication event, is also the cause for some other thing. If you look at it from that because the cause makes you, then, do an action. And it's the action which improves the world, you hope.

    So, if you understand where that cause occurs. And it might just — say for a verbal communication. One thing I think we all know. If you ask people, have you ever gotten a wrong phone number when somebody give you a phone number, because you just wrote it down? If you're really conscientious and think about it, you should repeat it back, so if somebody gives you a phone number, you say, well, my phone number's really 555-555-5555. And then you say, that's right.

    I mean, when you call to get a pizza, you'll see they'll read your phone number back right? Or you call your cab, they'll do it. But in medicine, we traditionally haven't done it. So, it's more important for the cab driver to get the right you know, number than it is for us giving a medication order to the floor. But when you understand it's verbal communication and how do you make sure it's clear. And you say, here's what we'll do for critical verbal communication, you must read back the order, that is you write down the message and then you read it back and the other person says, yes, that's correct. So, that's what we did, that's for medication. But, also, we found out it affects sending a code team, like, when there's a cardiac arrest — sending them to the wrong place. There's been cases where they called up and said there's a cardiac arrest at such and such a place. And the dispatchers misheard the numbers of the rooms. The code team goes to the wrong place, they get there and go, where's the code? And they go there's no code here.

    Well, then they have to call the dispatcher and they go, oh, wait a minute, it might have been this other place. Well, now, precious minutes are passing and the outcome's bad.

    Here's very different things: medication mistake; code team goes to the wrong place; the symptoms are totally different right? The cause was the same: failure to use read-back for instruction. Well, when you understand that, you implement read-back and suddenly you've helped the code team go to the right place. You've helped medication errors go down.

    And, in this way, by looking from a system's approach, you've gotten tremendous leverage from learning from one stop. And it could have been just a close-call, it didn't have to be a bad event. Where, otherwise, the way we used to look at it, we'd say, medication events, tell people to be careful and we didn't really fix anything.

    So, by looking at the root causes, you understand the real underlying cause where the problems are so you can go fix them. And if we weren't able to stand up in front of everybody and say we had these problems, then how could we fix them? Because you can't have it both ways. You can't say, we don't have any problems, well, then why would you spend effort fixing them?

  • SUSAN DENTZER:

    Let's go right in on that point. You can't fix what you don't know.

  • DR. BAGIAN:

    Okay.

  • SUSAN DENTZER:

    We'll just talk very briefly about the falls — a little bit more about the falls because —

  • DR. BAGIAN:

    Okay.

  • SUSAN DENTZER:

    — you did send me some interesting data about how many falls there are and what the, you know, how that's one of your current campaigns and the falls tool-kit. And then we'll just talk about, very briefly, about what are the next frontiers in patient safety from your standpoint and that'll do it.

  • DR. BAGIAN:

    Okay.

  • SUSAN DENTZER:

    You were saying part of the reason that a reporting system is, as you say, you can't fix what you don't know…

  • DR. BAGIAN:

    Well, one of the most important things is to understand that the reporting system allows you to understand where the problems lie. You can't fix what you don't know about. But this also takes courage on the part of leadership in the organization because you have to be willing to stand up and say we have a problem or a potential problem. People don't like to do that.

    But if you have that courage, then you can deal with it. If you say everything's great, well, then, how do you justify expending resources to fix something that's not broken? So, we were very fortunate in the VA That the leadership has been, you know, absolutely unflinching in the ability to say, in front of Congress, in front of the press, here's where we've had problems. And let us fix it. Now, in the beginning, I think that was very scary for everybody. I think we showed that we were credible, that we didn't back away from problems we had and people believed — trusted us then, that we weren't just trying to whitewash things, if you will.

    And that enabled us to really deal with knotty problems and fix those. And I think that's the point of the reporting system. It's not a counting exercise, it's the thing to learn from. And we were very fortunate in that way.

    In the case of falls, as an example: Falls are a huge issue around the country and around the world, patients fall. You know, after anesthesia, they're a little woozy or they're older or there's a number of reasons. But they get hurt, a number of people get hurt.

    We found by attacking that in a really methodical way that we were able in a recent pilot study we did — I say pilot — we pilot everything, because you really don't know till you use it, how it will work at the front line. You can think of all these great ideas in a laboratory setting or at a university, but until you actually have it at the front line where the nurses physicians, nursing aids are helping you, you really don't know.

    We did this and we found out we were able to really ask the question differently. We didn't — it wasn't about preventing falls. Because it's not the falls, it's the injuries. If somebody falls and isn't hurt, that's not a big deal. It's when they fall and are hurt. We found out we could reduce the falls not a lot. We could reduce them, but we reduced the injuries due to falls, 62 percent. That's huge, I mean, absolutely huge.

    We showed that just the savings, just from hip fracture, alone, for example, you know, in the injuries in the 40 hospitals was, like, over $8 million just in direct care, alone; not counting quality of life; not counting that a large percentage of people who fractured their hip due to fall, you know, die within the next year because of complications of being immobilized from a hip fracture.

    So a huge benefit by understanding things. Understanding, like, hip-pad protectors that are sort of like girdle pads, like a football player wears. Well, they'd been shown in studies to work, but, yet, when they actually go to use them, nobody would use them. Why? You gotta say — you can't just tell them wear them. They won't wear them. Why was that?

    Well, there was a couple reasons. One, they're warm; they were kind of like a girdle. Patients didn't want to wear them, they were too hot. Another one was that if a patient was incontinent, you know had trouble with their bowel or bladder control, that they would soil them and you'd have to have a lot available to put on, and the nurses really didn't like to have to deal with that. So, they weren't real motivated to make sure, coach the patient to wear them when they didn't want to.

    And the third was, kinda funny, that the patients said they make me look fat. And that's honestly what they said. So, we went and worked with the manufacturer and said, we can make these so they're cut away. It's almost like a garter belt. The hip pad's there, but it's cool, you know. And if it's like that, they also don't get soiled very easily.

    Well, we did that and suddenly hip pad compliance went up and injuries went down. Now we didn't solve that little fat problem but the fact is, by actually looking, what were the causes, you know, how do we really understand why they didn't use something that theoretically worked, but nobody would use. We got them to use it and we've seen this huge reduction. And that wasn't the only thing, but one of the principal things. So, understanding that and doing that really makes a difference.

    We've done the same thing with hand hygiene. It's well known that everybody should cleanse their hands, I didn't say wash their hands. It's really the alcohol rubs are much better than soap and water for most things. There's exceptions but by and large, the alcohol rubs. Everybody knows that, you could give a pop quiz to health care professionals and say, should you cleanse your hands before you touch a patient? Everybody will say, yes. But then if you go out and actually surreptitiously, you know, secretly observe them? You'll find out, if you're lucky, 40 percent do it and it really runs between 5 and 40 percent.

    Well, we went and did this and tried to understand why don't they do it? And we went and did this and we actually went from where, when we ask them how often they did it correctly? They said 90 percent of the time. We watched them and it was about 40 percent of the time.

    However, when we then put this program in place to understand why they didn't do it, we ended up coming up to about 80 percent do it. They still think they do it 90 percent of the time, but actually, secretly watching them do it 80 percent. That's like unparalleled. And we've been keeping that thing for nine months, we've kept it that by secretly observing, that's where. That's where they are. So, it's not like it was just a flash in the pan.

    Well, if you look at each, you know the average hospital infection, just in cost, alone, is $5,000 a patient. You know, and you can prevent this with much less input of any kind of funds to fix. And it's not about money, but it shows there's high utility to doing this. You know, you do it, it doesn't cost a bazillion dollars. You know, so you can do these things. We've —

  • SUSAN DENTZER:

    And what made the difference?

  • DR. BAGIAN:

    Well, what made the difference? Several things, I mean, there's a number, but some of the main points were, one, you had to make it very easy for the health care professionals to cleanse their hands. If the soap, you know, and the sink or the, in this case really alcohol rubs were like down the hall, they're trying to get stuff done. If they have to walk a half a block to do it, they're not going to because it's I'm trying to get my job done.

    If you have them right at the bedside or right outside the room or we gave them these little 2-ounce bottles, that you can wear around your neck or put in your pocket. Suddenly, it's real easy to do it. That was one of the biggest advantages was just making it available. And we had even protests in the beginning from some administrators, quite frankly, they said, well I don't want to buy all these little vials. They'll take them home. I said what are they going to do, wash their car with them?

    You know, how many drawers do they have? You can buy 10,000 of these little vials for the cost of one infection, 10,000. And we showed them that. We said, get over it. Just make them available. And we do. And we've seen just a huge reduction, you know, in what goes on. That is in the bad outcomes. And we see the compliance is way up, I mean, like double and it's basically, and this isn't just one research, this is in a number of facilities that they are, like, the benchmark for doing this and people said you couldn't do it. But it's by understanding simple little things, you know, it's not knowing what you want to achieve. We want them to wash their hands, it's how to get them to do it. And you have to make it easy. If you make it hard for people to do the right thing, they won't do it. If you make it easy, they will because they have competing demands. Their time is valuable, they have to see a million patient's hitting a call button over here, they want to be seen. What do you do? Do you take another 50 seconds to cleanse your hands? They're saying, nurse, please come here. Well, you want them to do that, if you make it easy for them to do it, they will. If not, they have competing priorities.

  • SUSAN DENTZER:

    How much is all of the patient safety steps that you put in place, how much does that cost relative to overall expense within the VA system?

  • DR. BAGIAN:

    It costs almost nothing. I mean, it costs some money, but if you look at it percentage wise, it's vanishingly small. We've actually done this utility analysis. And at one of our medical centers, if you want to look at it this way, for every $100 that we spend on patient care in running a medical center, we spend 10-cents — 10-cents out of $100 for the safety system. And that includes the full-time Patient Safety Manager and they're fully loaded costs. Running our cost teams and that cost — that's what the direct costs are. You know, that's the Delta, if you will. The change in cost. So, it's not a lot.

    Now, you know, our center, I think we figured it was what, I'd have to figure out the numbers but it was very small when you consider the budget of the entire VA, You know, it's well less than like a tenth or a hundredth of a percent of the VA It's a small thing, but it's having focus people that that's their job. If you don't have full-time people doing it, not an army, I mean, a small group doing it nationally, that's their job; that's their reason to live, if you will. So they have the time to really study, for instance, with the incorrect surgery and understand that it wasn't left/right. You know that took time to do that. It took a couple months to really figure that out. But now, think of the utility, you know.

    Everybody knows that for a couple months of somebody's time, but if you did it at one facility, and it was just somebody when they had 10 minutes, take a look, in 10 minutes a week every week, nothing would get done, they'd just about start opening a file and it's time to go back and do another job. But when this is their job, they can actually understand the real causes, come up with effective countermeasures, so there's a huge leverage effect there. So, the cost is very minimal. The fact is you just have to do it.

  • SUSAN DENTZER:

    What's the next frontier in patient safety for the VA?

  • DR. BAGIAN:

    Well, there's two things we're doing right now: One is communications or problems with communications underlies most problems in safety in all industries that involve humans and they all involve humans at some point. And about 82 percent of all the events we see, close-calls or actual bad events that occur communications were identified as one of the major contributing factors.

    So, we're doing team training, we call it, now and it's modeled in ways after cockpit resource management that aircraft operators do, to say how does the flight crew interact to make sure the right thing happens. So we do that so you can get a challenge against the authority ingredient. So, you know, it could be somebody that's a lower stature, you know, a lower pay grade, whatever, however you want to look at stature — age, professional education, can say this doesn't look right to me. Here's the issue.

    It's about how do we use briefings? Do we actually have dedicated briefings that this is the way we talk about things. If I'm transferring a patient from the emergency department to the intensive care unit here are the key pieces of information that we give every time, it's not that Dr. Jones uses this, Dr. Smith — we do it the same way because that makes sense. So, we're doing those things. That's one of the big things. And we've already seen successes from that.

    The other thing we're doing is education. We do much education. But this is not just teaching old dogs new tricks. Up until now, what we've done is we've taught the people who are out in health care how to do it. Well, it's a cultural change and that's not easy. So it comes hard and it comes slowly. But we need to kind of soften up the environment first because we don't want to train new nurses, new physicians, new pharmacists, because if they come out saying oh, it's about the system, and they run into one of the Neanderthals from the old days, you know, they say, oh, you're just trying to shirk your responsibility, you know, it's about you.

    Well, now, people understand that that's not what it's about, it's much more. So now, we've actually started a whole curriculum thing for health care professionals as they come up and not just docs, but, you know, nurses, pharmacists and others that we have already in 40 medical schools around the country. And we're in the process of rolling that out and doing that training to get that out.

    So we made regular curriculum modules about different things, about human-factor engineering, you know; how to do re-cause analysis, how do failure mode effect analysis. So we're going out there, so they get trained, so when they come out of medical school and they come out of their residency, they won't come out and say, well, how do we do this? They'll say, hey, I know it, that's how — I don't know there's any other way but this to do it.

    So, we think that's the next thing because it's really the culture that gives you the sustainability. It's not just what I would call the stupid pet tricks, you know, I kind of take off of Dave Letterman. But it's not the one-trick ponies that here, do this. Do this has a short shelf-life, you know, that's right for today, but what happens as things change? You don't know what the reasons were, why you changed it. Where, if you teach them how to think about how to solve problems and give them tools that help them do that effectively, they can adapt that to any new thing that comes down the pike, so it's sort of the difference between do you give them fish or teach them how to fish? And the whole things geared to teach them how to fish. That way you have sustainability in the system.

  • SUSAN DENTZER:

    How do patients in the VA System feel all of this has affected them? How do they respond to health care, differently, perhaps than before?

  • DR. BAGIAN:

    Well, I think if we look at how the patients have looked at this, we know, first of all from satisfaction, that we have, basically, unparalleled patient satisfaction compared to the rest of the country and loyalty. Which loyalty is we ask the question, if you could be cared for anywhere else at no cost to you, would you go somewhere else? And we have the highest grades ever. People say, no, I'd stay right here.

    But more than that, we saw, for instance, with the correct-site surgery, we actually give them brochures and talk about here's what your care giver should tell you. We work with the caregivers to say what's the reasonable thing. So you should expect your care giver to do this, this, and this. And it's like a checklist they should have with them. You should ask them, if they don't tell you, you should say, I need to know this, this, and this.

    As well as, they have responsibilities. You should tell them, this, this, and this, if they don't ask. Well, by everybody knowing what the expectations are, they do it.

    Now, also, when you have to ask them to state their names. There was a lot of trepidation that patients would say, why are you asking me to tell you my name, you know me? And they would feel uneasy about that. In fact, just the opposite occurred when we did this.

    We had unrequested letters from patients that came after we started doing this thing. I felt really safe, because when I came in, I read all these things in the news about how patients have the wrong operation, I was worried about what I have, would that happen to me? And I felt so good about everybody that came up to me, asked me to tell them my name and my birthdate or my Social Security number. So, just the opposite occurred, instead of feeling worried that what's going on here, they said, wow, people are really being careful, are concerned they're doing it right.

    So — and I didn't know that we were going to find that, but that's what actually happened. So we found out that, you know, and I'm not saying it's 100 percent, but the only letters we got were just praiseworthy. It said, this is great, we really like this, thank you very much. We got zero, I can tell you, we got zero letters of complaints saying they didn't like it.

    Now, does that mean nobody ever liked it? I'm sure somebody didn't, but the only letters we got was, said this is great. So, people actually want to do that and they helped because the fact is it's all about caring for the patient. And the patient should be interested, too and be part, an active part of what's going on to understand if this doesn't look right, they should speak up, too, and we tell them, you should, that's your job, too. You know, and we're all in this together, it's not like you're a passive recipient of care. You should be active, as well.

The Latest