What do you think? Leave a respectful comment.

This aviation expert says Boeing made ‘disastrously bad decision’ on training for 737 MAX

The recent Ethiopian Airlines crash led to the grounding of Boeing’s 737 MAX planes across much of the globe. But as new details emerge about the cause of the model’s second crash within five months, questions are being raised about how the plane's safety was approved in the first place. John Yang talks to Jeff Wise, a pilot and author of a book about MH370, the flight that vanished in 2014.

Read the Full Transcript

  • Judy Woodruff:

    Investigators are still trying to figure out what caused the Ethiopian Airlines crash that led to the grounding of the Boeing 737 MAX.

    As John Yang reports, questions are now being raised about how the federal government approved the plane's safety in the first place.

  • John Yang:

    Judy, The Seattle Times reports that when the FAA approved the jetliner in 2015, the agency delegated key safety assessments to Boeing itself.

    The newspaper said that included an assessment of the automated anti-stall system, which is the leading suspect in the plane's two crashes. In addition, The Wall Street Journal reports that both the Transportation Department and federal prosecutors are looking into the approval process.

    To discuss this, we are joined by Jeff Wise. He's a science writer who specializes in aviation and psychology. He's author of "The Taking of MH370." It's about the Malaysia Airlines flight that vanished in 2014. He's also a licensed pilot.

    We add that Boeing and the FAA both declined our invitations.

    Jeff Wise, welcome.

    A lot of discussion about automation. This is — was an automated system that they added to this model of the aircraft. It was one of the systems that The Seattle Times reports the FAA let Boeing decide on the safety assessment.

    You have written a lot about automation, about autopilot. What does this autopilot, the increased use of automation in flying mean? And is it a good thing, a bad thing? And what does it say about training, the need for training?

  • Jeff Wise:

    Well, it's a great question.

    Broadly speaking, automation is a really good thing for aviation safety. What you can do is take the more mundane tasks of flying things that a pilot would otherwise have to do, monitoring things and keeping check on things, and can focus on the more important things, the more creative things, if situations are arising.

    Let the computer deal with the more mundane stuff. So that's really been a major factor in the tremendous increase in aviation safety over the last few decades. On the other hand, we can't get too complacent. There are situations.

    As flying becomes more and more safe, the number of incidents becomes less and less. And you get into this situation where only the sort of really sort of corner-of-the-envelope situations do multiple problems arise simultaneously.

    You have might have an inexperienced pilot on a day with bad weather. You have some other mechanical problem that happens to occur at the same time. And the system gets overloaded. The automation can't handle it. Automation either fails or it turns itself off and dumps everything onto the pilot, oftentimes in a situation where the pilot is highly stressed.

    The pilot might be inexperienced. And so the sort of the weak points of automation and human beings overlap, and there's no one really to catch the falling baby, as it were.

  • John Yang:

    Well, you talk about the inexperienced pilot.

    I mean, Boeing — it is said that Boeing wanted to be able to say to customers, you don't need to retrain your pilots, because they wanted it to be cost-effective. They wanted it to be a fast changeover for these airlines.

    Is this something, though, that they should have prepared the pilots that, if something goes wrong while they're flying on autopilot, they would have been prepared to handle? They say that their attitude after the Indonesia crash was, well, a pilot should know how to handle this.

  • Jeff Wise:

    Right.

    So the context of all this happening is that Boeing gets a big chunk of its profits from the most popular airplane that it builds, the 737. It's been building them since 1967. It's a whole different era. Over half-a-century, they have been building these planes. And it's really a creature of that age.

    It's aluminum. It's got hydraulics, instead of fly-by-wire, like modern planes have. And so what they have been doing is progressively trying to improve it to kind of stretch out its lifespan. And the argument that many are making is, look, they have just tried to stretch this out too long.

    They basically tried to add these new fuel-efficient engines onto an aging airframe. The plane wasn't designed for this kind of engine. They had to sacrifice some flight characteristics in order to get it to work.

    Because it had these aerodynamic problems, it had sort of disturbing tendency to pitch up in certain circumstances. They kind of kludged it with this patch, this automation software that would kick in. When the flight — corner of the flight envelope got really bad, they would sort of paste this thing on, and this thing would take over.

    But as I was talking about, automation can act in sometimes surprising ways. And as part of its effort to keep this 737 fleet going, they wanted to be able to tell to customers, hey, listen, you can buy this, it's going to have the same commonality of parts with the rest of the 737 family, it's going to fly the same. You don't have to buy a whole bunch of new parts, like you would if we built a whole new plane.

    You don't have to retrain the pilots, like you would if we had to build a whole new plane. Just treat it like it's one of these old 737s. Just keep using it.

    Well, it turned out to be a disastrously bad decision, because, as we saw in Lion Air and probably now again in Ethiopian Airlines, that you do need training. And I think they're going to come back from this grounding and definitely having to be teaching pilots, look, this situation could arise.

    You're not going to see it in any other model, because no other model has this kind of automation feature that we have added here. And this is what you have to do.

    It's not that complicated. But, again, the situations where human beings are worst is the kind of situation where the automation turns itself off or fails, and you're throwing this novel situation where you're asking a pilot to figure something out, that he doesn't know what this automated system is doing.

    And that's where it all falls apart.

  • John Yang:

    Science writer Jeff Wise, thank you very much.

  • Jeff Wise:

    Thank you.

Listen to this Segment

The Latest