The Boeing 737 Max and the Problems Autopilot Can’t Solve

A Southwest Boeing 737 Max 8, the newest version of Boeing's most popular jetliner. Credit Joe Raedle/Getty Images
A Southwest Boeing 737 Max 8, the newest version of Boeing's most popular jetliner. Credit Joe Raedle/Getty Images

If you’re an airline passenger, automation is your friend — setting aside the fears over its role in the crashes of two Boeing 737 Max planes in the past five months. The gradual spread of automation through the civil aircraft fleet is a primary reason the accident rate worldwide has fallen from about four accidents per million flights in 1977 to less than 0.4 today. Many modern airliners are capable of taking off, flying and landing without any human assistance. Pilots today, as one former pilot puts it, are less stick-and-rudder movers than they are overseers of systems.

Automation is not without its own hazards, though. As it has become ubiquitous in cockpits, automation-related accidents have come to make up a significant proportion of all air disasters. In 2013, the Federal Aviation Administration published a study of accidents over the previous two decades and found that reports of “unexpected or unexplained” behavior of automated systems were present in 46 percent of the accident reports and 60 percent of the major incident reports collected by the researchers.

There are two main ways in which automation can lead to catastrophe. Sometimes a malfunction causes the autopilot to run haywire and put the plane into a dangerous state. This is what seems to have happened to the Lion Air 737 Max that crashed last October shortly after takeoff from Jakarta, Indonesia. (It may have occurred in the case of Sunday’s Ethiopian Airlines crash as well; on Wednesday, the F.A.A. said that newly available satellite data “indicates some similarities” between the two accidents.)

Because the 737 Max had been outfitted with larger new engines that could cause its nose to pitch dangerously skyward, Boeing had added a maneuvering characteristics augmentation system, or M.C.A.S., that would kick in and push the nose down if necessary. But a faulty sensor fed incorrect information to the Lion Air flight’s M.C.A.S., causing it to put the plane into a steep dive. Pilots on at least two flights in the United States reported similar problems, but in those cases they were able to disengage the autopilot system and recover control of the plane.

Another way that automation can come to grief is when the system works as designed but the pilot becomes confused and does the wrong thing. In 2009, Air France 447 was over the middle of the Atlantic en route from Rio de Janeiro to Paris when its speed sensors became clogged with ice and stopped feeding data to the autopilot. The autopilot then switched itself off and handed control of the plane over to the pilots, as it was designed to do under such circumstances. The pilots, however, became disoriented and one of them inadvertently pulled the plane’s nose up, so it climbed and lost a dangerous amount of speed. Worse, the pilot didn’t realize that with the autopilot off, the plane’s computer was no longer preventing the wing from aerodynamically stalling. In less than five minutes, a perfectly good airplane went from cruising at 32,000 feet to hitting the ocean.

In both kinds of events, pilots find themselves having to suddenly figure out why a complex system is no longer acting as expected and then take quick action based on that analysis. Moments of high stress are precisely those in which the human brain is least equipped to figure out complex situations.

The best way to deal with emergencies is to train and practice beforehand, so that response becomes automatic. Ironically, by turning over the mundane stick-and-rudder aspects of flying to computer automation, pilots are deprived of the opportunity to continually practice. This leaves them without the mental automation that might save them in a crisis.

Unfortunately, in the case of the 737 Max, it seems that Boeing has built a plane with a fundamental aeronautical issue that it thought would be resolved by adding a new automated system. But the company convinced airlines and the F.A.A. that the planes were essentially interchangeable with earlier models of 737, and therefore pilots who were already trained in flying older 737s would not need comprehensive additional training on the new system. The F.A.A. agreed with that conclusion. This would, among other things, save money on crew training and maintenance costs. That has proved to be a terrible miscalculation.

In the wake of a double disaster, Boeing has announced that it will release a software upgrade for the 737 Max. The company may indeed have solved the problem. It may be that the 737 Max will return to service with the patched software and never again suffer a mishap.

The problem is that it’s hard to prove a negative. Passengers won’t have any faith in the fix until decades have passed. And given that the flaws of humans and automatic systems have a well-demonstrated tendency to amplify each other at the worst possible moment, trusting Boeing’s solution could well be a risk that few airlines, passengers or safety regulators will be willing to take.

Jeff Wise is a science journalist, private pilot and the author of The Taking of MH370.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *