July 14, 2002

To Err Is Human

By GEORGE JOHNSON

THEY knew all along that human fallibility had contributed to the deaths of their children. There was the well-meaning tour operator in Moscow who had delivered the students to the wrong airport, causing them to miss an earlier flight to Spain. There was the Swiss air traffic controller who happened to take a break at just the wrong moment, leaving an overworked colleague struggling to guide five different planes through his small piece of sky.

Last week, the third, decisive element was revealed to grieving Russian parents: Ordered to climb higher by the electronic voice of the cockpit's automatic collision detector, the pilot of the children's plane obeyed the befuddled ground controller instead. The airliner dove head-on into a DHL cargo jet — a tragedy that might have been averted if people put more faith in machines.

That is the story pieced together from the flight recorders retrieved after the July 1 crash over southern Germany. In the aftermath, an airport safety consultant neatly described what must have been an impossible dilemma: "Pilots tend to listen to the air traffic controller because they trust a human being and know that a person wants to keep them safe."

In this case the sentiment was apparently misplaced. The computerized Traffic Alert and Collision Avoidance System, or TCAS, couldn't care one way or the other about the fate of its charges. But that was its strength. Exchanging data with its electronic counterpart on the other plane, the device coolly determined that the airliner should climb as the cargo jet descended. If human judgment hadn't intervened, the ships would have passed in the night, delivering packages to Brussels and children to Barcelona.

After the collision, the natural instinct was to blame the computer, and for days investigators puzzled over how the system — so highly regarded that pilots are trained to consider its commands almost sacrosanct — could have failed. But the malfunction, it now seems, lay not in wiring or software but in a fatal juxtaposition of two overstressed minds. The lone Swiss controller, whose center's own computerized alarm system had been shut down for maintenance, had no machine to back him up. Pushed to the limit, he made a wrong decision. Somewhere in his cranium a neuron twitched the wrong way. Then the pilot amplified the mistake.

The issue here is nothing so lofty as human versus artificial intelligence. What lay in the balance was a simple decision: up or down, 1 or 0. Believe the controller or believe the machine. Computers are routinely trusted to make a billion such binary calculations every second. It is what they do best.

The Russian schoolchildren would probably be alive if the avoidance system had been equipped with the equivalent of robot arms capable of seizing control of the plane. Of removing the human from the loop. Devoid of uncertainty, emotion, confusion — of the paralyzing regress of second guessing yourself and then second-guessing the second-guessing — the machine would have automatically passed its fail-safe point, countermanding its superior and saving the day.

That wouldn't have made for a very heartwarming tale. The annals of fantasy and science fiction are filled with stories of automata, from magical brooms to lumbering robots, running amok — Walt Disney's "Fantasia," Stanley Kubrick's "2001," James Cameron's two "Terminator" films. Coming next summer is the sequel to the sequel: "Terminator 3: Rise of the Machines." We can already guess who wins. Humanity must triumph over mechanism.

"Fail-Safe," the 1962 novel by Eugene Burdick and Harvey Wheeler (later made into a movie), warned what could happen if the machines get the final say. Through a chain of errors, the Strategic Command and Control system automatically dispatches a squadron of nuclear-armed bombers to destroy Moscow. Past a certain point, the decision is irrevocable. No mere human can order the planes to return. To stave off total annihilation the president, in his powerlessness, agrees to balance the equation — ordering the destruction of New York. Tit for tat. The leaders have become absorbed into the relentless logic of the machine.

In the real world, human warriors remain at the top of the chain of command. This is supposed to be comforting. Adept as machines are at calculation, people are said to be imbued with something higher: judgment. There is a lot packed into that word — carefully weighing conflicting information, drawing on an accumulation of experience, learning from mistakes, tempering cold analysis with moral values, altruism and a healthy instinct for survival.

BUT the more complex a system, the more there is to go wrong. The popular notion has it that we use only 10 percent of our brain power, when in truth it seems to take 110 percent just to muddle through the day. So we supplement our minds with machines, then agonize over whether they are telling us the truth.

The suspicion, healthy to a point, arises because we know that computers, like brains, can crash when they are overloaded. Reflections of ourselves, the machines are only as good as people make them.

What happened in the cockpit was not really a matter of Us versus Them. The pilot, to the extent that he had time to think at all, was faced with weighing the reliability of the controller against that of the anonymous engineers who built the avoidance detector.

In the end, he went with the voice that seemed closest by. The machine didn't stand a chance. For better or worse that is part of our program. It is the nature of the species to choose to make its own mistakes.


Copyright 2002 The New York Times Company | Permissions | Privacy Policy