Dancing to the Beat of Computer Designers

Trapped in The Net: The Unanticipated Consequence of Computerization by Gene Rochlin

(Princeton University Press, 1997, 293 pp., $29.95)

In July 1988, the USS Vincennes, a cruiser built to provide anti-aircraft missile defense for an aircraft carrier battle group, and controlled by the advanced Aegis electronic fire-control system, was patrolling the Persian Gulf as part of a fleet enforcing the U.S. embargo of Iran.

The morning of July 3, the Vincennes and two companion ships became embroiled in a scrap with Iranian patrol boats. Just then, Iran Air Flight 655 had the bad judgment or bad luck to fly over the battle zone.

The sophisticated equipment of the Vincennes spotted the civilian plane coming. The men of the Vincennes concluded they were being attacked by an Iranian military aircraft. Two SM-2 standard missiles were fired and hit home. Flight 655 plunged into the Persian Gulf, killing some 290 people.

A case of human error in which the equipment worked to perfection? In Trapped in the Net, Gene Rochlin argues that it wasn't that simple.

It is true that the equipment on the Vincennes functioned just as it was supposed to do. It is also true that there was “confusion and disorder” among the human beings on board the ship. But what else would anyone expect?

The Vincennes was trading shots with an enemy. Confusion and disorder are likely in an ongoing battle. This was the environment in which the Vincennes presumably was meant to operate. If the ship, considered as a system, then shot down an airliner—as it did—there must have been something wrong with the system, not just its human element. Smooth functioning computerized equipment enabled human beings, acting in error, to execute a disastrous mistake.

This was neither the first nor last time such a thing has happened in our brave new computerized world. Rochlin, a professor of energy resources at the University of California-Berkeley, is no technophobe, but his sophisticated book does raise serious questions about the direction we are headed.

Other computer books have focused on the impact of computerization on the individual; the focus of this one is on the impact this process is having on large organizations—business, the military, and so forth.

One question raised by the process concerns the loss of human expertise in an automated environment. “Expertise” is a product of trial and error—and computers permit no mistakes. A world without human error sounds fine, of course, except: What happens when the computers fail or suddenly confront an alarming new circumstance they haven't been programmed to handle? In such situations, what are human operators, denied the opportunity to acquire expertise, supposed to do?

Rochlin recounts cases in which experienced airline pilots—those with expertise—were able to cope with sudden mishaps, and cases in which other pilots— apparently without the same expertise—could not.

The chapter in which these tales are told begins with a quotation from an anonymous commercial airline pilot: “In the airliner of the future, the cockpit will be staffed by a crew of two—a pilot and a dog. The pilot will be there to feed the dog. The dog will be there to bite the pilot if he tries to touch anything.” To which another pilot added the imaginary explanation of an airline executive for not phasing out pilots entirely: “Because accidents will always happen, and no one would believe us if we tried to blame the dog.”

Other problems arise in automated factories and offices. Here, computers came billed as restorers of power and autonomy lost by workers due to mass production and standardized office work. Owners and managers went along at first for the sake of productivity and efficiency—but only up to a point. “Control of information technology remained as central to the purposes and goals of management as ever, and it was only a matter of time and technical development before means were found to reassert it.”

A fundamental source of difficulties—perhaps the fundamental source—is that the computer revolution is being driven by the interests of designers rather than the needs of users. In one study, 75% of pilots said they didn't think the designers of aircraft took users into consideration, and 90% said they thought the logic of designers very different from their own.

The result of this designer hegemony is what Rochlin calls an “asymmetric dependency relationship”—that is to say, designers call the tune to which the rest of us are obliged to dance. “The designers and programmers are free to do what they feel is right, or necessary, and the user has little choice other than to accept it, and stay current with the latest version of software, or to reject it and drift down the irreversible path of obsolescence.”

Computers have answered many important human needs, made life more pleasant and productive for countless people. This clock neither can nor should be turned back. Rochlin asks us instead to look ahead. Computers, he writes, are “creating patterns of reliance and dependency through which our lives will be indirectly and irrevocably shaped.” Before that happens, perhaps we should pause and decide how we would like our irrevocable future to look.

Russell Shaw writes from Washington.