The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 7 Issue 19

Sunday 10 July 1988

Contents

o Iranian Airbus discussion
Philip E. Agre
Tracy Tims
Hugh Miller
o Info on RISKS (comp.risks)

Iranian Airbus discussion

Philip E. Agre <Agre@WHEATIES.AI.MIT.EDU>
Sat, 9 Jul 88 18:24 EDT
An interesting analogy connects a number of the disagreements over topics
like the shooting of the Iranian airliner.  Some people want to stick to
the technical details and leave politics out of it; others reply that the
distinction is untenable since politics is part of the reality in which
products of technology operate.  Likewise, some people want to discuss
the conduct of war as if it occurred in a reality free of politics.  The
latter was once possible but now it isn't.  But why?  Roughly speaking,
because the world is a smaller place.  For one thing, the efficiency of
modern communications media make it possible to conduct a `political
war'.  For another thing, great increases in the velocities and ranges of
both weapons and civilian transportation make it much harder for civilian
activities to stay out of the way of `war zones'.  Yet the model of `pure
war' continues to inform the design of most computerized weapons systems.
All of the doctrines, indeed all the vocabulary, of Western warfare were
developed in the context of such well-defined, all-out wars as the major
modern European wars.  These wars started and ended at definite times,
opposed clearly defined alliances in which all relevant parties felt the
need of choosing sides, and were conducted by militaries whose only
political constraint was the necessity of winning.  Everyone understood
that civilian life simply came to a complete halt during these wars.
These episodes serve as our prototypes of a `war', a category about which
one makes generalizations by consulting a historiography of warfare,
written by modern Westerners, that concentrates on episodes that fit this
pattern.  The concept of `civilian' is simply the flip side of the
concept of `war'.  This concept of warfare is just as much a part of the
models implemented by the computers on the Vincennes as the concepts of
physics used to describe signals, trajectories, and explosions.  As we
well know, when the models underlying a computer system are wrong, the
computer will make mistakes.  Most of the systematic organized violent
conflicts in the world today are not `pure wars' but rather drawn-out
low-level conflicts in which the smallest details of military operations
are political actions organized by political considerations.  The
inappropriateness of the `pure war' model explains many recurring themes
in interviews, long before the Iran Air incident, with the military
people running the US operations in the Gulf, both the sailors on the
ships and the admirals back in Washington.  They complain bitterly, for
example, of having to ``fight a war in a lake'' and of the narrow margins
placed on their decisions by the presence of non-military planes and
boats, many of which (especially the boats) do not own or competently
operate the radio systems that permit ready discrimination in peacetime
traffic control.  Thus the naval battle that was occurring at the precise
moment when the Air Iran plane approached the `war zone' was not at all a
prerequisite for such an incident.  The ultimate questions are: As
warfare and politics blur, what should be call what's happening in the
Gulf (and two dozen other places in the world) if not a `war'?  And then,
as technical practice and politics blur, what should we call what happens
in laboratories and factories if not `technology'?


Responsibility (Iranian Airbus)

Tracy Tims <ttims%watdcsu.waterloo.edu@RELAY.CS.NET>
Sat, 9 Jul 88 18:52:39 EDT
It is true (as Henry Spencer points out) that you would have to be "misinformed
or crazy" get on that flight as if it were a normal peacetime flight.  On the
other hand, the fact that we do not judge such people wise should not affect
the way we judge those who caused their deaths.

If a woman takes a risk (say, walking home from work through a suspect
neighborhood) that leads to her being raped, we may question her judgement
in taking the risk, but we in no way reduce the burden of responsibility on
the man who actually did the raping.  We may suggest that she avoid walking
in the area, but we know that it is not right to expect women to limit their
lives because of a danger some criminals have decided to threaten them with.
The desired situation (in fact the moral situation) would be no risk to women
of rape.  There would have been no risk of rape (and no rape) had not some man
decided to create one.  One must not fall into the trap of transferring
responsibility from the perpetrator onto the victim.

Likewise, there would be no real risk of an airliner being shot down by a
missile had not some group of people decided to create that risk.  Yes, we can
question the judgement of a group of people who decide to expose themselves to
that risk, but we cannot lessen the moral responsibility of the people who
created the risk and who performed the action.

By having a shoot-first-ask-questions-later policy, in a zone where both
military and peacetime activities co-exist (and, as I'm sure we all agree,
where only peacetime activities should be), a military that is executing policy
places the risk of executing that policy squarely on the shoulders of
potentially innocent people.  Given the fact the the U.S. chose to conduct
military operations where there were innocent bystanders I feel strongly that
they should also be willing to accept any attendant risks.  Anything less than
that amounts to sticking other people with the bad results of their decisions.
The Navy has a moral obligation to decide whether or not a blip on their screen
is an attacking aircraft and not an airliner, if there is a significant chance
that it could be an airliner.  If they cannot, they should not place the risk
of misidentification on some innocent passengers. I feel this especially in
this case, where the U.S. Navy is not fighting a war for U.S. survival against
unprovoked attack, but is implementing a peacetime foreign policy decision.

    It's too bad the Iranian airline passengers had to learn the facts of
    life the hard way, but the people who fired those missiles cannot
    really be blamed for it.
        - Henry Spencer

I think this statement has a profound lack of empathy, but I find the last part
of it, "the people who fired those missiles cannot really be blamed for it," to
be completely absurd and dangerous.  Any atrocity can be justified using very
similar words: "it's too bad she had to learn the facts of life the hard way,
but the man who raped her cannot really be blamed for it."  (After all, she
really caused the crime, by placing herself in the position where it could
happen, right?)

The people who died in the airliner did not kill themselves.  Captain Rogers
killed them.  A sidereal examination of the facts of the incident shows this
clearly.  At most, the passengers are guilty of stupidity, optimism and bad
judgement.  Captain Rogers is guilty of their deaths.

The problem with man-made risks in general is not so much detecting them, but
trying to find someone or some group to actually be responsible for them.  If
the responsibility for a risk (and its consequences) is diffuse, or state
sanctioned, or complicated by the fact that the victims apparently chose to
accept the risk; then people are all to quick to deny any blame.  This is
a moral failing.  Many technological risks (from the design of user interfaces
to the existence of nuclear weapons) are orphans in this sense.

Tracy William Lewis Tims


The Iranian Airbus and following discussion

Hugh Miller <HUGH%UTORONTO.BITNET@CORNELLC.CCS.CORNELL.EDU>
Sun, 10 Jul 88 09:56:53 EDT
        I've had _numerous_ private messages & some RISKS postings responding
to my submission to RISKS 7.15 ("The target is destroyed").  A few warranted
replies, so I have tried to draw up same here.  (I have not responded to Gary
Chapman's posting in RISKS 7.17, since I agree emphatically with everything he
says.)

(1)      Michael Mauldin (RISKS 7.15) taxes me with getting the facts wrong.
         I plead guilty.  If you note the header of my message you will see
         that it was -- well, 'fired off' I guess is the right phrase -- at
         11:15 on Mon 04 July, at which time even the most elementary facts
         were in dispute.  For that matter, several of the 'facts' Mr Mauldin
         and others report have since been, uh, revised.  My main points,
         however, rely (I hope) less on facts than on possibilities.  I would
         be tempted to call them 'philosophical' did that not open them to the
         usual (and deserved) snorts of derision 'hard science' types reserve
         for contemporary so-called 'philosophy.'
                Similarly, Sue McPherson charges from Down Under "that the
         papers have told him that technology has fouled up again." I can
         assure her that I never believe what the papers tell me.  I grew up
         in Louisiana.  As for technology doing what it ought in this case,
         well, 290 dead civilians indicates otherwise to ME.  Her words
         radiate the confidence of technolatry: if we can get the facts
         straight & keep the media & the pols out of the control room we can
         fix this sucker right now so it'll never happen again.  But whether,
         in this instance, Capt Rogers made the 'right' call or not, whether
         the EW gear 'worked' or not, we still have to ask some very
         fundamental questions about technology.
                 Let's not be under any illusions about whether we will ever
         get the "real facts," the nitty-gritty technical details, of the
         Flight 655 tragedy.  (Perhaps 12 months from now one of you reading
         this will get hired by the Pentagon to write some code for the AEGIS
         system to prevent such-and-such a, purely hypothetical you
         understand, 'problem' from occurring.  I would like to think you
         would do the right thing and tell us, but doubtless you will be sworn
         to infinite secrecy.)  In Montreal, where I used to live, complaints
         against the Police de la Communaute Urbaine de Montreal were
         investigated by -- the Police de la Communaute Urbaine de Montreal.
         Needless to say, such complaints were unexceptionally dismissed as
         groundless.  With the stakes stupendously higher, what reason have we
         to believe that the US government (to say nothing of our lickspittle
         press) will behave in any less self-serving a fashion?  290 dead
         innocent airline passengers, 66 children among them, is, to put it
         crudely, one hell of a spin-control problem.

(2)      Bob Estell (RISKS 7.15) knew Capt Will Rogers before the Navy & finds
         it hard to believe he (Rogers) would have behaved in any other way
         save the honorable.  Given the alternative, I hope for Capt Rogers's
         sake he is right.  But military training of any sort changes a
         person, in my thin experience in the matter; it is intended to; and
         the results are for the 'good' only when that 'good' is evalued from
         the standpoint of the profession of arms.  At any rate, Capt Rogers
         either made a bad judgment on the basis of good evidence, a good
         judgment on the basis of bad evidence, or a bad judgment on the basis
         of bad evidence.  I cannot bring myself to believe Capt Rogers, any
         more than Capt Brindel before him, was capable of the first, and I
         hope none of us can.  (That may not keep him from being scapegoated.)
         If versions 2 or 3 are correct, then the particular technology, to
         quote George Bush, is "in deep doodoo."  And in any event technology
         in the broadest sense is what gave us the "Roger-willco" attitude
         that got us embroiled in this hellacious war in the first place.

(3)      Jim Anderson (RISKS 7.16) faults Iran Air for its imprudence, at
         best, in sending a commercial airliner over a combat-engaged US AEGIS
         cruiser.  IF indeed the facts are as my government represents them (I
         am, by the way, a native-born US citizen and a landed immigrant in
         Canada), then I might fault the people at Iran Air for poor judgment;
         perhaps I might even go so far as to hold them technically liable at
         law for criminal negligence indirectly causing death.  As to the true
         facts of the engagement, well, as of today (9 July) the US has, ahem,
         changed its story a few times.  As a highly interested party its
         account will always be suspect.  But even if the facts are as we
         represent them, since the US is not officially at war with Iran (at
         least as far as Congress, which under the Constitution has the
         exclusive power to declare war, is concerned) we had and have no
         right to be where we are in the first place, under strict
         interpretation of international law and maritime convention, to say
         nothing of good old practical political reasoning.  (Henry Spencer,
         RISKS 7.17, please take note.  This is not 'our' war.)  Let me pose
         the following 'scenario', as the gamers say: suppose you are a
         citizen of, say, France, in the year 2000.  Margaret Thatcher, now in
         her 18th term of office, declares war on your country.  The USSR, for
         10 years thanks to _perestroika_ engorged on Western high technology
         and a big consumer of oil from the North Sea, decides to protect its
         supply by sending in a huge naval tactical group, some of which stays
         outside the North Sea & English Channel, but much of which goes in to
         reflag tankers with the Soviet standard & generally harass & fire on
         French vessels while ignoring English ships.  In short order their EW
         technology, still at US 1988 levels, leads them to shoot down an
         Airbus A920.  The Soviets say the French were to blame; some Russians
         claim fanatical French deconstructionists had sent the airliner on a
         deliberate kamikaze mission, or had tucked a MiG-99 behind it, or AT
         BEST just should have known better than to tempt the wrath of the
         Bear.  Describe your feelings, as a French citizen.  (For fun,
         imagine what the Pentagon would be thinking.)  This was my point: we
         are in the the Gulf only partly because of greed and the normal,
         predictable imperialist tendencies we have exhibited for over a
         century.  REAL prudence would counsel us not to be there, and
         statesmen of an earlier day would probably have heeded that counsel.
         But the Mephistopheles of technology, to whom we have sold our soul,
         remember, whispers: "We _can_ do it.  If we _can_ do it, we _should_
         do it.  If we _should_ do it, we _must_ do it."  Technology is much
         more than just a tool.  It is a world, a universe of discourse, a
         mythology (as PGN put it elsewhere in the issue), a devil of a
         weirdly affectless sort, but persuasive as any tempter.  He has
         entered into us like the demon into the Gadarene swine, and driven us
         headlong, not this time into the Sea of Galilee, but into the Persian
         Gulf.
                Mr Anderson's final point is thus the most disturbing, the
         more so for its offhanded, sensible flavor.  "Let's get the forum
         back to technical risks," he urges, "and off of the political beat."
         Again, this is my point: the two are as inseparable as the faces of
         Janus.  The flashpoint of their union is the armed forces of the US
         and NATO, and to a much lesser extent those of the Soviet Union, the
         Warsaw Pact, and other nations.  Technology, or rather its
         apologists, dissembles this; part of the hoodwink consists in its
         claiming to be just a means, completely separate from any
         consideration of ends.  We are more than happy to go along, to let
         ourselves be deceived, for the sake of the lovely, tangible, short-
         term amenities and conveniences it supplies.  (Anyone who has doubts
         about America's capacity for self-deception must have been on Mars
         for the past 8 years.)  Technology is, rather, for us, an end in
         itself as well as the means thereto, THE end-in-itself _par
         excellence_.  (The end, PERIOD, I'm tempted to say.)  This, to my
         mind, is where a more fundamental consideration of the "RISKS" posed
         by technology must begin.
                Such a reconsideration, it seems to me, would have to go far
         beyond the received wisdom.  It would have to question what we take
         completely for granted, and when one does this one always runs the
         risk of being deemed insane, reactionary, or Luddite, no matter how
         much one loves one's children & the future.  Is, for instance, the
         entire modern project of unlimited progress through the conquest of
         nature, of which technology is the articulation, the Unqualified Good
         we assume it to be?  The project of the conquest of nature seems
         itself founded upon still deeper assumptions, such as the mechanical
         character of human and nonhuman nature, the total freedom of human
         cognition and valuation, the denial of transcendence, etc.  But the
         most important such assumption seems to be that compassion for the
         lot of one's suffering fellow human beings must override all other
         practical and theoretical considerations.  In Feuerbach's words,
         "compassion must precede thought."  Are we prepared, in the light of
         the chaos into which our 'compassionate' and 'thoughtful' technology
         is about to precipitate us, to rethink even these assumptions?
                Turning back to bits and bytes (permanently) is tuning out to
         the deeper issues.  I can play with the details as well as anybody, I
         suppose, but just as in a corporation you'll never get promoted to
         CEO if all you want to do is sit at a terminal all day and code, so
         we cannot be free men and women, "The People" to whom our
         Constitution makes constant reference, if we do not undertake to
         THINK about What Gives Here Anyway?

       If I may be permitted a personal point, at the risk of making this
sound like some maudlin Lance Morrow "Time Essay": Someone may object, "Well,
for an anti-technologist you seem to have no problems with the computer, the
pre-eminent technology."  All right.  But with me the issue of technology is
much more painful to think through than whether or not I am prepared to go
back to the typewriter or even the quill pen.  My adorable one-year-old
recently underwent a balloon valvoplasty for a blocked heart valve.  Without
the operation the prognosis was death by heart attack or congestive heart
failure by six months of age.  Today he is well and will lead an utterly
normal life, provided he does not fly on an Iranian airliner near an AEGIS
ship.  For me, to think about technology at a fundamental level means to come
face to face with the bitter possibility of my own son's certain death.
Unless we are prepared to think at that level we will go on killing, puzzled,
wishing we didn't have to, but hopelessly going on and on.  The great English
poet Stevie Smith, writing about theology in her poem "How Do You See?" has
words that the priests and priestesses of the new religion of technology would
do well to heed:

        I do not think we shall be able to bear much longer the dishonesty
        Of clinging for comfort to beliefs we do not believe in,
        For comfort, and to be comfortably free of the fear
        Of diminishing good, as if truth were a convenience.
        I think if we do not learn quickly, and learn to teach children,
        To be good without enchantment, without the help
        Of beautiful painted fairy stories pretending to be true,
        Then I think it will be too much for us, the dishonesty,
        And, armed as we are now, we shall kill everybody,
        It will be too much for us, we shall kill everybody.

        Questioning technology as profoundly as we must is so painful and
vertiginous I doubt we can do it.  I hope we can; but I doubt it.
        That's all for now.

Hugh Miller  University of Toronto   (416)536-4441   <HUGH@UTORONTO.BITNET> 

Please report problems with the web pages to the maintainer

Top