The RISKS Digest
Volume 7 Issue 17

Friday, 8th July 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Politics and Risk
Gary Chapman
Iranian Airbus ([mis]quotation from the SFO Chronicle)
David Parnas
Re: Iranian Airbus and the "facts"
Sue McPherson
Threshold probability for declaring a radar blip "hostile"
Clifford Johnson
Iran Airline Incident and meaningful real-time data
Chris McDonald
A320 Airbus: Air conditioning; monitoring traffic control; F-14s
Steve Philipson
Iranian Airbus Blame?
Chaz Heritage
Re: "The target is destroyed."
Henry Spencer
An epilogue to this issue
PGN
Info on RISKS (comp.risks)

Politics and Risk

Gary Chapman <chapman@csli.stanford.edu>
Thu, 7 Jul 88 16:27:06 PDT
I would like to respond to the contributors who suggested that we "leave
politics out of" this forum, and stick to "technical" subjects.  These comments
were in response to a contribution from someone from the University of Toronto,
who was highly critical of the use of computer-based electronic systems like
the one that was involved in the Iranian airliner tragedy. I for one found his
first RISKS posting on that subject very cogent and appropriate. 

If this forum is to address the risks of technical systems, it seems highly
artificial and misleading to eliminate consideration of the political
environment the potentially risky system may be in.  RISKS readers should
certainly be spared pure political harangues or ideological tirades, but there
is no reason why we cannot consider a technical system's political environment
part of the mix of ingredients that may make it risky.  Indeed, for some
systems it is almost exclusively the combination of the technical character of
the system and the political environment within which it is meant to work that
constitutes the risk.  The Strategic Defense Initiative would pose considerably
less risk, perhaps even close to zero, if it did not have to cope with hostile
nuclear warheads (and of course the entire raison d'etre of this system is
bound up with politics).  The technical reliability of Navy shipboard radars
and other sensors is of paramount importance precisely because they are
operating in a combat zone.  If the system's reliability, or unreliability, has
large and grave implications for the political environment in which it is
working, it should be within the purview of engineers and technologists to
question the wisdom and prudence of introducing the system into such a
political context.  This is simply analogous to a matchmaker suggesting that
his product not be used around gasoline or dynamite without due caution.

Extrapolating from that, it is certainly part of a comprehensive assessment of
risk to ask questions about why we want whole classes of technical systems,
such as increasingly automated weapons, weapons that destroy things faster or
more thoroughly, etc., when these technical capabilities entail not only risk
in the conventional sense of malfunction, but "social risk" in helping to usher
in a nightmarish world that our children will probably regret.  It is
legitimate consideration of this "social risk" that many engineers and
technologists avoid like the plague--it is labelled "emotional," "irrational,"
"not technical," etc.  It is astonishing that this is so widespread among
technical professionals when the "social risk" of so many technologies is so
readily apparent in our age, and appears to be getting generally worse instead
of better.

During the war in Vietnam, the United States Air Force had virtually
unchallenged air superiority, and we dropped three times more bomb tonnage on
Southeast Asia than was used by all the powers in all the theaters of World War
II.  This almost unimaginable destruction didn't do much of anything in terms
of winning the war or even furthering short-term American military objectives.
The North Vietnamese still won the war, despite devastation the contemplation
of which would make many people permanently catatonic (for a description of
this madness, read James William Gibson, *The Perfect War:  TechnoWar in
Vietnam*, Atlantic Monthly Press, 1986, Part III, "Death From Above").  The
Pentagon Papers revealed that the Air Force knew the saturation bombing was
having no effect on the level of resistance of the enemy.  The Air Force knew
this, and yet recommended *escalation* of the bombing throughout the remainder
of the war.

Was this a technical problem of not hitting the right targets, or not working
out the right pattern of bombing sorties?  Was this the "correct" decision by
Air Force commanders who were given a job to do with a certain tool?  As the
saying goes, "if the only tool you have is a hammer, all problems begin to look
like nails."  If we give Air Force commanders lots of big B-52s and lots and
lots of 500 and 1,000 pound bombs, should we blame them for attempting to turn
most of Southeast Asia into a parking lot?

Or was this a "social risk" of the technology, an atrocity in which all people
are implicated, technologists and "end users" alike?  Can the technology itself
have a role in creating the political context in which the technology itself
rockets risk over any scale we had previously imagined?  This is what happened
with nuclear weapons, and now it's happening with a whole spectrum of
technologies that are becoming increasingly refined and increasingly deadly.

It's pointless and even distasteful to address these issues in purely technical
terms.  To address risk in that fashion is to limit oneself to "tweaking"
systems that may be fundamentally wrongheaded to begin with, technological
systems that, at their core, are bad--not just risky--for humanity.  Some may
think that this adds up to an "emotional" appeal, but there's nothing
inherently wrong with that.  It takes people with emotion, and not just a
facility for algorithms, to recognize risk and to do something effective about
it.

Gary Chapman, Executive Director
Computer Professionals for Social Responsibility      chapman@csli.stanford.edu


Iranian Airbus ([mis]quotation from the SFO Chronicle)

<parnas%QUCIS.BITNET@CORNELLC.CCS.CORNELL.EDU>
Thu, 7 Jul 88 12:42:50 EDT
One of the pleasures of not reading all the major U.S. papers is that I don't
usually see articles that misquote me.  The text by David Perlman that you cite
is quite inaccurate.  The first statement is a paraphrase of my words, not my
words but my thoughts.  I said that it was ludicrous to see people who are
unable to tell a U.S.  made F-14 from a French made Airbus, claiming that they
will be able to write trustworthy software that will discriminate between
Soviet warheads and Soviet decoys designed to look like warheads.  The final
quote in the article is not even an accurate paraphrase.  I said that a system
designed primarily as a defense against low flying high speed missiles should
not be expected to discriminate between different kinds of aircraft.  If the
crew used a system in that way, i.e.  if they assumed that the warning "threat"
meant that the system "knew" it was tracking a military aircraft, then the
crew made an error.  I also told that reporter that we had far too little
information to make any judgement on the failure of such systems.  We do not
know which data were used by the crew in making their decision.  The Aegis
missile defense software might not even have been involved.

Dave


Re: Iranian Airbus (RISKS-7.16) and the "facts"

Sue McPherson <munnari!murdu.oz.au!sue@uunet.UU.NET>
Fri, 8 Jul 88 15:14:23 EST
Over the last few days we've had a lot of emotional discussion about the
shooting down of the Iranian Passenger Plane. All in all it has been about as
informative as the discussions last year on KAL007 - that is - not very.

On the one had we have Hugh Miller who believes that the papers have told him 
that technology has fouled up again. On the other we have Michael Mauldin 
who thinks that the Captain made an understandable mistake and it was the 
Iranian's fault. Then there is Bob Estell who thinks his old buddy is one of 
the good guys so it must be the Iranians fault. But I can't help thinking that 
the Navy doesn't have such a good reputation these days for employing honest
and reliable people. Finally we have Jim Anderson who thinks we should 
nuke the lot of them - er sorry, I mean shoot anyone within range.

Each of these people seems to be quite sure that they have the "facts" yet
they seem to be quite contradictory. Just how trustworthy is the "Pittsburg
Post" ? (who got it from the "New York Times" who got it from .....) It seems
that we are very reliant on the media (newspapers & TV) yet we have no way of
authenticating the information presented or even penalising them when they
make a mistake. Interpretation of the "facts" is another risky area, a good
example is the reported "fact" that the 707 has a cruising speed of 325 knots,
while this may be the truth it is not the "whole truth" as there are other
factors (such as wind speed) which could enable to the plane to travel at a
much greater ground speed (i.e., the speed that would be observed by the US
ship).

Both the "shootdown" and the ensuing discussions both prove the point that 
the biggest RISK we take, is in believing what we are told - whether the
information comes from the latest/biggest/most expensive radar system or the
"Pittsburg Post" there are no 100% guarantees that it is correct or even
complete.

Sue McPherson                    sue@murdu.mu.oz


Threshold probability for declaring a radar blip "hostile"

Clifford Johnson <GA.CJJ@Forsythe.Stanford.EDU>
Thu, 7 Jul 88 00:54:41 PDT
I have a couple of observations re the Iran shootdown, and the
big question, did Captain Will Rogers have sufficient cause to
order it, as JCS Chairman Crowe loyally contends?

My first observation is that this claim seems premature, since no-one knows
what information Will Rogers based his so-called decision (or should it be
called computer-prompted reflex?) upon, as yet.  Suffice it to say that if, as
is reported, he did not bother to monitor to the air field control tower; and
if, as is reported, Mode C (civilian) transponder returns were received by the
ship; and if, as reported, the plane was flying at the speed of a commercial
flight; and if, as reported, an F-14 is in any case no real threat to his
ship; then on its face he was taking an unnecessary 50-50-ish gamble (not
unreasonably characterized as a "panic") as to whether the radar blip was
really hostile or innocent.  Also, was it not negligent to not monitor the
control tower, as is reportedly standard practice for non-U.S. ships in the
Gulf?

But the question I really want to raise is, at what perceived "odds" — 50-50,
60-40, 90-10 ? — does a commander have "sufficient cause" to "declare" a
radar blip, that might be hostile or might be a commercial flight, officially
"hostile," so as to shoot it down?  Judging by Admiral Crowe's immediate
approbation, it seems he thinks that almost *any possibility* that a flight
might be hostile is sufficient to order a shootdown, by virtue of the
commander's "heavy obligation" to protect his "personnel and equipment."
Judging by Will Rogers' simple explanation that he thought it was a hostile
F-14, vague odds above 50-50 seem to be enough.

I would have thought that moral considerations would lead the military to
value civilian lives above their own (isn't that what they are supposed to be
guardians of?), and so the chances would have to be way *over* 50-50 to be
sufficient to order a shootdown?  Does anyone on the net think that odds of
51-49 or less are sufficient?  A natural follow-on question is this:  given
the shortness of the response time, what could be the best odds attainable in
a realistic attack scenario, even assuming the best computer technology the
United States could field?  Perhaps the degree of certainty simply cannot be
high enough to justify a shootdown in any circumstances?


Iran Airline Incident and meaningful real-time data

Chris McDonald STEWS-SD 678-2814 <cmcdonal@wsmr10.ARPA>
Thu, 7 Jul 88 8:01:15 MDT
It seems to me that Martin Minow's comments on the danger of drawing
conclusions in the absence of facts hits the mark.  If I might pursue that line
a little farther, I wonder if the Commander of the US ship was not forced to
make a decision because he ultimately did not receive the most accurate or
meaningful data.  For example, neither the news media nor any official
commentator has mentioned what data the AWACCS planes, which stage out of
Saudia Arabia, did or did not see regarding the Iranian aircraft.  From all
published reports and publicity releases it seems likely--assuming that an
AWACCS was airborne at the time of the incident--that the AWACCS detected the
takeoff of the aircraft and may also have monitored signal communications
between the aircraft and ground controllers.  It also seems likely given the
intelligence collection capabilities of several countries in the Gulf that
other sources would have recorded the same information.  

If these sources were available, then it seems logical to ask why one cannot
discriminate between a fighter aircraft and a commercial airliner?

If these sources were not available, then it seems reasonable to ask why not?


A320 Airbus: Air conditioning; monitoring traffic control; F-14s

Steve Philipson <steve@aurora.arc.nasa.gov>
Wed, 6 Jul 88 21:00:43 PDT
Munnari!mulga.oz.au!lee@uunet.UU.NET (Lee Naish) asks:

> Though the A320 Airbus has redundant computer systems, they all use the
> same air conditioning system.  Does anyone know what the expected
> failure rate of that system is, or how critical a failure would be?

   Jet aircraft air conditioning is derived from the jet turbine's high
pressure bleed air system and/or auxilary power unit.  Air conditioning is
thus available whenever an engine is running.  If both engines fail, they may
be able to use the APU in flight (some aircraft can only operate the APU on
the ground), but this is probably not significant vis-a-vis instrument
overheating, as the aircraft probably won't be flying long enough for the
equipment to overheat.  Computer system fans can fail, but there are usually
several of these per machine and are not highly critical items.


minow%thundr.DEC@decwrl.dec.com (Martin Minow THUNDR::MINOW ML3-5/U26 223-9922)
writes:

> that nobody on the Vincennes was monitoring tower-plane radio communications.
>(And the vague suspicion that there wasn't anyone on the ship fluent in Farsi.)

   You don't really propose that every ship monitor every aviation and
marine frequency within strike distance, do you?  That would be at least
dozens and perhaps hundreds of frequencies.  Also it is very unlikely that 
tower transmissions would reach 20 miles away on the surface.  

   One should note the international language for air traffic control is
English.


Jonathan Crone <CRONEJP%UREGINA1.BITNET@CORNELLC.CCS.CORNELL.EDU> writes:

>    ...  Grumman designed the F-14 to
> support the Navy's requirement for a powerful Air Defense Fighter.

> ... the F-14 has very limited Air to Ground capabilities...

> So perhaps the big question is, why are they saying that they
> were worried about the possibility of an attack from an F-14?

   Iran wasn't supposed to have Silkworm missiles either.  It must have
really surprised the first few captains whose vessels were hit by them.
Although the Iranians aren't exactly technical whizzards, it is possible
that they could have configured an F-14 for air-to-surface operations.
Would you risk your crew to save one belligerent opponent who has 
equipment of unknown capabilities?

   Fred Arnold, a WWII P-38 pilot and author, noted that he always 
gave wide berth to Allied ships, as policy was that it was better 
to shoot down one friendly aircraft by mistake than to lose a ship.  
It seems that the policy is still in force today.


Iranian Airbus Blame?

<"chaz_heritage.WGC1RX"@Xerox.COM>
7 Jul 88 09:31:26 PDT (Thursday)
RISKS Digest 7.16 consists almost entirely of speculation and expressions of
personal view about the shooting down of an Iranian civil airliner by the USS
Vincennes.

Where are the hard facts? Where are the Flight Data Recorders? Where is the
Cockpit Voice Recorder? Where is the data logger tape of the activity of the
Aegis system aboard USS Vincennes at the time of the incident? When will the Log
of the USS Vincennes be produced? Who will make known the Rules of Engagement
under which the USS Vincennes engaged its target? Who drafted those Rules of
Engagement? Which authority in Iran is reponsible for permitting the aircraft to
fly through the area from a military base during a period of military action?
Who in Iran is prepared to deny that the incident, tragic though it may be in
human terms, is politically extremely convenient for Iran? Who in Iran is
prepared to deny that fundamentalist Islam might well regard the 'martyrdom' of
the Airbus passengers justified if it facilitates political victory over 'the
Great Satan'? 

It must be entirely unjust to assign blame to any party (or system) in the
absence of admissible evidence. The fact that the unfortunate Captain of the USS
Vincennes felt it necessary to make an immediate personal statement accepting
full responsibility for the incident strongly suggests to me that he does not
expect such evidence to be forthcoming. It is shameful that this officer, who
has clearly done nothing but his duty, should have been placed in such an
invidious position so soon after the incident in question.

Still more shameful is the clearly preprogrammed response of the broadcast
media. We now have another addition to the 'Us and Them' Glossary:

USSR shoots down airliner = 'massacre'
USA shoots down airliner = 'tragedy'

Under such circumstances further Iranian political gain seems inevitable.

Chaz Heritage


Re: "The target is destroyed."

<mnetor!utzoo!henry@uunet.UU.NET>
Thu, 7 Jul 88 15:14:11 EDT
> "...So from now on it's hair-trigger 24 hours a day...  Shoot first and
> ask questions later.  The hell if I'm gonna be the next one to lose his 
> Florida retirement condo to keep Marconi's rep clean."  I can't find it in
> my heart to blame the man, either...

Nor can I, for a different reason.  We're seeing yet another manifestation
of the everything-should-be-absolutely-safe-and-if-it's-not-then-somebody-
has-been-negligent syndrome.  For heaven's sake, has everyone forgotten that
(ignoring the political hairsplitting and sticking to the pragmatic facts
of the situation) there is a *WAR* underway in that airspace?!?

In a war zone, a bias toward shooting first and asking questions later is
normal for anybody with any desire to survive.  Wars are confused.  The
"to shoot or not to shoot" decision often has to be made with inadequate
information.   A "wait and see" decision is *very* hazardous to your health.
"Own goals" — shooting down your friends — are  normal in a war; the most
one can do is try to reduce the frequency.

The fact is, when you take an airline flight through an area where a
missile war is in progress, nobody in his right mind is going to expect
that flight to be risk-free.  You won't find me in an airliner anywhere
near the Persian Gulf without an awfully good reason.  Anyone who got on
that flight as if it were a normal peacetime flight was either misinformed
or crazy.  Trying to be an innocent bystander to a war while standing in
the middle of it is a damned risky project.  The Stark incident evidently
made the US warship captains aware of this.  It's too bad the Iranian
airline passengers had to learn the facts of life the hard way, but the
people who fired those missiles cannot really be blamed for it.

Henry Spencer @ U of Toronto Zoology   {ihnp4,decvax,uunet!mnetor}!utzoo!henry


An epilogue to this issue

Peter G. Neumann <neumann@csl.sri.com>
Fri, 8 Jul 88 15:42:11 PDT
For those of you who have waded through the past three issues of RISKS, I
received MANY MORE contributions on this subject that have not been included
here.  This subject has generated a high level of interest, despite the lack
of hard facts — or perhaps precisely because of that lack.  Yes, we should
try to avoid rampant speculation, although it is important that we try to
understand the conflicting factors in the absence of any definitive reports.
(Several messages with rather wild rumors and speculations are omitted.)
But the awaited ``definitive reports'' often turn out to be less than
satisfactorily definitive (as in the case of the KAL 007).  Difficult
situations are in general not black and white, and the evidence is often not
very precise — if present at all.

I make no claims of infallibility.  I sometimes err on the side of openness
by including questionable material in RISKS.  I prefer that to cloture.
However, I think Gary's point at the top of this issue is very significant
-- on the inherent difficulties in decoupling politics and technology, and
indeed the dangers in trying to do so.  

I sometimes also err by rejecting a contribution that deserves to be heard,
but then I usually succumb to appeal.  The Iranian Airbus incident has
overwhelmed me with highly overlapping and speculative material.  If one of
your contributions overlaps one that is here, but still makes an important
point still unsaid, please excerpt, rewrite, and resubmit.

Please report problems with the web pages to the maintainer

x
Top