The RISKS Digest
Volume 6 Issue 76

Tuesday, 3rd May 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…


o Supporting data for Hirsh's explanation of the KAL007 incident
Nancy Leveson
o KAL007
Steve Philipson
o USS Stark
o Ada in strategic weapon systems including nuclear attack warning
Jon Jacky
o Re: Virus protection
David Collier-Brown
o To speak of the disease is to invoke it? (Viruses)
Henry Spencer
o Detectability of viruses
Fred Cohen
o Info on RISKS (comp.risks)

Supporting data for Hirsh's explanation of the KAL007 incident

Nancy Leveson <>
Mon, 02 May 88 19:19:11 -0700
It is interesting to consider whether Hirsh's explanation of how the KAL007
navigation error could have been accidental stacks up against other experiences
with navigation errors in commercial aircraft.  Hirsh claims that pilot
navigation error is the most likely explanation for the KAL incident.

In a magazine called Flight Crew (Fall 1979), Arnold Reiner wrote an article
called "Preventing Navigation Errors During Ocean Crossings," in which he
reports that such errors are common.  He states:

  "During the first six months of 1978, the International Air Transport 
   Association (IATA), reported that 49 North Atlantic flights were observed
   off track in excess of 24 nautical miles. [A "gross navigation error" is
   defined as a cross track error exceeding 24 miles and must be reported
   and the pilot held accountable if observed.] ... The number of navigation 
   errors is assuredly greater than IATA statistics indicate, because at jet 
   cruising levels, VOR reception often exceeds the range of coastal radars, 
   thus permitting errant crews to regain track undetected.... During the first
   six months of 1978, 16 flights were observed off track by more than 50
   miles, while eight were spotted by coastal radars 100 miles or more off
   track.  The three greatest cross track errors were 180, 400, and 700 miles.
   Averaging the number of observed gross navigation errors into the number
   of days in the first half of 1978 yields one gross navigation error each
   3.6 days."

I believe the KAL007 flight was 250 miles off track, which is within the bounds
of previous incidents that were assuredly accidental.  I have no data to
determine whether navigation errors are more or less frequent or have a
different average size over the North Pacific as opposed to the North Atlantic.

The reasons involving pilot error given in the article for these incidents
(which is written as a warning to pilots of how to avoid such problems) are of
general interest with respect to decreasing risks of navigation errors and
include:  multiple copies of computerized flight plans (e.g., where an enroute
reclearance had been entered on one copy but not the one used to extract
waypoint information; "present position" loading errors (e.g., many inertial or
Omega navigation systems will accept a present position that is substantially
distant from the aircraft's actual position without triggering a malfunction
code or other warning — Hirsh describes a relatively common practice by pilots
of downloading the inputs from one of the redundant computers to the other in
order to save time instead of redundant loading so that input errors can be
detected); erroneous loading of enroute waypoints (e.g., forgetting to load
tenths of minutes which can produce errors in tens of miles, forgetting to
advance the waypoint selector to the next waypoint and then loading a new
waypoint on top of one previously loaded; loading the wrong hemisphere; copying
waypoints onto a slip of paper first and then transposing the digits when
loading them); crews not monitoring present position or track frequently enough
to detect significant track deviations; autopilot problems (e.g., temporarily
disconnecting the autopilot to manually circumvent things like thunderstorms,
returning to track, and then forgetting to reengage the autopilot Nav mode).

Although Reiner's article is written a while ago, more recent stories I have
heard do not make it sound like these problems have since been eliminated.
Several of the possible explanations based on pilot error given by Hirsh are
very close to those noted above as having been responsible for similar
incidents (over a different ocean).  Note that there was a recent incident
where a Continental plane was far off track over the Atlantic (and nearly hit
another plane).  It does not appear that the Continental pilot was warned by
ground controllers of his wayward course.

       [Reference: Seymour M. Hirsh, "The Target is Destroyed", 1986.  PGN]

Re: KAL007 (RISKS-6.75)

Steve Philipson <>
Mon, 2 May 88 20:10:23 PDT
    ......                                                    By the way,
   one difficulty with trying to prove a conspiracy theory is that everyone on
   the inside will deny it (which may thus seem credible), whether or not the
   theory is true.  So, you are ALWAYS AT RISK, period.  PGN]

  Really?  Given what we've been talking about with whistle-blowers, 
don't you think that the truth will leak out eventually?  At least sometimes?

>  ...             But the prima facie conclusion, in the absence of such an
> explanation, is that considerations other than safety lead the authorities to
> blame the pilot, who can not speak for himself."

   It could also be that fatal accidents are more often due to bad judgment
than non-fatal accidents.  A high percentage of "fatals" are due to the
classic "continued VFR into IMC", which translates into challenging mother
nature by scud running (trying to sneak under the clouds) and losing the
challenge.  Another major killer is what I call "gross stupidity":  flying
while drunk or on drugs, buzzing your neighbor's house, low level
aerobatics, etc.  A favorite adage of mine is as follows:

   A superior pilot uses superior judgment to avoid using superior skill.

  The worst pilot error is that one which gets you into a situation that
you can't fly out of.  Maybe that's why more fatals are classified that way.

Re: Laying conspiracy theories to rest

Peter G. Neumann <Neumann@KL.SRI.COM>
Tue 3 May 88 16:20:05-PDT
With respect to whether whistle-blowers do get the true story out, it is
intriguing to consider the article by Eliot Marshall in the 22 April 1988
issue of SCIENCE — "Sverdlovsk: Anthrax Capital" — which reconsiders the
April 1979 deaths in Sverdlovsk.  The Soviet explanation involved tainted meat
resulting from anthrax in the grain feed — although official Soviet secrecy
certainly fueled the alternative theories.  According to Marshall,
``Sverdlovsk's "mystery epidemic" of 1979 lost much of its mystery this month
when a group of Soviet doctors came to the United States and met with
scientists and reporters to give a firsthand account of what happened.''  They
seem to have convinced their American counterparts that this explanation is
indeed justified.  However, Marshall quotes US Government sources that they
still believe that a germ warfare experiment was involved.  Thus, nine years
later this case is still subject to uncertainty.  [If another explanation is
in fact the correct one, it has remained hidden — at least in unclassified

USS Stark

Tue, 3 May 88 07:31 MST
The US Congress has decided to convene hearings on the Stark incident and 
possible performance failures on computerized air-search radars.

Ada in strategic weapon systems including nuclear attack warning

Jon Jacky <>
Mon, 02 May 88 20:43:43 PDT
The following appears in Darryl K. Taft, "Ada problems attributed to
management, not language," GOVERNMENT COMPUTER NEWS, April 29, 1988 p. 55:

"The Air Force has about 34 programs using Ada (Maj. Gen. Eric B.) Nelson
said.  Among those Nelson listed the Advanced Tactical Fighter, the small
Intercontinental Ballistic Missile, the Milstar Satellite Mission Control
System and the Command Center Processing Display System Replacement program.

This last system being developed at (Electronic Systems Division (ESD) at
Hanscom Air Force Base, Bedford Mass.) "accomplishes tactical warning and
attack assessment for this nation," Nelson said.  "Information on ballistic
missile activity headed for the United States is sent to the leaders that
make the big decisions.  Based on that system this country decides whether
to retaliate or not with our own nuclear forces," he said.

- Jon Jacky, University of Washington

Re: Virus protection

David Collier-Brown <geac!daveb@uunet.UU.NET>
3 May 88 17:07:58 GMT
| Somebody (I forget who) said,
|| To suggest that [write-protection] is 100% effective against a virus is to
|| overstate.  Studies in biology suggest that a virus can thrive even in a
|| population in which a large percentage of the members are immune, if a there
|| is sufficient commerce among the non-immune members...

|    Now, think about that for 2 or 3 seconds.  If you turn on your machine,
| write-protect all the drives, run a virus unknowingly, and turn off your
| machine, you will NOT be infected by any possible virus.

I'm sorry, but you've misunderstood the statement.  The virus thrives on
other people's unprotected disks, and runs in your unprotected memory,
attempting to "infect" your machine.  If your machine is never
    1) connected to another machine, or
    2) running an unprotected disk
at the same time you use your normal disk (ie, unprotect it to do
some work), then you are safe.  As you suggest.

But if there's a virus thriving nearby, it gets multiple tries to infect
your machine. You have to be **perfectly** consistent in protecting your
disk...  Which tends to be difficult, unless you only use a few, pre-virus
programs on a standalone machine.
  That's the point of the biological analogue.

David Collier-Brown, Geac Computers International Inc., 350 Steelcase Road,
Markham, Ontario, CANADA, L3R 1B3 (416) 475-0525 x3279 

To speak of the disease is to invoke it? (Viruses)

Tue, 3 May 88 11:03 EDT
In RISKS-6.75, Fred Cohen begins:

  >In WHMurray's recent article to this bboard, I hear the same sounds
  >I have heard for years when attempting to discuss computer viruses
  >in an open forum. To speak of the disease is to invoke it.

I admit to a certain amount of ambivalence on this issue.  I believe that there
is some risk of turning a vulnerability into a problem by talking about it too
much.  There is an undeniable phenomenon of copy-catism in society.  Serial
killers clump in time.  So do teen suicides.  There is also a tendency in our
society to glorify the perpetrator of a crime and stigmatize the victim.

The computer virus is different from the natural virus.  The incidences of
natural viruses are independent of what we say about them; the incidences of
computer viruses are not.

Now I make my living advising my clients on how to keep the computer safe, how
to use it to protect its contents, and how to use it safely.  I have a
responsibility to them and to the public at large to understand the nature and
size of this risk and to advise them accordingly.  I also have a responsibility
not to make the problem worse.

I am caught in a double bind.  We are collectively caught in a double bind.  To
deny the vulnerability may make the problem worse; to talk about it may make it

All that having been said, I come down on the side of truth telling.
Collectively we have made that decision.  We call the decision democracy.  It
is the decision that given the truth, collectively and most of the time, we
will make the correct judgements, and at least collectively, behave in our own
self interest.  So far it seems to have worked even in the face of lies and
liars ( of which viruses and their perpetrators may be among the more benign).

Specifically, I support the right and responsibility of Fred Cohen to speak on
this subject in public forums, however his opinions may agree or differ from my
own.  I oppose the kind of protective government, however well intentioned,
that believes that bureaucrats have the responsibility or the ability, to
protect us from our own errors.

My perception of the truth is that, so far, we have a vulnerability rather than
a problem.  It is the threat to public confidence, rather than the threat to
individual systems that is the issue.  That the perpetrators of viruses are, at
best experimenting, at worst playing, with powers beyond their ken or control.

William Hugh Murray, Fellow, Information System Security, Ernst & Whinney
2000 National City Center Cleveland, Ohio 44114                          
21 Locust Avenue, Suite 2D, New Canaan, Connecticut 06840                

Re: To speak of the disease is to invoke it? (Viruses)

Tue, 3 May 88 13:58:17 EDT
> ... Imagine howbad the
> virus situation would be 20 years from now if we didn't find out about
> it now! We would have cars that could be infected, automated airliners
> waiting for an accident to happen, automated defense systems that
> could strike individuals deads directly from space, all existing in an
> environment without integrity.

Mmm, I would be inclined to consider this an example of the "Floppy Disk
Fallacy" ("my PC uses floppy disks, so obviously professional programmers
working on Crays must use floppy disks").  Not everyone is as casual about
security as the PC crowd.  Although there are reasons to worry about the
safety of automated airliners and military systems, virus infection is
not plausibly one of them.  In the aerospace-software community, I am told,
it is not unheard-of to verify the *binaries* manually to make sure they do
the right thing, because the compilers are not fully trusted.  Although
these folks are thinking about programming errors rather than viruses, they
already care seriously about integrity.  (Whether they care *enough*,
especially when commercial pressures get serious, is a different issue.)

People doing life-critical work probably should take some precautions.
But quivering in fear that MSDOS viruses will infect airliners is like
quivering in fear of hackers dialing up NORAD's computers and starting
World War III (when in fact NORAD's computers simply do not *have* dialup
access, because those people take security seriously and always have).

Henry Spencer @ U of Toronto Zoology {ihnp4,decvax,uunet!mnetor}!utzoo!henry

Detectability of viruses

Fred Cohen <>
3 May 88 00:20:33 EDT (Tue)
I am Fred Cohen, and I said it is undecidable whether or not a program is
a virus, and that it is therefore impossible to detect all viruses and not
detect any non-viruses in finite time with a computer that obeys the Turing
model of computers. I did not say I could "detect" all viruses, but that if
we decided that all programs were suspect, we could surely detect all viruses

P.S. Write protecting hard disks only protects them from modification and thus
infection over the period of their write protection. It does not prevent other
infections that may occur to other parts of the world that can remember. - FC

Detectability of viruses

Peter G. Neumann <>
Tue 3 May 88 11:30:40-PDT
By the way, Fred's message in RISKS-6.58 begins, 

  "We can detect all viruses, but cannot decide whether or not a program is

Although I don't think either one of us misled anyone, I'm sorry for any 

Please report problems with the web pages to the maintainer