The RISKS Digest
Volume 7 Issue 21

Tuesday, 12th July 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

$54.1 million embezzlement foiled
Dave Curry
Aegis
DAve Curry
Iran Air Incident
Bob McKay
"Binary thinking" misses a lot
Bob Estell
Automatic Air Traffic Control
Eldred
Aviation units of measure
Joe Morris
Mouse trap
James H. Coombs
Threshold probability for declaring a radar blip "hostile"
Mike Wellman
Clifford Johnson
Info on RISKS (comp.risks)

$54.1 million embezzlement foiled

Dave Curry <davy@intrepid.ecn.purdue.edu>
Tue, 12 Jul 88 11:22:11 EST
System Crash Fails Swiss Bank Theft     (Information Week, July 11, 1988)

   Only a chance system crash prevented an attempted computer crime from
becoming Britain's, and possibly Europe's, largest recorded theft.  A
manager at the London branch of the Union Bank of Switzerland issued an
instruction to transfer 82 million Swiss francs ($54.1 million) to a branch
of Credit Suisse in Lyon, a small town near Lausanne.  The payment
instruction was sent via the Swift international interbank network, which
handles nearly a million payment messages per day.
   A computer breakdown at the Swiss end apparently forced the bank staff to
make manual checks of payment instructions that would normally be processed
automatically.  Suspicions were aroused, and Swiss police were waiting to
pounce on the man who arrived to collect the cash.  Two men have been
arrested in Switzerland, in addition to the London-based, British employee
of UBS.
   Swift officials in Brussels emphasized that the security of the network
had not been compromised, and that what happened was not strictly a computer
crime.  Genuine computer crimes, in which a secure operating system is
breached by an outsider to effect fraudulent transactions, are thought to be
rare.  [sounds like a rather "convenient" definition to me... -Dave]
   More common, and more worrisome to large financial service companies, are
cases in which fraudulent transaction instructions received on paper are
entered into systems as if they were genuine.  The Swift network then has no
way of knowing it is carrying a fraudulent transaction.
   Protection against these crimes relies on the fact that each transaction
is supposed to be ratified by a number of people in different locations.
Collusion among several managers would be required for these frauds to
succeed.  In this case, however, it appears that security at UBS was
inadequate: It should not have been possible for one man to enter a
fraudulent transaction of such high value [meaning that he should have
been able to enter one of lesser value? -Dave] into the Swift network even if
it appeared to have come from a genuine telegram ordering payment.

                    - Philip Hunter, London

--Dave Curry, Purdue University


Aegis

Dave Curry <davy@intrepid.ecn.purdue.edu>
Tue, 12 Jul 88 11:22:11 EST
Aegis System At The Heart Of Vincennes Investigation
(Information Week, July 11, 1988)

   Unanswered questions about the Persian Gulf engagement in which the U.S.
Navy cruiser Vincennes shot down an Iranian jetliner have focused attention
on the Navy's Aegis automated weapon system.  Aegis is widely considered one
of the most sophisticated uses of automation in the armed forces.  Vice
Admiral Joseph Metcalf III, Deputy Chief of Naval Operations, has referred
to it as "Star Wars at sea."
   Part of Aegis' uniqueness is that, like SDI, Aegis is more a concept than
a specific weapon system.  It was developed by RCA's Missile and Surface
Radar unit under a contract to synthesize a weapon system capable of
fighting air, surface, and submarine threats simultaneously.
   Aegis combines input from four phased-array radars (which employ hundreds
of tiny radar beams to scan the entire horizon without causing the gaps in
surveillance created by rotating dish radars) - using UYK-7 computers
manufactured by Unisys - to create a graphic display of all air, surface,
and subsurface targets on video screens in the ship's combat information
center (CIC), including information on the target's speeds and direction.
The radars scan an area that reaches out in all directions from the ship in
the shape of a bowl or shield, hence the name Aegis.
   The ship's sensors also track whether the target is friend or foe,
neutral, or assumed friend or foe, based on visual identification or
encrypted electronic signals.  All of this information - tracks, data,
displays - and almost all communications in or out of the ship's CIC is
automatically recorded in Aegis' computer.
   This is the data Navy investigators will examine when they attempt to
discover what went wrong on the Vincennes.  Although Secretary of the Navy
John Lehman has referred to Aegis as "the most carefully tested combat
system ever built," the system was deliberately designed for human input -
VISUAL IDENTIFICATION OF TARGETS, OPERATOR ASSIGNED FRIEND-OR-FOE STATUS
[emphasis mine], and operator selected targets - and is, thus, subject to
human error.
                    - Christpher Hord


Iran Air Incident

Bob McKay <munnari!cs.adfa.oz.au!rim@uunet.UU.NET>
Tue, 12 Jul 88 13:25:50 EST
Much of the commentary on this incident treats it as 'a land far away in a time
long ago'; it's not - it's an immediate question not just for Iran, but the
whole of SE Asia and Australasia: ALL the traffic between here and Europe
overflies the Gulf, much of it staging through Bahrein or Dubai. Until now that
hasn't worried me much - a 747 at 25,000 plus feet ought to be safe, so I
thought.  But now I wonder - could a DC10 on its final descent into Dubai on an
unpredicted track - perhaps avoiding a storm - look like an attack on US
vessels? What if an Iranian Mirage had accidentally crossed its track earlier?
Even scarier, what if that Mirage deliberately followed it in? Seems like a
reasonable way to get closer to the task force vessels. Improbable? So is the
present story on mistaking an airbus for an F14.  The point is, this was not
just an unlucky worst case event; it was actually one of the better possible
scenarios from the US point of view - imagine if it had been an Air India or
Qantas jet that was downed.  The US forces in the Gulf seem to be in a
virtually impossible situation.  They cannot afford to assume the best - that a
target is innocuous until proven otherwise; they also cannot afford to assume
the worst - that a target should be blasted unless proven friendly. On the
other hand, the airlines involved don't have much alternative either as there
are no other corridors available. Further disasters don't seem unlikely from
here.


"Binary thinking" misses a lot

"FIDLER::ESTELL" <estell%fidler.decnet@nwc.arpa>
12 Jul 88 08:04:00 PDT
I believe we've previously discussed the "RISKS of 'binary' thinking" in
this forum; i.e., recognizing only dichotomies - one/zero, right/wrong ...  
Our computer thinking - which we sometimes use to debug code - 
encourages that; ditto our legal system.

In defense of CAPT Will Rogers, I spoke of tragedy; I'd bet that Will
agrees with that conclusion.  I'd also guess that he would agree with
George Will ["This Week with David Brinkley", ABC, Sunday 10 Jul 88]
that his actions were "... morally defensible under the circumstances;"
BUT that he would not dispute Sam Donaldson's point [ibid.] that the
outcome was NOT morally desirable.

Some readers of this journal did not see me blame the USN, so they assumed
that I found the "fault" with the Iranians.  In fact, I found no fault;
if I had, I think I would have suggested we debate the methods [at least,
and perhaps motives, too] of those who put USN warships in "harm's way"
without bothering to declare war.  If others want to pursue that debate,
then I suggest we move it to ARMS-D.  [arms-d@xx.lcs.mit.edu]

In RISKS, I suggest we have a hard look at the generic problem of doing systems
- offensive, defensive, process monitoring, banking ... that ordinarily have,
historically, used "binary logic" when indeed they probably ought to evolve
into "expert systems" that use NOT ONLY classical logic [e.g., Boolean] BUT
ALSO [occasionally] "fuzzy logic."  This discussion might quickly spread to
include "neural network" kinds of logic, in which the failure of several bits
[gates] has only slight impact on the outcome, perhaps not even noticeable.

Bob


Automatic Air Traffic Control

<eldred@apollo>
Mon, 11 Jul 88 20:52:37 EDT
Considering the discussion about the Iranian A300 incident, I wondered about
the implications of current efforts to automate air traffic control.  Perhaps
if the Iranian ATC and the US Navy Aegis system were fully automated then
the chances of an unfortunate incident happening might have been different?
This article tells of current US plans in that area:

FAA BEGINS PLAN TO FULLY AUTOMATE ATC FUNCTIONS
[ from Aviation Week & Space Technology, June 27, 1988, p 73 ]
[ used without permission ]

SALT LAKE CITY--FAA Administrator Allan McArtor has begun a three-phase plan
to progressively automate air traffic control functions that eventually
would automate most functions and limit human controllers to supervisory and
emergency functions.

In a speech here dedicating the last of the FAA's 20 Host ATC computer
systems, McArtor said the automated network would rely on computers and
satellite tracking to choose the best, safest and most fuel- and
time-efficient routes for aircraft.  It would also space them more
efficiently than human controllers can.

The agency already is working on Phase 1 of the automated en route air
traffic control system (AERA) plan, McArtor said.  When functional, this
computer software upgrade will allow controllers to evaluate routes
requested by pilots for potential conflicts with other aircraft, prohibited
airspace and flow control restrictions.

The second phase, scheduled to be operational by the late 1990s, would give
controllers several solutions to traffic problems.  Any route chosen by the
controller would be communicated automatically to the aircraft by digital
data link.

The final part of the AERA plan--and admittedly the most ambitious, McArtor
said--would upgrade air traffic control software to allow totally automatic
air traffic operations.  Computers would detect and resolve traffic control
problems, make decisions, and offer clearances to aircraft without human
intervention.  However, air traffic still would be supervised by humans, 
McArtor said....

Satellite tracking and communications technology will be key to the AERA
effort and future ATC system modernization, McArtor said


Aviation units of measure

Joe Morris (jcmorris@mitre.arpa) <jcmorris@mitre.arpa>
Mon, 11 Jul 88 21:32:15 EDT
Discussing the question of the altimeter display in the Airbus involved in the
flyby crash in France in RISKS 7:20, Henry Spencer comments:

<> Does the Airbus model in question display altitude in feet or meters?
>    [Question raised regarding whether the Air France Airbus was at 30
>    feet or 30 meters...]
>
>Unless I am greatly mistaken, in feet.  ...

At the risk of a case of foot-in-mouth disease (since I have no experience
flying as a crewmember in Europe), my Jeppesen manuals (flying charts)
seem to contradict Henry's comment.  The following is extracted from the
Jeppesen J-Aid, "Tables and Codes" section, pp. 27-30 (dated 26 February 88):

UNITS OF MEASUREMENTS TO BE USED IN AIR AND GROUND OPERATIONS

So that there would be no misunderstanding as to what units of measure (as
mitres or feet) were used in each country, the International Civil Aviation
Organization (ICAO) provided a recommended table from which countries could
choose either the table labeled "ICAO" or the table labeled "Blue".

In 1979, ICAO revised Annex 5 and replaced the "ICAO or Blue" choice with
"International Standard" (SI) and non-SI.  
[...]
[excerpts from the definition matrix:]

  Measurement of:           ICAO             Blue                SI      non-SI
  ---------------           ----             ----                —      ------

  Distances used in         Nautical miles   Nautical miles      km       nm
  navigation reports        and tenths       and tenths

  Relatively short          metres           metres              m
  distances

  Altitudes, elevations     metres           feet                m        ft
  and heights

[...]
Dimensional units to be used in air/ground communications applicable for
the following countries or FIRS: [excerpts]

France:         ICAO (8) (60)
Switzerland:    SI/non-SI
United Kingdom: Blue (63)
United States:  Blue (33)

[Relevant footnotes:]

(8)  Altitudes and heights on IAL charts in feet
(33) Relatively short distances in feet [...]
(60) [...]
(63) [...]

There are presently 69 footnotes explaining non-standardized measurements.
The international aviation community, in other words, doesn't have the
universal system of measurements which would be nice for everybody, and it's
not at all unlikely that some pilots could become mixed up over a readout.

On the other hand, anyone who would fly a transport with passengers aboard
on a low pass without significant experience at the controls of that 
make-and-model...

Joe Morris (jcmorris@mitre.arpa)


Mouse trap

"James H. Coombs" <JAZBO%BROWNVM.BITNET@MITVMA.MIT.EDU>
Tue, 12 Jul 88 14:19:49 EDT
A user posted this notice on our bulletin board:

    Subject:  Mouse Injury
    Category:  Computer hazard
    Text:  Add another to the list of computer-induced medical problems.
    In frantic, last 8 days before publication haste I used the mouse
    madly in formatting the handbook for employees, and managed to
    inflame the joint of my index finger so thoroughly that I'm now in a
    splint and taking nasty pills.  Had I known this could cause a
    problem, I'd have used keyboard commands whenever possible, as I
    normally do.  [...]


Threshold probability for declaring a radar blip "hostile"

Mike Wellman <MPW@ZERMATT.LCS.MIT.EDU>
Sat, 9 Jul 88 12:05 EDT
Clifford Johnson asks,

    ...at what perceived "odds" — 50-50, 60-40, 90-10 ? — does a
    commander have "sufficient cause" to "declare" a radar blip, that
    might be hostile or might be a commercial flight, officially
    "hostile," so as to shoot it down?

The short answer is that there is no such threshold probability.  The
question presumes the the commander faces a one-shot shoot/no-shoot
decision to be taken on the basis of information assessed at an instant
in time.  More realistically, the commander's options at any instant t
are (1) shoot, and (2) wait an increment delta-t and decide again at
t + delta-t.  During that interval, the ship is either attacked or it
gains further information about the blip (the mere absence of an attack
counts as information).  The "shooting threshold" depends on the time t,
the likelihood of attack during the next delta-t, and the prospects for
collecting further information in the subsequent time intervals.  In the
general case, the threshold can behave arbitrarily over time.

If we insist on framing this as a one-shot decision, the commander has
two options and there are two basic states of nature distinguished by
whether the blip is an F14 or a civilian aircraft.  Thus, there are four
possible consequences: 

C1 - (shoot, F14)
C2 - (shoot, civilian)
C3 - (no shoot, F14)
C4 - (no shoot, civilian)

I suspect that even ranking these consequences by desirability will be
controversial, except that C1 and C4 are obviously preferred to C2 and
C3.  Let du(F14) be the difference in utility between the better and
worse actions given the blip is an F14.  That is, 

  du(F14) = u(C1) - u(C3), and similarly let
  du(civ) = u(C4) - u(C2).

The one-shot decision recommended by this simple model is to shoot iff
the probability that the blip is an F14 is greater than p*, where

  p* = du(civ) / [ du(civ) + du(F14) ].

Note that p* = 1/2 exactly if du(civ) = du(F14), that is if the
differences between the "wrong" and "right" actions are the same for
both possible states.  The threshold is greater or less than 1/2 as the
civilian or F14 consequences are considered relatively more or less
significant.  It should also be emphasized that these terms are not
simply "the value of civilian versus military lives."

Again, this model is outrageously simplistic because it entirely ignores
the dynamic nature of the actual decision and the important role of
prospective information.

Johnson's second question is

    given the shortness of the response time, what could be the best
    odds attainable in a realistic attack scenario, even assuming the
    best computer technology the United States could field?

There are no general limits to the extremity of the posterior odds,
regardless of technology, simply because the priors can be arbitrarily
extreme.  In particular situations, however, limitations of the sensing
technology do bound the final assessment.

--Mike Wellman.


Threshold probability for declaring radar blip "hostile"

Clifford Johnson <GA.CJJ@Forsythe.Stanford.EDU>
Sun, 10 Jul 88 13:44:59 PDT
>      ...at what perceived "odds" — 50-50, 60-40, 90-10 ? — does a
>  The short answer is that there is no such threshold probability...

Yes and no, mostly no.  Time passes, and in all circumstances what you say
holds true only until a "use them or don't use them" decision deadline dictated
by the particular threat perceived.  After this time, the intended defense is
ineffective.  Curiously, in the Iran shootdown it has been reported that the
decision came a little late, which perhaps suggests that the Captain panicked
just when the flight turned towards its center corridoor, which happened to be
in the direction of the ship.  This would mean that the flight was shot down
*because it responded* to the strident warnings.  If the threat of missile
attack had been usual (supposing arguendo that F-14's could deliver missiles),
the plane would have been hit after it had released its missiles.

I don't contest your game theory — such utilities are increasingly being
incorporated into online military battle managers.  In the envisaged naval
battle management system, the decision would presumably be recommended in such
utilitarian terms to the Captain or remote Admiral.  This means that such
values as we have speculated about really are being written into military
hardware that actually or virtually executes its own Rules of Engagement.  I
shudder.

>      given the shortness of the response time, what could be the best
>      odds attainable in a realistic attack scenario, ...
>  There are no general limits to the extremity of the posterior odds, ...
>  In particular situations, however, limitations of the sensing technology
>  do bound the final assessment.

The final point counts, I agree, but an uncontrolled risk-amplification may in
general occur in real-time guestimation of the background or a priori
probabilities of a threat.  Calling an alert, or issuance of a strategic
warning, which is what the Gulf forces got over the July weekend, does exactly
that, it vastly distorts a priori probabilities.  And I say "distort" with
academic rigor, for it is a well-proven tenet of war that a strategic warning
is highly unreliable but easy to believe.  Just like a sale isn't final until a
check is signed, or rather, until the computers say so, so hostilities may not
occur until a first shot is fired, or rather, until the computers say so.  Then
the firing of the first shot then places a priori odds beyond reason,
permitting a commander to see threats even from directions that shots have not
been fired from.

Clearly, it is in everyone's interests that a priori probabilities of conflict
are not unreasonably figured.  Without questioning good military intentions, I
am concerned that the likelihood of civilian deaths features insufficiently in
the U.S.  Rules of Engagement.  Witness the destruction of a mental hospital in
Grenada, of many civilians in the Libyan raid, and now of a civilian jet — and
so far, the military has not been faulted for any of these mishaps, on the
basic grounds that civilian deaths were due to tragic technical glitches
tolerable in the circumstances.

I am most concerned that the nuclear SIOP implicitly contains a priori
estimates of threat probabilities which are unreasonably boosted to protect the
Air Force and the defense establishment.  This in effect devalues civilian
consequences, and heigthens the danger.  And yet the nuclear hair-trigger is so
unopposed that the Strategic Air Command is *celebrating* 1988 as "The Year of
the SAC Alert Force," in commemoration of the 30-year-old alert called for
SAC's bombers on October 1, 1957.

Please report problems with the web pages to the maintainer

x
Top