The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 13 Issue 6

Friday 24 January 1992

Contents

o A320
Peter Mellor
T.C.Bennett
Ken Tindell
o Computerized Chauvinism
Brian Randell
o "Desert Storm" viral myths
Rob Slade
o "Designed-in Hardware Viruses" in the movies: "GOG"
Lauren Weinstein
o Sharing Idle Time with Linda
David C Lawrence
o Software Safety Correction
Tony Zawilski
o Re: Ohio justices fight over computer snooping
Christopher Stacy
o Info on RISKS (comp.risks)

A320 IT5148: Don't worry - just pilot error

p mellor <pm@cs.city.ac.uk>
Wed, 22 Jan 92 16:48:30 GMT
At around 1945 local time (1845 GMT) on Monday 20th January, Air Inter flight
IT5148 from Lyon to Strasbourg vanished from radar and radio contact. (In some
reports the time is given as 1920 local time.) Five hours later, rescue teams
arrived at the crash site on Mont Sainte-Odile (2496 feet) in the Vosges, about
30 miles south of Strasbourg. The aircraft was around 75 minutes into its
flight, and about 7 minutes flying time from Strasbourg airport.

Of 90 passengers and 6 crew, 87 died, 9 survived. All of the survivors were
in the rear section of the aircraft. Ten casualties were reported to have
survived the impact, but frozen to death while awaiting rescue. The temperature
on the ground was around -10 degrees C, there was thick fog, or cloud
clinging to the mountains, and deep snow. The nearby village of Barr was used
as a clearing station for the casualties.

No emergency message was received from the cockpit before impact. Survivors
report that there was no prior warning, and that nothing seemed to be wrong
until the impact occurred.

The emergency beacon which should have guided rescuers did not operate.
According to one report, it was destroyed by the crash.

At that point on that approach to Strasbourg, the recommended height is 9000
feet, and the minimum height for a safe approach is around 4700 feet. On
impact, the aircraft was below 2500 feet. Strasbourg airport is equipped
with a directional beam for use by aircraft instrument landing systems (ILS).
For some reason, IT5148 was not following the beam. Just before impact, the
aircraft was descending at 2300 ft/min, instead of 800 ft/min (DGAC spokesman).

The flight data recorders, i.e., Cockpit Voice Recorder (CVR) and Digital
Flight Data Recorder (DFDR), had been found by Tuesday night, but (according
to one report) are damaged. They have been taken to Paris for examination.
According to one report, they were accompanied all the way by the local
examining magistrate in person.

A Commission of Enquiry has been appointed, and an interim report is expected
within a month.

The pilot, Christian Hecquet, and his co-pilot had 14000 flying hours between
them, but had only recently switched to the A320 from flying Caravelles. The
aircraft which crashed went into service in 1988, and had a maintenance check
on Monday morning.

This information has been extracted from various reports in The Guardian,
The Times, The Daily Telegraph and the Daily Mail, 21st and 22nd Jan.

      **** Above are the facts, below is the speculation. ****

The crash bears a certain resemblance to the previous two A320 accidents, at
Mulhouse in 1988, and Bangalore in 1990. In all three accidents, the pilots
seemed to think that the aircraft was higher than in fact it was.

According to the DGAC spokesman, four possible causes are being investigated:-

- "altitude computer" (FMGC? - see below) failure,
- engine cut-out caused by bad weather,
- ice build-up on the wings,
- human error (in the cockpit, maintenance or ATC)

The latest crash is a serious embarrassment for Airbus Industrie, and already
mutterings of "pilot error" are being heard. This is most obvious in
The Times of 22nd Jan. The front page headline is:-

"Experts suspect pilot error. Crash Airbus `programmed to fly too low' "

The article (by Harvey Elliott, Air Correspondent, and our foreign staff)
begins:

"The pilots of an Airbus jet that crashed into a French mountain killing 87
 people probably programmed the aircraft to fly too low.

 As a five-man commission began its enquiry into Monday night's accident,
 safety experts tried to recreate on simulators the last minutes of the Air
 Inter flight from Lyons to Strasbourg. Their efforts suggest that the A320's
 "fly-by-wire" technology was not to blame."

Later in the same article:-

"Computers are capable of operating all flight controls on the Airbus, other
 than altitude, but David Velupillai of the manufacturer, Airbus Industrie,
 said: "If you program it to fly into a mountain, it will." The aircraft will
 prevent the pilot making a manoeuvre outside of its built-in "safety
 envelope", but it cannot tell the pilot that he is heading for a mountain
 until a few seconds before impact, when lights and buzzers alert him that he
 is close to the ground.

 The Air Inter jet should have been at about 9,000 ft as it approached
 Strasbourg airport. The minimum altitude for any aircraft in that area is
 4,700 ft, but the Airbus crashed into the mountainside at no more than 2,500
 ft. Experts working on simulators yesterday believe that the pilot may have
 thought he was nearer the runway than he was, and pushed the "open descent"
 button that would take the aircraft to a pre-programmed altitude. Otherwise,
 he might have forgotten about the peaks and programmed a "normal descent"
 putting him on a crash course."

On page 9 of The Times, 22nd Jan., the headline reads:-

"Computer error by pilots suspected" by Harvey Elliott, Air Correspondent.

and the article goes as follows:-

"A fatal programming error by the pilots of the Airbus A320 jet which smashed
 into a French mountainside was emerging last night as the most likely cause of
 the the crash which killed all but nine of those on board.

 Safety experts, anxious to discover if anything had gone horribly wrong with
 one of the world's most advanced passenger aircraft, took over simulators from
 Airbus customers around the world to try to recreate the last minutes of the
 flight IT5148 as it approached Strasbourg airport. Slowly, although with no
 real proof that their theories were right, they began to build up a picture of
 confusion in the cockpit.

[Stuff about minimum altitudes omitted.]

 Height is the one parameter not controlled by computer in the Airbus. It is up
 to the pilots to tell the aircraft's five computers what height they want to
 fly at by dialling in a particular altitude.

 The track the aircraft flies can be programmed in before take-off and the
 computers then automatically follow it."


My own initial reaction to this is:-

1. The extension of the concept of "pilot error" to "pilot computer error" is
   interesting. (Not only can those dumbos not fly a 'plane, they can't even
   program a computer! :-)

2. Height is not a "parameter" which can be controlled directly, in the same
   way that pitch, roll, yaw and thrust are governed by the Electronic Flight
   Control System (EFCS) and the Full Authority Digital Engine Control (FADEC).
   The "five computers" referred to in the Times article above are presumably
   the three Spoiler and Elevator Computers (SEC) and two Elevator and Aileron
   Computers (ELAC) which together make up the EFCS. The pilots do not "tell"
   these computers anything about altitude. The Flight Management and Guidance
   Computer (FMGC) performs the autopilot function, and interfaces with the
   EFCS to cause the aircraft to follow a pre-programmed course. The FMGC *can*
   control altitude, by manipulating the EFCS. In some circumstances, the FMGC
   can select "open descent" mode automatically, or, on approach to an airport,
   the pilot can select it manually.

3. Unless we assume that two experienced pilots simply forgot that there are
   a few mountains in the way on the Strasbourg run, they would only have
   selected a descent mode prematurely if they did not know where they were.
   If they didn't, why not?

4. The aircraft did not "smash into a mountainside". It crashed into trees in
   an area where there is a fair amount of reasonably level terrain, as shown
   by the fact that the rear section was slowed down by the tail catching in
   trees. Close to impact, the altitude alarm should have started to give
   audible warnings at 200 ft.

Not everyone is so keen to accept the "pilot error" theory. One source was
quoted as saying that, in an aircraft like the A320, pilot error lies in the
"hiatus between the pilot and the computer" (lovely phrase!).

In the Times article, we read:-

"Jean-Paul Maurel, general secretary of the French pilots' union, said the
 aircraft had been on a normal approach path, well above the Vosges peaks when
 it suddenly plunged and hit the ground in less than a minute."

After 4 years in service, and 600,000 flying hours, the A320 has scored 3
major accidents and 177 lives lost. In 11 years, the Boeing 757 has flown
4 million hours with no fatalities.

As the headline to the Daily Mail feature article which quoted these statistics
asks: "Is this aircraft too clever for its own good?"

Peter Mellor, Centre for Software Reliability, City University, Northampton
Sq., London EC1V 0HB +44(0)71-253-4399 Ext. 4162/3/1 p.mellor@city.ac.uk

       [Also noted by  Jonathan.Bowen@prg.oxford.ac.uk . ]


Re: Another A-320 crash in France (RISKS-13.05)

<T.C.Bennett@syse.salford.ac.uk>
Thu,23 Jan 92 11:48:08 GMT
In Risks 13.05 Romain Kroes, Secretary General of SPAC is quoted as saying "...
it has been clear to us that the crews were caught out by cockpit layout"

     Surely this statement implies that there is a problem with training
rather than the software or the crew.

PGN notes an article in the San Fran Chron saying there was an abrupt drop in
altitude of 2000 feet in approaching the airfield.

     Since the previous two crashes of this type of aircraft seemed to be
related to the altimeter it would imply that a specialised set of conditions
that arises very infrequently could cause incorrect altimeter readings that
indicate to the pilot that he is too high. Since both flight recorders are
reported damaged we probably won't find out on this one though....

Thomas

Most system flaws can be attributed to unwarrantedly anthropomorphizing the
user....

BITNET  : ua0019%uk.ac.salford.syse@uk.ac
     or : ua0019%uk.ac.salford.syse%ukacrl.bitnet@cunyvm.cuny.edu


Speculation on latest A-320 crash: why?

<ken@minster.york.ac.uk>
23 Jan 1992 15:04:47 GMT
Before we speculate over the cause of the crash we ought to bear in mind that
there are vested interests in the A320. Airbus Industrie is knocking spots of
Boeing and MD in sales. The US FAA can easily take the side of US
manufacturers, using "software safety" as an excuse.  Similarly, the French
Government, which will investigate the accident, owns a share in Airbus
Industrie (via Aerospatiale), and has a motive to find the accident was caused
by "pilot error". There is nationalism bound up in this: Europeans are proud of
the achievements of `home grown' industry, Americans are proud of their home
aircraft industry.  The news reports want a sensational accident to report -
"Computer kills 80" instead of "Man bites Dog".  Safety Critical Software
researchers want to get on TV telling the world how bad computers are and
"incidentally, funding of research in safety is far too low..." We should also
bear in mind that the pilots' unions are concerned over the manning of modern
aircraft: the 747-400 `loses' an engineer. Pilots are also concerned over the
`loss of things to do'.

We should also be aware that the A320 is far from unique in having computer
control. Many many commercial aircraft have computers controlling some or all
aircraft subsystems. I spent a long time a couple of years ago going through
microfilm of The Times checking up on aircraft accidents. Way before the
Habsheim crash I found that an DC-9 aircraft crashed from a slat/flap computer
being incorrectly operated, deemed Pilot Error (August 17th 1987). The 747-400
manifested a very serious auto-throttle bug, causing loss of engine power (See
RISKS 10.04).

Now, there is the very serious problem of HCI in these aircraft - the "glass
cockpit" problem. The Bangalore crash (See RISKS 10.48) was blamed on pilot
error, but it could be argued that the pilots were lulled into a false sense of
security by the autopilot. It could also be argued that the training of pilots
when flying in "glass cockpits" is inadequate. However, the A320 is not unique
in having these problems.

Before any hysteria breaks out over computer control in aircraft (e.g.  the FAA
revoking the A320 license, the European aviation authorities responding by
banning the 747-400, etc), we must consider what things were like before
computers. In RISKS 12.72 and 13.01 the problem of the A320 Fuel Monitoring
system was discussed. In olden days the re-fuelling of aircraft was a real
problem. Different airports have different fuel densities, qualities, etc.
Certain aircraft can only run on certain grade fuel, and whenever refuelling
the technicians must know how much fuel is left and what grade it is, how the
new fuel will mix with the old, and how much the new fuel will weigh
(obviously, the fuel tanks have a volume capacity, whereas different density
fuels will weigh different amounts) - there is an (apocryphal) story of a jet
being trapped at an African airport because the fuel quality was too low. The
density of fuel will change the way the aircraft is trimmed, and will change
the range of the aircraft. Do RISKS readers seriously believe that ground crews
with clipboards and pocket calculators made less mistakes than the A320 fuel
control system?

I should point out that I am a European, and that my research (into Distributed
Hard Real Time Systems) is being funded by a member of the Airbus consortium,
and that my opinions are mine and no-one else's. You are free to ask whether I
too have a vested interest in computer control of commercial aircraft.

Finally, I would like to quote an `expert' on aviation who in a recent TV
interview said that "the common theme to all three of the A320 crashes is
lack of altitude".

Ken Tindell, Computer Science Dept., York University, YO1 5DD, UK                                +44-904-433244   ..!mcsun!uknet!minster!ken


Computerized Chauvinism

<Brian.Randell@newcastle.ac.uk>
Tue, 21 Jan 92 18:10:47 GMT
This brief item appeared in the Jan 19 issue of The Observer, one of the
"quality" national Sunday newspapers here in the UK. As a Welshman, though to
my regret not Welsh-speaking, I take particular and personal exception to this
example of what I would term computerized chauvinism, though I am sure that
many similar examples have been perpetrated elsewhere.
                                                            Brian Randell

                            DoT HANDICAP

The Department of Transport has explained that applicants wanting driving
tests in the Welsh language have been labelled 'disabled' because the
computer system only has space at present under that heading. In a letter
to Gwyneth County Council from its Manchester base, the Department said:
'In no way does the Department consider any person whose first language is
not English as disabled.'

Computing Laboratory, The University, Newcastle upon Tyne, NE1 7RU, UK
EMAIL = Brian.Randell@newcastle.ac.uk   PHONE = +44 91 222 7923


"Desert Storm" viral myths

Rob Slade <p1@arkham.wimsey.bc.ca>
Wed, 22 Jan 92 15:15:06 PST
The recent spate of reports of a virus which shut down Iraq's air defence
system during "Desert Shield/Storm" seems to have started with the series
"Triumph Without Victory: The Unreported History of the Persian Gulf War" by U.
S. News and World Report.  The articles are being rerun in many papers (as
well, apparently, as CNN and ABC Nightline), and the article on the virus run
in my local paper is specifically credited to USN&WR.  The bare bones of the
article are that a French printer was to be smuggled into Iraq through Jordan,
that US agents intercepted the printer, replaced a microchip in the printer
with one reprogrammed by the NSA, that a virus on the reprogrammed chip invaded
the air defence network to which the printer was connected and erased
information on display screens when "windows" were opened for additional
information on aircraft.

The first question is: could a chip in a printer send a virus?  Doesn't a
printer just accept data?

Both parallel/Centronics and serial RS-232 ports are bidirectional.  (Cabling
is not always, and I well remember having to deal, in the early days of PCs,
with serial ports which had been used as printer ports, and could not be used
as modem ports because the "return" pin had been sheared off, a common practice
to "fix" balky printers.)  However, the "information" which comes back over the
line is concerned strictly with whether or not the printer is ready to accept
more data.  It is never accepted as a program by the "host".

The case of "network" printers, is somewhat more complex.  There are two
possible cases: network printer servers and "network printers (such as
the Mac Laserwriters): and they are quite distinct.  The print server
(on, say, DECnet) is actually a networked computer acting as a print
server; accepting files from other network sources and spooling them to
a printer. True, this computer/printer combo is often referred to simply
as a printer,  but it would not, in any case, be able to submit programs
to other hosts on  the net.  The Mac case is substantially different,
since the Mac laser printers are attached as "peers".  Mac Laserwriters,
at least, do have the ability to submit programs to other computers on
the network, and one Mac virus uses the Laserwriter as a vector.
However, it is unlikely that the Iraqi air defence system was Mac based,
and few other systems see printers as peers.

Second question: if it *was* possible to send some kind of program from the
printer to the computer system/network, was it a virus?

Given the scenario, of a new printer coming into an existing system, any
damaging program would pretty much have had to have been a virus.  In a
situation like that, the first thing to do when the system malfunctions after a
new piece of equipment has been added is to take out the new part.  Unless the
"chip" could send out a program which could survive, in the network or system,
by itself, the removal of the printer would solve the problem.

Third question:  could a virus, installed on a chip, and entered into
the air defence computer system, have done what it was credited with?

Coming from the popular press, "chip" could mean pretty much anything, so my
initial reaction that the program couldn't be large enough to do much damage
means little.  However, the programming task involved would be substantial.
The program would first have to run on the printer/server/peripheral, in order
to get itself transferred to the host.  The article mentions that a peripheral
was used in order to circumvent normal security measures, but all systems have
internal security measures as well in order to prevent a printer from "bringing
down" the net.  The program would have to be able to run/compile or be
interpreted on the host, and would thus have to know what the host was, and how
it was configured.  The program would then have to know exactly what the air
defence software was, and how it was set up to display the information.  It
would also have to be sophisticated enough in avoiding detection that it could
masquerade as a "bug" in the software, and persistent enough that it could
avoid elimination by the reloading of software which would immediately take
place in such a situation.

The Infoworld AF/91 prank article has been mentioned as the "source" for the
USN&WR virus article.  There was, however, another article, quite seriously
presented in a French military aerospace magazine in February (which possibly
prompted the Infoworld joke.)  This earlier article stated that a virus had
been developed which would prevent Exocet missiles, which the French had sold
to Iraq, from impacting on French ships in the area.  The author used a mix of
technobabble and unrelated facts, somehow inferring from the downloading of
weather data at the last minute before launch, the programmability of targets
on certain missiles and the radio destruct sequences used in testing that such
a "virus" was possible.

It has also been rumoured, and by sources who should know, that the US military
has sent out an RFP on the use of computer viri as computer weapons.  Although
I have not seen the request, I *do* believe it went out, and we have
confirmation in the report of a contract being awarded for further study in
that area.  I *don't* believe in the USN&WR report.

copyright Robert M. Slade, 1992   DEFMTH7.CVP   920115

PS - I have only received *one* report of the Mac Laserwriter virus, so
don't take it as gospel.  Laserwriters *are*, however, peers on an
Appletalk net.  (None of this is to be confused with the
Laserwriter/Postscript password trojan/virus.)

Vancouver      p1@arkham.wimsey.bc.ca   | "A ship in a harbour
Institute for  Robert_Slade@sfu.ca      |  is safe, but that is
Research into  CyberStore Dpac 85301030 |  not what ships are
User           rslade@cue.bc.ca         |  built for."
Security       Canada V7K 2G6           |           John Parks


"Designed-in Hardware Viruses" in the movies: "GOG".

Lauren Weinstein <lauren@vortex.com>
Wed, 22 Jan 92 13:31:39 PST
On the topic of how it would (obviously) be much easier to get your adversary's
system to accept erroneous commands if you had "designed-in" such abilities,
the film-conscious reader might wish to check out the science-fiction film
"Gog" (1954).

In this actually above-average presentation, a top-secret U.S.  space
research/defense facility is plagued by a series of inexplicable "accidents".
These include deaths due to runaway centrifuges, malfunctions of a giant solar
mirror (ZAP!), and attacks by two utility robots ("Gog" and "Magog") which
almost destroy the facility, among other major problems.

It is eventually determined that the various incidents have all been triggered
by enemy forces, via a very stealthy high-altitude jet, sending signals to
special receivers which were embedded within the facility's central computer by
enemy agents during the computer's construction in Germany.

So keep your eyes on those printers!
                                                  --Lauren--


Sharing Idle Time with Linda

David C Lawrence <tale@cs.rpi.edu>
Thu, 23 Jan 92 09:37:15 EST
_The__New_York_Times_ led its January 19, 1992, Business section with "David
Gelernter's Romance with Linda", an article which discussed the massive amounts
of free computing cycles available at any given moment.  The thrust of the
article was that these cycles could be harnessed to speed up compute intensive
jobs, a general concept with which computer scientists have been working for
years.  Early on, however, it was alluded that sights are set on going after
not only machines arranged to do the work by groups of co-operating
researchers, but indeed after any machine it can access.

Several columns in it was said without even the blink of an eye:

  Mr. Gelernter visualizes all these computer networks linked together
  --- along with all the desktop computers that are not now linked to
  anything.  When that happens, his piece de resistance will go to
  work:  a software program that constantly goes from computer to
  computer seeking out idle computer power and putting it to work.

I was a bit amazed at how directly this was offered, suggesting mostly
that the only hurdles to overcome were technical ones.  This even
after they had already discussed the great secrecy with which some
Wall Street firms are using the technology.  At least several
paragraphs later they got to "At Issue: Free Choice":

  What is to keep Pirhana Linda or its descendants from being
  subverted by someone who wants to tamper with another computer or
  steal information?  And what if an individual doesn't want to share
  a computer?  Indeed, a generation of computer users embraced desktop
  technology in the 1980's precisely because they were suddenly freed
  from sharing a single mainframe computer with hundreds or thousands
  of others. [...]

  Privacy experts say the issue is a broader one:  being able to chose
  whether to participate at all.

  "The critical test for any technology is whether it leaves you the
  ability to retreat into a private sphere," said Mark Rotenberg,
  Washington director for the Computer Professionals for Social
  Responsibility.  "If you can't turn the system off, you're trapped."

Good!  At least they seem to be aware of some of the risks and social
obstacles they beget.  But wait, there's more ...

  But trends already taking hold in the computer industry are likely
  to SWEEP ASIDE or OVERRUN such concerns.  The growth of networks is
  expected to continue as more and more corporate data processing
  executives turn to Mr. Gelernter's ideas about parallel computing.

[Emphasis mine.]

Now, I don't really have a problem with these executives using
machines like this within their own networks, but I do have a problem
with it being used more widely.  I am unaware of exactly what this
care-free "trends in the computer industry are", but if they do exist
I hope other RISKS contributors can point them out.  I had been under
the impression that with the founding of organizations like the EFF
and CPSR that trends within the computer society were actually better
about issues like this.

Perhaps I am over-reading the scope with which "a computer program
which goes from computer to computer and network to network" was
offered, but even so, there seem to be many potential pitfalls which
must be guarded against to prevent such a helpful migrant worker from
being mutated into a rogue virus with millions of possible victims.

I am quite confident that Mr. Gelernter and the others working on this
project only have the best intentions in mind, but I do hope to see
better how they are addressing the risks.


Software Safety Correction

Tony Zawilski <m16143@mwvm.mitre.org>
Thursday, 23 Jan 1992 08:06:57 EST
   [long message to say given email address should have been  ]
         zawilski@mitre.ORG   not .com


Re: Ohio justices fight over computer snooping (Harding, RISKS-13.04)

Christopher Stacy <CStacy@STONY-BROOK.SCRC.Symbolics.COM>
Wed, 22 Jan 1992 15:06-0500
Bob Frankston tells about the good old days of Multics, where the system
defaultly enforced the convention that you don't have access to what's in your
neighbor's desk.

It's worth noting that, at the same place and time, another laboratory across
campus operated under a different set of social conventions.

The ITS operating system was implemented without any security.  Any user could
access another person's files or mailbox, or view another user's screen in real
time, and in fact these practices were considered socially acceptable.  In
certain cases, this limited the utility of the system.  For example, grading
information was not usually kept on the machine.  However, this open attitude
and policy was generally felt to be more desirable than a secure situation, and
there's no doubt that it contributed substantially to the techno-social
environment there, and to the success of the laboratory.

I am mentioning this here to remind us that computer environments interact
with, and partially redefine, the social situations that they are intended to
support.  (Of course, most computer systems don't support the subtle
intricacies of secrets and privacy in natural social settings.  Even flexible
ones like Multics are often considered by users to be too difficult to figure
out how to use with the desired level of finesse.)

Computer privacy ("security") systems need to be flexible, human engineered,
understood by their users, and have their policies advertised and in
conformance with the social setting in which they are used.  It's very easy for
counter-productive security measures to infect a group's thinking - a real case
of a "computer virus" infecting people!  :)

Please report problems with the web pages to the maintainer

Top