The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 3 Issue 79

Sunday, 12 October 1986


o China Air incident... the real story
Peter G. Trei
o Air-Traffic Control Spoof
Peter G. Neumann
o Aviation Accidents and Following Procedures (RISKS-3.77)
Matthew Waugh
o DC-9 crash again
Peter Ladkin
o Info on RISKS (comp.risks)

China Air incident... the real story

Mon 13 Oct 86 01:04:22-EDT
Excerpted from 'Tumbledown Jumbo', an article in the Oct 86' issue of
FLYING magazine, concerning the China Airlines 006 incident of Feb 86.

ellipses and contractions in [square brackets] are mine.
   At one point the autothrottle brought the engines back to about zero 
thrust. the throttles came forward again, the number-four engine did
not respond. The flight engineer ... told the captain that the engine 
had flamed out.
   Maximum restart altitude is 30,000 feet [the plane started at 41,000].
The captain told the first officer to request a lower altitude. He then
told the engineer to attempt a relight, even though the plane ... was still
at 41,000. The restart attempt was unsuccessful.
   The captain ... released the speed and altitude hold on the autopilot. The
autopilot was now programmed to maintain pitch attitude and ground track. The
airplane continued to lose speed gradually ... and the captain eventually 
disengaged the autopilot completely and pushed the nose down.
   At the same moment, the airplane yawed and rolled to the right. The
captain's attitude indicator appeared to tumble [as did two backups].
The airplane had now entered the clouds. At the same time ... the other three
engines quit.

[paragraph omitted, describing speed varying between Mach .92 and 80 knots,
as crew attempts recovery under up to 5G accelerations.]

   After ... more than two minutes, the 747 emerged from the clouds at 11,000
feet and the captain was able to level it by outside reference. Coincidentally,
he felt that the attitude indicators 'came back in' at this point. [engines
1,2, & 3 restart themselves, and 4 responds to a checklist restart].

Initially the captain decided to continue ... [but it was noticed
that] the landing gear was down and one hydraulic system had lost all
its fluid. ... the captain decided to land at San Francicso. The plane operated
normally during descent, approach and landing.

    [Later analysis showed that engine four had NOT flamed out, but just stuck
at low thrust due to a worn part. The others were also responding to the 
throttles very slowly, a common problem at 41,000 feet. The NTSB inquiry
concluded that...] the captain had become so preoccupied with the dwindling
airspeed that he failed to note that the autopilot, which relied on ailerons
only, not the rudder, to maintain heading, was using the maximum left control-
wheel deflection available to it to overcome the thrust asymmetry due to the 
hung outboard engine. When the right wing nevertheless began to drop, ...
the captain didn't notice the bank on the attitude indicator ... . When he
did notice it, he refused to believe what he saw. At this point, ... the
upset had begun and the captain and first officer were both spatially 


    Once the erroneous diagnosis of a flameout had been announced, ... the
captain placed excessive reliance on the autopilot.... When he finally
disengaged it, and put himself 'back into the feedback loop' it was at a
critical moment, and he could not adjust quickly enough to the unexpected
combination of control feel and instrument indications to prevent the upset.


     The rest of the article is devoted to RISKS-style analysis of use
of automatic systems. To give a more down-to-earth (pun intended)
analogy, suppose your car was equipped with an AI 'drivers assistant',
which handled all normal highway driving. Suppose further, at night,
with you drowsy and at 60 mph, the right front wheel blows out. The AI
blasts the horn to alert you, and applies substantial left torque to
the steering wheel to keep it straight. You realize your in trouble,
grab the wheel, and turn off the AI. The wheel immediatally jumps out
of your hands to the right (you didn't know how much torque the AI was
applying), and the car swerves off the road...

    The use of automated systems to handle routine operations of critical
systems, with dangerous situations suddenly dumped in the hands of human
operators, presents a new Risk... that they may not fully understand the
ramifications of the problem during the critical transition time.

    A co-worker of mine who has worked in both the Navy and civilian
nuclear programs tells me that Navy reactor systems are designed to keep
humans in the loop. The only thing the automated systems can do without
a person is 'scram' or shut down the reactor. Changes in power level,
opening and shutting valves, pulling control rods, operating pumps, etc,
must be performed by the people constantly tending the reactor. Thus, the
system cant very easily spring surprises on the operators.

Air-Traffic Control Spoof

Peter G. Neumann <Neumann@CSL.SRI.COM>
Sat 11 Oct 86 20:03:57-PDT
Some of you may have missed a recent set of rather serious breaches of the
integrity of the air-traffic control system.  It is another important
instance of a masquerading spoof attack typified by the Captain Midnight
case (although via voice rather than digital signals).  [Again note the
October 86 issue of Mother Jones noting similar vulnerabilities and the ease
of performing attacks.]

  Washington Post, 8 October 1986

  MIAMI -- A radio operator with a ``bizarre sense of humor'' is posing as
  an air traffic controller and transmitting potentially dangerous flight
  instructions to airliners, and pilots have been warned about it, an
  Federal Aviation Administration spokesman said.  Two fake transmissions
  have occurred in the last week, and one caused a premature descent, said
  Jack Barker of the FAA's southern region in Atlanta.  ``There have been no
  dangerous incidents, but the potential for danger is there.  It's more an
  annoyance than a safety problem,'' Barker said from an FAA meeting in
  Washington.  Barker said the operator uses two frequencies that air
  traffic controllers use to tell pilots how to approach Miami International
  Airport.  The transmissions began Sept. 25, and the last was Friday [3
  Oct], he said.

Aviation Accidents and Following Procedures (RISKS-3.77)

Fri, 10 Oct 86 11:50:44 PDT
The accident report involving a British Airways 737 at Manchester Airport
was released recently. The aircraft suffered an engine compressor failure on
take-off. The aircraft instruments indicated something else (I'm a little
hazy about exactly what, I think it was a tire burst), and standard
operating procedure was to turn clear of the runway, basically I believe to
clear the runway for other traffic. This the pilots did, bringing the wind,
which had been dead ahead to blow from the now burning engine and wing, onto
the fuselage. Multiple lives were lost, etc.

It would appear from this that had the pilots performed an abort and
maintained the runway, all that would be required for safety reasons, the
deaths could have been reduced or avoided. However the operating procedure,
for operational (not safety) reasons mandated otherwise and worsened an
otherwise pretty terrible situation.

UUCP   : {ihnp4|mtuxo}!naples!mjw   Matthew Waugh
ATTMAIL: attmail!mjw            AT&T IS, Lincroft, N.J.
                                Telephone : (201) 576-3362

DC-9 crash again

Peter Ladkin <ladkin@kestrel.ARPA>
Fri, 10 Oct 86 14:50:49 pdt
Danny Cohen's point about accuracy is well taken. The incident I was trying
to refer to was the crash of Eastern 212, a DC-9, in Charlotte, N.C. I
apologise to Risks readers for not confirming this before posting.

Danny and I have exchanged letters on the issue of *deliberate override*.
Danny considers the action of turning off the LAAS to be both non-deliberate
and not an override.  I still consider it both deliberate and an override.
It seems to hinge on whether habitual actions can be described as
deliberate, and on whether not following prescribed procedure upon receipt
of a warning can be considered an override.

Peter Ladkin

Please report problems with the web pages to the maintainer