The RISKS Digest
Volume 5 Issue 31

Friday, 21st August 1987

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

"Computer Failed to Warn Jet Crew"
PGN
Risks to Privacy
Jerome H. Saltzer
ATM features
Jack Holleran
Licensing software engineers
Frank Houston
Dave Benson
Re: Risks of automating production
henry Spencer
Re: Automated environment control
Robert Stanley
Brian Douglass
Trusting Computers
Marcus Hall
Info on RISKS (comp.risks)

"Computer Failed to Warn Jet Crew"

Peter G. Neumann <Neumann@csl.sri.com>
Fri 21 Aug 87 08:34:09-PDT
The front page of the Washington Post this morning described the current
hypotheses on the crash of the MD80, Northwest Flight 255, in Detroit on 17
August 1987.  The flight recorder indicates that the flaps were not set for
takeoff, although another Northwest pilot reported to the contrary.  As most
of you now know, the pilot and copilot apparently omitted the checklist
procedure for setting the flaps.  Today's addition to the emerging picture
is that, when the attempted takeoff began, the computerized warning system
failed to announce (by simulated voice) that they had neglected to set the
flaps.  It was supposed to.  However, a subsequent computerized warning did
indicate the final impending STALL, which indicates that the computer was at
least working, contraindicating speculation that the warning system might
have failed because of the circuit breaker being turned off.  This leaves
open the possibilities of sensor failure, faulty wiring, and computer
problems — hardware failure, software misdesign, etc.  (A flap indicator
switch had failed on that plane in January and had been replaced, but this
was apparently discounted in importance because of redundancy.)  Another
possibility being considered is that the flap and slat controls could have
been properly set, but failed to deploy...


Risks to Privacy (re: RISKS 5.30)

Jerome H. Saltzer <Saltzer@ATHENA.MIT.EDU>
Thu, 20 Aug 87 10:14:37 EDT
Today's (Thursday, August 20, 1987,) Eastern Edition of the Wall Street
Journal carries a page-one leader article on risks to privacy of government
data banks and cross-agency record matching.  Most of its material is likely
to be familiar to RISKS readers, though it includes a couple of incidents I
hadn't heard reported before.  
                                                Jerry
    [Jerry pointed out western and eastern editions may differ.] 


ATM features

Jack Holleran <Holleran@DOCKMASTER.ARPA>
Thu, 20 Aug 87 12:47 EDT
Reprinted from Readers Digest (page 111, August 1987) (without permission)

GUARD YOUR CARD

  Many BANK CUSTOMERS do not realize that their automatic-teller machine
(ATM) cards are not protected by the federal laws that cover credit cards 
like Visa and MasterCard (which have a $50 maximum liability if lost or 
stolen).  ATM liability is also $50, but *only* if the card is reported 
missing within two days.  After that, your liability rises to $500, and
after 60 days the amount you could get stuck for is *unlimited*.  So
guard your ATM card and personal identification number (PIN), or you could 
lose a small fortune.

                                 ---Molly Sinclair in *Family Circle*

No wonder banks, etc. want us to have ATM cards.


Licensing software engineers

Frank Houston <houston@nrl-csr.arpa>
Thu, 20 Aug 87 10:10:21 edt
The following does not reflect my opinion on professional licensing, which I
believe mutually benefits both the licensee and the public, but reflects my
concern about the amount of influence a license may carry.

While I am generally neutral about professional licensing, I am concerned
about the practice of making a licensed individual "responsible" for safety,
security or quality, just as I am concerned about assigning such
"responsibility" to a specific department in any organization.  Too often
everyone in the company or project assumes that the "responsible" person or
group will catch all problems; that the rest of the process can be a little
sloppy, "..and besides there is a schedule to meet."

Of course the responsibles do not completely catch anything except the blame
for what went wrong (which they try to shift to Murphy).  Businessmen
operate by the modern equivalent of the "Code of Hamurabi," but it is
impractical to fire a whole engineering group or even a senior designer
(expensive to replace); so the quality assurance department is the target of
opportunity (low pay, low prestige).  I'll wager that the average senior
designer has twice the seniority of the average senior QA engineer.

The point that I am striving for is that assigning to SOMEBODY the
responsibility for safety, quality or security, whether by licensing, title
or regulation, is not only insufficient to solve the problem but may also be
detrimental in the long run because it promotes complacency in the rest of
the team.

In the quality assurance literature, anecdotes report that the most
successful quality programs involve every person in the company, while the
quality assurance department becomes more a scorekeeper that a line of error
defense.  The aphorism "quality is everybody's job" may be a cliche, but it
holds much truth.  I would say that quality, safety and security are
everybody's jobs.

Everybody working on critical systems must consider himself/herself
responsible for the quality, safety and security of the system; and until
that sort of attitude takes root neither licensed engineers nor automated
tools nor process standards will make the slightest dent in the risks we face.


Regarding the certification of software engineers

<benson%cs1.wsu.edu@RELAY.CS.NET>
Tue, 18 Aug 87 17:00:11 PDT
Nancy Leveson (<>)

<>Am I wrong in my observation that under-qualified people
<>are sometimes doing critical jobs and making important decisions?

Your observation appears correct to me.

<>Do you agree that some type of certification is necessary?

I would rather say that certification is highly desirable.  Necessity is
a most difficult concept after one moves beyond the basics of air, water
and food.

<>If so,  there are still many open issues...

Yes, yes.  But the basic issue is to obtain or create some certification
agency.  My attempt to interest the Society of Professional Engineers met
with silence.  My attempt to kindle some interest in this matter on RISKS
about 15 months ago received no support, and at least one negative response.
I conclude that it will take at least one major accident with significant
loss of life, this accident attributable to software failure, before there
will be sufficient interest to establish a certification program.

                David B. Benson


Re: Risks of automating production

<mnetor!utzoo!henry@uunet.UU.NET>
Sat, 15 Aug 87 21:23:56 EDT
>   From: Richard A. Cowan <COWAN@XX.LCS.MIT.EDU>
>   ... In the end, I believe that
>   automation will cause incredibly disruption and suffering unless there is
>   also a dramatic shortening of the work week.

Much depends on what else is going on in the economy at the same time.  By
far the biggest example of technological unemployment in history (to date)
is the mechanization of farming, which does not seem to have caused trouble
on such a scale.  This was relatively recent, too.  As late as 1918, farming
was so manpower-intensive that my grandfather missed combat service in WW1
because he (and thousands of others) got sent home on "harvest leave".  I
don't know what the numbers were like then, but over a longer time scale
the percentage of farmers in the population has gone from >90% to <10%.
Studying this enormous transition might tell us something about handling
he advent of automation.

                Henry Spencer @ U of Toronto Zoology
                {allegra,ihnp4,decvax,pyramid}!utzoo!henry

     [At the rate we are going, by the year 2001 there will be more 
     computer users than people (if you will pardon reuse of an old joke).
     But, no matter how many million programmers we have, the GOOD ones who
     are capable of disciplined work are always going to be at a premium.  PGN]


Re: Automated environment control (RISKS 5.24)

Robert Stanley <roberts%cognos%math.waterloo.edu@RELAY.CS.NET>
20 Aug 87 00:21:40 GMT
In the early 1970's a (possibly apocryphal) story was doing the rounds.  It
concerned an installation of IBM's where the new physical plant had a
significant amount of its environmental controls in the hands of a computer.
Unfortunately, a serious bug emerged during testing, which manifested itself
by activating the fire-fighting system, which immediately powered down the
systems (electronic equivalent of hitting the crash stop), eliminating all
traces of the misbehaving software.  After two repetitions the story
reported that the project was abandoned.

I never troubled to track this one down, but it has stuck in the memory.  The
second story I can vouch for because I hired the senior system programmer
shortly thereafter and he gave me the details.  Unfortunately I am not at
liberty to reveal the installation concerned, which probably invalidates this
posting.  [Not necessarily.  We just have to take it for what it is.  PGN]

A huge dedicated computer environment was constructed by one of the largest
computer users in the UK.  The main floor was so large, and the halon fire
suppression on such a hair-trigger, that the operators had to practice gas-mask
drills when the alarm sounded.  By the same token, the exits were sufficiently
distant that reaching them in a smoke/gas filled atmosphere could be a serious
navigational problem.  To help out, illuminated arrows were set into the floor
panels, and a computer was supposed to light these to point the way to the
nearest exit.  Needless to say, this extremely costly system totally failed its
first test because the primary control software shut down all systems,
including the light control system, on detection of a fire/smoke problem.

I do not know how to cure this class of problem, which is akin to the
self-powered emergency light for power failure.  The reason for shutting down
control systems in a fire emergency is obvious, but the idea of umpteen
self-contained micro-processor-based emergency systems is equally horrifying.
Imagine the software update problem, should the installation to be sufficiently
advanced in its thinking to implement flexible response!  By way of comparison,
consider the Brown's Ferry near-disaster (We Almost Lost Detroit), where a fire
in the (one and only) cable duct under the control room floor rapidly severed
the control centre from everything over which the operators were supposed to
have control.  No matter that the fire started in an extremely bizarre fashion,
the issue is how do you protect vital communication links in physically
hazardous circumstances.

Robert Stanley           Compuserve: 76174,3024        Cognos Incorporated
 uucp: decvax!utzoo!dciem!nrcaer!cognos!roberts        3755 Riverside Drive 
                   or  ...nrcaer!uottawa!robs          Ottawa, Ontario
Voice: (613) 738-1440 - Tuesdays only (don't ask)      CANADA  K1G 3N3

  [For RISKS it's been halon cats and dogs lately; this one gets through.  PGN]


Automated control stability and sabotage

<asci!brian@ucbvax.Berkeley.EDU>
13 Aug 87 11:56:51 PDT (Thu)
I write fiction on the side.  Watching the discussion on automated control
systems for such systems as air traffic control and power plants, I was
reminded of a short story I wrote a few years ago that used a possible world
economic collapse as a plot device (though the theme was dramatically
different).  I've always been suspicious of Wall Street (the buying and
selling of money for the sake of making money just strikes me as wrong, I
think of investing as buying and staying).  In my story I skirted the issue
of how the collapse could be brought about, but intimated that someone
wanted to do so by sabotaging a computer system somewhere.  Recently I've
been thinking such a collapse is now possible by the deliberate sabotage of
a computer system used by brokerage houses.

As I understand it, these systems watch the difference in say the price of
company A and a bond from company B.  When the price of the bond becomes
attractive for reasons I don't understand, the computer starts dumping hundreds
of thousands of shares in company A and buying in company B.  Another computer
sees the drop in company A and starts selling to cut its losses.  Another
computer and watches the drop, and starts buying in A when it thinks it is a
good value.  TIME had an issue about the electronic market and talked about a
program called Firedown that did something like this.
        [And RISKS has had considerable discussion in past volumes on 
        instabilities that can result from closed-loop computer models. PGN]

My question is what if somebody sabotaged such a system or group of systems,
and they started selling everything and bought into say gold.  Is there a
risk that an "electronic" panic could be started with all of these machines
selling and selling, generating an avalanche effect?  It seems from media
reports that the speed of the buying and selling transpires much too fast for
human intervention, and the programs are DESIGNED to work without human
intervention.  What prevents such an occurrence?  If sabotage is possible,
and an avalanche could be generated, what precautions could be made to stop
it (maybe a government computer monitoring the whole thing that steps in and
starts buying everything up to slow and eventually stop the avalanche, later
it starts an organized sell of everything it bought)?

With what appears to be agreement that fully automated control systems have
enormous dangers and that human intervention is demanded, what about these
Wall Street beasts?  I don't know, but as a writer I'm very curious what
others might know.

Brian Douglass, Applied Systems Consultants, Inc. (ASCI), P.O. Box 13301,
Las Vegas, NV 89103   Office: (702) 733-6761   Home: (702) 871-8182
UUCP:    {mirror,sdcrdcf}!otto!jimi!asci!brian

    [Certain safeguards are clearly desirable, both preventative and
    real-time monitoring.  But we learn from such situations that it is
    essentially impossible to anticipate every possible instability mode.  
    The ARPANET collapse of 1979 was an example of a seemingly impossible
    event happening.  PGN]


Trusting Computers (Re: RISKS 5.27)

ihnp4!illusion!marcus@ucbvax.Berkeley.EDU <Marcus Hall>
Fri Aug 14 08:14:53 1987
> Her ... cash card had been used to steal 250 pounds (the daily limit ...
>
> Elsewhere, Abbey National claims to process 5.7M transactions per year ...

>    The moral is:  don't expect computers to perform all the routine,
>boring tasks that they do so much better than people.
>
>                Andy Walker, Maths Dept, Nottm Univ, UK

No, the problem just has to be looked at differently.  To enforce the
250 pound per day limit, the computer has to have fields in its customer
records that indicate the date of the last withdrawal and how much has
been withdrawn on that date so far (or some such information).  To enforce
the "3 consecutive day" rule, merely add another two bit field that indicates
how many consecutive days the account has hit the max withdrawal limit.  Of
course, the easy way around this is to withdraw 249 pounds per day, so this
probably isn't what you want to be looking for, but the scheme can be adapted
to many different trigger conditions.

The moral here is:  Don't be blinded into believing that the obvious
solution is the only solution.

I wounder how much computing power is wasted in the world because poor
algorithms are implemented and give acceptable enough performance to just
get by?
                              Marcus Hall

Please report problems with the web pages to the maintainer

x
Top