The RISKS Digest
Volume 5 Issue 14

Wednesday, 22nd July 1987

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

FAA absolves Delta in 2 close calls, ATC problems blamed in one
PGN
Origin of term "intelligent machine"
Jon Jacky
robocop
Lou Steinberg
Nuclear power plants
Alex Bangs
Nancy Leveson
Reminder about alarms
Eugene Miya
FCC computer fees
Alex Bangs
Risks of exporting technology
Clint Wong
Electronic Cash Registers
William Daul
Brief book review of the Hacker's Handbook
John Gilmore
Re: Credit card risks
Amos Shapir
Info on RISKS (comp.risks)

FAA absolves Delta in 2 close calls, ATC problems blamed in one

Peter G. Neumann <Neumann@csl.sri.com>
Wed 22 Jul 87 11:17:40-PDT
There were two near-misses last weekend — two Delta jets over Virginia both
bound for Dallas-Fort Worth, and a Delta jet and a single-engine Cessna
approaching Sacramento.  In both cases the FAA reported that the Delta jets
were "just following instructions" from the air traffic controllers.  In the
first case, the Atlanta ATC center in Atlanta was having computer problems
and the FAA placed the blame there — on the system or on the controllers.
In the Sacramento case, it appears the private pilot was at fault.  (Delta
has been having an incredibly Murphian run of bad luck lately, with about
ten different incidents in the past two weeks.)  [Source:  SF Chronicle, 22
July 1987]

Incidentally, the July/August 1987 issue of the Common Cause Magazine has an
article by Jonathan King entitled "Blind Spots", "another reason to fear
flying: faulty air traffic equipment and not enough manpower to keep it
working."  It is a reasonably sensible analysis of past troubles.


Origin of term "intelligent machine" - press probably not responsible

Jon Jacky <jon@june.cs.washington.edu>
Wed, 22 Jul 87 09:28:40 PDT
> baldwin@cs.rochester.edu writes:
> The term "intelligent machine" is a lasting disservice to our discipline
> by the press of the 1940's and 1950's..."

I'm not sure what the original source of this phrase was, but the term
"artificial intelligence" was originated in the 1950's by John McCarthy,
generally regarded as one of the most important computer scientists (he
invented LISP, among other things).  The story goes that he created the term
in a grant application in order to kindle funders' interest in topics like 
symbolic logic with otherwise seemed rather esoteric and impractical.

I am getting tired of people blaming "the press" for "sensationalizing"
items that were in fact originally sensationalized by the technical
community itself.  In fact the most mind-boggling and incredible claims
about computing often originate from some scientists, and are if anything
underreported by the press.  For example some quite well-regarded computer
scientists are said to believe it will be feasible to load a runnable copy
of a human intelligence into a computer in the reasonably near future, so we
won't have to die anymore.  What has the NATIONAL ENQUIRER got that can beat
that?  A particular galling example to me is DARPA people complaining about
the "misconception" that Strategic Computing is promoting "killer robots"
when in fact their reports picture autonomous flying vehicles dropping bombs
on things.
                                  - Jon Jacky


robocop

Lou Steinberg <lou@aramis.rutgers.edu>
Wed, 22 Jul 87 11:25:17 EDT
I think there may be some overreaction here to the comments on a machine
blindly following instructions as a form of humor.  In fact, *people* doing
this kind of thing, especially taking things literally that were meant
figuratively, is a classic form of humor.  Gracie Allen was a master at this.

      [Yes, even for those who remember Gracie, let's go slow here on people
      blindly following instructions, unless there is a RISKS connection.  PGN]


Nuclear power plants

Alex Bangs <bangs%husc8@harvard.harvard.edu>
Tue, 21 Jul 87 15:54:25 edt
I am very interested in nuclear power plant control. I was wondering if there
are any books/articles that people might recommend on the subject. It is my
opinion (and this ought to get some people going) that if they could actually
build one of the plants without construction corruption, we might be able
to have a nuclear industry. It is also my belief that some intelligent
(see: Robocop) control systems ought to be able to keep a plant running safely,
given mechanical backup.

Alex Bangs, Harvard Robotics Lab, bangs@metatron.harvard.edu


Nuclear power plants

<Nancy Leveson <nancy%murphy.UCI.EDU@ROME.UCI.EDU> [at PGN's request]>
Tue, 21 Jul 87 21:20:44 -0700
Alex Bangs brings up two points with respect to nuclear power plant control:

  >It is my opinion ... that if they could actually build one of the plants 
  >without construction corruption, we might be able to have a nuclear industry

Although there have been construction foulups, I doubt that they can all
(or even most) be tied to corruption as opposed to simple mistakes.  The
reasons for the nuclear power industry problems go way beyond construction
difficulties and include economics, poor management, limitations in our
basic engineering capabilities, the impossibility of building any complex 
systems that need to guarantee an extremely low failure probability, etc.
For an excellent discussion of these problems with respect to the
nuclear power industry and other complex systems, I highly recommend
a book by Charles Perrow called:  "Normal Accidents: Living with High-Risk
Technology" and published by Basic Books in 1984.

  >It is also my belief that some intelligent (see: Robocop) control systems 
   ought to be able to keep a plant running safely, given mechanical backup.

Computers are currently used in nuclear power plant control, but I assume
Alex is suggesting that computers be given complete responsibility for safety.
I have not seen Robocop, but from the descriptions in Risks of the robot
following orders and killing someone inappropriately, I would hate to think
that this is the way we would want to build a nuclear power plant.  But
more seriously, it is important not to confuse Hollywood with reality.
AI software has not proven to be any more reliable than any other
software.  Programming is programming whether the application is supposedly
intelligent or not.  I will try to avoid controversy by refraining from
commenting on the capabilities of supposedly "intelligent" systems, but
it seems reasonable that if we were able to guarantee perfect software that
way, all software would immediately be written using AI techniques (or
more cynically, labelled as "intelligent") and we could all stop writing 
into and reading the Risks bulletin board.  

Considering everything, it does not appear to be reasonable or responsible
to depend on software to guarantee safety in any system where the consequences 
of failure are as extreme as in nuclear power plants.  Note that this does 
not mean that computers cannot be used in these systems — they currently are.

It only means that we cannot expect computers to eliminate the danger of 
nuclear power plants or any other complex, potentially unsafe system.  The 
systems with computers are likely to be as dangerous as without them, and 
because of the well-known difficulty of building highly reliable software, 
they may be LESS safe with computers.  Whether we can in the future learn how
to use computers to control safety-critical systems with the same or less 
danger than without them is still unknown, but there are no simple answers.

Nancy Leveson     (nancy@ics.uci.edu)
Information and Computer Science, University of California, Irvine


Reminder about alarms

<eugene@ames-nas.arpa>
21 Jul 87 09:58:09 PDT (Tue)
    [If any of you wonder, "What has all this to do with computers and related 
    systems?", the answer by now should be obvious...  Alarms were ignored,
    bypassed, misinterpreted, or otherwise mishandled in many cases such as
    the Stark, Three Mile Island, Chernobyl, Therac 25...  PGN]

Addendum, especially in the case of Chernobyl and Brian's Computers and Society
comment later in the same issue.  We, computer people, are frequently
accused of a binary mentality.  This is a good case where we have a minimum
of three states: true alarm, false alarm, and testing/practice.  The
last case is important because it allows consideration of further
contingencies, but it has the further danger of true alarm during
testing/practice as well as known false alarm.  The complexity can get worse,
so I won't go into it.  The question is how to provide for testing?

I hope that the situation will not get worse before it gets better.
Like "Robocop,"  I have to remain skeptical.  We (computists) are the
guardians for the general public at this time.
                                                    --eugene miya


FCC computer fees

Alex Bangs <bangs%husc8@harvard.harvard.edu>
Tue, 21 Jul 87 15:40:20 edt
I can see the point that the FCC should be able to tax an industry that is no
longer completely weak, but there is still a problem. First, the people who
are probably going to get taxed are the users, not the company. Perhaps if they
drop their prices or lose users, their profits will drop. Otherwise, the
government will just be making more money while Telnet continues to make big
profits. The other problem I see is that this system really hurts the small-
time home user of these services. I really cannot afford to use CompuServe at
more than $6 per hour for fun. In a way, this tax is still risking damage to
the home compute culture.

Alex L. Bangs, Harvard Robotics Lab, bangs@metatron.harvard.edu


Risks of exporting technology

Clint Wong <cgwong%orchid.waterloo.edu@RELAY.CS.NET>
Wed, 22 Jul 87 10:40:45 EDT
July '87  IEEE Spectrum - Newslog

Norway will tighten export controls on technology.  The Defense Minister
announced this decision following disclosures that the state owned 
Kongsberg Vapenfabrik sold metalworking machines and related software to 
the Soviet Union.  The equipment was sold in partnership with Toshiba 
Machine Co. of Japan and enabled the Soviets to build quieter submarines 
using advanced propellor blades.

    [Some of you have seen Monday's full-page ad from Toshiba apologizing
    for this event, and promising never to do it again.  This case reminds
    us of some critical risks in technology outflux and some other critical 
    risks when there is not relatively equal distribution of technology...
    Evidently this is a case in which the President's offer to share (e.g.,
    the Star Wars technology!) is not applicable — it seems to have had a 
    very profound effect.  (There is the old joke about certain computer
    systems and programming languages that should be given to our
    adversaries in order to set THEM back many years.  On the other hand,
    nuclear weapons capabilities are a case in which maintaining parity
    among the main parties does not help stave off developments in other
    nations.)  (End of ramble.)  PGN]


Electronic Cash Registers

William Daul / McDonnell-Douglas / APD-ASD <WBD.MDC@OFFICE-1.ARPA>
21 Jul 87 18:29 PDT
I forgot to send this comment to RISKS a few months ago.  I purchased some 
goods at a electronic register...the type that tells you how much change the 
customer is to receive.  When I started looking at the facts (cost of item, 
amount tendered and change), I realized that register had the wrong amount 
displayed.

Can someone tell me how common a problem is this?  Have others run into it?  It
makes me a bit cautious now.  Any comments or pointer to further information 
would be appreciated.
                          [Responses to Bill, please.  PGN]


Brief book review of the Hacker's Handbook

John Gilmore <hoptoad.UUCP!gnu@cgl.ucsf.edu>
Wed, 22 Jul 87 18:50:28 PDT
The mention of the Hacker's Handbook reminded me of this book.

    The Hacker's Handbook, by Hugo Cornwall.  1986 edition.  Published
    by E. Arthur Brown Company, 3404 Pawnee Drive, Alexandria, Minnesota,
    USA 56308; phone +1 612 762 8847.  ISBN 0-912579-06-4.

I found it last year and really liked it.  The book is written from a
European perspective (it was originally published in England).  It
starts from basics of computers, modems, and security, mentioning a lot
of things that RISKS readers already know very well.  But it includes
numerous examples of hacking European data services, e.g.  Prestel,
British Telephone, various X.25 networks, the British MI5 intelligence
service, as well as radio data hacking .  There was sufficient "new"
information to keep me avidly reading all the way through.  The
author's writing is chatty and informative, very easy to read.  I
recommend this book for the Risks Library.


Re: Credit card risks (RISKS-5.13) [plus a robopoem]

Amos Shapir <nsc!nsta!nsta.UUCP!amos@Sun.COM>
22 Jul 87 14:57:41 GMT
wittenberg%ultra.DEC@decwrl.dec.com  (David 'Witt' Wittenberg) writes:
>AT&T phone credit cards use a credit card number that consists (in most cases)
>of your phone number followed by four (presumably somewhat random) digits.

When I realized that, and that the only purpose of the card was to remember
the number, I memorized the last 4 digits and destroyed the card.  The
possibility that someone who knows my name (e.g. in the office) will look
over my shoulder while I was using it was just too great.

Even now when the cards are nagnetic there's not much point in keeping
them, as there are as many systems that accept regular credit cards
wherever the AT&T machines are.

   - - - - - -

Subject: Re: Robocop and following instructions (RISKS-5.12)
Brian Gordon <gordon%cae780.cae.tek.com@RELAY.CS.NET> writes:
>  >From: baldwin@cs.rochester.edu
>  >"I think there's something basically funny about a machine ... 
>  > blindly following instructions in the face of logic" 

          I really hate this damn machine,
          I wish that they would sell it;
          It doesn't do quite what I want -
          Only what I tell it!

          — Funny, but also a Major Truth.

Amos Shapir, National Semiconductor (Israel)
6 Maskit st. P.O.B. 3007, Herzlia 46104, Israel  Tel. (972)52-522261
amos%nsta@nsc.com @{hplabs,pyramid,sun,decwrl} 34 48 E / 32 10 N

Please report problems with the web pages to the maintainer

x
Top