The RISKS Digest
Volume 5 Issue 25

Sunday, 9th August 1987

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Computer Error Opened Flood Gates of Alta Dam
Haavard Hegna
Heating up planning discussions ...
Robert Slade
Re: Faults in 911 system caused by software bug?
Paul Garnet
"It must work, the contract says so"
Henry Spencer
Separation of Duty and Computer Systems
Howard Israel
Optical Disks Raising Old Legal Issue
Leff
AAAS Colloquium Notice
Stan Rifkin
Secrecy About Risks of Secrecy Vulnerabilities and Attacks?
Peter J. Denning
Another electronic mail risk
Doug Mosher
Risks TO computer users (US Sprint)
James H. Coombs
Computer Safety and System Safety
Al Watters
Computers in nuclear power plants
Frederick Wamsley
Autoteller problems
Alex Colvin
Info on RISKS (comp.risks)

Computer Error Opened Flood Gates of Alta Dam (Norway)

Haavard Hegna <hegna%vax.nr.uninett@NTA-VAX.ARPA>
Fri, 7 Aug 87 16:33:28 +0200
This excerpt (slightly edited) is from Dagbladet, Oslo, Norway,
Friday August 7. 1987:

COMPUTER ERROR OPENED ALTA (power station) FLOOD GATES.  While all
(Norwegian) electrical power supply directors were on visit at the very
controversial and newly opened Alta Power Station (in Finnmark, northern
Norway), the catastrophy happened. While trying to correct a computer error,
two of the flood gates of the Alta Dam accidentally opened.

A three-meter high flood wave ran down the narrow river canyon at a speed of
approx. 20 km/h.  A full police rescue alarm was called, local radios gave
warnings to campers etc., a helicopter was sent out to locate possible
victims of the flood.  The river rose about 1 meter during the twenty minutes
that went by before the the dam engineers were able to close the gates
manually. 2 million cubic meters (50 M cubic feet ?) of water was lost.

As far as the local police knows, no one was hurt, but 5-10 small
river boats were crushed.  The power station expects some claims 
for compensation.  [The Alta river is well known for its salmon]

The Alta Power Station has been operative since May this year.  "The paradox
is that this very autumn we were going to install a valve that will prohibit
the accidental opening of the flood gates", the dam administration says.

The Alta Dam and Power Station was much in the news a few years ago. It lies
in the Norwegian part of Lappland. Long and very violent demonstrations by
Lapps and others (including some very prominent Norwegians) nearly stopped
the contruction work.

Haavard Hegna, Norwegian Computing Center, Oslo, Norway


Heating up planning discussions ...

<Robert_Slade%UBC.MAILNET@umix.cc.umich.edu>
Fri, 7 Aug 87 06:27:04 PDT
     Regarding Burch Seymour's note in 5.20 about a CPU overheating: a
certain charitable recently purchased computer equipment to assist with
their operations. Bad planning contributed largely to the fact that it has
not been of any help to them. One of the concerns that they failed to
address was the physical placement of the machines with respect to heat.

     A "Baby 36", a PC, terminals and printers for both and a printer for
the new phone system were all put into an eight-by-ten-foot room. As the
contractor hired to perform the initial installation of software, and
training for the staff, I pointed out that this was less than desirable.
However, it was not until, in an attempt to make the room habitable, I
turned the thermostat off that I found out that it controlled not merely
that room, but the entire back half of the building.

     With the assistance of the materials manager (after failing to convince
the executive director to make changes in room assignments) I completely
blocked the heating duct into the computer room, and made a direct fan vent
to the outside. In spite of this, through one of the coldest winters we've
ever had in Vancouver the temperature *never* fell below seventy-two degrees
farenheit (most often it hovered in the eighties) and the clerical staff
froze their fingers off.

     (After three months of complaints of heatstroke and frostbite [one
typist took to wearing gloves all day!] the computer was moved ... to an
area with insufficient power supply!)


Re: Faults in 911 system caused by software bug?

<pgarnet@nswc-wo.ARPA>
Fri, 7 Aug 87 14:52:23 edt
Instead of being critical of the bug and the 30 minutes of downtime

  >... it would then be interesting to learn 'who's responsible' should
  >the faulty system have resulted in some tragedy.

we should find it refreshing to see that someone has learned the lesson of
keeping the old system on line while the new one is tested *IN THE REAL
WORLD*.  So many times people disconnect the old system and depend entirely
on a new, incompletely tested system.
                                                 Paul Garnett


"It must work, the contract says so"

<mnetor!utzoo!henry@uunet.UU.NET>
Fri, 7 Aug 87 17:20:51 EDT
Mark Day's comments re the Eastport report bring to mind a related issue, which
has some relevance to Risks:  how management responds to reports of problems.  
I highly recommend "Illusions of Choice", by Robert F. Coulam, published (I
think) by Princeton U. Press, dated late 70s (I can dig out accurate info if
needed; my copy isn't handy).  The ostensible subject of the book is the
development and procurement of the infamous F-111 fighter- bomber, but it is
also a fascinating look at bureaucratic management and the mistakes it makes.

The title of the book comes from the persistent belief of the senior DoD
people that they could decide the project's fate at various milestones
during development, when in fact the project was merrily proceeding on its
pre-ordained path without waiting for permission to do so.  This was not
because the people in question lacked authority:  the F-111 had so much
trouble that the Secretary of Defense was personally involved.

The part of the book that specifically came to mind was Coulam's comments
on the F-111's engine problems:  DoD repeatedly took it for granted that
the engine problems would be solved, simply because the contract with the
engine manufacturer said so.  The engine problems *were* eventually pretty
much solved, but only at the cost of significant changes to air intakes and
so forth.  Those changes were incorporated only in later F-111's, because
the early ones were too far down the production line by then:  aircraft
production had happily continued, in sublime confidence that the engine
problems would go away!

This sounds a whole lot like the contractor attitudes that the Eastport
group criticized.  I would conjecture that the software development was
going to be done by someone other than the prime contractor for the
hardware.  Of course the software would work on the hardware as supplied;
the software contracts would insist on that!

Henry Spencer @ U of Toronto Zoology {allegra,ihnp4,decvax,pyramid}!utzoo!henry


Separation of Duty and Computer Systems (re: Pat Farrell, RISKS-5.19)

Howard Israel <HIsrael@DOCKMASTER.ARPA>
Sat, 8 Aug 87 18:00 EDT
Just as a point of fact, the COMPUSEC community has picked up on this.  Any
VAX/VMS system since V4.0 (I believe, V4.3 definitly) has the capability to
enforce 2 passwords on a given account.  This allows the system
administrator to set up privileged accounts (or semi-privileged accounts, or
just regular accounts) with 2 passwords, one user for each of these
passwords.  The capability to activate the account would require both users
to be present to login.  Once logged in, the user can do what he wishes
(exactly analogous to the safe deposit box situation in a bank).  Even the
system administrator's account can be set up this way.

Of course, one does not have to necessarily use two people, just one
person who knows both passwords would get you in also (I guess this is
appropriate for especially paranoid people).

---Howard Israel, ATT Bell Labs


Southern Methodist University <Leff>
Fri, 7 Aug 1987 01:54 CST
      <E1AR0002%smuvm1.bitnet@RELAY.CS.NET>
Subject: Optical Disks Raising Old Legal Issue
To: RISKS%CSL.SRI.COM@RELAY.CS.NET, INFO-LAW@BRL.ARPA

Summary of "Optical Disks Raising Old Legal Issue", in Digital News, 3 Aug 1987

When microfilm came out, Congress passed the Uniform Paperwork Act which
covers "microfilm, microfiche, photocopies and some electronically
transmitted paperowrk, such as facsimiles."  There is a book out called the
"Legality of Optical Storage" which was prepared with the assistance of a law
firm.  No specific state or federal laws cover the technology as yet.  Robert 
S. Williams, president of Cohasset Associates in Chicago, a record management 
consulting form, says that the existing laws will suffice.  New Jersey has a
bill pending in the legislature to specifically cover optical storage.  It
is geared to the state government's needs rather than private industry.

The IRS has already introduced a trial program in which optical disk juke
box storage is used by personnel to answer questions from customers or in
preparation for audit.  However, the IRS still keeps the original forms
although both the IRS legal counsel and Department of Justice says that the
optically stored originals are as good as the originals in court.  The
National Archives has given IRS the permission to destroy the originals, but
the IRS is still storing the originals.


AAAS Colloquium Notice

Stan Rifkin <rifkin@brillig.umd.edu>
Fri, 7 Aug 87 12:29:41 EST
The American Association for the Advancement of Science (AAAS) is holding its Second Annual Colloquium on Science, Arms Control, and National 
Security September 28-29 at the Hyatt Regency Crystal City, jut outside 
of Washington DC.  The reason Risks Forum readers should be interested 
is that this Colloquium represents one of the few occasions when policy-makers meet scientists to share concerns; both sides speak frankly.
For example, at last year's Colloquium, Kenneth Edelman, Director of the 
US Agency for Arms Control and Disarmament, said that nuclear weapons 
were safer now than at any other time in history. The Colloquium 
provided the opportunity to ask him what made him believe that. Last 
year Gen. Abrahamson, Director of SDIO, made clear that the first step 
in SDI was to protect military targets, not civilian populations. He 
also provided a list of technologies that needed to be developed in 
order for SDI to succeed. High among those was battle management 
software. He said that computing power alone can tip the strategic 
balance between the super-powers.

Speakers at this year's Colloquium will include Robert Dean, Special
Assistant to the President and Senior Director for International Programs
and Technology Affairs; Lewis Branscomb, Director of the Science,
Technology, and Public Policy Program at Harvard, and formerly the Chief
Scientist of IBM; Robert Cliff Duncan, Director of DARPA; John Deutch,
Provost of MIT; Paul Nitzer, Special Advisor to the President and Secretary
of State for Arms Control Matters; Jonathan Dean, Arms Control Advisor,
Union of Concerned Scientists; Amoretta Hoeber, Director for Planning and
Development, TRW; Lt. Gen. John Wall, Commander, US Army Strategic Defense
Command; Charles Zraket, President, Mitre; and Sidney Graybeal, VP of
Systems Planning Corp.  Registration is $160 (incl. meals) for the two days.
For further information, call AAAS at (202) 326-6490.

- Stan Rifkin


Secrecy About Risks of Secrecy Vulnerabilities and Attacks?

Peter J. Denning <pjd@riacs.edu>
Fri, 7 Aug 87 13:08:34 pdt
A recent item in RISKS called attention to Ian Witten's article in the
summer 1987 issue of ABACUS.  I found the article fascinating reading and
filled with insight.  It is crammed with detail on the weakness of systems
to password attacks, Trojan horse attacks, and virus attacks, and it even
elucidates the lacunae of Ken Thompson's Turing Lecture on invisible,
self-replicating compiler viruses.

I would like to invite a discussion in RISKS on the question of publishing
graphic detail on security weaknesses of systems.  As Editor-in-Chief of the
ACM Communications, I am sometimes faced with conflicting reviews on such
materials — some reviewers will say it is inappropriate to publish such
lucid detail when there is no good defense against the attacks, some will
say that this type of information is capable of inspiring a lot of mischief
and making life unpleasant for numerous innocent people, while others will
say that these things need to be discussed so that people of good will will
know what they're up against.  It would be helpful to me and other editors
to have a better idea of where the community stands on this sort of thing.
Should we encourage publication of this kind of material?  Discourage it?
Publish general overviews of attacks and weaknesses but not give enough
detail to permit others to replicate the attacks?  Should there be an ACM
policy on this question?  If so, what might that policy be?

Many thanks to all.   Peter Denning

    [This topic has arisen regularly in RISKS.  The last time we took a
    crack at it we still seemed to believe that vulnerability lists were
    somewhat sensitive, but the pretense that nobody knew what they might
    contain was likely to get you into much greater trouble than if you
    assumed that the vulnerabilities were widely known and likely to be
    exploited.  If a system is fundamentally flawed, it should not be used
    in critical applications where those flaws could result in serious
    compromise.  Publishing the security weaknesses can only improve things
    in the long run — although it may lead to some frantic moments in the
    short run.  The real problem is the tendency to put all of one's eggs in
    a single ostrich basket — and then stick your head in the sand.  Passwords
    are a fine example.  If there is ONLY ONE PROTECTION WALL, then anyone who
    penetrates it has everything.  The obvious strategy is to use multiple
    controls — with passwords (or perhaps better authenticators) as a first 
    line of defense, sound oerating systems and applications as a second, and 
    careful auditing in real-time or after-the-fact as the last resort...  PGN]


Another electronic mail risk

Doug Mosher <SPGDCM%cmsa.Berkeley.EDU@Berkeley.EDU>
07 Aug 87 17:55:16 PDT
I want to point out a hazard in electronic mail systems that can easily
be overlooked until one gets caught by it.

The risks normally discussed include: the mail gets seen by unauthorized
readers; is lost; is sent but the archive copy is later lost; the ID or
contents are changed; or the recent and humorous example of "archive saved
after all" when it wasn't wanted, in Reagan's NSA using PROFS.

The additional hazard I want to point out is: the problem of yourself
inadvertently sending a note or a copy to the wrong person(s).

I have used various forms of e-mail at several universities over a 13 year
period, and perhaps the most embarrassing incidents in such use have been
when I mis-sent notes.

Typical sensitive contents are things relating to criticism, of employees or
peers or managers. Or possibly semi-secrets, usually related to early
discussions of hirings, firings, reorganizations, office moves, management
shifts. All of these matters can be written about to specific individuals,
where the mis-sending of the note can be very hurtful; in some cases if
mis-sent to specific persons, in other cases if mis-sent to almost anyone.

Here's a brief list of ways such mis-sending can occur:

1. similar names or nicknames or userid's.

2. absentmindedness, along with interruptions; returning from a telephone
interruption and sending a note to the proper recipients of the previous
topic or the next topic.

3. the manual or automatic inclusion of others on a cc list for something to
which you are replying.

4. mistakes or misunderstandings or similar names/nicknames/userids in
group or system nickname files.

5. Actual failure of an electronic mail system (I have found this rarer than
the above risks, though when I was developing some of my own tools once,
I had this type of failure; the form of it was to mail the current note
to the recipients of the last note I sent).

Electronic mail is riskier in this regard than manual mail, for several
reasons, including:

Brief, cryptic, and/or indirectly indexed names for addressees; unusually fast 
and simple actions to prepare and send the notes, which avoid the normal
delays with real mail (but those delays added extra chances for review by
oneself and others, or chances to intercept and cancel); plus the normal
risks of new or unfamiliar systems and the risks of computerized systems.

One defense is a broad defense against the dangers of many new systems:
use the new facility for the least critical matters in the early stages
of either the system's development or your learning of the system.

Doug Mosher, MVS/Tandem Systems Manager  (415)642-5823, 
Evans 257, Univ. of California, Berkeley, CA 94720

    [Ironically, one solution here is related to a solution to Peter
    Denning's query: assume that everything might as well be public
    anyway (since the privacy and integrity in most mail systems and the
    networks they use are minimal anyway) — and don't use a computer system
    to say anything that you would not say publically.

    There are also some "features" of mail systems under which the current
    pointer does not get moved as you might think it would.  You can easily
    wind up answering the wrong message.  The solution there is always to
    review a message — including its recipients' addresses — before you 
    commit to sending it off.  PGN]


Risks TO computer users (US Sprint)

"James H. Coombs" <JAZBO%BROWNVM.BITNET@wiscvm.wisc.edu>
Sat, 08 Aug 87 15:32:16 EDT
Well, in addition to the risks to the innocent public from the use and
misuse of computers, I now see that we computer users suffer because of
the ignorance of those who apparently do not use computers or, at least,
those who use them mindlessly.  US Sprint has just dumped two different
kinds of cards on me, neither of which works with my software.

I have Sprint's local basic service, but the quality is not good enough
for telecommunications.  Their "dial a local number and input a code"
connection has always been better than AT&T or anything else that I have
had locally.  Now, however, they have numbers that are too long for
ProComm.  In addition, they have reversed the sequence, so that one now
inputs the telephone number before the travel code.  That means that
none of the popular communications software will work properly with
Sprint's new system.  Finally, to demonstrate their complete ignorance
of the needs of telecommunicators, they ask us to "listen for the
recording" before inputing the "14 digit FONCARD number."  It will
probably take the ProComm authors a little while to come out with a
speech recognition module!

This change is going to cost Sprint a lot of revenue in the next few
months.  It's also going to cost many of us a lot of aggravation.  The
software developers will have to rework their programs and negotiate
with Sprint (or we will find different carriers).  And, at least in the
meantime, there will probably be a lot of people hacking together
interim solutions.  How could Sprint be so naive?
                                                      --Jim

P.S.  If they are trying to solve problems with illegal access, they
are on the wrong track.  The thieves will revise their programs very
quickly; the legitimate customers will be the ones who are locked out.


Computer Safety and System Safety (Re: RISKS 5.22)

<SAC.96BMW-SE@E.ISI.EDU>
9 Aug 1987 10:57-CDT
   REF. YOUR CONTRIBUTORS WHO TALK ABOUT THE "SAFETY" OF THE COMPUTER.  AS
ONE WHO WORKS IN THE "SAFETY" CAREER FIELD I REALIZE THAT THERE ARE MANY
FORMS OF "SAFETY", AND THERE IS A DEFINITE NEED TO MAKE COMPUTERS "SAFE" FOR
USE.  I FIND HOWEVER THAT IN READING THE PAGES OVER THE PAST YEAR AND A HALF
THAT MANY OF YOUR CONTRIBUTORS ARE CONFUSING "SAFETY" FOR "SECURITY" AND
SHOULD INSTEAD BE TALKING ABOUT COMPUTER SYSTEM SECURITY RATHER THAN
COMPUTER SYSTEM SAFETY.  (THERE IS AN ENTIRE SUB-GROUP OF THE SAFETY FIELD
THAT DEALS STRICTLY WITH SYSTEMS SAFETY.)

AL WATTERS,               SAC - Safety, Always Caring



Computers in nuclear power plants

Frederick Wamsley <well!alcmist@cogsci.berkeley.edu>
Sat, 8 Aug 87 23:20:11 pdt
Much has been said about the risks of trusting computers too much.  It can
also be a problem to trust them too little.  This was a problem at Three
Mile Island, where the human operators turned off a cooling system that the
automation had turned on.

Good user interfaces can help.  It should be clear to a human supervising
an automated system *why* it's doing whatever it's doing.  Observers on 
the presidential convention which investigated TMI said nasty things
about the design of the plant's control panel.

By the way, anyone interested in the political aspects of nuclear power should
check out Peter Beckman's book, "The Health Hazards of Not Going Nuclear".


Autoteller problems (Re: RISKS 5.22)

Alex Colvin <mac@uvacs.CS.VIRGINIA.EDU>
7 Aug 87 18:21:54 GMT
I've had a problem with my PIN on an autoteller before.  In this case it
was clear that the problem lay in the PIN keypad.  Some keys wouldn't
register, others bounced.  After several attempts at typing very
carefully, the machine kept my card. 

Customer Service had noticed an unusual number of cards taken over the
weekend, but were dubious of my explanation until several people in the
customer service line behind me made the same complaint.  They had me
talk to the service people, because I was able to explain the problem in
technical terms ("keys", "bounce"). 

It's possible the BofA problem was also a simple mechanical failure, but
bank staff tend to treat the machines with reverence not accorded ordinary
household appliances.  They don't expect them to get stuck and break.

        [Several other ATM submissions are not included here.  This one is 
        borderline, illustrating again the deification of technology.  PGN]

Please report problems with the web pages to the maintainer

x
Top