The RISKS Digest
Volume 5 Issue 26

Monday, 10th August 1987

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Secrecy About Risks of Secrecy
Jerome H. Saltzer
Maj. Doug Hardie
Separation of Duty and Computer Systems
Willis Ware
NASA Computers Not All Wet
Mike McLaughlin
Computer Error Opened Flood Gates of Alta Dam
Henry Spencer
Amos Shapir
Re: Another electronic mail risk
Prentiss Riddle
Info on RISKS (comp.risks)

Secrecy About Risks of Secrecy (RISKS 5.25)

Jerome H. Saltzer <Saltzer@ATHENA.MIT.EDU>
Mon, 10 Aug 87 15:42:01 EDT
The desirability of public discussion of computer security flaws comes up
frequently in RISKS as well as in every mailing list and forum I've seen
that touches on security.  My opinion is very strongly in favor of openness
and public discussion of computer security problems, right down to the
details of how to break in to specific operating system releases.

The background for my strong opinion is that I've been working with computer
security, protection, and privacy since 1961.  In that time I've seen both
secrecy and openness used at different times, and it is clear (to me) that
whatever arguments seem to apply at the time, in practice openness works and
secrecy doesn't.  Whenever someone succeeds in arguing that a security problem
shouldn't be noised about, the result is usually several of the following:

     1.  Fixing of the specific problem drops in priority, sometimes
     to the point that it doesn't get fixed at all, leaving the
     system vulnerable to attack for an extended time.

     2.  Others with similar systems often don't learn about the
     problem, leaving their systems vulnerable.

     3.  Others with dissimilar systems almost never hear about the
     problem and therefore don't realize that they should look for
     analogous problems in their systems.

     4.  People developing new systems design the same problem into
     their new system.

In contrast, when rapid public discussion takes place following
discovery of a security problem, the original system gets fixed very
quickly (sometimes it is temporarily operated in a drastically
curtailed mode while waiting for the fix); other users of similar
systems start evaluating their vulnerability, the community
immediately starts to look for analogs of the problem, and as a
general rule the state of the art steps forward a notch and that
class of problem has a somewhat lower chance of reappearing in future
system designs.  (That is not to say that all system designers bother
to look around for lessons they might apply to their systems!)

My conclusion is:  the argument that one enhances security by not
talking about weaknesses is fallacious.  In practice, trying to hide
security problems reduces security, certainly in the long run,
usually in the medium run, and often even in the short run.

When the proprietor of a site or an operating system vendor makes a
potent case for not revealing a security weakness, my usual proposal
in response is that any veiling of a weakness should have a time
limit, after which it will be discussed publicly.  That approach
focuses the argument on the right question, namely what is a
reasonable time limit; it keeps the pressure up to fix the problem,
and it tends to move in the direction of getting the information out
to those people in the world who really need it.  Most important, it
recognizes that the information will eventually circulate among
system attackers, so it must be gotten to system designers, too.

To say more strongly what Grampp and Morris wrote in 1984: the bad
guys know this stuff already, or will discover it soon; the good guys
need all the help they can get.
                    Jerry

   [As an example, we have the note from BJORNDKG in RISKS-5.24 that
   "If you forget your PIN number, a new envelope with a new PIN is sent 
   to you, because no one can access the actual PIN, only the computer 
   knows what it is."  But anyone who reads RISKS should realize how untrue
   this conclusion is, even if the PIN is stored encrypted!  PGN]


Secrecy About Risks of Secrecy Vulnerabilities and Attacks?

"Maj. Doug Hardie" <Hardie@DOCKMASTER.ARPA>
Mon, 10 Aug 87 09:41 EDT
> Should we encourage publication of this kind of material?  Discourage it?
> Publish general overviews of attacks and weaknesses but not give enough
> detail to permit others to replicate the attacks? 

I am not convinced that you can publish overviews that do not provide
any thoughtful person with enough insight to fairly easily complete
the details of the attack.  I believe that it is the initial
identification of a flaw that is difficult, not the exploitation of it.

>    [... If a system is fundamentally flawed, it should not be used
>    in critical applications where those flaws could result in serious
>    compromise...  PGN]

While this argument appears logical from the academic view, it has
some interesting ramifications in real life.  Since the introduction
of digital computers, the federal government has been one of the
biggest users.  Their usefulness has led to all sorts of sensitive
information being stored and manipulated in an amazing variety of
different machines.  I believe it is fair to state that these machines
were not designed with security as a major consideration.  I suspect
that removing all sensitive information from  critical applications
would cripple the government.  Also, there is no way to replace all
those systems with secure systems in the near future.  Even if we only
buy secure systems for new applications, it will be several decades
before the fundamentally flawed systems are gone.

>    [...  Publishing the security weaknesses can only improve things
>    in the long run — although it may lead to some frantic moments in the
>    short run...  PGN]

Unless those frantic moments include the USSR concluding from compromised
information that they are able to successfully attack the US.  In that case,
I claim that regardless of the outcome of the attack, publishing the
security weakness did not improve things in any sense.  — Doug


Re: Separation of Duty and Computer Systems (RISKS-5.25)

<willis@rand-unix.ARPA>
Mon, 10 Aug 87 10:17:38 PDT
Just a clarifying comment on Howard Israel's observation about VMX/VMS.
It may be that the tekkies have provided for the option of twin passwords
but government policy statements and administative directives have not
directed that such safeguard procedures be used.  Until system
implementers and system owners/operators are compelled to install
split-duty provisions, the best intentions of the technical designers will
only have provided a necessary, but not sufficient condition for improving
security.  We will have happenstance usage of double passwords depending
on the conscience of system managers and their perception of threat.

And in some sense, as Howard Israel points out, the safeguard as now
implemented isn't all that great since one person, knowing both, can type
them in sequentially — which makes it no better than a single password
and certainly more trying to one's memory.  A stronger procedure would
require concurrent entry (within a prescribed time limit) and from
different, physically well-separated terminals.  Even that can be
circumvented as Richard Pryor once demonstrated in some movie.

The importance of such intricate safeguards of course depends markedly on
what the threat is believed to be, which implies that double passwords
will probably be used primarily in defense systems and occasionally in
commercial systems, e.g., perhaps where public safety or intense
industrial competition might be involved.

                 Willis Ware,  Santa Monica, CA


NASA Computers Not All Wet

Mike McLaughlin <mikemcl@nrl-csr.arpa>
Mon, 10 Aug 87 08:44:57 edt
A recent news items noted the flooding of a NASA control room during a
simulated Space Shuttle flight.  A longer piece on NPR stated that pipes in
the ceiling of the control room had burst, and that the control room crew
had quickly put covers over their equipment.  NASA personnel considered it
"routine" to have equipment covers handy for just such an occasion.  While
it is easy to second-guess the installers, the NASA computer folks should be
complimented for having their covers ready.  This is a simple, non-technical
precaution.  I'm surprised that it has not already been discussed in Risks.

    [It has, but way back in RISKS-1 or -2.  By the way, there was a CDC 
    computer in a basement that got flooded by the automatic sprinklers,
    shortly after I arrived at SRI.  By the by-the-way, I heard a story
    today about a corporate building power failure that was accompanied by
    a fire; it turns out that the obligatory "EXIT" sign came on in the
    dark with a short in its circuitry, and the emergency light caught fire.
    (The bearer of this news wanted to keep his company's good name out of
    RISKS, so this is appears here anonymously.)  PGN]

[Can't cite my sources - I was lazy, hoping someone else would enter a
comment about this one, and I have forgotten the exact details, such as
which center, what date, etc.  - Mike]


Computer Error Opened Flood Gates of Alta Dam

<mnetor!utzoo!henry@uunet.UU.NET>
Mon, 10 Aug 87 14:14:58 EDT
Dept. of amusing juxtapositions:  in 5.24 we have Dave Benson saying
"We must recognize that... little computer related RISK is attached to
other generation methods or to conservation...", while in 5.25 we have:

> COMPUTER ERROR OPENED ALTA (power station) FLOOD GATES...
> ...three-meter high flood wave... full police rescue alarm...
> warnings to campers etc...

Persons interested in risks of power generation, in a broad sense, might also
wish to read Tom Clancy's "Red Storm Rising".  The first few chapters are a
graphic account of how centralized computer control of oil refineries permits 
one-man sabotage on a staggering scale.  (The rest of the book, which is also
quite good but less relevant, is about the Third World War that results when 
such a single act of sabotage severely disrupts Soviet oil production.  This
is a risk on a scale that dwarfs most of the RISKS discussions!)

I would also observe that one aspect of energy conservation is efficient,
often computerized, management of energy use in large buildings etc.
I don't recall hearing of any serious problems with such systems, but
surely there must have been some, especially early on?  Anybody know?

Henry Spencer @ U of Toronto Zoology {allegra,ihnp4,decvax,pyramid}!utzoo!henry


Re: Subject: Computer Error Opened Flood Gates of Alta Dam (Norway)

Amos Shapir <nsc!nsta!nsta.UUCP!amos@Sun.COM>
Haavard Hegna <hegna%vax.nr.uninett@NTA-VAX.ARPA> writes:
>The Alta Power Station has been operative since May this year.  "The paradox
>is that this very autumn we were going to install a valve that will prohibit
>the accidental opening of the flood gates", the dam administration says.

Installing the safety features a few months *after* the system becomes
operational may be called a 'paradox' in Norwegian; the English term is,
if I am not mistaken, 'incompetence'. (In Russian it's Chernobyl, of course).

Amos Shapir, National Semiconductor (Israel)
6 Maskit st. P.O.B. 3007, Herzlia 46104, Israel  Tel. +972 52 522261
amos%nsta@nsc.com @{hplabs,pyramid,sun,decwrl} 34 48 E / 32 10 N


Re: Another electronic mail risk (Doug Mosher)

Prentiss Riddle <ut-sally!im4u!woton!riddle@seismo.CSS.GOV>
10 Aug 87 15:37:20 GMT
I just wanted to point out that the risk of sending messages to the wrong
recipients can apply to paper mail as well as electronic mail.  My father
claims to have once known a manager who forbade the use of paper clips in his
office.  It seems that he had once lost a job because a memo critical of
one of his superiors got snagged in the paper clip of another memo addressed
to the person he was criticizing!  :-)

--- Prentiss Riddle
--- Opinions expressed are not necessarily those of Shriners Burns Institute.
--- riddle@woton.UUCP {ihnp4,harvard,seismo}!ut-sally!im4u!woton!riddle

Please report problems with the web pages to the maintainer

x
Top