The RISKS Digest
Volume 3 Issue 65

Wednesday, 1st October 1986

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

UNIX and network security again
Andy Freeman
F-16 software
Wayne Throop
NYT feature article on SDI software
Hal Perkins
Autonomous widgets
Mike McLaughlin
Robottle Management Software?
PGN
Info on RISKS (comp.risks)

UNIX and network security again

Andy Freeman <ANDY@SUSHI.STANFORD.EDU>
Mon 22 Sep 86 17:09:27-PDT
preece%mycroft@gswd-vms.ARPA (Scott E. Preece) writes:

    If you can't trust the code running on another machine on your
    ethernet, then you can't believe that it is the machine it says it is,
    which violates the most basic principles of the NCSC model.

That's why electronic signatures are a good thing.

    I wrote (andy@sushi):
    >   Then NCSC certification means nothing in many (most?) situations.

    Well, most sites are required to have certified systems (yet?). If
    they were, they wouldn't be allowed to have non-complying systems.

The designers of the Ford Pinto were told by the US DOT to use $x as a
cost-benefit tradeoff point for rear end collisions.  Ford was still
liable.  I'd be surprised if NCSC certification protected a company
from liability.  (In other words, being right can be more important
than complying.)

       [This case was cited again by Peter Browne (from old Ralph Nader
        materials?), at a Conference on Risk Analysis at NBS 15 September
        1986:  Ford estimated that the Pinto gas tank would take $11 each to
        fix in 400,000 cars, totalling $4.4M.  They estimated 6 people might
        be killed as a result, at $400,000 each (the going rate for lawsuits
        at the time?), totalling $2.4M.  PGN]

    Absolutely, network access should be as secure as phone access, IF YOU
    CHOOSE TO WORK IN THAT MODE.  Our links to the outside world are as
    tightly restricted as our dialins.  The Berkeley networking software
    is set up to support a much more integrated kind of network, where the
    network is treated as a single system.  For our development
    environment that is much more effective.  You should never allow that
    kind of access to a machine you don't control.  Never.  My
    interpretation of the original note was that the author's net
    contained machines with trusted-host access which should not have had
    such access; I contend that that represents NOT a failing of the
    software, but a failing of the administration of the network.

My interpretation of Brian's original message is that he didn't have a
choice; Berkeley network software trusts hosts on the local net.  If
that's true, then the administrators didn't have a chance to fail; the
software's designers had done it for them.  (I repeated all of Scott's
paragraph because I agree with most of what he had to say.)

-andy

    [I think the implications are clear.  The network software is weak.  
     Administrators are often unaware of the risks.  Not all hosts are
     trustworthy.  The world is full of exciting challenges for attackers.  
     All sorts of unrealistic simplifying assumptions are generally made.  
     Passwords are typically stored or transmitted in the clear and easily
     readable or obtained — or else commonly known.  Encryption is still
     vulnerable if the keys can be compromised (flawed key distribution,
     unprotected or subject to bribable couriers) or if the algorithm is 
     weak.  There are lots of equally devastating additional vulnerabilities
     waiting to be exercised, particularly in vanilla UNIX systems and 
     networks thereof.  Remember all of our previous discussions about not
     trying to put the blame in ONE PLACE.  PGN]


F-16 software

<rti-sel!dg_rtp!throopw%mcnc.csnet@CSNET-RELAY.ARPA>
Tue, 23 Sep 86 19:12:33 edt
  > I spoke to an F-16 flight instructor about this business concerning bomb
  > release when the plane is upside down.  He said the software OUGHT to
  > prevent such an occurrence.  When the plane is not at the right angle of
  > attack into the air stream, toss-bombing can result in the bomb being 
  > thrown back into the airplane.  

Hmpf.  *I* spoke to an ex Air-Force pilot.  He said if *any* restriction on
bomb release is incorporated it should be to prevent it when the plane (or
more specificially, the bomb itself... there *is* a difference, and you had
better realize it!) is pulling negative G's.  This was my original point...
"upside down" or "inverted" isn't the correct thing to worry about, it is
the wrong mindset entirely, too simple a notion.

He went on to back up this assertion by pointing out that there is a
common (well... well-known anyhow) bombing technique, called "over the
shoulder" bombing, that requires release while inverted.  Consider the
following diagram.  (Note that the trajectory shapes are unrealistic and
the scales are exagerated.  Limitations of the terminal, don't y'know.)
                                 _
                               /   \
                              /      \
             ________________________ |
            <               /        \r
                           /          \
                          |            |
                          v            /
B >___________________________________/
                          T

Now, we have bomber B, release of bomb r, and target T.  The bomber makes a
fast, low-level run over the target (to avoid radar, and to let the
bombsight get a good look).  Then, soon after the overfly, pulls sharply up
and over, and *while* *inverted* releases the bomb.  The bomb lofts high
into the air over the target whilst the plane scoots for home (rolling out
of the inversion, presumably but not necessarily), and the bomb eventually
lands splat on the target.

Basically, if you want the flight computer to wet-nurse the pilot at all in
this regard, it ought to have a sensor to detect strain on the bomb
restraints, and refuse to release them if the bomb isn't currently "trying"
to "fall" away from the aircraft.  (Even this isn't foolproof, of course,
but it comes close.)  Tying this into the *attitude* of the *aircraft*
*itself* is *WRONG* *WRONG* *WRONG*, and is, as I said before, an
architypical computer risk, in that it is an overly simple and misleading
model of the situation.

The conversation I had with my friend makes a lot of sense to me, and the
above somewhat vague stuff about the angle of attack does not.  It could be
I'm just missing something obvious, but I stand by my earlier position.

   The desire for safety stands against every great and noble enterprise.
                                --- Tacitus


NYT feature article on SDI software

Hal Perkins <hal@gvax.cs.cornell.edu>
Wed, 24 Sep 86 11:32:59 EDT
The science section of last Tuesday's New York Times (16 Sept 1986) had a
feature article on the SDI software problem starting on page C1.  The
headline is

    Software Seen As Obstacle In Developing 'Star Wars'
    Serious problems have forced dramatic changes in planning.

    by Philip M. Boffey

The article is much too long to type in — anyone interested can easily
find a copy.  The author has done his homework.  He gives a good
overview of the problems and of the issues in the SDI software debate
and seems to have talked to the main people involved, several of whom
are quoted.  There's not much here that will be new to computer people
who have been following the debate, but it's definitely worth reading.

Hal Perkins, Cornell CS


Autonomous widgets

Mike McLaughlin <mikemcl@nrl-csr>
Wed, 24 Sep 86 10:32:29 edt
The discussion of Autonomous Weapons should be expanded, considerably.
Consider the following devices, soon to be found at your local dealer:

    Autonomous Lumberjack - locates and cuts down designated
trees (pulp, hardwood, diseased... )

    Autonomous Booter - identifies automobiles with more than 
n dollars in overdue tickets.  

    Autonomous Streetsweeper - clears your street of any immobile
object other than licensed vehicles (see A. Booter, above).

    Autonomous NightWatchman - passive notifies authorities,
active counteracts intruders. 

N.B.:  My "passive autonomous nightwatchman" is available at your friendly
Heath/Zenith store _now_!  Sorry, don't have a catalog at hand, or I'd
provide ordering information.
                                    Mike McLaughlin

                                     [Mike, Now that it is FALL, you must be 
                                      feeling AUTUMNMATED.  Autonomous Bosh]


Robottle Management Software? (Wine nought?)

Peter G. Neumann <Neumann@CSL.SRI.COM>
Wed 24 Sep 86 06:57:03-PDT
The following news item appeared in the 15 Sept 1986 issue of Digital Review, 
roundabout from the 26 June 1985 issue of the Halifax Gazette.  But it is
RISKY enough to report here.

    EDINBURGH (Reuters) — A robot dressed in a black hat and bow tie
    appeared in court on Tuesday after running amok in a restaurant
    where it was employed to serve wine. 

         Within its first hour on the job, the secondhand robot became
    uncontrollable, knocking over furniture, frightening customers and 
    spilling a glass of wine, the court was told.  The following day, 
    the robot, exhibited Tuesday in the court, was still incapable of 
    controlling the wine glasses, testimony said.  Eventually its head
    fell into a customer's lap.

A tipsy-turvy robot?  Did the firsthand know what the secondhand was doing?
Asimov's Nth Law of Robotics might read, "A robot must not spill wine on the
customers unless enforcing this Law would conflict with Laws 1,2, and 3."
But maybe the program instructed the robot to put on "glasses" (ambiguously)
so it could see better.  Punishment: Send the robot to a OENAL COLONY?
[Apologies in advance.  I've been up too late recently.]  Peter

Please report problems with the web pages to the maintainer

x
Top