The RISKS Digest
Volume 6 Issue 59

Tuesday, 12th April 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Robot suicide
Tom Slone
Computer Risks? UUCP map entries?
Comment on "Diving Risks" — Fail Safe Design?
Mark W. Eichin
``How Computers Get Your Goat''
Kevin B. Kenny
Should You Trust Security Patches?
Steve Bellovin
Race?
John Macdonald
A Cray-ving for RISK prevention
Matt Fichtenbaum
Re: What happened to personal responsibility?
Henry Spencer
Discrimination
John Lavagnino
Darin McGrew
Nonviral biological analogies — a reference
Eugene Miya
New constituency for RISKS (Soviets embrace UNIX)
Jon Jacky
Vendor speak with "functioned" tongue!
Chris McDonald
Info on RISKS (comp.risks)

Robot suicide

Tom Slone <potency@violet.Berkeley.EDU>
Tue, 12 Apr 88 11:41:26 PDT
"A Budd Company assembly robot has apparently committed suicide.  The
robot was programmed to apply a complex bead of fluid adhesive, but the
robot 'ignored the glue, picked up a fistful of highly-active solvent,
and shot itself in its electronics-packed chest."
--Motor Trend, 11/86
                                    [Inspired by Budd's McFrenzy?  PGN]


Computer Risks? UUCP map entries?

<[Anonymously Contributed]>
Sun Apr 10 13:34:33 1988
I was just going through the UUCP map entries, and noticed quite a few "home
systems" mentioned. Did it ever occur to these people that the UUCP map entries
make a great shopping list for burglars? "Lemme see now, IBM PC/AT, nahhhhhh, I
hates them segment registers, SUN 3/50, nah, m'az well steal a VT-100, ahhhhhh
SUN 3/280-LS/MFT, big disk, just what I need for doing the floor plan of First
Federal..." I just finished creating a map entry for my home system, and I
stopped to think, "would I put a sign on the front of my home saying I have a
few thousand dollars worth of computer equipment inside". I doubt it very much.
But people (me included, I guess!) routinely post map entries for the (netnews)
world. Am I being excessively paranoid, or is it a healthy mistrust of my
fellow creatures? I realize the possibility of a Bad Person using the maps for
"shopping" was probably unlikely a few (2? 3?) years ago, but with the
proliferation of netnews systems, especially "public" netnews systems, I'm sure
the probability went up.

   [Anonymouse traps waiting to spring?  No, this is just the old inference
   problem, which has been discussed here amply, and which is clearly
   exacerbated by the networking of databases.  PGN]


Comment on "Diving Risks" — Fail Safe Design?

Mark W. Eichin <eichin@ATHENA.MIT.EDU>
Fri, 8 Apr 88 00:42:25 EST
Re: diving ascent computer: Does the version with a flashing LED as warning
ALSO have a test button (or some other test) to see if the LED has failed?
If not, divers could grow to trust it, then if (when!) the LED fails, they
would be in danger of accident...


``How Computers Get Your Goat'' (RISKS-6.54)

Kevin B. Kenny <kenny@b.cs.uiuc.edu>
Mon, 11 Apr 88 12:45:46 CST
  : ...  The researcher, Jan L. Guynes, used psychological tests to classify 86
  : volunteers as either Type A or Type B personalities...  She found that a 
  : slow unpredictable computer increased anxiety in both groups equally...

I read a study several years back which, while not classifying Type A vs. Type
B subjects, studied psychological response to response time.  The results of
the study were that the VARIANCE in the response time was significant; the
mean was much less so.  The conclusion could be that `unpredictable' is the
key word in the preceding paragraph.

See Harold Sackman, Man-Computer Problem Solving, Auerbach, Princeton NJ, 1970.

                                                 Kevin


Should You Trust Security Patches? (Re: RISKS-6.58)

<smb@research.att.com>
Tue, 12 Apr 88 10:27:15 EDT
These wonderful new security patches that were sent out without
publicity — how do you know the fix really came from DEC?

Just a thought to keep you really paranoid...
                                                  --Steve Bellovin


Race? (Re: RISKS-6.55)

John Macdonald <harvard!linus!utzoo!spectrix!John_M@rutgers.edu>
Mon Apr 11 18:54:37 1988
I would have thought that the appropriate answer to the question "Race:" on a
driving license application would be "never" or "Formula One" or any similar
experience.  It is a quite reasonable question for them to be asking :-).

      [A grammatically correct answer to "Race?" would be "No (I don't)."  PGN]


A Cray-ving for RISK prevention (Re: RISKS-6.55)

Matt Fichtenbaum <mlf@genrad.com>
Mon, 11 Apr 88 09:14:30 edt
>CRAY - A traditional Shinto ceremony was performed at Cray's systems check-out
>building in Chippewa Falls to introduce a protective spirit into a new X-MP/24

Quite a feat of Cray, eh?


Re: What happened to personal responsibility?

<mnetor!utzoo!henry@uunet.UU.NET>
Tue, 12 Apr 88 14:57:31 EDT
> ... To sit in a 30mph steam train was not only a joy, you placed
> your life in the hands of engineers who were ultimately accountable. To
> sit in a 125mph bullet train or a high-speed local subway is no longer
> quite so joyful. You *still* place you life in the hands of the company,
> but is it the Engineers, software or otherwise that carry the can?

Why, nobody, of course.

If you want a good example of what I'm talking about, consider the Challenger
disaster.  I think there is little doubt that specific people could plausibly
be held responsible for it, although there might be some debate about exactly
who.  Now, look at the aftermath.  How many people have been arrested on
criminal charges as a result?  None.  How many people have been fired in
disgrace as a result?  None.  (A few have run into trouble for talking too
much about the incident, but not for causing it!)  How many companies have
been disbarred from government business as a result?  None.  What penalties
were assessed against Morton Thiokol?  Well, after a long debate it was
agreed that ten million dollars would be deducted from payments on their
SRB contracts.  (Note that (a) the replacement value of a shuttle orbiter
is approximately two *billion* dollars, (b) both NASA and its customers
have been hard-hit by the long hiatus in spaceflight and other side effects
of the disaster, (c) Morton Thiokol has received many millions of dollars in
fix-the-SRBs contracts, and (d) the issue of an alternate source for SRBs,
a major worry to M-T, has been postponed for some years.)

To avoid a repetition of the Challenger disaster, people need an incentive
to avoid one.  For the lawyers and MBAs who run most aerospace companies,
that means a financial incentive.  Only if technical disaster translates
into financial disaster will the bean-counters see to it that the whole
company has a firm commitment to avoiding it.  Only then will a "no" from
the engineers be backed up by the management, even if it hurts.  So how much
of a financial disaster has Morton Thiokol undergone?  None!

Look at the results, not the rhetoric.  Who was responsible for Challenger?

Nobody.

Henry Spencer @ U of Toronto Zoology {allegra,ihnp4,decvax,pyramid}!utzoo!henry


Re: Discrimination and careless arguments

John Lavagnino <<LAV%BRANDEIS.BITNET@MITVMA.MIT.EDU<>
Tue, 12 Apr 88 11:46 EST
David Thomasson writes:

> Lavagnino confuses two separate actions: gathering information,
> and misusing information.

Can we believe in this separation after reading the accounts of actual
practice that appear in RISKS?  And can we believe in Thomasson's (unstated)
assumption that the various bureaus of our government have no connection with
each other?  I'm afraid I can't.  His analysis of Earnest's story reduces it
to a mere fallacy by throwing out all evidence of the meaning of race in that
place and time; that evidence he dismisses as just a bunch of anecdotes,
because he assumes there are no connections, but to me it's clear that it's
what leads to Earnest's reaction to the license application. Thomasson's
conclusion is further based on his (unstated) opinion that no objection to
governmental activities may be made without irrefutable evidence of
misbehavior — which is a reasonable opinion, but it's an opinion all the
same, and there are others on the matter, such as Earnest's.

This method amounts to throwing out all the evidence and assuming that
you haven't thereby distorted the problem you set out to study; again,
think about that procedure from a RISKS point of view.

John Lavagnino, Department of English and American Literature, Brandeis Univ.


Discrimination

Darin McGrew <ibmuupa!mcgrew@ucbvax.Berkeley.EDU>
Mon, 11 Apr 88 15:57:24 PST
In RISKS 6.55, David Thomasson <ST401405%BROWNVM.BITNET@MITVMA.MIT.EDU> says:
> If one thinks it is a simple matter of separating the "bad" kinds of
> discrimination from the "good" (or "acceptable") kinds, try phrasing a
> general principle that will make that distinction.

This is rather off the subject of computer risks, but it shows a related
problem.  "Bad discrimination" is that which is based on qualities that should
be irrelevant to the choice being made.  "Good discrimination" is that which
is based on qualities that are relevant.

The problem comes from the decision of what qualities are relevant to a given
decision.  When we disagree about the relevance of certain qualities, my right
to be considered apart from "irrelevant" qualities will conflict with your
right to consider all my "relevant" qualities.  Problems also arise when I
perceive that you considered irrelevant qualities when you didn't.

This problem shows up with computer systems when information is considered
relevant by one person, and not by another.  This causes people to ignore
warning indicators because they learn that the engineer considered a lot of
"irrelevant" information important.  It also causes hidden failures (eg, of
failsafe systems) because the engineer didn't consider something important to
be "relevant."

Darin McGrew        ucbvax!ibmuupa!mcgrew
I speak for myself, not for my employer.


Nonviral biological analogies — a reference

Eugene Miya <eugene@ames-nas.arpa>
Fri, 8 Apr 88 21:51:44 PDT
Since we are talking about the biological analogy of computer viruses, I
would like to call attention to a book to further continue (non-viral)
biological analogies.  The author would like to get people thinking about them:

%A B. Huberman, ed.
%T Computational Ecologies
%I North-Holland
%D 1988

It does not deal with viruses per se, but does wish to consider distributed
systems in an ecological context.  
                                          --eugene miya


New constituency for RISKS (Soviets embrace UNIX)

Jon Jacky <jon@june.cs.washington.edu>
Thu, 24 Mar 88 09:22:45 PST
>From Electronic Engineering Times, March 7 1988

UNIX POPULARITY EXTENDS INTO USSR by Ray Weiss

Unix popularity is spreading.  It has even reached the Soviet Union, where
Unix classes will be held this summer.  

A series of one-week classes will be taught in English by instructors from
an American company, Lurnix.  The classes, to be held in Peraslava some 60
miles north of Moscow, will be open to both Soviets and foreigners.  In fact, 
Lurnix is setting up a tour for Americans that would like to combine
travel to the USSR with a study of the operating system.

One hangup is the current export policies.  They allow Unix object code to
be exported, but Unix source code is embargoed.  Without source code, Unix
cannot be easily adapted to different host computers or special peripherals.
Consequently, the classes will concentrate on Unix system administration and
programming under the Unix operating system. ...

The last project Lurnix worked on was a study that explored networking 
between grade schools and its effect on learning.  The study was funded
by the Carnegie Corp.

The new classes are part of an effort to establish Unix s a standard in 
the country's schools.


Vendor speak with "functioned" tongue!

Chris McDonald STEWS-SD 678-2814 <cmcdonal@wsmr10.ARPA>
Tue, 12 Apr 88 15:30:47 MST
We recently received a quantity of Unisys terminals.  In the operator's manual
I was surprised to read the following on the subject of function keys.  You can
define the keys "to do such things as: Transmit a special password or
instruction to the host..."

I find it curious that a firm that has indicated its intention to build
"trusted systems" against the National Computer Security Center's Orange Book
criteria should use such an example.  

Please report problems with the web pages to the maintainer

x
Top