The RISKS Digest
Volume 7 Issue 94

Thursday, 15th December 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…


o Vincennes: conclusively, a computer-related error
Clifford Johnson
o Ethics
Dennis G. Rears
o "It's already in the computer"
David Sherman
o RISKS of Tightening Security
o Info on RISKS (comp.risks)

Vincennes: conclusively, a computer-related error

"Clifford Johnson" <GA.CJJ@Forsythe.Stanford.EDU>
Wed, 14 Dec 88 21:07:08 PST
>  From: (Randall Davis)
>  When [Vincennes] first happened we saw a remarkable flood of
>  messages to Risks speculating about the role of advanced
>  automation in general and computers in particular as central
>  to this disaster....
>  there is, with only a few exceptions, clearly a strong
>  desire in the Risks community to believe otherwise.  Perhaps
>  that's worth a few minutes reflection.

I reflect that *all* the information that panicked the Vincennes crew and
captain came from the computers.  The captain was not faulted for trusting his
AA officer, the AA officer was not faulted for misreading (or not reading) his
console, and the officers who reported to the AA officer were not found at
fault.  The fault was found to lie largely with the computer's initial
classification of the flight as hostile, and the computers' subsequent unclear
albeit correct presentation of the ascent data.  The actions taken to remedy
the deficiencies are improvements in the computer display/ human interface.
This is a a classic case of computer *related* error: unobvious and secondary
display of criticial data.

What the Pentagon has has more or less overtly ruled is that its
most competent, trained, and alert officers cannot be blamed for
mistakenly reading and acting on deadly computer displays,
especially not in combat, i.e. when they're actually used.
Replacing alphanumerics with an up/down arrow is the planned
solution to the Vincennes problem.  (Who will be accountable if
the arrow is misread, or if it points the wrong way owing to a
subtle bug - again the computerization?)   As the OTA reported with
respect to nuclear launch under attack (i.e. on warning):
"The risk of error for an LUA system would seem highest when the
human being's ability to make highly structured errors combines with
the machine's limited ability to correct [for] them."  (1981, MX
Missile Basing.)

I saw a front-page report of the approbatory celebration that greeted the
Vincennes Captain and crew on their return to port.  Garlanded and with a huge
grin on his face, having been exonorated by the official inquiry, Captain
Rogers stated he knew he'd done the right thing in the circumstances. I
chillingly wonder if the sister ship, that correctly identified the Iranian jet
as commercial, will receive as loud applause?


"Dennis G. Rears (FSAC)" <drears@ARDEC.ARPA>
Wed, 14 Dec 88 12:37:49 EST
    Several articles in Risks and other USENET groups have commented on the
need for ethics course(s) for "computer people".  I feel there is a need for
them, however, I question the content.  It is my belief that what is and is not
"ethical behaviour" is not clearly defined.  There are some areas that are
agreed on as off limits, "destroying data" and some which are not "perusing
files".  In my mind I don't have a clear guide.  It's like the one Supreme
Court Justice who said "I can't define Pornography but I know it when I see
it".  I think that describes computer ethics right now.
    Also on a similiar theme, the system admins who were complaining about the
loss time involve in fighting the worm should have had their systems right in
the first place.  I view it as negligent to have programs like uucp & sendmail
on systems unless the admin is aware of all the ramifications.  They didn't
deserve to have their systems broken into, but, they didn't really do their job
in the first place.  I must add this does *not* apply to victims of the worm.
    The risk I see is blind trust in computer jocks (wizards, gurus, experts,
etc).  The trust being mainly in technical compentence.  I have seem some
sysadmins who are nothing more then operators (some even less).  However at
many places non-computer people believe "oh, the problem so technical I won't
explain to you because you won't understand it".  As part of being a
professional we must spread our knowledge and give other a deeper understanding
on what know and do.
                                Dennis Rears

"It's already in the computer"

David Sherman <>
13 Dec 88 10:01:21 EST (Tue)
The other day I parked in a multi-story parking lot while going to
the doctor.  "Parking rates: $1.00 per half-hour".  Well, it took
several minutes to get to a parking space on the 6th floor, and about
10 minutes to drive down to the cashier when I came out; this included
a good 7-8 minutes waiting in the line of cars to pay the (single) cashier.

My in-ticket was stamped 15:09.  I decided while waiting in the
line of cars to leave, around 16:08, that as a question of principle
I shouldn't pay for more than an hour, since I was parked for a fair
bit less than an hour.  I make it to the cashier (where they have a
"5 minutes grace" notice), and it's 16:15 when he punches in my ticket.
I hand him $2, and he insists I owe him another dollar.  I say no,
pointing out the amount of time I was waiting in the line to pay.

The RISKS interest lies in his response at this point.
Before computer control of parking cashiers, he could no doubt have
waved me off and accepted the $2.  Now, though, "it's already in the
computer", and if he doesn't get $3, he tells me, it'll come out of
his pocket.  I hung tight, on principle, and told him to take my number
and have his supervisor call me if he liked (he didn't like).  Since I
was blocking the only exit to the lot, cars were backing up behind and people
were getting annoyed, eventually he gave up and raised the gate.

So who controls how much you owe at a parking exit?  People don't
matter.  It's "the computer".

David Sherman, Toronto      attcan!lsuc!

RISKS of Tightening Security

"F.Baube" <>
Wed, 14 Dec 88 10:12:14 -0500
The _City Paper_ of Washington DC, December 9, reports that the _Washington
Times_ was stricken by a "computer catastrophe" last week.  No, not a mainframe
felled by a virus infection.

"The powers-that-be at the _Times_ disabled the computer system's powerful
"RODI" command.

The RODI command ("Read Only Direct") was much loved by _Times_ reporters and
editors because it permitted anyone with a computer logon to examine
practically any file - stories or notes - in the ... system.  Only files that
bad been secured with the system's "lock" command were safe from the eyes of
the RODI cognoscenti.  Because the lock command is so cumbersome, 99.9% of the
paper's files were left unguarded.

"Everyone thought it was their personal little secret," one computer
investigative ace says. "On Wednesday [Nov. 30] you could hear an audible wave
of despair washing over the newsroom ... One _Times_ reporter ... lamented
aloud: "How am I supposed to know what's going on around here without RODI ?"

Nosy _Times_ reporters aren't the only ones crushed by the loss of RODI.
_Times_ sports fans are despondent, too, because RODI allowed them to freely
scroll the sports wire for scores and news.

One _Times_ employee blames this column for RODI's termination, saying, "I'm
sure the editors got tired of seeing their memos printed in _City Paper_."

#include <disclaimer.h>

Please report problems with the web pages to the maintainer