The RISKS Digest
Volume 7 Issue 86

Saturday, 3rd December 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…


o Mix-up Impedes Romance
Kevyn Collins-Thompson
o California Lotto computer crash
Rodney Hoffman
o Telecommunications, Data Entry, ... - and "Security"
Henry Schaffer
o Re: Toll Road information collection
Dave Nedde
o Manufacturers' responsibilities for security
Keith Hanlan
o Computer Malpractice
David J. Farber
o Interesting Sidebar on worm and liability
Charles J. Wertz
o Unfortunate Use of Term "cracker"
T. Andrews
o Re: "crackers" and "Crackers", " 'jackers", and "snackers"
o Info on RISKS (comp.risks)

Mix-up Impedes Romance

Kevyn Collins-Thompson <>
Fri, 2 Dec 88 12:02:38 EST
One day, after I logged in to my CMS account here, I discovered that new mail
was waiting for me in my reader.  The lengthy message was prefaced by the

     "From: Mailer@<machine>: Your message could not be sent ..etc"
     "Reason:  Address unknown..."

Upon scanning this returned letter, I discovered that it had not been
written by me at all, and that the intended recipient and sender were
thousands of miles away, apparently the unfortunate victims of a random 
mailer screw-up.  The first sentence of that letter, though, I will
always remember:

     "My dearest Janice:
      At last, we have a method of non-verbal communication which is
      completely private..."

California Lotto computer crash

Hoffman.ElSegundo@Xerox.COM <Rodney Hoffman>
2 Dec 88 07:47:47 PST (Friday)
From two stories by Dan Morain in the 'Los Angeles Times' on Tuesday, Nov.
29 and Thursday, Dec. 1:

  The California Lottery will fine GTECH Corp. $208,500 for a weekend computer
  crash that left two-thirds of the Lotto terminals in Southern California
  unable to accept wagers.  All 4,375 terminals in Southern California stopped
  working for 14 minutes in the peak betting period Saturday night.  Two-thirds
  of the terminals remained down for the rest of the night.

  A newly installed telecommunications program for the main Southern California
  lotto computer malfunctioned.  The problem was exacerbated by a GTECH
  operator who subsequently installed the wrong back-up program.  The new
  program was designed to improve system reliability.  It has been removed for
  testing.  "There's little doubt that the error was caused by GTECH software,
  compounded by GTECH operator error," said a senior vice president for the

  The state's contract with GTECH allows it to charge the company $4000 for
  each minute that the system is not working, and $1000 a minute when it is
  unacceptably slow.  Lottery officials say that in the last year, the computer
  system has been inoperable or unacceptably slow for 779 minutes, or 0.2% of
  the time.

Telecommunications, Data Entry, ... - and "Security"

n c state univ <Henry Schaffer <<>
Fri, 2 Dec 88 15:39:08 est
Re: the quote from "Optical Information Systems Update," Dec 1, 1988, p.8.  

  ...  two-way transmission provides complete document control and security
  because the forms never leave the customer[']s office.  ...

Of course, if one is concerned about the security of the *information*,
that is a different matter.

Re: Toll Road information collection

Dave Nedde <>
Mon, 28 Nov 88 13:00:21 EST
>From: oster@dewey.soe.Berkeley.EDU (David Phillip Oster)
>Is it fair to also stamp the tickets with the time of issue, so if the
>distance traveled divided by the time elapsed is greater than the average
>speed limit the toll taker can hand you a speeding ticket at the same time?
>An appropriate computer would help the toll taker in this task.

Alas, as a Mass police officer pointed out in an interview, you have to catch
someone *in the act* of speeding to get them for it.  Probably something to do
with that annoying bill of rights...

Manufacturers' responsibilities for security --

Keith Hanlan <keithh%tartarus%bnr-fos@sri-unix.UUCP>
Fri, 2 Dec 88 16:13:14 EST
         Vendors should provide proper tools for security

In RISKS 7.84, Brandon S. Allbery (allbery@ncoast.UUCP) explains that "Vendors
have some blame, but their [naive] customers and [ignorant] salespeople have
even more." This thesis is based on his observation and his experience that a
great many small-scale customers have no inclination to incur the overhead of a
'secure' system.

    From this, and the context of the article as a whole, I infer further,
that Mr. Allbery's feeling is that vendors are thus catering to a lowest
common denominator and, perhaps in keeping with the spirit of unix,
leaving the details and deficiencies to those with specific requirements.
I agree that this is most likely what has happened even though it is
not a publicly advertised tenet of any product developer.

    However, I feel that the vendors' laxities cannot be excused in
the least. In fact, as the professionals with an understanding of the 
implications of security, or lack thereof, it is incumbent on them to
produce a secure product which is still easy to install, maintain, and
use. Proper tools could reduce the confusion and inconvenience
which drives so many customers to take short-cuts. It would also 
enhance their product in all market areas.

    In the wake of the Internet Worm we have seen claims that UNIX is
intrinisically an insecure system and that this fact casts a pall on
UNIX's current rise in popularity. (I personally think the UNIX casts a
pall on computing as a whole but that is another issue. :-) ) However, 
I still maintain that proper maintenance tools will go a long way to 
producing secure computer networks.

    I have used several varieties of UNIX and the vendor's are always
very quick to advertise their value-added features and embellishments.
However, how often, when porting or re-writing an operating system, do
vendors take the opportunity to fix glaring bugs and deficiencies?
"Compatibility!" you cry? What bug cannot be made a option left to the
user's discretion? ("-B switch for historical reasons.")

    I hope this current wave of concern will encourage the vendors to
re-think their development strategies. Bug fixes are not that difficult.
It is time for the unix operating system developers to doff their
hacker's capes and stop reveling in the vagarities of Unix.

Keith Hanlan, Bell-Northern Research

Computer Malpractice

"David J. Farber" <>
Fri, 2 Dec 88 22:03:08 EST
The network worm (sometimes called virus) affair raises issues that are very
important to our field.  Both the BITNET Board of Trustees and the CSNET
Executive Committee have been struck by the fact that many public comments on
the event have contained statements such as, "We learned from it," "We will
make sure technically it will not happen again," or "He did us a favor by
showing...," unaccompanied by expressions of ethical concern.

We have succeeded as a profession technically in creating facilities — the
BITNET, CSNET and other components of the national research network — which
are now critical to the conduct of science and engineering in our nation's
academic, industrial, and government research laboratories.  Further, this
technology has spread within our nation's commercial research and development
organizations and even into their manufacturing and marketing.

Just as medical malpractice can have a serious effect on an individual's
health, one of the costs of our success is that we are now in a position where
misuse of our national and private computer networks can have as serious an
effect on the nation's economic, fense, and social health.  Yet while almost
every medical college has at least one course on medical ethics and insists on
the observance of ethical guidelines during practice, computer scientists seem
to avoid such non-scientific issues.

The worm "experiment" caused a major disruption in the research community.
Among other points of attack, the worm exploited a trapdoor that had been
distributed as a software "feature".  Many hours of talent were wasted finding
and curing the problems raised by this "game".  Many additional hours were lost
when researchers were unable to access supercomputers and mail systems due to
system overload and network shutdown.

We condemn the perpetration of such "experiments", "games", or "features" by
workers in our field, be they students, faculty, researchers or providers.  We
are especially worried about widespread tendencies to justify, ignore, or
perpetuate such breaches.  We must behave as do our fellow scientists who have
organized around comparable issues to enforce strong ethical practices in the
conduct of experiments.

We propose to join with the relevant professional societies and the national
research networks to form a Joint Ethics Committee charged with examining
existing statements of professional ethics and modifying them as necessary in
order to create a strong statement of networking ethics and recommendations for
appropriate enforcement procedures.

Interesting Sidebar on worm and liability

<<WERTZCJ@SNYBUFVA.BITNET> Charles J. Wertz, Buffalo State College>
Sat, 3 Dec 88 09:12 EDT
Here is an extract of an interesting comment sent to BUG-LAN@SUVM
by magill@ENIAC.SEAS.UPENN.EDU (William Magill at Univ of Pa.)
  "..the reason that security policy procedures are important is an
  issue of LIABILITY."

  "The recent Internet worm was a case where KNOWN security holes
  were exploited. While what was done 'wasn't nice', it was
  indefensible from a point of view of liability. Put another way,
  had data been compromised, the fact that known security holes
  were not 'plugged' would have rendered the University/Hospital
  defenseless in a liability case."

Unfortunate Use of Term "cracker" in RISKS-7.84

Thu Dec 1 20:56:25 1988
[RISKS-7.84] referred to the common practice among the semi-literate of
trusting to God that "crackers" will not invade or damage their new computer

As a native of God's Own Country, I must object to this use of the term
"cracker" to refer to computer vandals and burglars.  I suspect that our
neighbours to the north (also known as crackers) would also object.

        Dr. T. Andrews, Systems,    CompuData, Inc.  DeLand

Re: "crackers" and "Crackers", " 'jackers", and "snackers"

Peter Neumann <>
Sat, 3 Dec 1988 16:23:12 PST
With initial caps, "Cracker" (as used in Florida or Georgia) is a proper noun,
as opposed to "cracker" (as in the sense of a malevolent hacker).  But in 
Spoken English, the subtlety is certainly lost.

But we do have a problem.  We desperately need a convenient term like
"cracker", because the nonpejorative primary meaning of "hacker" needs to be
defended vigorously against misuse by the press and others.  Perhaps we could
try to use "jacker" (or " 'jacker", short for hijacker) as someone who breaks
into computer systems and subverts them.

How about "snacker" for someone who is a nonmalicious but exploratory
benevolent hacker?  When Bob Morris (the elder) was visiting Berkeley from Bell
Labs for the year (around 1967?), he might have been classified as a snacker:
he seemed to nibble at the edges of the Berkeley time-sharing system more than
anyone else.  In fact, whenever he walked into the terminal pool room, others
would log out — because the system tended to crash more often when Bob was
logged in.  (He stumbled onto quite a bunch of hitherto undetected bugs.)
[Joe Bftsplk at Berkeley?]

Please report problems with the web pages to the maintainer