The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 7 Issue 92

Monday 12 December 1988


o Glass cockpits
Randall Davis
o "Proper British Programs"
Steve Philipson
o Information available for a price
Curtis Keller and Bruce O'Neel
o Toll Road information collection
Steve Philipson
o Big Bother and Computer Risks
Dennis L. Mumaugh
o Re: Computer Virus Eradication Act of 1988
Jonathan Sweedler
Vince Manis
o Re: Vendor Liability and "Plain Vanilla" configurations
Andy Goldstein
o Re: "Hackers", "crackers", "snackers", and ethics
Andy Goldstein
o Hackers
o Info on RISKS (comp.risks)

Glass cockpits (Rodney Hoffman, RISKS-7.90)

Randall Davis <>
Sun, 11 Dec 88 16:23:32 EST
The article [Peril for Pilots]  raises two interesting issues:

The larger issue is, what kinds of and how much remote sensing to use, what
kinds of and how much automation to introduce in any situation.  Note, though,
that the problem of information overload and believing sensors (including your
own eyes) has been with us for quite some time and exists /independent of/
automation in general and computers in particular.  Both of those can easily
ADD to the problem, but that's different from being the SOURCE of it.

Second, it is another example of the remarkably unsuccessful attempt to cast
technology as the primary heavy in the Vincennes incident.  When it first
happened we saw a remarkable flood of messages to Risks speculating about the
role of advanced automation in general and computers in particular as central
to this disaster.  When the report came out indicating that the system had
supplied accurate information, the silence was deafening.

And now this (continuing from the same msg above):

    "The anti-air warfare officer made no attempt to confirm the reports [...]
    Instead, this "experienced and highly qualified officer [...] relied on
    the judgment of one or two second-class petty officers, buttressed by
    his own preconceived perception of the threat, and made an erroneous
    assessment to his commanding officer."

In other words, if only the AA Officer had <>paid attention to the system and
believed it instead of or along with his ``one or two ... officers,''<< the
tragedy would have been avoided.

Can automation and reliance on remote sensing be overdone?  Of course.  Is
this an example of it, or an example of the opposite?

The point is not that technology is blameless, nor that this particular
technology is faultless, nor that the sole issue is the effectiveness of the
technology (political, ethical, military and other considerations all play a
role in this).

The point is twofold: 
first, within almost any reasonable definition, the system worked to supply
accurate and useful information in a form available in a ``quick reference'';
every report that comes out continues to make that clear.

second, there is, with only a few exceptions, clearly a strong desire in the
Risks community to believe otherwise.  Perhaps that's worth a few minutes

"Proper British Programs" (Nancy Leveson, RISKS-7.91)

Steve Philipson <>
Mon, 12 Dec 88 13:15:22 PST
>the ICAO must certify these systems before they are used, the standard has
>teeth and could be enforced quite strictly (unlikely, but ...). The following 
>are some interesting excerpts:

>  "... [Software] must be developed systematically in such a way that its
>   behavior, under all possible conditions, can be established by logical
>   reasoning (to a level of formality appropriate to the application).

   It's my opinion that strict enforcement of the above requirement simply
makes the developer liable for errors, but doesn't do much for actually
improving software reliability.  It is unlikely that "all possible conditions"
can be forseen, let alone provided for.  The problem becomes bigger as the
complexity of the system increases, to the point where exhaustive analysis of a
system could take centuries to perform.

   The requirement is essentially that systems be perfect.  That goal has
proven elusive (unattainable?) in all areas of human endeavor.  Extensive
formalism and verification should be required of critical systems, but
requirements for perfect function are inane.  A better approach would be to
require independent performance monitoring and evaluation as part of the
complete system.  This is the approach we often take with non-computer based
systems.  It seems reasonable to make it part of our imbeded computer systems
as well.

Toll Road information collection (John Sullivan, RISKS-7.91)

Steve Philipson <>
Mon, 12 Dec 88 13:15:22 PST
John Sullivan (sullivan@fine.Princeton.EDU) writes about fines for speeding 
based on computed speed from toll both timing:

>            ...          I'm no
>lawyer--but I see no reason not to pass a new law to make exiting with too
>little elapsed time a new crime.  The owner of a car can be held responsible
>for parking tickets even if someone else parked the car, so I see no reason
>that whoever drives through the exit booth can't be held responsible for the
>average speed.

   A parking violation does not go on a person's driving record.  It does not
matter who was cited as long as the bill is paid.  So, if you lend your car to
a friend who then get's a parking ticket, you can collect from him with no
negative impact on your record.  However, a speeding ticket does end up on a
specific person's record, and thus can result in the suspension of that
person's driving priveleges, and can have a significant impact on his/her
insurance rates, etc.  This makes it quite important to assign the ticket to
the driver and not just the vehicle.  This scheme does not address this
concern, hence it is unreasonable.
                        Steve Philipson

Big Bother and Computer Risks

In reference to RISKS DIGEST 7.91, Robert Steven Glickstein
<> and others have been discussing Toolbooths
and other risks of automated monitoring.

This subject has been extensively treated by Science Fiction writers,
especially Mack Reynolds who postualed the Coporate State with a cashless
society.  All transactions were done with a single money card.

One story especially instructive involved a "criminal".  He tried to rob a
person and it was pointed out that the card was useless without the owner as
personal identfication [e.g. retinal prints, etc.] were necessary to use the
card.  Our protagonist grabs the person, etc.

The police give chase and they are tracking the crook by observing the use of
the card, and where it is used.  Later they watch him use the Mass-Transit
system and can track him to the gate and car.

The end of the story concludes that the "crook" is a cracker to test how easy
it was to break the system and how long it would take for police to catch such
a person.

Interesting points were that the tracking mechanism was built into the computer
systems and could be activated on a widespread basis by a simple command from a
computer.  Our risk is that such capabilities could be designed into any new
cashless machine either as a conscious feature or as a debug switch for

I suspect that the case of life imitating art is close at hand and readers of
the risks list ought to go back and check out some early Science Fiction as
well as the latest Computer stories including the so-called cyber-punk

=Dennis L. Mumaugh
 Lisle, IL       ...!{att,lll-crg}!cuuxb!dlm  OR cuuxb!

Information available for a price

Wed Dec 7 20:29:48 1988
I received a postcard in the paper mail from a company called Credit Checker 
and Nationwide SS#- Locate.    Apparently anyone can -

  o Take a lot of risk out of doing business.
  o Check the credit of anyone, anywhere in the United States
  o Pull Automobile Drivers License information from 49 states
  o Trace people by their Social Security Number

By "Using ANY computer with a modem!"

To subscribe to this unique 24-hour on-line network call 1-800-255-6643.

Hmmm, I wonder if my neighbor with the new 928 : can really afford it 
       and how many traffic tickets does he have....

Curtis Keller 

     [Also noted on 12 Dec 88 by Bruce O'Neel <XRBEO@SCFVM.GSFC.NASA.GOV>]

Re: Computer Virus Eradication Act of 1988

Jonathan Sweedler <cjosta@taux01.UUCP>
12 Dec 88 07:42:38 GMT
From what I have read of the laws and bills dealing with computer
viruses and computer trespassers, it seems that someone can only be
prosecuted if they "cause harm" to a computer.  Even the new Computer
Virus Eradication Act of 1988 doesn't seem to apply to a person who
enters a computer without authorization just to browse through
directories.  Even part two of the below quote from the act may be
circumvented by simply announcing that you have entered the computer!
It seems that Robert Morris Jr. would not have done anything illegal
(even under these new bills) if his virus had worked as it was designed
to work: to propagate quietly from machine to machine.

>7         "(a) Whoever knowingly-
>8             "(1) inserts into a program for a computer infor-
>9           mation or commands, knowing or having reason to be-
>10          lieve that such information or commands will cause
>11          loss to users of a computer on which such program is
>12          run or to those who rely on information processed on
>13          such computer; and
>14            "(2) provides such a program to others in circum-
>15          stances in which those others do not know of the inser-
>16          tion or its effects;

In other words, can I break into any computer I want to and look at
whatever files I want to, as long as I announce I'm there and don't
cause any harm?  Why do these laws center on causing harm to computers
and not just illegal/unauthorized entry?

Jonathan Sweedler  ===  National Semiconductor Israel

Computer Virus Eradication Act of 1988 (RISKS-7.91 from VIRUS-L)

Vince Manis <>
Mon, 12 Dec 88 12:22:54 PST
>From: Don Alvarez <>
>Subject: Computer Virus Eradication Act of 1988
>2          (a) IN GENERAL.- Chapter 65 (relating to malicious
>3    mischief) of title 18, United States Code, is amended by
>4    adding at the end the following:
>5    "S 1368.  Disseminating computer viruses and other harm-
>6              ful computer programs
>7         "(a) Whoever knowingly-
>8             "(1) inserts into a program for a computer infor-
>9           mation or commands, knowing or having reason to be-
>10          lieve that such information or commands will cause
>11          loss to users of a computer on which such program is
>12          run or to those who rely on information processed on
>13          such computer; and
>14            "(2) provides such a program to others in circum-
>15          stances in which those others do not know of the inser-
>16          tion or its effects;
>17   or attempts to do so, shall if any such conduct affects
>18   interstate or foreign commerce, be fined under this title or
>19   imprisoned not more than 10 years, or both.

The idea sounds great to me.  However, the wording has problems. I'm
not a lawyer, but the text above appears to make it illegal to provide
a delete file routine in an operating system; it also makes the GNU
Emacs 'dissociated-press' function illegal.

What seems to be missing here is a proper definition of the terms
'virus' and 'worm'. Presumably, the problems with such programs
are that: (a) they install themselves into computer systems surreptitiously,
(b) they operate without the user asking them to do so, and (c) [viruses
only] they damage user data. I don't think the above wording addresses these
issues. There's also a question of intentionality: if a system includes a
`Grim Reaper', which deletes all files not referenced within a certain
period of time, and a user does not know about the G.R., are the
implementors of the system, or the operations staff on that machine,
responsible for the disappearance of files?

I consider it a RISK when legislators introduce bills on technological
matters which may not really address the issue at hand. I remember Rep. Jack
Brooks writing letters to Sigplan Notices about 20 years ago on why PL/I was
a poorer language than Fortran and Cobol, for example.  It is our job as
technical people to advise legislators on such issues.

Of course, as a Canadian, why should I care...? :-)

Re: Vendor Liability and "Plain Vanilla" configurations

Andy Goldstein <>
12 Dec 88 14:02
> From: "FIDLER::ESTELL" <>
> [...] By analogy, DEC could ship VMS with all the passwords "expiring" most
> ESPECIALLY those on "privileged" accounts [e.g., System, Operator], and then 
> go into a "closed loop" that could be exited only after the "user" [system, 
> or operator, in this case] selected and installed a *computer generated* 
> password.  ONLY then could the installation be completed ....

We've been doing that for a couple of years. All the standard passwords are
set up pre-expired, and the installation prompts for new passwords on all
the standard accounts (and rejects the standard values). The only thing we
don't do is to generate the passwords; the password generator was considered
too controversial. (The French really hate it because the letter frequency
is all wrong.) The biggest problem is that once you've installed a system,
there are ways of cloning the system disk that circumvent the standard
installation procedure. I feel it's still the most effective thing we've
ever done for system security.

There are some other things that aren't as tight as we would like them in
the out-of-the-box system; we're working on those. Thanx for your kind
words; we keep trying.
                    - Andy Goldstein  VMS Development

Re: "Hackers", "crackers", "snackers", and ethics

Andy Goldstein <>
12 Dec 88 11:48
Douglas Jones points out some experiences with benign hacking in the
1970's, in which the efforts of friendly hackers helped improve the
overall system. He expresses regret at the current attitude of treating
all hackers as criminals.

I have had productive working relationships with hackers in the past,
much to be benefit of my company's products, and I continue to maintain
some of these relationships. However, this only works in some environments.
Constructive hacking makes sense in universities and some development
environments, where the hackers belong to the organization that operates
the computer systems, giving them a certain level of trustworthiness.
In addition, the data in such systems is not terribly valuable, and
the occasional disruptions in service are a nuisance and no more.

With the majority of hacking nowadays, things are quite different.
Many of the computer systems involved are crucial to a business's
operation; some are critical to human life. The potential (and in some
cases the actuality) is there for major losses from disruption of
service and theft. The hackers are unknown outsiders in whom a serious
organization can place no trust whatsoever. Today's hackers do not
report what they find. Rather, they steal an organization's data and
services. They leave trap doors for themselves so they can re-enter
the system after it has been ostensibly secured. They are, simply put,
electronic joyriders and vandals.

I am all in favor of constructive hacking, but it should be confined to safe
places. Those who enter systems used for sensitive purposes should be
prosecuted for the tresspass they have committed. [Easier said than done, of
course, which is one of our biggest problems.]
                                Andy Goldstein, VMS Development


Shatter <unido!altger!Shatter@uunet.UU.NET>
11 Dec 88 19:24:12 MEZ (Sun)
Well it is nice to at last see a responcible and inteligent attitude
to hackers in risks (thnx Kenny), But i feel that it is time that an
active hacker had some form of input into the current debate.

Before I get down to my arguements about hackers and hacking perhaps I
should say a few words about myself and why I feel qualified to make my
views known.[I expect I will get flamed alot after this]

Some of you may have already heard of me via articles in the WALL STREET
JOURNAL, NEW YORK DAILY NEWS etc but for those of you who don't read or have
access to copies of these newspapers I am a hacker of over 10 years activity
who is based near Nottingham, England My speciality are the various packet
switched networks around the world such as PSS,Telepac,Transpac etc with
various forays into UN*X,NOS/VE VMS,VM/SP CMS (HPO) etc.[by the way I apologise
for any spelling mistakes but my spelling is very bad as I am dyslecxic]

I feel that as a hacker with so much activity and expirience I am qualified to
make the following points on behalf of the whole hacking community.

Hackers are not the vandals and common criminals you all think we are in fact
most of the "TRUE" hackers around have a genuine respect and love for all forms
of computers and the data that they contain and we are as a community very
responcible and dedicated to the whole idea of IT but we also have a strong
dislike to the abuse of IT that is perpitrated by various governments and
organisations either directly or indirectly.  There is of course a small
minority of so called hackers who do cause trouble and crash systems or steal
money etc but these people on the whole are dealt with by other hackers in away
that most of you could not even think of and most never repeat their "crimes"

In risks recently you have all been very busy discussing what names to use for
hackers and you all seem to be mssing the point. The term "HACKER" is still one
to be very proud of and I am sure that in your younger days you were all called
hackers and were very proud of the fact that someone felt that you had a great
technical expertise that warrented the use of the term, But you all suffer from
the standard problem that nearly all people involved within IT have and that is
of non communication. You never pass on the information that you pickup and
learn to others within IT [American Government organisations and Educational
Institutes are among the greatest offenders] and this allows the hacking
community [who do communicate] to be at least one step ahead of the system
administrators when it comes to finding security problems and finding the cause
and fix for the problem. A case in point is the recent arpanet worm and the FTP
bug both these problems have been known for many months if not years but when
talking to various system administrators recently not one of them had been
informed about them and this left their systems wideopen even though they had
done all they could to secure them with the information they had.  [An
interesting piece of information is that hackers in england knew about Morris's
worm at least 12 hours before it became public knowledge and although England
was not able to be infected due to the hardware in use we were able to inform
the relevent people and patrol internet to janet gateways to look for any
occurance of the worm and therefore we performed a valuble service to the
computing community in England -- although we did not get any thanks or
acknowledgement for this service.]  [but i am straying] Hackers should be
nurtured and helped to perform wot they consider a hobby [ you may do a
crossword as an intelectual challenge -- I study computers and learn about how
things interact together to function correctly (or incorrectly as the case may
be)] and the use of a group of hackers in a "HACK ATTACK" ((c) Kenny 1988) can
perform a valuable service and find problems that most of you could not even
start to think of or would even have the inclination to look for.

So please don't treat us like lepers and paupers find yourself a "TAME" hacker
and show him the respect he deserves and he will perform a valuble service for
you and above all COMMUNICATE with each other don't keep information to
yourselves if you have found it the chances are that so has someone else and
horror apon horror it may be a HACKER

Bst Rgrds 

Please report problems with the web pages to the maintainer