The RISKS Digest
Volume 6 Issue 55

Tuesday, 5th April 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Battle of the Virus Hunter
Amos Shapir
Software & War
Chief Dan Roth
A new RISK prevention scheme?
Eric Haines
not John Saponara
Yet Another UnTimely Risk
Paul Cudney
Olde Virus Shoppe
Barry Hayes
Douglas Jones
Re: (c) Brain VIRUS
Chief Dan Roth
Re: Risks in diving computers
Rich Sands
RISKS in philosophyland
David Thomasson
Risks of NOT giving race/ethnicity
David Rogers
Re: More On Race and Ethnicity Questions...
Henry Spencer
April Forgeries
Charles Daffinger
Rahul Dhesi
Info on RISKS (comp.risks)

Battle of the Virus Hunter

Amos Shapir <nsc!taux01!taux01.UUCP!amos@Sun.COM>
4 Apr 88 21:35:56 GMT
An interesting wager was made in a live TV interview here this afternoon: A
software house has announced a product that can warn users of the presence of a
virus - any virus - on their PC. It was written by the guy who discovered the
'Israeli Virus'.  Another software house, which produces an 'inoculation'
program against that virus, claims that a detection of every type of virus by a
single program is impossible, and offered a 10,000 shekel bet (about $6200)
against it, which was promptly accepted. They will have a show-down within 2
weeks - this is going to be interesting to watch!

Amos Shapir  National Semiconductor (Israel)
6 Maskit st. P.O.B. 3007, Herzlia 46104, Israel  Tel. +972 52 522261


Software & War

Chief Dan Roth <chiefdan@vax1.acs.udel.edu>
5 Apr 88 01:07:31 GMT
According to an article in the Christian Science Monitor, the communist rebels
in the Phillipines are being hampered by a virus (CSM's terminology) which was
meant as part of a software protection scheme.  

The rebels have imported a large number of Casio-manufactured laptop computers
for coding communications and other "on-the-run" uses in the guerilla war.
However, they also have been using pirated software.  The software has a
built-in "feature" which the rebels are finding quite a disadvantage — the
pirated copies work fine for awhile but then suddenly an "anti-piracy feature"
erases all the files on the disk.  Not exactly helpful in a battle situation.

(The original article was in the Thursday, March 31st CSM.)


A new RISK prevention scheme?

John Saponara <saponara@tcgould.tn.cornell.edu>
Tue, 5 Apr 88 11:01:10 EDT
Thought you might like this one.  I found it on the net, but don't recall
where it was.  The source is some Cray related magazine, I believe.

CRAY - A traditional Shinto ceremony was performed at Cray's systems 
    check-out building in Chippewa Falls to introduce a protective
    spirit into a new X-MP/24.  The ceremony was requested by Century
    Research Center Corp. (CRC), the Japanese service bureau that 
    ordered the system.

      There were two purposes to the ceremony:  to help protect the
    system during shipping and to ensure that it will run smoothly once
    it is installed on site.  "The ceremony places a spirit in the
    computer that ensures the company will prosper as customers use
    the system." (Tony Hagiwara, manager of the shipping firm that
    will deliver the system to Japan.)

      The Shinto ceremony traditionally is performed in Japan as a kind of
    blessing for significant events, such as an important purchase or
    the dedication of a new building.  

      Ceremony participants included:  the President of Cray Research Japan, 
    Cray's Coordinator of country services, Cray's Director of marketing 
    support, the President of CRC, the Director of CRC's systems 
    engineering division, and an employee of the shipping firm.  
    Each participant laid an evergreen branch on the table before the 
    computer system, bowed, clapped hands twice, and bowed again.  When 
    all had finished, the six participants each had a cup of sake and the 
    brief ceremony was over.

      An easel with the names of Cray Research and CRC written in Japanese,
    was brought from Japan for the ceremony.  It will return to Japan
    with the Cray system and sit nearby the system once it is installed,
    so that the spirit will continue to guard the system, helping keep
    its operation trouble-free.

Eric Haines
(not John Saponara, no matter what the header of this mail says!)


Yet Another UnTimely Risk

Paul Cudney <cudney@sm.unisys.com>
Tue, 5 Apr 88 00:06:55 PST
Before the beginning of computing was time, and with time was change.  You
might think we would all be familiar enough with calendar time to cope with
it easily, even to the point of designing systems to accommodate both the
predictable and the politically inevitable.  Within the space of two months
we have been surprised by reports of problems handling this most pervasive
measure of our existence.  We have been further surprised by the difficulty
in making what most would consider a minor change to a system, such as
changing the date Daylight Savings Time goes into effect (or not), as you
can see from the following notice.  Perhaps the design of a calendar watch
algorithim should be required in every software engineering course.

I see several lessons here for RISKS readers.  Care to volunteer a few?

> From sysadmin Mon Apr  4 20:20:53 PST 1988
> Subject: R&D is off by an hour
> Date: 4 Apr 88 23:57:58 GMT
> Organization: System Development Group, Santa Monica
>
> The R&D clock has been manually adjusted by an hour to partially
> compsensate for a software bug.  The problem is that R&D's release of
> Berkeley Unix predates the US Congress's decision last year to move up
> transition to daylight savings time from the end of April to the beginning
> of April.
>
> A fix isn't easy because it relies on relinking every piece of software
> that prints human-readable dates.
>
> Here's an example session that illustrates the problem.
>
>         R&D-1% date Mon Apr 4 15:57:37 PST 1988 R&D-2% date -u Mon Apr 4
>         23:57:37 GMT 1988
>
> It's 3:57pm PDT, even though 'date' says "PST"; the GMT time is off by an
> hour because it's really 22:57:37 GMT.
>
> If your software communicates with the outside world or otherwise relies
> on an accurate clock, you should take this into account.  For example, the
> GMT Date: headers at the start of news articles posted at R&D are all
> off by an hour; see the header of this article for an example.


Olde Virus Shoppe

Barry Hayes <bhayes@cascade.stanford.edu>
4 Apr 1988 1352-MST (Monday)
Way way back in, I think, 1978 or so, we created a bug on the time-sharing
system at Dartmouth, DTSS.  Not really a classical virus, but a fun bug anyway.

There was a kind of file protection called "slave trap programs".  You could
set up a file so that whenever a program would open that file, a slave trap
program would run and its termination status would give the access rights
allowed to the program trying the open.

Well, one day we played with the consequesnces of this scheme and wrote a
program which, when used as its own slave trap, would change its own name and
then terminate.  The end result was a file, usually called ELUDE-23 since the
length of the program was 23 words, which, when you tried to open it, would
change its name to ELUDE-NN, where NN would be the seconds in the time of day.

This confused people, of course, but also caused problems for a few programs.
There was a program which would go into every directory on the system and copy
over fragmented files, for instance.  It issued a system call that would open
every file at once to avoid overhead.  The result from this call was, for each
file, a directory entry, a status return for an open, and a file descriptor.
It wasn't very happy when it started getting "file not found" status.

By the way, you out there somewhere Steve?


Olde Virus Shoppe

Douglas Jones <jones%pyrite.CS%cs.uiowa.edu@RELAY.CS.NET>
Mon, 4 Apr 88 13:59:20 mst
One of the classic ways to crash a UNIX system is to create an executable
file, call it virus, containing code such as the following:

     echo "virus" & virus & virus

When run, this shell script prints "virus" on the terminal in parallel with
starting two more copies of itself.  The resulting proliferation of processes
quickly fills the process tables of the system.  I have used this for years
as a test of UNIX systems, since it is faster to type it in than read the
manuals to find out how they manage resource exhaustion.

On older UNIX systems (with no per-user resource limits), this would crash
the system as soon as an essential system process could not be created.
On newer systems, this effectively disables the user who starts it, and it
is hard to kill.  The mess can be ended by renaming or deleting the
file, at which point, remaining processes will be unable to create new ones;
killing individual processes rarely has a useful effect.

I encountered a related problem in an advanced course on fault tolerant
computing last spring.  We have an Encore multiprocessor at Iowa largely
dedicated to running student programming assignments.  I assigned a project
in which students were to write fault tolerant code on the Encore.
The people at the computer center began to notice some very unusual loading
on the machine soon after students began working on this project.
The skeleton of a typical fault tolerant program is outlined below:

    loop { one iteration is completed for each failure }
         if fork = 0 { fork is a UNIX system call to create a process }
             then { child process begins executing here }
                  loop { until failure is detected in parent }
                       — code to restore key variables from checkpoint
                  endloop
             else { parent process continues executing here }
          loop { until failure is detected in child }
                       — code doing useful job and being monitored by child
               — code to checkpoint key variables in stable storage
                  endloop
             endif
        endloop

The most obvious problems with this arose when the code to await a failure was
wrong and always terminated the loop in question.  In this case, huge groups
of processes were rapidly created, much like the shell script above.

A more subtle problem arose when students got working programs but forgot to
include any code to terminate the program when they were done testing.
At least one student didn't realize that his processes weren't going away when
he was done, and each of his experimental sessions created another fault
tolerant team of CPU bound processes.  When the system operators noticed these
accumulating, they set out to kill them and got quite frustrated when
replacements appeared for each process they killed.
                                        Douglas W. Jones


Re: (c) Brain VIRUS in RISKS DIGEST 6.52

Chief Dan Roth <chiefdan@vax1.acs.udel.edu>
5 Apr 88 00:57:38 GMT
The "(c) Brain" virus is not a new virus.

It is a basically harmless virus which first emerged here at the University of
Delaware early last fall.  I say *basically* harmless, because (unless its been
modified) it doesn't attempt to do any harm to the disks.  However, those with
a better understanding of DOS on the IBM-PC tell me that in certain very
specific cases (I believe involving non-standard data formats) some data could
be lost.


Re: Risks in diving computers

Rich Sands <rms@gubba.SPDCC.COM>
4 Apr 88 15:04:34 GMT
J.M.Hicks comments on dive computers:
>   Poor human interfaces have been discussed in this forum many times, but
>what opinions do people have of users' behaviour when a simple system is
>replaced by a complicated system that they do not understand and they
>can probably ignore because it takes a conservative view?

As a regular diver and user of an Orca 'EDGE' dive computer, I think
that these devices are a perfect example of how computer technology can
dramatically REDUCE the risk of an inherently risky activity. Using the
old-style Navy dive tables is tricky, requires substantial training, and
can be fouled up even by experienced divers. The dive computer's user
interface is MUCH easier to use and understand than the manual tables.
In this case, computers replace a very complicated system with a much
simpler system, not the other way around.

Sport divers go through a certification program that emphasizes safety,
graphically explains what can happen to you if you ignore the rules, and
in general produces a very safety-conscious diver. They know that dive
computers can keep them safe only if they heed the computer's warnings
and understand its operation. I think that most divers would pay a lot
of attention to the 'ASCEND MORE SLOWLY' message that their computer
flashes at them. There are many other risks in using the EDGE much
more serious than this warning message, such as the tendency for the
on/off switch to get caught on things, shutting the computer off and
losing the accumulated nitrogen absorption data.

There will always be people who do not heed safety rules, either on
purpose, or from ignorance.  The former will abuse dive computers just
as surely as they abuse the tables now.  The latter will find the
computers much less intimidating and understandable than the tables,
making them safer. 

The newest computer by Orca, called the 'Skinny Dipper', replaces the
'ASCEND MORE SLOWLY' message with a red flashing LED. The current depth
is not obscured anymore, and the warning is much more noticable since
there is almost no red light underwater and anything bright red really
gets the diver's attention. It also has a locking on/off switch.

Why worry now about the risks of slight imperfections in an otherwise
risk-reducing technology? Worry instead about making this excellent
safety device inexpensive enough to be in the hands of all sport divers,
THEN worry about the details!
                                               —  rms

UUCP: {ihnp4,harvard,husc6,linus,ima,bbn,m2c}!spdcc!gubba!rms 
Compuserve: 71360,1067  BIX: richsands 


RISKS in philosophyland

David Thomasson <ST401405%BROWNVM.BITNET@MITVMA.MIT.EDU>
Tue, 05 Apr 88 00:10:38 EDT
Several recent items in RISKS maintained tenuous connection with
computers while discussing the more humanistic issue of discrimination.
One writer went at his subject with such vigor that he rounded
things off with a battle cry:

  >Get out there and challenge the bigots! Both you and the society will grow.

The same writer then sounded a cautionary note:

  >I've often wondered *why* the same person who will not accept or tolerate
  >shoddy work or thinking on the job, will choose to ignore or tolerate or
  >accept or embrace any shoddy societal norm.

Although I'm happy to see RISKS extending its content to include
philosophical issues, I continue to blanch at some of the arguments
and assertions that are made. For example, the same writer who issued
the above-quoted caveat told of being invited to join an officers'
club. In a moment of dudgeon, the writer replied to the club:

  >I TAKE OFFENSE AT AN INVITATION TO JOIN ANY ORGANIZATION WHICH
  >DISCRIMINATES IN ANY WAY, ...AND DISCRIMINATION BY RANK OR PAY
  >IS DISCRIMINATION JUST AS SURELY AS DISCRIMINATION BY COLOR, AGE,
  >ETHNICITY, GENDER OR RELIGION.

Why would — or should — one disapprove of *any* kind of discrimination?
The implicit claim here is that discrimination in any form whatever is
morally wrong. And I cannot see how this assumption can be exempted from a
charge of shoddy thinking about morality, the same kind of shoddiness that
the writer wonders and warns about. Are youth clubs morally suspect because
they restrict membership to *youths*? Is Phi Beta Kappa culpable for
excluding stupid people? Are ballet companies open to reproach for
discriminating against clumsy oafs?  (And by the way, what *is* so morally
offensive about a club for officers??)
    I am not trivializing the issue here. If one thinks it is a simple
matter of separating the "bad" kinds of discrimination from the "good" (or
"acceptable") kinds, try phrasing a general principle that will make that
distinction. I find it more than a little disturbing when people who are
obviously very bright and extremely competent in their fields (computer
science and related fields) burst onto the philosophical scene and start
shooting out the lights.
   Consider another recent RISKS item about discrimination. The writer
says he applied for a driver's license and noticed that the application
asked for his race. "It seemed to me that my race had nothing to do with
driving a car, so I left it blank." By the same reasoning, one might just
as well refuse to give one's name, sex and address, since they too have
nothing to do with driving a car. Perhaps — *perhaps* — including such
information on a driver's license could be justified on some ground
other than driving competence. Perhaps?
   I am not out to toss cold water on RISKS' recent ventures into such issues
as discrimination. By all means, challenge bigots. But for God's sake get down
off old Rosinante and do it with a little more style and intelligence.


Risks of NOT giving race/ethnicity

David Rogers <drogers@riacs.edu>
4 Apr 88 20:12:39 GMT
Most financial aid forms ask for ethnicity (the modern way to phrase `race'
questions).  When I was at Berkeley, I, in my fervor, refused to answer such
things, or at the minimum, checked OTHER.  (At least they gave an OTHER box!)
They would happily accept any such forms, probably because they were tired of
arguing with students like me.

The risk?  I later asked an aid officer what they did with these forms:  they
just assign all such people to the `white' group for the purposes of
calculating aid, since that is the `least desirable' ethnicity when it comes to
calculating aid.

When open conflict arises about the answers to questions on forms, that is
usually better than this much more insidious procedure, that of assigning the
user an `answer' which is (usually) the least desirable of the options.  The
use of computers will make this `when in doubt, assume the worst' type of
defaulting even more common, and nearly impossible to detect.
                                                                David Rogers


Re: More On Race and Ethnicity Questions...

<mnetor!utzoo!henry@uunet.UU.NET>
Tue, 5 Apr 88 13:12:16 EDT
> ... If you *really, really* think about it, there is no way to justify a
> RACE or ETHNICITY question, unless you accept the notions of quotas...

In fairness, it should be mentioned that in a community with discrimination
problems, security-clearance forms (and no others) might have real reason to
ask such a question, for the same reason that they have legitimate reason to
ask about unorthodox sexual habits:  blackmail potential.  Mind you, I admit
that (a) it's harder to get a good blackmail threat out of racial issues, (b)
if we assume, for example, the southern US some decades ago, such a question
really ought to be something like "any Negro ancestry?"  rather than just
"race?", and (c) fortunately, this sort of nonsense isn't much of an issue any
more.  But in the wrong place at the wrong time, I can see how a real security
issue could arise.  It's not inherently ridiculous, although in the examples
cited it certainly is silly.

Henry Spencer @ U of Toronto Zoology {allegra,ihnp4,decvax,pyramid}!utzoo!henry


April Forgeries (Re: RISKS-6.52)

Charles Daffinger <cdaf@iuvax.cs.indiana.edu>
Mon, 4 Apr 88 23:17:09 EST
     [Most of you have chortled appropriately at the Spafford Spoof.  Charles'
     message is apparently intended for those of you who need more explicit
     references to the self-referential evidence left by the forged forgery
     warning.  By the way, Charles neglected to remark that RISKS-6.52 was 
     not put out on 1 April either.  PGN]

Here's the article warning about forgeries:  Note the strange date, note
that spaf's message is dated *after* the message it was enclosed in, and
a couple of self-references in the posting!  Enjoy!

In article <12386860573.13.NEUMANN@KL.SRI.COM> you write:
>RISKS-LIST: RISKS-FORUM Digest   Friday 1 April 1988   Volume 6 : Issue 52
>

>Contents:
>  April Fool's warning from Usenet (Gene Spafford via Cliff Stoll)
>----------------------------------------------------------------------
>
>Date:     Thu, 31 Mar 88 12:17:48 PST
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
>From: cliff@Csa5.LBL.Gov (Cliff Stoll)
>Subject:  April Fool's warning from Usenet
>
>Here's the warning from USENET's  news.announce.important:
>
>From: spaf@cs.purdue.EDU (Gene Spafford)
       ==================================
>Subject: Warning: April Fools Time again (forged messages on the loose!)
>Date: 1 Apr 88 00:00:00 GMT
       ^^^^^^^^^^^^^^^^^^^^^
>Organization: Dept. of Computer Sciences, Purdue Univ.
>
>Warning: April 1 is rapidly approaching, and with it comes a USENET
>tradition. On April Fools day comes a series of forged, tongue-in-cheek
>messages, either from non-existent sites or using the name of a Well Known
                                             ==============================
>USENET person. In general, these messages are harmless and meant as a joke,
=======
[...]
>
>        o Posted dates. Almost invariably, the date of the posting is forged
>          to be April 1.                   =================================
           =============


April Forgeries (Re: RISKS-6.52)

Rahul Dhesi <iuvax!bsu-cs!dhesi@rutgers.edu>
Mon, 4 Apr 88 23:25:38 EST
... Of course, it's possible that it was a double-forgery, i.e., that Gene 
Spafford forged it himself.  — Rahul Dhesi  
                                                  [Sorry.  Not THIS TIME.  PGN]

Please report problems with the web pages to the maintainer

x
Top