The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 6 Issue 24

Wednesday 10 February 1988


o Alarming Wenches and Risks of Lojack
Alex Colvin
Scott A. Norton
o Re: Software theft
Roy Smith
o Interleaving of Early Warning Systems
Ronni Rosenberg
o Shuttle Security
Jan Wolitzky
o Risk Study Centers
Curtis C. Galloway
o Legal Software testing
David Lesher
o Re: risks of helpful usenet software
David Herron
o Grants-chaos
F.H.D. van Batenburg
o Re: viruses
Chaz Heritage
o CompuServe virus - more details et cetera
David HM Spector
o Info on RISKS (comp.risks)

Alarming Wenches (RISKS-6.23)

Alex Colvin <>
Wed, 10 Feb 88 10:28:02 EST
  > ... One of my bosses had his brand new, fully alarmed, 1986 Toyota Celica
  > removed from his driveway in Beacon Hill by a wench equipped truck in
  > the wee hours of the morning.

That's the most dangerous kind.  Especially in the wee hours.

                        [Actually I noticed the typo, but liked it so 
                        much I left it as is.  Sic (sic) it to me.  PGN]

Re: Hub auto-theft lessons; $$$ risks of Lojack

Tue, 09 Feb 88 21:06:52 PST
  > ... He made it out the door only to hear the periodic beep
  >of his pendulum alarm muffled from inside a large van ... 

The real point of this message:  Notice how the thieves negated most
of the value of the alarm by putting the car inside a van.  Although
the owner seemed to hear the siren, the thieves could drive through town
without too much attention being drawn to them.  If the van had
been RF shielded, Lojack would have been defeated, too.

What does Lojack use for an antenna in the protected car, anyway?  If it
shared the radio antenna, or had its own, a simple snip could also disable
the protection.

I'm not impressed by the security it provides, and of course there
is the privacy risk to the owner originally mentioned.

LT Scott A. Norton, USN, Naval Postgraduate School, Monterey, CA 93943-5018

      [Scott also asked for the name of the wench.  PGN]

Re: Software theft

Roy Smith <roy%phri@uunet.UU.NET>
10 Feb 88 15:32:54 GMT
  > it is extraordinarily bad practice to fire someone and then not change
  > all relevant passwords, revoke their privileges, etc.

Actually, I would quibble with the order of operations.  Change the passwords
first, *then* fire the person.  In the past five or so years, we have had
occasion to fire two people who had access to sensitive material.  In both
cases, accounts were zapped and appropriate passwords were being changed while
that person was in the office getting the bad news.  It doesn't take long for
a disgruntled person to do serious damage with a quick "rm -rf *".

Roy Smith, {allegra,cmcl2,philabs}!phri!roy
System Administrator, Public Health Research Institute
455 First Avenue, New York, NY 10016

Interleaving of Early Warning Systems

Ronni Rosenberg <ronni@CCA.CCA.COM>
Wed, 10 Feb 88 11:41:00 EST
In RISKS 6.22, Ronald Wanttaja discusses a scenario in which "The Soviets
blind most of the US Early Warning satellites. ...  The U.S. immediately goes
to high DEFCON. ...  The Soviets do *nothing*."

I believe that if the U.S. goes to a high DEFCON, the Soviets automatically
go to a higher state of alert.  Part of the danger of such situations is that
the two countries' alert systems are tightly interconnected and responsive to
each other.  This can have the effect of ratcheting the alert status ever
higher and increasing tension, which greatly increases the risk that an
inappropriate decision will be made.

Shuttle Security

Wed, 10 Feb 88 17:22 EST
The subject of the self-destruct mechanism used to prevent runaway rockets
(including space shuttle's boosters) from wreaking havoc was discussed
previously in this discussion group.  One very knowledgeable contributor posted
interesting details of the mechanism, including descriptions of the radio link,
with assurances that the high security of the system, including classification
of the frequencies used, greatly reduced the possibility of inadvertently
blowing up a rocket.

Now, according to the AP, a NASA security audit conducted in September found
serious security violations at NASA's Marshall Space Flight Center in
Huntsville, AL.  The wire service story, of course, focuses on such hijinks as
a safe for classified documents being used to store coffee money, but it also
reports that 7 packages of microfilm classified "Confidential" were left
unsecured for 8 months.  Each package of microfilm contained 181 sheets,
listing 4,205 confidential radio frequencies (personally, I'm always suspicious
of such precise figures).  The information belonged to various of the armed
services, CIA, and NSA.  The MSFC is responsible for processing the shuttle's
solid rocket boosters, which include the self-destruct mechanism.

What does this do to a risk analysis of shuttle safety?  In general, how many
points do you take off for each month the key to your system is laying around
unprotected?  When things like this happen, do people really sit down and redo
those calculations, or do they just run around covering themselves and hope the
same numbers as before still apply?

Jan Wolitzky, AT&T Bell Labs, Murray Hill, NJ; 201 582-2998; mhuxd!wolit
(Affiliation given for identification purposes only)

    [Quantitative risk analysis is always dangerous -- particularly
    if the assumptions are questionable.  The existence of a serious 
    flaw may kill you, or it may lie lurking.  Probabilities are not
    very interesting when you are dead.  PGN]

Risk Study Centers

"Curtis C. Galloway" <>
Wed, 10 Feb 88 15:23:01 -0500 (EST)
From the Carnegie Mellon office of public relations:

  "Carnegie Mellon University has received a $1.2 million grant from the
  National Science Foundation (NSF) to help fund its new Center for Risk
  Perception and Communication, aimed at improving how companies, workers,
  the public and regulatory agencies communicate about and deal with
  significant health and safety factors.

  "The center's experts in engineering, psychology and economics will do
  basic research on risk communication. They will focus on danger areas whose
  hazards have been studied, including radon in homes, highway safety
  associated with seatbelts, dam safety, the potential for birth defects and
  cancer from power lines, and cancer risks from sun light and chemicals in
  the environment."  

I wonder if they will include in their research the risks to the public in
computers and related systems...  Have "hazards been studied" in this
"danger area?"  It seems to me that there is a distinct lack of
communication about the risks of using computers (with the exception of the
RISKS digest, of course!)

Curt Galloway                 ARPA:
UUCP: ...!{seismo, ucbvax, harvard}!!cg13+

Legal Software testing (Re: RISKS-6.22)

David Lesher <netsys!>
10 Feb 88 03:57:34 GMT
Ms. Leveson neglected to mention the big problem with the ABA testing
program. They charge many thousands of dollars for such an approval, and
many small vendors can't/won't pay up.  Hence, only large, well funded,
companies offer 'approved' products.

Re: risks of helpful usenet software

David Herron -- Resident E-mail Hack <>
9 Feb 88 18:04:55 GMT
Henry's comment about new vs. old usenet software hit home very strongly
with me.  I made a posting a couple of weeks ago advertising that we had
perl available, and I cross-posted it to comp.sources.d, uk.wanted,
ky.general and uk.general.  Ever since I've been getting mail from machines
all over the net which thought that one of those newsgroups was moderated.
I've probably gotten over a hundred by now.

Each of these machines is an "older" one from back when the rules were a
little bit different, and there were some hard-wired newsgroup names which
were moderated.  Or rather, their news software is "older" software... :-)

David Herron -- The E-Mail guy            <>
or:                {rutgers,uunet,cbosgd}!ukma!david, david@UKMA.BITNET


Wed, 10 Feb 88 14:31 N
In the Netherlands students are supported by the government with a small
grant to live on, augmented with a low interest loan which should be paid
back later.  The amount of money depends upon the wealth of one's parents,
the study results and many many more factors.

In fact, this legislation was so complex that the brochures which were
distributed by the government to the universities only covered the most
simple cases.  After heated complaints from the universities the government
finally produced and distributed a MS-DOS program to assist the information
officers at the universities.  However, this program seemed to give correct
information only once out of six questions (NRC 14/8/87), so it was soon
called the "Deet-flop" (Deetman being the responsible minister and flop
having the connotation of failure).  Clearly this program was of debatable
value so desperate universities appointed a number of students to assist the
information desks and some of those students finally produced in their spare
time a much better program than the Deet-flop.  This is in use now in the

However the real pain in the neck was not the governmental information, but the
department responsible for the actual distribution of loans and grants itself.

* R.Schipper, one of my students, showed me a letter which cut him out of any
  funds because the department assumed he had earned the ridiculous huge sum of
  f 756025.00 (about $400000) instead of f 756.25 in july alone.

* Another student was cut out of funds because her father was too rich last
  year.  The fact that he got broke recently and was virtually pennyless now
  did not change anything.

* Another 2 students told me they just reported a change of address.  This
  resulted in a temporary (9 month for one of them) stop of payment until the
  computer program could handle the update of this information.

* Some students who quit studying still got their monthly payments although
  they had reported their new status properly (Computable 19/1/88).

* Ms Ymke Dykstra (86 years old) got a grant of f 2250 for study although she
  didn't study at all (Computable 19/1/88).

Of course these students were not the only ones suffering from that grants &
loans distribution system.  One estimated that about 100000 out of the 550000
students had trouble because of this unreliable software system (Leids Dagblad
23/10/87).  Apart from actual blunders, a major problem was that the computer
system and organisation couldn't handle the load.  So apparently the respons to
any mutation was to freeze all payments until all previous arrears was made up.
In this way many students didn't receive their monthly payment, but their
complaints only increased the load.  It was estimated that for example in
august 130000 letters were left unreplied (NRC 13/10/87).  Students who tried
to phone couldn't get through either; in august 1.1 million phone calls were
tried but only 60000 got through (NRC 18/9/1987), and those students who did
get through were told that nothing could/would be done because the
administration department "was probably working on it" and complaints should be
done in writing (which would only worsen the chaos of course!).  Many desperate
students who didn't got any improvement in their financial situation personally
travelled to Groningen daily (about 2 to 3 hours one way) to plead their case,
but all in vain.

Nevertheless, the minister denied the occurence of any problems repeatedly
until the end of 1987, when an investigation was started.  It appeared that all
the people responsible for the software had warned the minister repeatedly that
the software could not be ready before 1987.  The minister however, insisted
upon a start one and a half years earlier, in the beginning of 1986 (NRC
15/12/87).  This resulted in a total chaos of which many students suffered.  In
the meantime the costs of this project, originally estimated at f 20 million,
increased to f 73 million (computerworld 1/12/87).

F.H.D.van Batenburg

Re: viruses (RISKS-6.23)

10 Feb 88 10:17:11 PST (Wednesday)
It is now clear that certain software houses are using virus as a deterrent to
software piracy. There is at least one commercial system (Softguard 3.00)
designed to destroy the files of a user who attempts to copy software protected
by it.

This activity is, in my personal view, unjustifiable; there is quite enough
trouble with malicious amateurs as it is. I do not believe that any such system
can prevent disc copying by purely  hardware devices. There is no reason to
suppose that a dedicated amateur could not break down the protection of the
anti-copy system itself, attach it to hitherto unprotected software, and post
the whole thing to CompuServe or whatever - thus creating another epidemic.

I have adopted certain policies which I would recommend:

1  If you can manage with

2  Buy only unprotected, 'professional' software products from reputable houses
who advertise the fact that their products lack protection devices. Pay the
extra cost cheerfully and expect a professional level of support from the
software house involved.

3  If you run a commercial game program, power down the entire system for at
least five seconds afterwards before doing anything serious. Virus, like RAM
discs, may be reset-survivable.

4  If you detect a software house using virus in its products, then do (a) an
immediate boycott; (b) as much adverse publicity as you can manage.

Software houses who trust their customers not to steal from them should be
respected and supported; there are many in UK and with luck the number will
increase. Software houses who use virus against their customers are
conspirators to commit criminal damage and should be treated as such.

Chaz Heritage

Disclaimer: these are my personal views and not  necessarily those of any other
person or corporate entity.

CompuServe virus - more details et cetera

David HM Spector <spector@vx2.GBA.NYU.EDU>
Wed, 10 Feb 88 15:45:41 EST
An update on the Macintosh virus on CompuServe (and other systems):

The virus mentioned in Risks 6.22 seems also to be in at least one other
HyperCard stack that I found on a BBS in San Jose and and on GEnie, General
Electric's Information Service.  The stack is called "The Apple Product
Stack" (or something similar) and claims to be a preview of some upcoming
Apple products.  (I am in the process of contacting the SysOps of the BBS to
inform them of its presence.)  What this stack does is show a badly scanned
image of something indiscernable and then (in the background) installs a
virus into your system file.

Later, I was horrified to find during a check of my MacintoshII at home, that 
the very virus I had reported about being on CompuServe was alive and kicking
in  **MY** Macintosh.  [I feel like I have been violated!]

Upon setting a number of disassemblers to work on the virus itself, I was able 
to determine that its a date-triggered, self-propagating retro-virus.
(Please pardon the abuse of the terminology...)   Its characteristics and
workings are as follows:

It is an "INIT" resource (for the uninitiated an INIT is a code segment that 
gets run by the Macintosh OS at system startup time).  INITs are usually
used to do things like start mail servers, screen blankers, patch OS bugs, etc.

The virus's method of transmission is (suprise, suprise) via floppy disks
*or* by an infected system "mounting" any volume that contains a bootable 
system file.

It sets itself up as a running part of the operating system by modifying 
system traps.  The code is set to do something {I have not yet figured 
out what, but it starts by showing a picture of some sort} on March 2nd, 1988.
There seems to be a few data areas in the middle of the code which may get
jumped-to and then do something else, but I haven't had time to explore it
to that end yet.

If you try to remove it from a running system, and it tries to propagate 
itself, your workstation will crash since the virus code is not present to 
service the system trap request. And if you tansfer control to another 
system file/disk  without write-locking it (in hardware!) first, you've just 
infected the other system!.  

The best solution is the one suggested by Neil Shapiro, the Cheif SysOp of 
CompuServe's MAUG; replace the system files ASAP, preferably by booting your 
Macintosh from a write-locked floppy and copying a fresh system onto your 
hard disk and any bootable floppies you have around.

The really "clever" part of this, if you will, was the use of a HyperCard stack
at the initial transmission medium.  HyperCard is a realy nifty program that
is extensible with XCMDs and XFCNs (external commands and functions) usually
written in C, Pascal or Assembly to provide functionality not present in 
Apple's Standard HyperCard distribution.  The stack called this "user supplied"
function, and <>ZAP<< a perfectly useful feature turned into a weapon.

I wonder how many viruses exist in copies of Lotus-1-2-3 on IBM-PCs?  I 
understand external functions may be added with either C or Assembly.

On a lighter note:
I am looking into writing some detection programs (for Macs) to look for 
common things that the viruses in my "collection" do in a target program, 
and warn that a program under examination _MAY_ be less than safe.  Not a 
certification by any means but perhaps a way to check for simpler viruses...
(And of course, it would/should have built-in ways to make sure it was not 
itself compromised... if that's possible.  Perhaps by some clever crc 
scheme -- I don't know right now, as its just an interesting midnight project 

David HM Spector                New York University
Senior Systems Programmer           Graduate School of Business
Arpa: SPECTOR@GBA.NYU.EDU           Academic Computing Center
UUCP:...!{allegra,rocky,harvard}!cmcl2!spector  90 Trinity Place, Rm C-4
MCIMail: DSpector               New York, New York 10006
AppleLink: D1161     Compu$erve: 71260,1410     GEnie: XJM21945

Please report problems with the web pages to the maintainer