The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 5 Issue 23

Tuesday, 4 August 1987


o Article on "Computer (In)security"
Jim Horning
o DC sends bad tax bill to the *WRONG* citizen
Joe Morris
o New Report on SDI Feasibility
Mark S. Day
o Railway automation
Stephen Colwill
o Faults in 911 system caused by software bug?
Jim Purtilo
o Re: Macaqueswain steering
o PIN-demonium
Curtis C. Galloway
o Factory automation and risks to jobs
James H. Coombs
o Nukes vs Coal
Tom Athanasiou
o Info on RISKS (comp.risks)

Article on "Computer (In)security"

Jim Horning <horning@src.DEC.COM>
Tue, 4 Aug 87 12:10:05 PDT
Vol. 4, no. 4, Summer 1987, pp. 7-25
"Computer (In)security: Infiltrating Open Systems"
Ian H. Witten (Department of Computer Science, University of Calgary)

ABSTRACT: Beware the worm, the virus, and the Trojan Horse.  Despite
advances in authentication and encryption methods, computer systems are
just as vulnerable as ever.

MINI-REVIEW: This article is chock-full of the sort of information that
RISKS contributions keep saying can't be overemphasized. It is written at a
level that can be understood by almost anyone who ever had to log into a
computer system, and possibly by a congressman or corporate vice-president.
Witten stresses that trust is transitive: If I trust you as a source
(especially for software), I have thereby placed trust in everyone you trusted.


Shared computer systems today are astonishingly insecure. And users, on
the whole, are blithely unaware of the weaknesses of the systems in which
they place--or misplace--their trust. Taken literally, of course, to
"trust" a computer system as such is meaningless, for machines are neither
trustworthy nor untrustworthy; these are human qualities. In trusting a
system one is effectively trusting all those who create and alter it--in
other words, all who have access (whether licit or illicit). Security is a
fundamentally HUMAN issue. ...

It is comforting, but highly misleading, to imagine that technical means
of enforcing security have guaranteed that the systems we use are safe. ...

Many systems suffer a wide range of simple insecurities. These are,
in the main, exacerbated in open systems where information and programs
are shared among users--just the features that characterize pleasant and
productive working environments. ...

Throughout this article the word BUG is meant to bring to mind a concealed
snooping device as in espionage, or a microorganism-carrying disease as in
biology, not an inadvertent programming error. ...

Not only should you avoid executing their programs; take the utmost care
in ANY interaction with untrustworthy people--even in reading their
electronic mail. ...

The simplest kind of Trojan horse turns a common program like a text
editor into a security threat by implanting code in it that secretly reads
or alters files in an unauthorized way. An editor normally has access to
all the user's files (otherwise they couldn't be altered). In other words,
the program runs with the user's own privileges. A Trojan horse in it can
do anything the user could do, including reading, writing, or deleting
files. ...

One good way of getting bugged code installed in the system is to write a
popular utility command. ...

The thought of a compiler planting Trojan horses into the object code it
produces raises the specter of bugs being inserted into a large number of
programs, not just one. ...

The trick is to write a bug--a "virus"-- that spreads itself like an
infection from program to program. The most devastating infections are
those that do not affect their carriers--at least not immediately--but
allow them to continue to live normally and in ignorance of their disease,
innocently infecting others while going about their daily business. ...

The neat thing about this, from the point of view of whoever plants the
bug, is that the infection can pass from programs written by one user to
those written by another, gradually permeating the whole system. Once it
has gained a foothold it can clean up the incriminating evidence that
points to the originator, and continue to spread. Recall that whenever you
execute a program written by another, you place yourself in that person's
hands. For all you know, the program you use may harbor a Trojan horse,
designed to do something bad to you (like activating a cookie monster).
Let us suppose that, being aware of this, you are careful not to execute
programs belonging to other users unless they were written by your closest
and most trusted friends. Even though you hear of wonderful programs
created by those outside your trusted circle, programs that could be very
useful to you and save a great deal of time, you are strong-minded and
deny yourself their use. But maybe your friends are not so circumspect.
Perhaps Mary Friend has invoked a hacker's bugged program, and unknowingly
caught the disease. Some of her own programs are infected. Fortunately,
they may not be the ones you happen to use. But day by day, as your friend
works, the infection spreads throughout all her programs. And then you use
one of them. ...

If you use other people's programs, infection could reach you via a floppy
disk. ...

The smaller and more primitive the system, the safer it is. For absolute
security, don't use a computer at all--stick to paper and pencil! ...

The only effective defenses against infiltration are old-fashioned ones.

Finally, talented programmers reign supreme. The real power resides in
their hands. If they can create programs that everyone wants to use, if
their personal libraries of utilities are so comprehensive that others
put them on their search paths, if they are selected to maintain critical
software--to the extent that their talents are sought by others, they have
absolute and devastating power over the system and all it contains.
Cultivate a supportive, trusting atmosphere to ensure they are never
tempted to wield it.

DC sends bad tax bill to the *WRONG* citizen

Joe Morris ( <>
Tue, 04 Aug 87 11:13:23 EDT
The District of Columbia, still smarting from its handling of the snow last
winter, has discovered what happens when it sends out an erroneous tax bill
to someone who can make life uncomfortable: it sent a bill for ten cents to
David Brinkley (yes, THAT David Brinkley) along with a sternly worded
warning that if he did not pay, he would be fined $2,137 in penalties and
interest.  Brinkley reported on this in a brief commentary on his nationally
televised ABC news program last Sunday.  There was a follow-up in this
morning's Washington Post, from which the following is lifted:

   The whole thing was a mistake, said D.C. Department of Finance and
   Revenue spokeswoman Brendolyn McCarty, a "mathematical audit error."
   Ordinarily, she said, [taxpayers] would not have been billed for 
   amounts of less than $2.  But someone forgot to tell the government 
   computer that.
   Brinkley paid the 10 cents even though he maintained he owed the
   District government nothing.  District officials acknowledge that 
   his bill had been paid last year, but no one knows where the extra
   dime cropped up and  officials cannot account for why the letter
   threatened him with $2,137 in penalties and interest.

   "They will wind up blaming it on the computer," Brinkley said when
   told of the District government's response.  "I knew that.  But a
   computer only puts out whatever you put in it."

Perhaps the most astounding thing to come out of this is that we apparently
have at least one respected national news commentator who is aware that GIGO 
means "garbage in, garbage out", as opposed to most of the Fourth Estate, who 
believe that it means "garbage in, gospel out".  The story itself, of course, 
is unusual only in its victim's ability to strike back quickly.  

New Report on SDI Feasibility

Tue 4 Aug 87 10:54:52-EDT
  > Describing the [SDI] hardware requirements as "more firmly in hand 
  > than the software," the report says that building the system's
  > architecture around hardware will mitigate software problems.

If this is an accurate summary of the report's conclusion, it's rather
odd.  The Eastport Study Group report to the Director of SDIO
(December 1985) came to the exact opposite conclusion.  Again, there
was the assertion that the task of building SDI was "difficult but
possible."  However, there was also a strong critique of the usual
procurement process of acquiring hardware first and treating software
as an afterthought.

    ... contractors treated the battle management computing resources
  and software as a part of the system that could be easily and 
  hastily added.  The contractors treated battle management as 
  something that is expected to represent less than five percent of
  the total cost of the system, and therefore could not significantly
  affect the system architecture.  They have developed their proposed
  systems around the sensors and weapons and have paid only "lip 
  service" to the structure of the software that must control and
  coordinate the entire system. (p. 9)

  The panel cautions, however, that the feasibility of the battle 
  management software and the ability to test, simulate, and modify 
  the system are very sensitive to the choice of system architecture.
  (p. 10)

It's plausible that the newspaper got it wrong and that the new report
also says that designing the system around the SOFTWARE would make the
software problem easier.  Otherwise, it seems like a very strange claim.

Railway automation

Stephen Colwill <mcvax!praxis!steve@seismo.CSS.GOV>
Fri, 31 Jul 87 14:50:28 BST
This article follows up on previous comp.risks items relating to the Muni
Metro and BART railway systems in San Francisco.

Today I was amused to read in The Times an article describing a ride that the
Queen took on the new London Docklands Light Railway.  This railway features
driverless trains.  The article describes an incident whereby overriding the
control computer to "speed the royal passage" resulted in a delay occasioned by
the doors not opening at the station because the train was improperly docked.

I was considerably less amused to read, as an aside in the same article, about
an incident in which one of these automatic trains "crashed the terminus
buffers and ended up dangling in midair".  As I recall, the train was on the
verge of leaving the end of some kind of bridge.

Although the Queen officially opened the LDLR yesterday passenger services
would be delayed for a few weeks since (according to London Transport) "test
running has not yet reached the high level of reliability needed".

I would like to express two personal opinions: firstly I am dismayed that
despite the long history of railway travel each new implementor of automation
still makes the same mistakes as the predecessors. Secondly I fail to see how
people who propose automatic control of nuclear power stations and ATC can
presume to do this when the conceptually far simpler problem of the control
of trains on a closed railway network has not yet been solved after years of

Steve Colwill, Praxis, Bath, England.

Any above-expressed opinions may not be imputed to my employer.

                                             [Fault-tolerant imputers?  PGN]

Faults in 911 system caused by software bug?

Jim Purtilo <>
Tue, 4 Aug 87 11:00:53 EDT
I came across this the other day:

        Shutdown Attributed To `Bug in Software'

        Patricia Davis
        The Washington Post, August 1, 1987

  Fairfax County officials said yesterday that their new and occasionally
  dysfunctional 911 emergency communications center was operating properly,
  with calls being answered in less than a second.

  Mike Fischel, director of the Public Safety Communications Center, said that
  American Telephone & Telegraph Co. workers had discovered and repaired a
  problem in the telephone system that a week earlier caused it to shut down
  for 30 minutes.

  Because of a ``bug in the software,'' Fischel said, employes where able to
  receive calls at the communications center July 24 but could not hear the
  caller.  Because of a backup system, however, there were no delays in
  providing emergency service, he said.

  Although there have been ``persistent problems'' with the telephone system,
  including telephone sets overheating and some background noise, the
  computer-aided dispatch system has performed well since it became the
  primary system July 23, Fischel said.

  The new system cost $12 million and is calld CAD, for Computer Aided
  Dispatch.  It is housed in the police department's Pine Ridge facility in
  central Fairfax and somewhat resembles Mission Control at the Johnson Space
  Center in Houston.  Red numbers flashed on a wall show how long calls go
    [ .... ]
  The new system displays the phone number and the address, then verifies
  the information, assigns a priority to the call and relays the information
  to the appropriate dispatcher --- police, fire or ambulance.

  During the month-long transition to CAD, there were complaints of delays
  both in answering 911 calls and rerouting them to the proper dispatcher.
  Officials said those problems were because of the complexity of the system
  and the fact that employes were initially doing double duty by monitoring
  the old system to check the performance of the new one.
    [ .... ]
                * * *

Hence it would appear that the new system failed but service continued
due to the operators keeping both old and new systems on-line during a
breakin period.

It would be interesting to find out a few more specifics concerning the
nature of this ``bug.''  Does anyone out there have more clues to
contribute?  Also, once the glitch has been localized, it would then be
interesting to learn ``who's responsible'' should the faulty system have
resulted in some tragedy.

Re: Macaqueswain steering (RISKS DIGEST 5.21)

Peter G. Neumann <>
Tue 4 Aug 87 10:03:05-PDT
A query from Dave Mack wondered thoughtfully why on earth the monkey tale
made it into RISKS.  One reason is that I wanted to illustrate the point
that developers of technologically based systems must anticipate all sorts
of events that to the ordinary mortal would seem preposterous.  Having a
monkey loose in the cockpit is just one example of an event that few if any
people had ever thought of before.  I might have added something about
monkeys typing randomly and sooner or later typing Shakespeare.  The
probability that a monkey might do something eventful to the plane controls
is of course nonzero.  Macaques are particularly inquisitive and playful.
PGN [who is surprised that this prehensile tale has hung on in three
consecutive issues!]


Curtis C. Galloway <>
Tue, 4 Aug 87 13:33:30 edt
I used to think that bank card PINs were stored on the cards themselves, but
an experience I had some months ago convinced me that (at least in my case)
they aren't.

When I got my bank card from my bank in Little Rock, they didn't send me the
usual notification of my new PIN.  I assumed I could use the one I had
requested on the application form, but it didn't work.  I went in to my
branch and they called up the central office -- they had assigned me a
different PIN, but had never told me about it.  The teller read off my card
number to the person on the phone and they told her my number, which she
wrote on the back of a deposit slip and gave to me.

Things were fine for a couple of months until one day an ATM rejected my card
with an "incorrect PIN" message.  I went back to the bank, where they went
through the same procedure of calling the central branch.  I found out they
had changed my number back to the one I had originally requested!

I was very surprised at this -- I thought they had to give you a new card to
change your PIN, or at least re-encode the magnetic stripe.  Guess I was

--Curt Galloway
Usenet: seismo!!cg13+

Factory automation and risks to jobs

"James H. Coombs" <>
Tue, 04 Aug 87 13:18:02 EDT
The folklore has always been that computers, especially in the form of robots,
would one day render blue collar workers obsolete.  BIX [Byte Information
Exchange] recently reported that automated assembly has not significantly
reduced labor and assembly costs (and, presumably, jobs).  Instead, inventory
costs have been cut because planners can more accurately predict how long it
will take to produce an item.  Items can be produced "just in time" instead of
maintained in inventory.  In addition, resources devoted to assembly are only
about 5 to 10 percent, but inventory tends to absorb 10 to 20 percent.  The
speculation now is that managers are more likely to be displaced than workers,
since it is the managers who tend to be responsible for inventory management
and accounting.

The BIX report [microbytes/items #1421] was based on the work of Randall
Davis, a professor at the MIT Alfred P. Sloan School of Management and the MIT
Artificial Intelligence Laboratory.  Apparently Microbytes Daily interviewed

Nukes vs Coal

Tom Athanasiou <toma@Sun.COM>
Tue, 4 Aug 87 10:26:54 PDT
> People who raise the issue of nuclear wastes should look into the arsenic
> content of stack-scrubber sludge from coal-burning plants.  That stuff is
> produced in far greater quantities than nuclear wastes, for comparable
> power outputs, and arsenic has *no* half-life -- it is dangerous *forever*.
> Here we have another comparable, arguably rather worse, risk that is largely
> ignored in all the uproar about nuclear power.  Why?  Less publicity.
> Henry Spencer 

I'm getting really sick of people defending nuclear power plants by talking
about how dangerous coal power plants are.  These are not the only
alternatives, as we all should know by now.  

As has been well established in the energy debates of the last 15 years,
the best alternative to nuclear power is conservation.  Please excuse me if
I don't feel moved to dig the relevant factoids out of my library to back
up this statement, as it should be obviously true.  (You could, however,
start with Lovin's classic SOFT ENERGY PATHS).
                                                       -- TomA

   [OK, gang.  Adequate discussion of the risks of technology in general may
   be impossible within the scope and resources of RISKS.  Worse yet, there
   are deep belief systems involved, and some of the arguments are more 
   religious than rational.  I have had mail on both sides -- "Why do you keep
   running anti-nuclear power messages?", and "Why do you keep running
   pro-nuclear power messages?"  I am just trying to keep RISKS objective, 
   although very few people lack committed viewpoints in any argument -- the
   rational middle-ground viewpoints do not seem to be popular in our society.
   (Moderation in the defense of moderation is no virtue?)  One goal of
   RISKS is to encourage open discourse on topics relating to risks in
   computer-related technology.  One perfectly good strategy in a particular
   area may be to eschew the use of computers in a particular application,
   or to avoid the application altogether.  Such arguments may seem
   nontechnological, but they can be quite technologically based.  PGN]

Please report problems with the web pages to the maintainer