The RISKS Digest
Volume 8 Issue 02

Wednesday, 4th January 1989

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Christmas 1988 Decnet Worm — Counteracted
Cliff Stoll
Vincennes and the computer
Steve Philipson
Clifford Johnson
Viruses and System Security (a story)
by Dave Platt
submitted to RISKS from rec.humor.funny by Jim Horning and Mark Brader
Stallman, Minsky and Drescher on the Internet Worm
via Martin Minow
FAA Orders Computer Card Security Systems at 270 Airports
Henry Mensch
Info on RISKS (comp.risks)

Christmas 1988 Decnet Worm — Counteracted

Cliff Stoll <cliff%cfa204@harvard.harvard.edu>
Tue, 27 Dec 88 13:44:27 EST
On December 22nd, someone started a virus/worm on the SPAN/Decnet network.
It attacks only Vax/VMS computers, and only those which are connected 
to the SPAN/HEPNET/Decnet network.  It cannot enter Unix systems or PC's.

This virus/worm is benign in that it does not erase information.  The writer 
apparently wishes to embarrass system managers and network administrators.

Language purists will call it a worm:  it does not modify any files,
and copies itself from node to node.

Indications point to an origin in Germany.

I spent several hours creating bogus announcements to confuse and counteract 
the virus writer.  I've mailed these to the PHSOLIDE collection point.

The virus writer has collected these announcements, and has no way
to tell which announcements are valid, and which are phoney. 


Technical details for Decnet/VMS people:

The worm enters through the Decnet Task object, and mails your system's
announcement banner (sys$announce) to Decnet node 20597::PHSOLIDE.
(This node apparently is in France)

The worm generates a random node address and tries to copy itself onto that
node.  If this fails, it tries different random nodes until it finds one.
Once it finds a valid node, it tries to copy itself using the NETFAL account
(through the Task object).  If you don't have a valid Task object, it tries
to log into account DECNET, with password DECNET.

Once it's in your system, it creates a list of all users on your node, and
mails a message to each of them.  This message is some blather about how
Father Christmas has had a hard time getting "the terrible Rambo-Guns, Tanks
and Space Ships up here at the Northpole." The message itself is written in
a stilted, almost Germanic, style.

You can immunize your system by deleting the TASK 0 Decnet object, and by
making certain that you've changed the Decnet password.  In any case, the
worm is timed to stop after December 24th.  By the time you receive this
message, the worm will have died.

Cliff Stoll, Harvard - Smithsonian Center for Astrophysics,
60 Garden Street, Cambridge, MA 02138         Cliff@cfa200.harvard.edu


Vincennes and the computer

Steve Philipson <steve@aurora.arc.nasa.gov>
Fri, 23 Dec 88 15:03:50 PST
In RISKS-FORUM Digest 7.94 "Clifford Johnson" <GA.CJJ@Forsythe.Stanford.EDU>
[Vincennes:  conclusively, a computer-related error] writes:

>I reflect that *all* the information that panicked the Vincennes crew and
>captain came from the computers.  The captain was not faulted [...]
> The fault was found to lie largely with the computer's initial
>classification of the flight as hostile, and the computers' subsequent unclear
>albeit correct presentation of the ascent data.  The actions taken to remedy
>the deficiencies are improvements in the computer display/ human interface.
>This is a a classic case of computer *related* error: unobvious and secondary
>display of criticial data.

>What the Pentagon has has more or less overtly ruled is that its
>most competent, trained, and alert officers cannot be blamed for
>mistakenly reading and acting on deadly computer displays,
>especially not in combat, i.e. when they're actually used.

   In the case of Vincennes, the computer was definitely NOT the only nor
the most significant source of information.  The ship had been primed with
intelligence reports of hostile intent, was engaged in battle, maneuvering
radically, and taking fire.  The crew could hear bullets and shrapnel hitting
the ship.  They had been briefed to expect attack including aerial attack,
and had the memory of the Stark to remind them of the dangers inherent in
their situation.  They knew they were under surface attack.  They were
ready to believe that they were about to come under aerial attack as well.

   A major conclusion of the report was that people under great stress do 
not function in the same manner as they do in lab conditions.  It's easy 
for us to scour through the records in the comfort of our homes and offices 
and make judgements, but far more difficult to make them under severe time 
pressure, in physically disturbing conditions, under the threat of death.

   This case illustrated that a correct presentation of data is not always 
sufficient to prevent error; it may be necessary to present the data 
correctly and in a form that is highly unlikely to be misinterpreted.  It 
is not clear that we will ever be able to make systems that are immune 
from misinterpretation under such severe conditions.

   There is always confusion in battle, and there always will be, no 
matter what we do with computer systems.  The commander's first duty was 
to protect his ship.  That is what he did, albeit from what turned out to 
be a non-combatant that could not have hurt him.  To censure the crew of 
the Vincennes would undermine the ability of every man in uniform to take 
the necessary actions to protect himself and his country.  The Pentagon 
brass affirmed with their decision that battle zones are places rife with
confusion and danger, and that errors under those conditions are a fact of
life.

   We learn from this incident that battle zones are no place for innocents
(a lesson that is intuitively obvious), and that we have much to learn 
about how to fight with systems based on men and machines. 

[...]


Vincennes and the computer

"Clifford Johnson" <GA.CJJ@Forsythe.Stanford.EDU>
Tue, 27 Dec 88 16:19:08 PST
>  In the case of Vincennes, the computer was definitely NOT the only
>  nor the most significant source of information.

What I meant was that without the computer, there wouldn't have even been a
decision to shoot.  The computer-sensor's recognition of military signals
from the take-off airfield triggered, according to rule, an initial
misclassification as hostile until proven otherwise, and without the
computers' tracking of the flight nobody could have believed that the flight
was diving towards the ship.  That the error was due to bad presentation of
data was the Pentagon's conclusion, and why the incident is conclusively
computer-related error.

> To censure the crew of the Vincennes would undermine the
> ability of every man in uniform to take the necessary actions
> to protect himself and his country.

We agree that the conduct of men in such circumstances is inherently an
input-governed impulse.  But your sentiment overlooks that military mission
takes precedence over personal survival, and that protection of innocent
life in the Gulf was the Vincennes' mission.  Viewed in this light, the
reliance placed on the computer-governed drills is unconvincingly justified.

[...]


Viruses and System Security (a story) [by Dave Platt]

Jim Horning <horning@src.dec.com>
20 Dec 88 00:30:03 GMT
The following story was posted in news.sysadmin recently.

The more things change, the more they stay the same...

Back in the mid-1970s, several of the system support staff at Motorola
(I believe it was) discovered a relatively simple way to crack system
security on the Xerox CP-V timesharing system (or it may have been
CP-V's predecessor UTS).  Through a simple programming strategy, it was
possible for a user program to trick the system into running a portion
of the program in "master mode" (supervisor state), in which memory
protection does not apply.  The program could then poke a large value
into its "privilege level" byte (normally write-protected) and could
then proceed to bypass all levels of security within the file-management
system, patch the system monitor, and do numerous other interesting
things.  In short, the barn door was wide open.

Motorola quite properly reported this problem to XEROX via an official
"level 1 SIDR" (a bug report with a perceived urgency of "needs to be
fixed yesterday").  Because the text of each SIDR was entered into a
database that could be viewed by quite a number of people, Motorola
followed the approved procedure: they simply reported the problem as
"Security SIDR", and attached all of the necessary documentation,
ways-to-reproduce, etc. separately.

Xerox apparently sat on the problem... they either didn't acknowledge
the severity of the problem, or didn't assign the necessary
operating-system-staff resources to develop and distribute an official
patch.

Time passed (months, as I recall).  The Motorola guys pestered their
Xerox field-support rep, to no avail.  Finally they decided to take
Direct Action, to demonstrate to Xerox management just how easily the
system could be cracked, and just how thoroughly the system security
systems could be subverted.

They dug around through the operating-system listings, and devised a
thoroughly devilish set of patches.  These patches were then
incorporated into a pair of programs called Robin Hood and Friar Tuck.
Robin Hood and Friar Tuck were designed to run as "ghost jobs" (daemons,
in Unix terminology);  they would use the existing loophole to subvert
system security, install the necessary patches, and then keep an eye on
one another's statuses in order to keep the system operator (in effect,
the superuser) from aborting them.

So... one day, the system operator on the main CP-V software-development 
system in El Segundo was surprised by a number of unusual phenomena.
These included the following (as I recall... it's been a while since I
heard the story):

-  Tape drives would rewind and dismount their tapes in the middle of a
   job.

-  Disk drives would seek back&forth so rapidly that they'd attempt to
   walk across the floor.

-  The card-punch output device would occasionally start up of itself
   and punch a "lace card" (every hole punched).  These would usually
   jam in the punch.

-  The console would print snide and insulting messages from Robin Hood
   to Friar Tuck, or vice versa.

-  The Xerox card reader had two output stackers;  it could be
   instructed to stack into A, stack into B, or stack into A unless a
   card was unreadable, in which case the bad card was placed into
   stacker B.  One of the patches installed by the ghosts added some
   code to the card-reader driver... after reading a card, it would flip
   over to the opposite stacker.  As a result, card decks would divide
   themselves in half when they were read, leaving the operator to
   recollate them manually.

I believe that there were some other effects produced, as well.

Naturally, the operator called in the operating-system developers.  They
found the bandit ghost jobs running, and X'ed them... and were once
again surprised.  When Robin Hood was X'ed, the following sequence of
events took place:

  !X id1

  id1:   Friar Tuck... I am under attack!  Pray save me!  (Robin Hood)
  id1: Off (aborted)

  id2: Fear not, friend Robin!  I shall rout the Sheriff of Nottingham's men!

  id3: Thank you, my good fellow! (Robin)

Each ghost-job would detect the fact that the other had been killed, and
would start a new copy of the recently-slain program within a few
milliseconds.  The only way to kill both ghosts was to kill them
simultaneously (very difficult) or to deliberately crash the system.

Finally, the system programmers did the latter... only to find that the
bandits appeared once again when the system rebooted!  It turned out
that these two programs had patched the boot-time image (the /vmunix
file, in Unix terms) and had added themselves to the list of programs
that were to be started at boot time...

The Robin Hood and Friar Tuck ghosts were finally eradicated when the
system staff rebooted the system from a clean boot-tape and reinstalled
the monitor.  Not long thereafter, Xerox released a patch for this
problem.

I believe that Xerox filed a complaint with Motorola's management about
the merry-prankster actions of the two employees in question.  To the
best of my knowledge, no serious disciplinary action was taken against
either of these guys.

Several years later, both of the perpetrators were hired by Honeywell,
which had purchased the rights to CP-V after Xerox pulled out of the
mainframe business.  Both of them made serious and substantial
contributions to the Honeywell CP-6 operating system development effort.
Robin Hood (Dan Holle) did much of the development of the PL-6
system-programming language compiler; Friar Tuck (John Gabler) was one
of the chief communications-software gurus for several years.  They're
both alive and well, and living in LA (Dan) and Orange County (John).
Both are among the more brilliant people I've had the pleasure of
working with.

Disclaimers: it has been quite a while since I heard the details of how
this all went down, so some of the details above are almost certainly
wrong.  I shared an apartment with John Gabler for several years, and he
was my Best Man when I married back in '86... so I'm somewhat
predisposed to believe his version of the events that occurred.

Dave Platt 
  Coherent Thought Inc.  3350 West Bayshore #205  Palo Alto CA 94303

--
Edited by Brad Templeton.  MAIL, yes MAIL your jokes to funny@looking.UUCP
Attribute the joke's source if at all possible.  I will reply, mailers willing.
Remember: If you POST your joke instead of mailing it, I will not reply.


Stallman, Minsky and Drescher on the Internet Worm

<minow%thundr.DEC@decwrl.dec.com>
20 Dec 88 14:53
The following letter appeared in the Business section of the Boston Globe,
20 Dec 1988.  [It does not represent the position of Digital Equipment
Corporation (or my position, either).  Martin Minow]

    Recent computer virus threatens American justice system, too

The recent computer network virus was a prank designed to be harmless.  A
minor programming error made it replicate so much that it clogged Internet,
a research network, with messages.  Now some people want to punish this
accident as deliberate sabotage.

Yes, people should not clog networks.  But the "worm" had parts designed to
avoid clogging; one had an error.  Research is error prone: punishing errors
is futile if limited to errors in pranks.  More rational is to keep critical
computers off research networks, as the military does.

Yes, another worm might be designed to destroy files.  Some people are angry
at these potential future crimes; so angry that they clamor to punish someone
as an example, whether his own deeds deserve it or not.

This clamor threatens the American tradition of justice for each individual
-- something even more valuable than a working Internet.

        Richard Stallman
        Free Software Foundation,
        Cambridge.

        Henry Minsky and Gary Drescher
        MIT Artificial Intelligence Laboratory,
        Cambridge.


FAA Orders Computer Card Security Systems at 270 Airports

Henry Mensch <henry@GARP.MIT.EDU>
Wed, 4 Jan 89 23:05:01 EST
(NY Times, 4 Jan 89) NEW YORK — In a sweeping new move to tighten security
at United States airports, the government Wednesday ordered that computer
card systems be installed at the busiest terminals by early 1991 to keep
people who might threaten airline safety from reaching restricted areas.
Ultimately, a total of 270 airports would have to install either computer
card systems, resembling those used for automatic banking, or alternative
methods providing equal security.  The Federal Aviation Administration rule,
estimated to cost $170 million over the next 10 years, was proposed in
March. 

The move was made as a result of the crash of a Pacific Southwest Airlines
commuter jet in December 1987 that occurred after a passenger, believed to
have been an employee dismissed by an that had bought PSA, fired several
gunshots during the flight.  All 43 people aboard were killed.

The decree Wednesday had additional significance in the aftermath of the
bombing of a Pan Am jumbo jet over Scotland last month in which a total of
270 people were killed.

In a section of the rule justifying its action, the FAA said currently used
identification badges "provide a means of control once an individual has
gained access to a restricted area."  "The FAA is concerned," it said, "that
these procedures could allow an individual using forged, stolen or
noncurrent identification to compromise the secured areas."  It added that
former employees could use their familiarity with procedures to enter a
"secured area and possibly commit a criminal act on board an aircraft."
[...]
Burnley noted in Wednesday's announcement that computer-controlled card
systems could be programmed to "keep a record of employees who try to enter
areas for which they are not authorized."  "They can also reject cards that
have been reported lost or stolen, or which have not been surrendered by
former employees," he said.

T. Allan McArtor, administrtator of the FAA, said such systems already
were in use at some airports and "have proved to be highly effective
and workable."

Airline officials and airport operators had advanced many objections to the
new rule, including the high cost of installing and operating the
computer-card or other systems.  But in dealing with the cost issue, the FAA
said the total investment "can be recovered fully if one incident, involving
the loss of 170 lives and a wide-bodied jet," were prevented in the next 10
years.  [...]

Please report problems with the web pages to the maintainer

x
Top