The RISKS Digest
Volume 7 Issue 90

Thursday, 8th December 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

"Glass cockpit" syndrome / Vincennes
Rodney Hoffman
VDTs and premature loss of ability to focus eyes
Rodney Hoffman
NEW YORK TIMES reviews novel about computer sabotage
Jon Jacky
"hacker" et al.
RAMontante
Russ Nelson
Douglas Monk
Andrew Klossner
Kenneth Siani
Don Mac Phee
Unquestioning belief in expert testimony
Matt Bishop
Info on RISKS (comp.risks)

"Glass cockpit" syndrome / Vincennes

Rodney Hoffman <Hoffman.ElSegundo@Xerox.com>
6 Dec 88 21:24:53 PST (Tuesday)
The 5 December 1988 'Los Angeles Times' reprints a lengthy 'Washington Post'
story by Sally Squires with the headline and caption:

                        PERIL FOR PILOTS

    Psychologists call it the "glass cockpit" syndrome, a computer
    information overload in which the flood of technical information,
    faulty communication and outside stress lead to judgment errors.

Much of the article is drawn from testimony by American Psychological Assn.
representatives before the House Armed Forces Committee, relating to the
Vincennes shootdown in July of an Iranian commercial jetliner.  The
witnesses said the incident "was just a symptom of a larger problem facing
society."  A few quotes from the article:

  Research is badly needed to understand just how much automation to 
  introduce — and when to introduce it — in situations where the 
  ultimate control and responsibility must rest with human operators,
  said Richard Pew of BBN....

  The growing use of high-tech devices in the cockpit or on ships can
  have two seemingly contradictory effects.  One response is to lull crew
  members into a false sense of security.  They "regard the computer's
  recommendation as more authoritative than is warranted," Pew said.
  "They tend to rely on the system and take a less active role in control."
  UTexas psychologist Robert Helmreich calls it "automation complacency."

  Another response is to fall victim to information overload and ignore
  the many bits of data pouring from myriad technical systems.... The 
  stress of combat or poor weather or machine failure only serves to
  compound the errors that can be made.  Yet "most military personnel
  feel impervious to stress," Helmreich said....  

  But many stress effects can be overcome even in combat — if people
  are conscious of their vulnerability.... Helmreich noted that when
  "multiple people verify information and decisions" there is less 
  chance of error....

  "Errors of the sort made by Vincennes personnel can be anticipated,
  and procedures to reduce their likelihood or their gravity can be
  instituted," said UMichigan psychologist Richard Nisbett....

  [Background on the multiple emergencies aboard the Vincennes at the 
  time of the shootdown.]  "The anti-air warfare officer made no attempt
  to confirm the reports [from the crew] on his own," the commander-in-
  chief of the US Central Command reported.  "Quick reference to the
  console directly in front of him would have immediately shown increasing,
  not decreasing, altitude [of the Iranian jet]."  Instead, this
  "experienced and highly qualified officer, despite all of his training,
  relied on the judgment of one or two second-class petty officers,
  buttressed by his own preconceived perception of the threat, and made
  an erroneous assessment to his commanding officer."


VDTs and premature loss of ability to focus eyes

Rodney Hoffman <Hoffman.ElSegundo@Xerox.com>
6 Dec 88 21:00:21 PST (Tuesday)
The 5 December 1988 'Los Angeles Times' reprinted a story from the November
1988 issue of the UCBerkeley 'Wellness Letter' headlined PREMATURE LOSS OF
ABILITY TO FOCUS EYES LINKED TO VDT USE.

The article reports on clinical findings by Dr. James Sheedy, chief of the
VDT clinic at the UCBerkeley School of Optometry.  Sheedy emphasized that
his evidence is preliminary and that his conclusions are based on people
who had come to the clinic with eye problems — not on a controlled study.
These preliminary findings:

  Of 153 patients who averaged six hours a day at a VDT for four or more 
  years, more than half had difficulty changing focus.  Presbyopia, or 
  loss of ability to focus with advancing age, accounted for half of 
  these problems.  The rest of the patients, though, were in their 20s
  and 30s and should have had good focusing mechanisms.

The article goes on to recommend appropriate eyeglasses, nonreflective
screens, frequent breaks, etc.


NEW YORK TIMES reviews novel about computer sabotage

<jon@june.cs.washington.edu>
Wed, 07 Dec 88 09:10:40 PST
The Sunday, Dec. 4 issue of the NEW YORK TIMES BOOK REVIEW (their Christmas
Books issue) prominently reviews a new novel, TRAPDOOR, by Bernard J.
O'Keefe.  The premise (from the review by Newgate Callender, NYT's crime
fiction reviewer):

"A brilliant American woman of Lebanese descent has developed the computer
code that controls the operation of all our nuclear devices.  Turned down for
the job she has sought, convinced male chauvinism is the reason, she is ripe
to be conned by a Lebanese activist.  At his suggestion she inserts a virus
into the computer system that in a short time will render the entire American
nuclear arsenal useless.  ... The Lebanese President ... demands that Israel
withdraw from the West Bank, or else he will tell the Russians that the
United States will lie helpless for a week or so."

Callender's review begins with the lead sentence, "Nov 2, 1988, was the day
computers in American went mad, thanks to the `virus' program inserted by
the now-famous, fun-loving Robert T. Morris, Jr."

Some background on the author, also from the review:

"Bernard J. O'Keefe (is) chairman of the high-tech company EG&G and of an
international task force on nuclear terrorism ... (and is) the author
of a nonfiction book called NUCLEAR HOSTAGES.  (O'Keefe says) "I wrote this
parable to point out the complexity of modern technology and to demonstrate
how one error, one misjudgment, or one act of sabotage could lead to actions
that would annihilate civilization." "

Callender also says "...the execution is less brilliant than the idea. ..
The book has the usual flashbacks, the usual stereotyped characters, the
usual stiff dialogue."

Although the reviewer doesn't say so, the premise of this novel is quite
similar to a 1985 French thriller, published in the U.S. as SOFTWAR.  That
novel was also based on the idea that a nation's arsenal could be completely
disabled from a single point of sabotage, although in SOFTWAR it was the Soviet
Union on the receiving end.  Popular reviewers of both books apparently find
nothing implausible in the premise.
                                     - Jonathan Jacky, University of Washington


meaning of "hack"

RAMontante <bobmon@iuvax.cs.indiana.edu>
Wed, 7 Dec 88 21:19:20 EST
I met the word "hack" at MIT in 1972, and I must admit that the current
single-minded and increasingly pejorative usage bothers me.  Let me
quote from the Lexicon section of "How To Get Around MIT (1972 ed.)":

    Hack — (1) A noun denoting a trick or prank.  For example, welding
    a streetcar onto the tracks or getting elected UAP are fine hacks.
    (2) A verb meaning to goof off, talk randomly, or just hang around.
    (3) A verb meaning to apply oneself, work hard, try earnestly.
    Example:  A computer hacker.  Also connotes fanaticism.  (4) Harrass
    somebody, whether in fun or maliciously.

Meaning (4) covers both the weather balloon planted in the Harvard-Yale
football game some years ago, and the wishing well constructed in a
friend's dorm room one holiday season.

One more quote from HoToGAMIT, under "THE OTHER EDUCATION":

    Metallurgy Shop
    For creative metallurgy or just hacking, 4-133 (the home of Tony
    Zona) is the place to be.  you can learn welding, brazing, and
    soldering...


Hacker's Dictionary definition of Hacker

Russ Nelson <nelson@sun.soe.clarkson.edu>
Wed, 7 Dec 88 09:59:39 EST
Perhaps it's time to go to an authoritative source--the Hacker's Dictionary.

HACKER, noun.
  1. A person who enjoys learning the details of computer systems and
     how to stretch their capabilities--as opposed to most users of
     computers, who prefer to learn only the minimum amount necessary.
  2. One who programs enthusiastically, or who enjoys programming
     rather than just theorizing about programming.
  3. A person capable of appreciating HACK VALUE.
  4. A person who is good at programming quickly.  (By the way, not
     everything a hacker produces is a hack.)
  5. An expert on a particular program, or one who frequently does
     work using it or on it.  Example: "A SAIL hacker."  (This
     definition and the preceding ones are correlated, and people who
     fit them congregate.)
  6. An expert of any kind.  One might be an astronomy hacker, for
     example.
  7. A malicious or inquisitve meddler who tries to discover
     information by poking around.  For example, a "password hacker"
     is one who tries, possibly by deceptive or illegal means, to
     discover other people's computer passwords.  A "network hacker"
     is one who tries to learn about the computer network (possibly
     because he wants to improve it or possibly because he wants to
     interfere--one can tell the difference only by context and tone
     of voice).


hacker/cracker/snacker, and now: "slacker"

Douglas Monk <bro@rice.edu>
Wed, 7 Dec 88 15:45:10 CST
slacker : referring to a person who wrote sleazy or shoddy code but who
should have known better, or was too lazy, or didn't think or care enough.

Doug Monk (bro@rice.edu)


Re: "cracker", "hacker."

Andrew Klossner <andrew%frip.gwd.tek.com@RELAY.CS.NET>
Tue, 6 Dec 88 15:11:17 PST
    "We desperately need a convenient term like "cracker", because
    the nonpejorative primary meaning of "hacker" needs to be
    defended vigorously against misuse by the press and others."

I disagree.  The word "hacker" is now embedded in the American language
to mean a computer attacker.  No amount of energy spent railing against
this new use will put an end to it, any more than we can make a plural
noun again of "data."

Let's spend our vigorous efforts in areas that truly merit them.

  -=- Andrew Klossner   (uunet!tektronix!hammer!frip!andrew)    [UUCP]
                        (andrew%frip.gwd.tek.com@relay.cs.net)  [ARPA]


HACKERS CAN HELP YOUR SECURITY

SIANI@nssdca.GSFC.NASA.GOV <Kenneth Siani>
Thu, 8 Dec 88 15:24:14 EST
   Much has been said recently about the evils of hackers and hacking.
I would like to say a few words in their defence. 
Before entering the 'orthodox' computer security field, I myself
was a mysterious phantom of the night. :-) I am sure quite a number
of persons in this field of computer security started out this way.
Perhaps this fact gives me some insight on the subject of hackers.


   I would like to clear up a few popular misconceptions about hackers.
Hackers are not the people that destroy systems nor are they the people
that penetrate systems to steal secret information or money. People 
that preform such acts are vandals and thieves, they are not true
hackers. The only remote connection between such persons and hackers is
the fact that both often do their work via modem and they both exploit
weakness in computer security. Another thing that separates true hackers
from the vandals that are mistakenly called hackers is motivation. 
Hackers are motivated by a great and intense quest for computer knowledge
and perhaps the feeling of power that comes from it. 
In addition, hackers have a true love of computers, and would never 
purposely damage any system. Destroyers of systems have no such love, and
their motivations are quite different!


   I am sure many of us have put a great deal of effort into making our
systems as secure as possible. In our effort to make our systems secure, 
we have read the red book, the orange book, the MITRE reports and all
the many other security related journals, books and reports.
We go to very expensive computer security seminars to hear wisdom 
from the experts and we try to implement all the safeguards the experts
tell us about. Yet, the hackers still get in! Even the National Security
Agency, the 'High Priests' of computer security, have had hackers running
around their systems. Nobody seems to be immune to hackers!


   Rather then focusing on the negative aspect of hacking and hackers,
I would like to share with you a very positive experience with hackers that
I was privileged to be a part of. During the summer of 87, I lead a group
of real hackers on an assault of some of NASA's computer systems at the
Goddard Space Flight Center in Greenbelt MD. One goal of this effort was to
test the electronic venerability of selected computer systems at GSFC NASA.
Another objective of this study was to raise the security consciousness
of not only the key personnel of the selected target systems, but of all
the users of the systems at GSFC. 


   This effort was known as "HACK ATTACK". A full report of all methods
used and the success and failures of each were made. In addition, specific
recommendations were made to improve the security of the penetrated systems.
The official report is quite sensitive, but an unclassified yet very
informative abstract of the report was presented at the AIAA/ASIS/IEEE 
Third Aerospace Computer Security Conference held in Orlando Florida
Dec. 7th - 11th 1987.
You may wish to contact the American Institute of Aeronautics and
Astronautics, 370 L'Enfant Promenade, SW, Washington, DC 20024-2518,
for information about getting a copy of the report. The title is:
THE HACK ATTACK, INCREASING COMPUTER SYSTEM AWARENESS OF VULNERABILITY 
THREATS.


   The Hack Attack was conducted in a very controlled manner, but the
hackers were real and the results were a great surprize to all of us.
Some of the hacker techniques employed were the Forward Hack, Reverse Hack,
Default Account Hack, The Decoy and my personal favorite, Social Engineering.
Some software bugs were exploited, as were some basic human weaknesses.
The hackers ranged in experience, education and computer literacy from
novice high school hackers to more experienced college level hackers.
One thing shared by all of them and myself, was a great enthusiasm for
the project. It truly was a game of wits, the hackers against the systems
and their operators. 
Some systems were penetrated while others were not. In the end there were no
losers, only winners. The security weakness of some systems were exposed
while the strengths of others were confirmed. 


    As a direct result of the Hack Attack some software has been modified and
some policy has been changed. But the greatest change of all has been in the
user community. The security consciousness of the users has greatly increased,
and that was of course our primary goal. Perhaps your systems and users
could benefit from such an experience. You would be surprized to find out
just how many hackers would be willing to help you plug the holes in your
system. You would be equally surprized how much more seriously your users
would consider the issue of computer security after they have been exposed
to a Hack Attack.


The Hack Attack has made a lasting impression on people.
It has been over a year since the Hack Attack, but to this very day, people
cover up their terminal screens or log off their computers when
I enter the room.  :-)


DISCLAIMER: The views expressed here are my own and not that of my employer,
NASA, the NSA, or any other person or government agency.

Kenneth Siani
Sr. Security Specialist 
Internet address:  { SIANI@NSSDCA.GSFC.NASA.GOV }


Hacking as a profession... Why not?

Don Mac Phee <BIW137@URIACC.BITNET>
Thu, 08 Dec 88 09:59:40 EST
  I have been involved in mainframe computing for a number of years on systems
ranging from an NCR mainframe (I've blissfully forgotten the type! :-) )
through to a VAX 8600. And in my experience as a user, I've found that a hacker
can be one of the greatest assets to a system. No administrator is omnipotent
and sometimes they can make serious mistakes in the installation of a certain
package, or pay negligible attention to loopholes in the operating system that
can lead to increased downtime or even worse... virus attacks that paralyze the
system. But the question I pose to the RISKS reader is this:

   Why don't the manufacturers hire 'hackers' to debug their operating systems?
I'm not speaking of the countless consulting firms that do security work, or
the people who designed the system. Instead, I speak specifically of the
'hacker' as a bright energetic person who sees the system from a different
viewpoint from that of the administrator, or the designer. Someone who, when
given a set of manuals about the design of the system and the operation, can
'break in' legitimately and show the flaws.

  Maybe I'm naive, but isn't the best way to design a mousetrap, is to build
it, then send the mice through?

-Don Mac Phee (BIW137@URIACC.BITNET)


Unquestioning belief in expert testimony

Matt Bishop <bishop@bear.Dartmouth.EDU>
Wed, 7 Dec 88 09:00:56 EST
In RISKS 7.89, Clifford Johnson tells of a speeder who was convicted because
of testimony of a university mathematician involving the Mean Value Theorem.
One of the risks of being involved with science and technology is that often
we deal with things that the public does not completely understand, and so
has to "take our word for".  This can have quite drastic consequences.

Haber-Runyon's book "General Statistics" has a graphic example of this (see p.
156-157).  An elderly woman was mugged in California, and a witness saw a
"blonde girl with a ponytail run from the alley and jump into a yellow car
driven by a bearded Negro."  Eventually, a couple were arrested and tried for
the crime; the evidence against them was that the woman "was white, blonde, and
wore a ponytail while her Negro husband owned a yellow car and had a beard."
Both were convicted, because the prosecutor got an expert witness from the math
department of a nearby college to testify that the probability of a set of
events occuring is the product of the probability of each of the events
actually occurring; and using "conservative estimates" (such as the chances of
a car's being yellow is 1 in 10, the chances of a couple in a car being
interracial ia 1 in 1,000, etc.), he concluded the odds of any other couple
sharing the characteristics of the defendents is 1 in 12,000,000.  That was
enough for the jury.

Fortunately, the California Supreme Court disagreed.  Aside from the illegality
of "trial by mathematics", the math expert didn't go far enough — assuming
everything he (or she) said was true, one Justice pointed out that there was a
41% chance that at least one other couple in the area will also share those
characteristics! (He attached a 4 page appendix to the opinion demonstrating
this to his own, and five concurring justices', satisfaction.)

Such are the dangers of encouraging blind trust in experts ...
                                                                 Matt Bishop

PS: The woman was released earlier and broke parole when she "lit out
for parts unknown."  The man was in jail at the time of the Supreme
Court's opinion; "he could be tried again, but the odds are against it."

Please report problems with the web pages to the maintainer

x
Top