The RISKS Digest
Volume 7 Issue 87

Monday, 5th December 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Value for money?
Jerry Harper
Corps of Software Engineers
Gary Chapman
DEC Enet and "denial of service" attacks
Willie Smith
Re: Nonsecure passwords/computer ethics ( /dev/*mem and superuser )
Paul E. McKenney
Kendall Collett
PGN
"Hackers," "crackers," "snackers," and ethics
Frank Maginnis
PGN
FM
Darrell Long
Alex Colvin
Computer Risks Revisited
John Markoff
Info on RISKS (comp.risks)

Value for money?

Jerry Harper <jharper@euroies.UUCP>
Sat, 3 Dec 88 20:42:33 GMT
This is excerpted from THE IRISH TIMES of two weeks back:

      " OFFICIALS `COMMITTED [$67m] TO INADEQUATE COMPUTERS' "

The Department of Health was accused yesterday of committing some [$67m] of
State funds to the purchase of an inadequate computer system for the health 
service.  Eleven million pounds will already have been spent on the project
by the end of this year, the Secretary of the Department of Health, Mr Liam
Flanagan, told the Dail [our parliament] Committee of Public Accounts.
...[the decision taken in 1982 to computerise government services... deleted]

   ...Auditor General,Mr Patrick McDonnell, expressed his disquiet at the
lack of planning since that date, and at the fact that no costing was done
until May 1985, by which time [$67m] was committed...
   ...[deleted piece about the authorship of a report]
   ...Mr Flanagan said [$670,000] had been spent on management consultancy.  In
his opinion, this was value for money, despite the fact that some of the
hardware proved to be inadequate with high maintenance costs, and certain items
had to be sold off at half-price to health boards.  In particular, the
committee heard that threee of the mini-computers which had cost approximately
[$227,000] were sold back to the supplier for [$123,000].  Two of these were
subsequently supplied to the Eastern Health Board at [$41,000] each.
   ...[deleted piece about the report being referred to the Minister]"
   The system being referred to was put together by McDonnell-Douglas and
"looked after" by the closely related McAuto.  An enormous amount of pressure
was placed on hospital administrators and senior consultants to accept the
system.  The pressure came from the company through the usual sales hype and
several politicians attempting to bend individuals ears.  A senior consultant I
know in one of the prestigious test site hospitals commented that he was
astounded at the inferiority of what was being offered.  It became so bad at
one stage that maintenance people were practically living in the hospital.  I
don't attribute culpability for the deficiencies of the system to any of the
companies involved but at the very least they should have beta-tested the
system more thoroughly.  The management consultancy firm could do with a little
of that also.


Corps of Software Engineers

Gary Chapman <chapman@csli.Stanford.EDU>
Mon, 5 Dec 88 09:01:54 PST
Not exactly a risk of computers, but definitely a risk to software engineers:
during the early days of the war in Vietnam, there were some IBM programmers
and system engineers living in-country and working on Army computers.  One
day things got pretty hot--a rocket attack, I believe--and the IBM personnel
demanded to be evacuated.  The Army refused, saying they were essential to the
war effort, that without them the computers would not perform.  The IBM manager
threatened to go to superior authorities, so the Army commander then said that
the nearby airbase was under attack and there were no flights available for
evacuation.  I never heard the resolution of this story, but it was clear these
programmers got more than they bargained for.

For many years there was a "special" draft for doctors--physicians were almost
guaranteed two years of military service, simply because they were doctors.
Might we see something similar for computer professionals in the future?

-- Gary Chapman
   Executive Director, Computer Professionals for Social Reponsibility


DEC Enet and "denial of service" attacks

Willie Smith, LTN Components Eng. <w_smith%wookie.DEC@decwrl.dec.com>
5 Dec 88 10:50
Over the five-day Thanksgiving weekend, the crackers of New England
successfully perpetrated a denial-of-service attack on Digital's internal
network.  They managed to do this without touching a keyboard.  Someone in
Corporate-level management decided that the following actions were to be
taken by all system managers:

The first was to shut down all the routers.  This prevented any network traffic
at all from travelling between different areas and broke the network into
independent LANs.  Essentially turning off the WAN....

The second step was to disable all dialins at all sites.  This prevented
hackers from gaining access to the machines in the first place.  This included
all external packet network hosts as well as local modems.

The third step was to shut down all "unattended" machines for the duration. 
This additional step prevented hackers from getting to the machines at all, and
greatly reduced the incidence of unauthorised use.  :+}

Well, to give them credit, it worked just fine!  It also caused a bit of havoc
in automatic distribution of Internet and Usenet postings, and gave those of us
who like to dial in to read our mail and internal BBSs the weekend off.  In
fact, it was so successful that there are rumblings that it may be repeated as
needed.  Most of our internal mailers time out after 3 days, and many are set
to one day to conserve disk space.  I've been told that the free access to the
net that we have been taking for granted is going to tighten up considerably
due to frequent intrusions.

Willie Smith  w_smith@wookie.dec.com  w_smith%wookie.dec.com@decwrl.dec.com
{Usenet!Backbone}!decwrl!wookie.dec.com!w_smith


Re: Nonsecure passwords/computer ethics ( /dev/*mem and superuser )

Paul E. McKenney <mckenney@spam.istc.sri.com>
Tue, 22 Nov 88 13:54:00 PST
In RISKS-7.74, PGN says:

    [ . . . ]  In UNIX,
    the /dev/mem vulnerability (a "feature" to some) can be used to capture
    passwords in unencrypted form.  Even the Gould UTX/32S C2 version of
    Unix still has that vulnerability.

UNIX systems -can- be configured so that they do not have this vulnerability.
All memory devices (e.g., /dev/mem) devices should be protected as follows:
    cr--r-----  1 root     sys        3,   0 Jan 10  1987 /dev/mem
i.e., so that only the super-user (root) or members of the special group
``sys'' (some systems use group ``kmem'', but the principle is the same)
are allowed to access kernel memory.  This may be accomplished with
the following commands:
    chown root /dev/mem /dev/kmem ...
    chgrp sys /dev/mem /dev/kmem ...
    chmod 440 /dev/mem /dev/kmem ...
where the ``...'' is replaced by the pathnames of any system-specific
memory devices (e.g., /dev/vme* on Suns).

System programs that need access to kernel memory (e.g., ps) should have the
following protections:
    -rwxr-sr-x  1 bin      sys         47104 Jan 10  1987 /bin/ps*
i.e., the ``ps'' program should not be writeable to anyone except the
special user ``bin'', should be executable and readable to all, and should
set its group ID to ``sys'' upon execution (allowing it to read the /dev/mem
file).  This prevents normal (read ``malicious'') access to the /dev/mem file,
while allowing authorized systems programs access to /dev/mem.
This may be accomplished with the following commands:
    chown bin /bin/ps ...
    chgrp sys /bin/ps ...
    chmod 4755 /bin/ps ...
where the ``...'' is replaced by the pathnames of all system programs
that need access to memory devices.

Note that the ptrace system call (which allows transparent debugging)
disables the setuid/setgid facility in the versions of UNIX that I am
familiar with.  If your UNIX is less paranoid, you will also need to
disallow read access to system programs that access /dev/mem.

                    Thanx, Paul


Re: Nonsecure passwords/computer ethics ( /dev/*mem and superuser )

Kendall Collett <kcollett@fang>
Fri, 18 Nov 88 09:30:55 -0600
> From: Peter G. Neumann <Neumann@KL.SRI.COM>  (<a href="/Risks/7/74">RISKS-7.74</a>)
> Even the Gould UTX/32S C2 version of Unix still has that vulnerability. [...]

While the memory pseudo-devices (/dev/*mem) exist in Gould UTX/32S, access to
these devices is restricted in two ways (beyond the mode bits on the files):

    1) All untrusted users in UTX/32S operate from within a
           restricted environment (RE).  An RE is a subtree of the file
           system which, for the most part, looks like a complete UNIX
           file system, with the exception that it does not contain
           sensitive commands or files.  When users log onto the system,
           their root directory is set by the system to be the root
           directory of the subtree forming the RE.  Since an RE does
           not contain the /dev/*mem files, from the user's perspective
           the memory pseudo-devices do not even exist.

        2) The memory pseudo-devices on UTX/32S are regarded by the
           system as ``privileged devices''.  Regardless of the mode
           bits on the device special file, only superusers and can
           access privileged devices.

It is true a superuser on UTX/32S can access /dev/mem and read
unencrypted passwords, but this does not give a superuser any
capabilities that he or she didn't already have.

Kendall Collett             
                      formerly:
Motorola Inc., Microcomputer Division     Gould Inc., Computer Systems Division
Urbana Design Center                  Software Development Center
1101 E. University Avenue
Urbana, IL 61801

kcollett@urbana.mcd.mot.com
uunet!uiucdcs!mcdurb!kcollett

Disclaimer:  In expressing the above opinion, I am in no way acting as a
             representative of either Gould, Inc. or Motorola, Inc. 


Re: Nonsecure passwords/computer ethics ( /dev/*mem and superuser )

Peter G. Neumann <Neumann@KL.SRI.COM>
Mon, 5 Dec 88 10:11:11 PST
Recalling the $350,000 ATM scam (RISKS-7.85), the mere existence of a superuser
mechanism (no matter how well you think it may be protected) is dangerous.  For
example, the Ethernet and Arpanet are wide open and unencrypted, with passwords
flowing around.  The moral of the story is, to a first approximation, assume
everything is wide open and don't rely on computers and networks to protect
you.  To a second approximation, you (and your system administrators, and your
system vendors) can do much better to protect you.  But don't count on it.  PGN


"Hackers," "crackers," "snackers," and ethics (RISKS-7.86)

Frank Maginnis <maginnis@community-chest.mitre.org>
Mon, 05 Dec 88 11:04:12 -0500
Perhaps we should consider a less forgiving attitude towards traditional
"hackers," or "snackers," as they are styled here.  There are serious ethical
concerns with an "experimenter" who uses uninformed, unknowing subjects in his
experiments — which is what, in effect, is being characterized as
non-malicious and deserving of a non-pejorative term.  Moreover, in more mature
scientific fields, such as medicine, it is not left up to the experimenter to
decide for himself what is ethically acceptable; he or she must convince review
boards that include both peers and (one hopes) members of the affected public.

Some might consider the comparison with medical research ethics overdrawn.  But
consider: suppose a medical researcher were caught introducing some virus — a
real one — into the environment.  Suppose he thought the virus benign, but it
turned out that it made several thousand people sick, perhaps people who were
in some particularly susceptible population (UNIX system administrators, say).
Not too sick: maybe just enough that they lost a day or two of work, suffered
discomfort, some degree of mental anguish . . .

The computer profession imposes a risk on the general public if it allows
individual experimenters, however well intentioned, to conduct experiments,
demonstrations of vulnerability, or whatever, on "subjects" who do not know
they are a part of the experiment, and have not consented to being so.  We
shouldn't condone such activity, even when the affected community is small
(e.g.  subscribers to a university's time-sharing network), or when the impact
turns out to be benign.  We don't need a non-pejorative term for such activity;
we need a highly pejorative attitude towards it.

Frank Maginnis, MITRE Corporation, McLean, VA
(Standard disclaimers apply)


Re: "Hackers," "crackers," "snackers," and ethics

Peter G. Neumann <Neumann@KL.SRI.COM>
Mon, 5 Dec 88 10:27:35 PST
I am afraid my point may have been misunderstood.  Bob Morris (Sr.) was using
the Berkeley time sharing system (which might have been called experimental at
the time) as an ordinary unprivileged user.  Systems in development are
generally flaky.  He just happened to fall into traps that had been unwittingly
left by the builders, but  was not trying to crash the system.

It seems rather obvious that such problems should be found early and fixed,
BEFORE users come to depend (unwisely?) on computer systems.  (Perhaps we
need to add ``slacker'' to the hacker-alias terminology?)

By the way, a discussion that we may need to get into (when the Internet
worm has died down) is the issue of whistle-blowing.  We generally persecute
whistle-blowers.  It may be the "shoot-the-messenger" syndrome.

On the Internet worm problem and other similar situations, RISKS
contributors seem to look on two positions as antithetical.  In fact these
positions are not incompatible

  * Hacking can be dangerous (to the hacker and the hackees).  (The Internet
    Worm initiator SCREWED UP BADLY.)

  * Our favorite systems and network technologies are FUNDAMENTALLY FLAWED.
    We should not be surprised by malicious attacks by malevolent hackers in 
    the future.

Neither of these statements negates the other.

[Sorry if RISKS is overly devoted to related problems lately.  It is just
going with the flow (of contributions).  PGN]


Re: "Re: "Hackers," "crackers," "snackers," and ethics"

Frank Maginnis <maginnis%community-chest.mitre.org@gateway.mitre.org>
Mon, 05 Dec 88 16:26:09 -0500
[...]  I misunderstood you to say that he was actually trying to crash the
system, albeit by doing legal things (i.e. a "Black Team" sort of effort).
Actually, I was not trying to criticize Morris Sr., so much as an attitude
that seems to be widespread, if not endemic, in the profession.  This
attitude can be seen in defenses of Morris Jr., along the lines of:  "Well,
it's too bad the worm got out of control, but what he was trying to do was
really benevolent," etc. etc.  I also wanted to inject into the continuing
discussion of computer ethics the idea that we should adopt something like
the notion of "informed consent," as it is applied in medical research.  The
question of whether a university time-sharing system should be considered
"experimental" with respect to this kind of hacking might be one that a
university review committee could consider.  They could look into whether
the users understand that hacking is going on, in the interest of improving
the system, and are willing to accept the risks involved.  But the hacker
shouldn't consider that he has the right to conduct an experiment just
because _he_ considers it to be in everyone's best interests.

Incidentally, with respect to your original item, I'm afraid that trying to
change the connotation of a word once it has gotten into general circulation
is like trying to command the tide not to come in.  Just ask 3M
("cellophane"), Xerox, mathematicians who study chaos theory, grammarians
who rail against "hopefully," etc.  "Hacker" and "virus" will undoubtedly
appear very soon in standard English dictionaries with the general public's
understanding of the terms, not the profession's — "hacker" probably
already has! We'll just have to adapt.


Ethics and caution (the worm)

Darrell Long <darrell@midgard.ucsc.edu>
Mon, 5 Dec 88 10:58:45 PST
I have seen several articles recently complaining about others that state
they learned from the Internet worm and that the perpetrator actually did
them a favor.  The worm caused all of us to lose a lot of time, and so it
was a serious breech of ethics.  But I would like to remind everyone, that
the real bad guys do not share our ethics, and thus are not bound by them.
We should make it as difficult as possible (while preserving an environment
conducive to research) for this to happen again.

The worm opened some eyes.  Let's not close them again by saying
"Gentlemen don't release worms."

Darrell Long, Computer and Information Sciences, University of California
Santa Cruz, CA 95064


"Hackers," "crackers," "snackers," and ethics (RISKS-7.86)

Alex Colvin <mac3n@babbage.acc.virginia.edu>
Mon Dec 5 14:21:58 1988
> But we do have a problem.  We desperately need a convenient term like
> "cracker", because the nonpejorative primary meaning of "hacker" needs to be
> defended vigorously against misuse by the press and others.  ...

In my youth (10 years ago) "hacker" WAS a pejorative term, implying a sort
of dedicated misdirection.  Those were the guys who spent all night writing
login emulators and talk programs that they could eavesdrop.  See
Weizenbaum "Computer Power and Human Reason" for similar usage.

I forget what was the term of approval. "bit banger?"


Computer Risks Revisited

<John Markoff>
Sat, 3 Dec 88 09:12:40 PST
NETWORKS OF COMPUTERS AT RISK FROM INVADERS 
By JOHN MARKOFF  (c.1988 N.Y. Times News Service)

     Basic security flaws similar to the ones that let intruders gain
illegal entry to military computer networks in recent weeks are far more
common than is generally believed, system designers and researchers say.
     And there is widespread concern that computer networks used for
everyday activities like making airline reservations and controlling the
telephone system are highly vulnerable to attacks by invaders considerably
less skilled than the graduate student whose rogue program jammed a
nationwide computer network last month.
     For example, the air traffic control system could be crippled if
someone deliberately put wrong instructions into the network, effectively
blinding controllers guiding airplanes.
     The two recent episodes have involved military computers: one at
the Mitre Corp., a company with Pentagon contracts, and the other into
Arpanet, a Defense Department network with links to colleges.  But illegal
access to computer systems can compromise the privacy of millions of people.
     In 1984, TRW Inc. acknowledged that a password providing access to 90
million credit histories in its files had been stolen and posted on a
computerized bulletin board system.  The company said the password may have
been used for as long as a month.
     This year an internal memorandum at Pacific Bell disclosed that
sophisticated invaders had illegally gained access to telephone network
switching equipment to enter private company computers and monitor telephone
conversations.
     Computer security flaws have also been exploited to destroy data.  In
March 1986 a computer burglar gained access by telephone to the office computer
of Rep. Ed Zschau of California, destroyed files and caused the computer to
break down.  Four days later, staff workers for Rep. John McCain of Arizona,
now a senator, told the police they had discovered that someone outside their
office had reached into McCain's computer and destroyed hundreds of letters and
mailing addresses.
     In Australia last year, a skilled saboteur attacked dozens of
computers by destroying an underground communication switch.  The attack cut
off thousands of telephone lines and rendered dozens of computers, including
those at the country's largest banks, useless for an entire day.
     Experts say the vulnerability of commercial computers is often
compounded by fundamental design flaws that are ignored until they are exposed
in a glaring incident.  ``Some vulnerabilities exist in every system,'' said
Peter Neumann, a computer scientist at SRI International in Menlo Park, Calif.
``In the past, the vendors have not really wanted to recognize this.''
     Design flaws are becoming increasingly important because of the
rapidly changing nature of computer communications. Most computers were once
isolated from one another.  But in the last decade networks expanded
dramatically, letting computers exchange information and making virtually all
large commercial systems accessible from remote places.  But computer designers
seeking to shore up security flaws face a troubling paradox: by openly
discussing the flaws, they potentially make vulnerabilities more known and thus
open to sabotage.
     Dr. Fred Cohen, a computer scientist at the University of
Cincinnati, said most computer networks were dangerously vulnerable.  ``The
basic problem is that we haven't been doing networks long enough to know how
to implement protection,'' Cohen said.
     The recent rogue program was written by Robert Tappan Morris, a
23-year-old Cornell University graduate student in computer science, friends
of his have said.  The program appears to have been designed to copy itself
harmlessly from computer to computer in a Department of Defense network, the
Arpanet.  Instead a design error caused it to replicate madly out of
control, ultimately jamming more than 6,000 computers in this country's most
serious computer virus attack.
     For the computer industry, the Arpanet incident has revealed how
security flaws have generally been ignored.  Cohen said most networks, in
effect, made computers vulnerable by placing entry passwords and other secret
information inside every machine.  In addition, most information passing
through networks is not secretly coded.  While such encryption would solve much
of the vulnerability problem, it would be costly.  It would also slow
communication between computers and generally make networks much less flexible
and convenient.
     Encryption of data is the backbone of security in computers used by
military and intelligence agencies. The Arpanet network, which links computers
at colleges, corporate research centers and military bases, is not encrypted.
     The lack of security for such information underscored the fact that
until now there has been little concern about protecting data.
     Most commercial systems give the people who run them broad power
over all parts of the operation. If an illicit user obtains the privileges
held by a system manager, all information in the system becomes accessible
to tampering.
     The federal government is pushing for a new class of military and
intelligence computer in which all information would be divided so that
access to one area did not easily grant access to others, even if security
was breached.
     The goal is to have these compartmentalized security systems in
place by 1992.
     On the other hand, one of the most powerful features of modern
computers is that they permit many users to share information easily; this
is lost when security is added.
     In 1985 the Defense Department designed standards for secure computer
systems, embodied in the Orange Book, a volume that defines criteria for
different levels of computer security.  The National Computer Security Center,
a division of the National Security Agency, is now charged with determining if
government computer systems meet these standards.
     But academic and private computer systems are not required to meet
these standards, and there is no federal plan to urge them on the private
sector.  But computer manufacturers who want to sell their machines to the
government for military or intelligence use must now design them to meet the
Pentagon standards.
     Security weaknesses can also be introduced inadvertently by changes in
the complex programs that control computers, which was the way Morris's program
entered computers in the Arpanet.  These security weaknesses can also be
secretly left in by programmers for their convenience.
     One of the most difficult aspects of maintaining adequate computer
security comes in updating programs that might be running at thousands of
places around the world once flaws are found.
     Even after corrective instructions are distributed, many computer
sites often do not close the loopholes, because the right administrator did
not receive the new instructions or realize their importance.

Please report problems with the web pages to the maintainer

x
Top