The RISKS Digest
Volume 7 Issue 84

Tuesday, 29th November 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

"Program Verification: The Very Idea", by J.H. Fetzer
Nancy Leveson et al.
Internet Worm Tech Report
Gene Spafford
Purchasers of computer systems as causes of the Internet worm
Brandon S. Allbery
Bank of America ATMs Hit a Glitch
PGN
Corps of Software Engineers?
Henry Spencer
Software Uniformity Legislation
Colin M Thomson
Zapping shoplifters in Minnesota
Scot E Wilcoxon
(Counter-)corrective control systems
Jeffrey R Kell
Info on RISKS (comp.risks)

"Program Verification: The Very Idea", by J.H. Fetzer

<leveson@electron.LCS.MIT.EDU>
Tue, 29 Nov 88 07:11:44 -0500
In RISKS-7.61, Brian Randell recommended a paper by J.H. Fetzer in the
Communications of the ACM vol 31, no 9 (Sept. 88), pp. 1048-1063.  There has
been no reply in the RISKS Forum, and some people have apparently interpreted
this absence as everyone's agreement with Brian's view of the paper.  It should
be noted, however, that the paper has created great outrage among many in the
research community due to its misstatements and distortion of the goals of
formal verification, the seemingly low level of knowledge and understanding by
the author about formal verification (e.g., misstatements such as the
implication that it is only applicable to high-level languages and the lack of
references to the verification literature and work of the past 20 years), the
inclusion of inflammatory and unsupported statements (e.g., that the practice
of program verification somehow has serious negative consequences ``not only
for the community of computer science, but for the human race''), and the
general argument that deductive reasoning is not appropriate for computer
programs — which has implications far beyond just formal verification.

Formal, mathematical approaches to system construction and analysis are
directly motivated by the concern to construct trustworthy systems — systems
whose developers will be willing to stand before the public and take
responsibility for the consequences of their deployment.  As yet, this goal is
unrealized, but research in formal verification is a serious contribution
toward that goal and is both socially and scientifically responsible.  Serious
and well-informed discussion of the limitations to formal verification, and of
its strengths and defects relative to other responsible proposals for the
construction of trustworthy systems, is entirely appropriate and would, we are
sure, be welcomed by the supporters of formal verification.  In fact, many in
the field have themselves been the leaders of such discussion (e.g., see the
recent discussion by Avra Cohn, ``Correctness Properties of the Viper Block
Model'', Cambridge University, England, 1988, regarding her own verification of
the Viper microprocessor, and several papers in the book "Mathematical Logic
and Programming Languages" that was published on the occasion of C.A.R. Hoare
being named to the Royal Society).  However, ill-informed and irresponsible
attacks upon work that attempts to make computer science a socially-responsible
engineering endeavor do not seem to us to be useful or productive.

                      Mark Ardis, Software Engineering Institute
                      Victor Basili, University of Maryland
                      Daniel Craigen, I.P. Sharp
                      Susan Gerhart, MCC
                      Donald Good, Computational Logic Inc.
                      David Gries, Cornell University
                      Dick Kemmerer, University of California, Santa Barbara
                      Nancy Leveson, University of California, Irvine
                      John McHugh, Computational Logic Inc.
                      Peter Neumann, SRI International
                      Friedrich von Henke, SRI International

       [Other anticipated signatories were not available for final sign-off.]


Internet Worm Tech Report [Risks of Offering Popular Reports]

Gene Spafford <spaf@purdue.edu>
Tue, 29 Nov 88 16:30:12 EST
    [In response to a query as to why Purdue FTP stopped working today:]

We're off the air because we had 42 ftp processes running at once, and
the ftp daemon got hung.  We now need to reboot the machine to clear it,
but that means kicking over 100 users off (this is a 10 processor
Sequent Symmetry).  So, we're waiting until later this afternoon
until people have gone home for dinner, etc.

    [OK, folks, let's show a little restraint.  I picked up a copy in the
    off hours yesterday.  But FTP FLOOD is clearly just one more denial of
    service problem that must be anticipated in setting system parameters...  

    By the way, ALL of the Arpanet/Milnet mail bridges have been today, 
    reportedly due to "technical difficulties".  PGN]


Purchasers of computer systems as causes of the Internet worm

Brandon S. Allbery <allbery@ncoast.UUCP>
Mon, 28 Nov 88 21:29:06 -0500
A bit of biographical commentary:  I am a consultant and programmer who
specializes in systems administration and DBMS systems.  This puts me in a
position to see how our clients manage their computer systems.  For those who
are interested, we handle Altos X86 systems, SCO Xenix, and the occasional 3B/2
and even more occasional other small UNIX systems.

I should note that the term "Standard Security Speech" is used sarcastically;
insofar as our clients are concerned, it's more of a rant about something that
"can't possibly apply to them."  It's not written down or otherwise formalized;
its content is as variable as the security problems at each client site.

And the context, from the Usenet newsgroup news.sysadmin:

  As quoted from <563@husc6.harvard.edu> by reiter@endor.harvard.edu (Ehud Reiter):
  +---------------
  | I think the vendors bear the lion's share of guilt in this affair.
  | Why the ---- didn't Sun and friends fix these security holes ages ago?
  +---------------

That portion of my response which is relevant to RISKS follows.  ++bsa

  --------

I can answer this, perhaps not for Sun but in general.

I've annoyed many a client with "Standard Security Speech #1", discussing
the importance of not running all their programs from an unpassworded "root"
login.  And many of those clients have modems.  I didn't realize just how
bad the situation was until one of those clients argued back that they
bought an ***** (name deleted to avoid advertising) system because a
business associate had compained about &&&&'s not allowing "root" to log in
on non-console terminals.  Why was this so bad?  "We don't want to have our
users be restricted in what they can do."

The logic of these sysadmins is simple and extremely dangerous:  People are
ignorant about computers.  People don't want security.  People want to load
their applications into their computers and trust that god will keep the
crackers out.  And there have been cases when a company will refuse to buy a
particular computer because it comes with security enforcement.

The vendors have made mistakes, certainly.  But their customers have a nasty
tendency to consider these mistakes to be features.  Common arguments used
by these people when confronted with the flaws in their reasoning:

"Nobody knows our computer's phone number." — Demon-dialer programs are
  trivial, especially when used with smart modems that can recognize voice
  answers.

"We don't have any information that anyone would want." — Fine, so you
  don't have to worry about industrial espionage.  This would not have
  bothered in the least the cracker gang that was broken by the FBI earlier
  this year, that operated in the Cleveland area [where I live], much less
  interstate gangs courtesy PC Pursuit or that West German group that used
  an improperly installed Gnumacs to break into systems.

"It {won't,can't} happen to us." — Needs no commentary.  Ask any sysadmin
  on the Internet.

Worse is that almost *every* small Un*x system out there has NO security,
because the salespeople that installed them and set them up didn't know
about it.  They have everyone run as unpassworded root.  They load
applications into /tmp, where any cracker can destroy the entire system with
just ONE publicly-executable "rm".  They don't say word one about backup
procedures.  And many of them don't give their customers the master disks to
their software, so if their programs get blasted they're gone for good.

That last paragraph is the worst part.  We work primarily with reasonably
pure Xenix and Unix System V — no sendmail, no fingerd, no ftpd, no
susceptibility to the *current* worm.  And capable of quite good security.
But setting up security takes some work — it always has, it always will --
and most salespeople are too busy counting their commissions to consider
doing that work.  If they even know anything about security, which I would
doubt after some of the things I've seen.  And even if they did, the people
mentioned above would forbid it.  (Did I mention the system administrator
who told me that he had his people running as root because he didn't want
them to be stuck in a restricted shell?  No, he didn't mean /bin/rsh.)

The Internet worm is well on its way to becoming the kernel of my "Standard
Security Speech #2".  Maybe a few people will pay attention this time; one of
*****'s failures is that systems ship with a "uucp" login enabled and security
disabled even in HDB UUCP.  All it'd take is a UUCP version of the Internet
worm and a demon-dialer program to wreak havoc in these small systems.

Vendors have some blame, but their oh-so-naively-trusting customers and
oh-so-ignorant salespeople (or distributors' salespeople, who the vendors have
no control over) have even more.  Education is the answer here.  It is a sad
but true fact that only an actual invasion of their systems will get any
response out of them.

            Brandon S. Allbery, Telotech, Inc.
Brandon S. Allbery, comp.sources.misc moderator and one admin of ncoast PA UN*X
uunet!hal.cwru.edu!ncoast!allbery <PREFERRED!>     ncoast!allbery@hal.cwru.edu
allberyb@skybridge.sdi.cwru.edu  <ALSO>        allbery@uunet.uu.net
comp.sources.misc is moving off ncoast — please do NOT send submissions direct
      Send comp.sources.misc submissions to comp-sources-misc@<backbone>.


Bank of America ATMs Hit a Glitch

Peter G. Neumann <Neumann@KL.SRI.COM>
A computer malfunction at BoA's main data center appears to have shut down
all of the bank's 1450 automated teller machines in California for three
hours on Sunday afternoon, 27 Nov 88 (normally a very busy day).  The
shutdown also affected BofA customers throughout the country.  [Source: San
Francisco Chronicle, 29 Nov 88, p. C1, article by Kenneth Howe]


Corps of Software Engineers?

<attcan!utzoo!henry@uunet.UU.NET>
Tue, 29 Nov 88 02:02:48 EST
Just noticed in an old Aviation Week editorial (Oct 17):

    "Flexibility is software's strong suit, allowing the military
    to make changes in how a weapon system functions, even after
    it is fielded... [discussion of gratuitous changes deleted]
    ...making changes in a hurry during a conflict is imperative
    if software is to help US forces prevail."

    "Traditionally, armies have had combat engineers to build
    makeshift bridges, ports, and even airfields in a hurry.
    But where is the US corp of software engineers that can fix
    a key software module quickly so the next airstrike can
    account for an unexpected SAM threat?  Do the armed services
    expect contractor personnel to volunteer for duty on the
    front lines?  Clearly some minimal level of expertise is
    needed in the field and on board ship to make sure that
    weapons systems programs can accommodate unexpected
    circumstances... there is too much riding on software and
    too little expertise in the military to deal with it."

                                     Henry Spencer at U of Toronto Zoology
                                 uunet!attcan!utzoo!henry henry@zoo.toronto.edu


Software Uniformity Legislation

Colin M Thomson <CMTA@gm.rl.ac.uk>
Mon, 28 Nov 88 16:06 GMT
Conleth O'Connell sent a note to the soft-eng list about US uniformity
legislation for software.

I guess there's a risk lurking in there. Either we get legislation appropriate
to life-critical software, and all the commercial games outfits go bust, or we
end up with nuclear reactors and flight control systems required to meet
mickey-mouse standards.  Maybe the legislators will try to classify software,
and set up rules appropriate to different sorts of products; if they do, will
they get it right?
                          Tom Thomson, ICL, Manchester M12 5DR, UK
                          tom@prg.ox.ac.uk cmta@alvey.uk
***** VIEWS EXPRESSED ABOVE MAY NOT CONFORM WITH THOSE OF MY EMPLOYER *****


Zapping shoplifters in Minnesota

Scot E Wilcoxon <sewilco@datapg.mn.org>
29 Nov 88 11:11:40 CST (Tue)
RISKS has already reported on the possibilities of cash registers reporting
the purchases of individuals to marketing databases.  Some Minnesota stores
are now using electronic technology to monitor thieves during individual
crimes and at a higher level.

A new superstore recently opened in this area.  While reporters were waiting
for a demonstration of the security system, an actual theft demonstrated it.
In addition to security agents on the shopping floor and the now-common TV
cameras, purchases made on the electronic cash registers can be monitored by
the security office.  This allows actual thieves to be confronted outside
the store without disturbing legitimate customers (I refer to "false alarms",
as the need for a paper record of purchases and confirmation by the clerk
will disrupt the use of that cash register after its use by a thief).

Minnesota police have noticed that some shoplifters have been getting caught
for many small thefts in different counties.  Minnesota law allows many small
losses during a six-month period to be added together, and if the total
exceeds the minimum for a felony the thief may be charged with a felony.
Police and major retailers are now sharing information on shoplifters so
they can more severely prosecute the professional shoplifter.  A database
in a computer is being used to track the totals.

Anyone with access to the security system line from the cash registers can
more conveniently gain information about any customer.  Other than its
invisibility, this is not much worse than the information disclosed to the
clerks or the other persons waiting in line.  However, if a store does not
already use the purchase information for marketing purposes, the installed
equipment can easily deliver information to a marketing system.

The thief database is more of a threat to thieves than to the general
public, except of course in cases of mistaken identity.

Scot E. Wilcoxon, Data Progress, Minneapolis, MN   +1 612-825-2607
sewilco@DataPg.MN.ORG    {amdahl|hpda}!bungia!datapg!sewilco


(Counter-)corrective control systems

Jeffrey R Kell <JEFF@UTCVM.BITNET>
Wed, 23 Nov 88 11:23:40 EST
Last week, our operator informed me that our laser printer was indicating
a low voltage condition.  Indeed, the expected 240v was down to 214v, just
below the 216v minimum allowed by the controller software.  We contacted
physical plant, who in turn discovered the building mains were also low.
Next in line was the power board, who reluctantly investigated and after
some deliberation and delay concluded that it was within their limits.
This continued until we contacted the second shift supervisor at the power
board and confirmed that the voltage was 10% below nominal.  They tweaked
the appropriate substation and voltage came back to normal.

On the following morning, the low 214v returned.  Similar iterations of
the previous day followed, and voltage again restored late that afternoon.

Next morning, as you might guess, 214v again, etc.

Eventually the power board's two shifts finally communicated.  There was
a bad (miscalibrated) sensor at some point showing a higher voltage than
was really there.  This was known to the second shift crew, who compensated
accordingly (without correcting the sensor); but first shift was taking the
reading verbatim.

Please report problems with the web pages to the maintainer

x
Top