The RISKS Digest
Volume 14 Issue 11

Friday, 27th November 1992

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Re: Computer Security Act and Computerized Voting Systems
Roy G. Saltman
Rebecca Mercuri
How Is Technology Used for Crime in the Post-Hacker Era?
Sanford Sherizen
Re: Nuclear plant risks
Brad Dolan
Re: Installer Programs
John Bennett
Mathew
Re: How to tell people about risks?
Sanford Sherizen
Mark Day
Change in the Maximum Length of International Telephone Numbers
Nigel Allen
Humorous submissions for a book
Andrew Davison
Info on RISKS (comp.risks)

Re: Computer Security Act and Computerized Voting Systems

Roy G. Saltman <SALTMAN@ECF.NCSL.NIST.GOV>
Fri, 27 Nov 1992 15:39:01 -0500 (EST)
This is in response to Rebecca Mercuri [RISKS-14.09] and Bill Murray
[RISKS-14.10] on the subject of the applicability of the Computer Security Act
of 1987 (P.L. 100-235) to computerized voting systems.

The purpose of the Computer Security Act was to create, according to Section 2
(a) of that act, "a means for establishing minimum acceptable security
practices" for "improving the security and privacy of sensitive information in
Federal computer systems."  Systems containing national security, i.e.,
classified, information are excluded from the purview of the act.  In Section
2 (b), the act assigned to the National Bureau of Standards (now the National
Institute of Standards and Technology, NIST) "responsibility for developing
standards and guidelines needed to assure the cost-effective security and
privacy of sensitive information in Federal computer systems..."  By assigning
primary responsibility to NIST for providing the necessary guidance, the act
limited the influence of DoD, as Mr. Murray states, although that was only one
of the purposes, not "the" explicit purpose.

The Computer Security Act of 1987 applies to "Federal computer systems," where
such a system, in the language of the act "means a computer system operated by
a Federal agency or by a contractor of a Federal agency or other organization
that processes information (using a computer system) on behalf of the Federal
Government to accomplish a Federal function."

Now, Mr. Murray is incorrect when he states that "control over voting
procedures is reserved to the states."  That is specifically untrue in the
case of Federal elections.  Federal elections are elections for President,
Vice-President, U.S. Senators, and U.S.  Representatives.  The U.S.
Constitution provides, in Article I, section 4 that the Congress may "at any
time by law alter such regulations" that the State Legislatures impose for
determining the time, places, and manner of holding [Federal] elections.  In
addition, in Article I, section 5, the Constitution states that "each House
[of Congress] shall be the judge of the elections, returns and qualifications
of its own members."

A number of Constitutional Amendments and Federal laws have changed (expanded)
voting rights.  In the most recent Congress, a bill was passed mandating very
specific procedures for voter registration.  The bill would have become law if
not vetoed successfully by President Bush.  While such laws affect only
Federal elections, the states invariably apply them to their own state
elections, because they cannot afford two sets of procedures.  This was
specifically true for the amendment granting voting rights to 18-year-olds, as
well as for the Voting Rights Act of 1965.

As the President, Vice-President, and members of Congress are Federal
officials remunerated through appropriations of the Federal budget, the votes
cast for them on a state or local government computer system is a "system
operated by ... [an] other organization that processes information ... on
behalf of the Federal Government to accomplish a Federal function."

Clearly, votes in Federal elections are sensitive information, under the
Computer Security Act.  The reason that vote-tallying in Federal elections is
not covered is that the act mandates actions by Federal agencies to implement
the act.  However, the definition given to "Federal agency" given in the act
specifically omits the U.S. Senate and the U.S. House of Representatives.
These are the agencies ultimately responsible for Federal election data, and
they have excluded themselves from coverage of the act.  Thus, the only
sensitive Federal data omitted from coverage is the most important for
continuation of our democratic government.


Re: Computer Security Act and Computerized Voting Systems

Rebecca Mercuri <mercuri@gradient.cis.upenn.edu>
Fri, 27 Nov 92 18:28:39 EST
In response to William Hugh Murray's posting in RISKS Forum 14.10:

First of all, let me state that I agree with much of what was said.  Some of
which has even appeared in my earlier writings on this subject, most notably
the observation that DREs are similar to ATMs (which I comment on extensively
in my March 1992 paper published in the Proceedings for the Fifth
International Computer Virus & Security Conference), the statement that
"control over voting procedures is reserved to the states" (which also appears
in that same paper), and I had also alluded to the insufficiency of security
standards and theory applicable to this problem in my recent (November 92
CACM) Inside Risks column. So it appears that we are in agreement here (with
sufficient citations in evidence of this fact).

Readers should note that my use of the phrase "Orange Book" was intended more
generically (as is common) than to the specific book itself. The primary
intent of my posting was to make the public aware of the fact that the FEC and
NASED have not made sufficient use of the ongoing work by NCSC in their
establishment of guidelines and controls for voting equipment. I was not
suggesting that the NCSC documents would BE SUFFICIENT for voting
applications, only that they should be consulted as it is the largest body of
standards work to date regarding computer security and MUCH (not all) of the
rainbow series material IS applicable to the voting area (as well as in
banking, where it is certainly considered), regardless of the original
intention of the Act.

Where Murray and I are not in agreement is in his assertion that the shared
resource problem is not applicable to vote recording, reporting and tallying.
He may have conceived of computerized voting as taking place on independent,
stand-alone systems, but in actuality this is often not the case. Many
computer systems used for voting purposes are networked and time-shared. The
RISKS implications of this should be obvious, now that I have pointed this
out.

A recent voting machine security document that I was asked to review contained
the following statement:

    "The proposed DEC computer operating system (VMS) is one
     of the most secure systems available today. Security
     features that comply with the National Computer Security
     System (NCSC) C2 classification have been built in as an
     integral part of the operating system."

I include these sentences not for their merit (with which I have considerable
difficulty), but to point out that the operating system problem is ALSO highly
applicable to the voting problem.

Both Peter Neumann and I are well aware of the ramifications of getting what
is asked for. We have each publically called for greater care in developing
voting systems with, as Murray said, "extraordinary, not standard, rigor and
... in the light of independent scrutiny." But I must bring to light the fact
that this is not presently the case. I also agree that the laws which affect
voting machines should be specific to their manufacture, and have personally
lobbied to improve these laws, but as some states have failed to adopt even
weak regulations, there must be some place to start, and an existing Computer
Security Act might be a good place to at least look. Note that Congress IS
responsible for overseeing FEDERAL (Presidential and Congressional) elections,
and such SHOULD be considered a matter of national security. Congress did,
though, exempt itself from the CSA and hence it does not apply to elections at
present.

Due to time constraints I am not at liberty to iterate at length on this
subject, but numerous documents and publications are available that will
confirm the fact that electronic voting systems are not being administered
with the care that should be given, that current laws are inadequate,
standards are lacking, and rigorous theories are needed in this area.  CPSR is
a good place to contact if one would like to become more fully aware of this
problem, and then I hope that more individuals will join the effort to ensure
that accurate, anonymous voting is guaranteed.

Rebecca Mercuri.


How Is Technology Used for Crime in the Post-Hacker Era?

Sanford Sherizen <0003965782@mcimail.com>
Fri, 27 Nov 92 14:38 GMT
I am working on a research project and book involving the process by which
some people have been able to "invent" computer and other high tech crimes.
The issue that I am trying to understand is how people adopt new technologies
for purposes other than those envisaged by the developers.  Specifically, some
people are very skilled in what I call thinking like a thief.  They see
technology from a particular perspective, based on how it can be used for
anti-social, deviant, and/or criminal acts.  A counterpart to this are those
who come up with new applications for technology, which may also be viewed as
negative by developers but can often lead to new applications (and even new
industries).

It could be quite important to establish how other technologies have been used
for "inventing" crimes.  How do technologies get expanded and changed,
especially by "non-official" people, i.e., they do not work for the original
technology developers or owners? These "non-official people" see new
possibilities, whether for a positive or negative use.  I know that the terms
positive and negative raise many definitional and value difficulties.  Yet, I
am interested in how people view technology, how they are able to frame
questions about other uses, how this information gets spread to others, and
the positive as well as negative consequences of this process.

Drawing on my background in criminology, sociology, and information security,
I have collected examples on how other crimes have developed and been
responded to by technologists and law enforcement. Some examples that come to
mind include how the car was relatively quickly adopted by bank robbers to
improve their chances of getting away from the crime scene.  This led to
police forces having to become motorized, which had many other consequences
for command and control of police deployment and changes in how the police had
involvements with citizens.  The Xerox or photocopying machine certainly led
to better protection of documents by allowing the making of file copies but
was quickly understood by some that it could also be used to steal copies as
well as to leak documents to unauthorized sources, e.g. the Pentagon Papers.
The automatic dial system was invented to Strowger to prevent Kansas City
telephone operators from depriving him of business by giving busy signals or
wrong numbers to potential customers.  Ithiel de Sola Pool, in his book on the
telephone, discusses how the telephone was portrayed as both the promoter and
the conquerer of crime.  "Call girls" and obscene callers were at least two of
the ways that telephones were viewed as contributing to moral decay while the
first electric police-communication system of record was installed in 1867.

Answering these questions could help establish some clues as to how computer
crime is developing and areas that require particular security attention.  We
have not been very prepared for the "Post-Hacker Era" and we have not been
raising questions about the crime potential of new technologies.

Any ideas, reactions to these comments, and suggestions of any social
historical studies about these issues are welcomed.  I will post a summary of
responses.

Sanford Sherizen, Data Security Systems, Inc., Natick, MA MCI Mail: 3965782


Re: Nuclear plant risks (RISKS-14.09, 14.10)

Brad Dolan <71431.2564@compuserve.com>
27 Nov 92 15:36:59 EST
Mr. Wexelblat points out that recent legislation (mostly) eliminates
post-construction hearings from the nuclear plant licensing process.

I believe the second stage review was often used as a platform by intervenor
groups to delay plant licensing; adding millions, even billions to plant
costs.  Utilities pass these costs along to ratepayers, who must either reduce
their consumption of electrical energy or reduce spending on other things to
pay increased electrical bills.  In the southeastern U.S., many people have
begun heating their homes with wood or (unvented!) kerosene heaters, which
expose them to potentially substantial risks.

I would like to see a comparison of safety benefits (in terms of expected
lives saved, property saved, or whatever) resulting from intervenors'
interventions with the safety detriments which have resulted from increased
electrical bills.

Brad Dolan 71431.2564@compuserve.com 10288-700-NUCLEAR N4VHH


Re: Installer Programs (Thorson alias mmm, RISKS-14.08)

John Bennett, Defence Science & Technology <BENNETT@dstos3.dsto.gov.au>
Thu, 26 Nov 1992 9:58:23 +1030 (CST)
Unfortunately, a de-installer is likely to introduce its own risks if it is
written by the writer of a buggy installer, e.g.,
  a. It might not deinstall everything which should be deinstalled
  b. It might deinstall something which should not be deinstalled
  c. Something else might go wrong.

Sometimes I think Macs are too smart, there are times when I would like some
manual intervention in automatic processes such as Mak describes.

John Bennett, DEFENCE SCIENCE AND TECHNOLOGY ORGANISATION PO Box 1500
Salisbury, South Australia 5108 Phone : +61 8 259 5292 FAX : +61 8 259 5537


Risks of Believing Install Programs

mathew <mathew@mantis.co.uk>
Thu, 26 Nov 92 12:33:59 GMT
I recently installed Microsoft Windows for Workgroups on a Toshiba T5200
"portable" computer.  (I say "portable" because the machine is a few years
old, and hence measures 37x10x40cm and weighs nearly 9kg.)

The machine is connected to our Novell Netware network, and so the process
included installation of Netware and Ethernet drivers.

Windows for Workgroups detected the Ethernet card automatically, and
presented a dialogue box suggesting that it was a Novell NE2000 Anthem card.
This was absolutely correct, so I confirmed it.  The installer then presented
another dialogue, this time with the DMA address and IRQ interrupt number.
The DMA address was correct, and I reasoned that since the machine had worked
everything else out it probably had the IRQ number correct also.  I hit the
OK button.

A warning box appeared stating that the IRQ number on my Ethernet card would
clash with the machine's second serial port, and would have to be changed if
I wanted to use that port.  I'd never used it before, but I did want to now.
I abandoned the installation.

Twenty minutes, three technical reference manuals, twelve fixing screws and
three removable covers later, I had the Ethernet card in my hands.  I
peered at the card.  It was set to address 320, IRQ 2.  Windows had reported
this as address 320, IRQ 3.  I left it exactly as it was, put the machine
back together again, and went through the install once more.

The risk, of course, is that because the install program seemed to know
exactly what sort of Ethernet card I had and what DMA address it used, I
automatically assumed it knew what it was talking about when it told me the
IRQ number.  In my defence, I must say that Windows had been spot-on at
working out the IRQ number for the card in my Elonex machine.

The message for authors of install programs is that perhaps they should avoid
making their programs supply default values using guesswork if the software
can't actually read the value.  If the program makes a couple of lucky
guesses, the user might be tempted to believe that the program knows
something he doesn't.

The message for users is: even if an install program tells you exactly what
hardware you have attached to your machine, never trust it to pick correct
settings based on that information.
                                                mathew


Re: How to tell people about risks?

Sanford Sherizen <0003965782@mcimail.com>
Fri, 27 Nov 92 14:24 GMT
Several years ago, public safety messages would appear prior to major holidays
warning drivers in the U.S. about accidents.  The messages said (paraphrase)
"350 people will die this Thanksgiving holiday.  Don't be one of these
fatalities.  Drive carefully."  Researchers from the Safety Council found that
these risk messages were not helpful at all.  Drivers thought that an accident
would not happen to them.  The risks became someone else's problem.

The Safety Council then decided to personalize risk.  They ran ads directed at
kids suggesting that the kids ask their parents to drive safely and to use
seatbelts.  This approach to risk was much more successful and may have been
one of the many reasons why traffic fatalities have declined over the years.

Sanford Sherizen, Data Security Systems, MCI MAIL: SSHERIZEN  (396-5782)


Telling people about risks (ingredient labels)

Mark Day <mday@jukebox.lcs.mit.edu>
Fri, 27 Nov 92 11:40:25 -0500
One contributor to the discussion about describing risks made two alarming
comments.  I read both comments as arguing (basically) "since ingredient
labels pose certain problems, we should eliminate them and thereby eliminate
the problems".  The comments struck me as a good example of creating a major
risk to correct a minor one, perhaps because I'm more interested in this issue
as a consumer of food than as a producer of food.

Here's the first comment:

  I do not believe that there is any point in attempting to
  convey numerical, let alone statistical, information to a 'general
  public' increasingly deskilled, de-educated, aggressive and litigious.

In essence, since the 'general public' can't be trusted to interpret
facts properly, better not to tell them.

The writer would presumably not include himself in the 'general public' that
he characterizes as so unpleasant; no doubt he could interpret the facts
properly.  But unless the information is on the carton of whatever he's
buying, he won't have access to that information either.  Realistically, the
choice is between making the information available to everyone or making it
available to no-one.  The writer will not be able to go to a special store
(reserved for those who pass a test of numerical skills) to buy goods with
labels describing their contents.

On to the second alarming comment:

  An example of the perceptions of the general public: European food
  manufacturers, who are subject to labelling requirements, find it
  expensive to produce a different label for each country, translating
  the complex chemical names of food additives for each language group.

  Solution: give each additive - they are more or less standardised - a
  European code number (e.g. "E123"). This can then be looked up in a
  language-specific list to provide the chemical name in the required
  language, if anyone is really that interested.

  [goes on to describe that consumers didn't like this]

This is a dreadful solution, and I can understand why consumers would be less
than thrilled about it.  The purpose of ingredient labels on goods is to
convey information to potential purchasers, and this code-word scheme
definitely conveys less information in the real world.

An analogous argument would eliminate ingredient labels altogether, and
require only that the manufacturers provide a telephone number to call for the
ingredients.  After all, the information is still "available", just much less
convenient to check when you actually need it.

If I've learned that I must avoid products containing monosodium glutamate
because of an allergy, I'm going to be unhappy when I need to learn a new code
name for it.  The convenience for the manufacturer is balanced by an
inconvenience for me.

And frankly, the whole business seems like a fraud to me.  It's not just
chemicals that have different names in different European languages, it's also
simple things like water, sugar, salt.  Either all of those get code numbers
(at which point the ingredient labels become utterly useless) or else the
manufacturers still have to produce multiple labels for an identical product.

--Mark Day


Change in the Maximum Length of International Telephone Numbers

Nigel Allen <nigel.allen@canrem.com>
Wed, 25 Nov 1992 19:00:00 -0500
Any application that does not allow for fifteen-digit international telephone
numbers is a potential risk today, and will become a real risk on January 1,
1997.

The following information from Stentor Canadian Network Management, the
consortium of major Canadian telephone companies, was distributed through the
Canadian Department of Communications' Terminal Attachment Program Advisory
Committee as TAPAC bulletin 92-13.

Advance Notice of Change in the Maximum Length of International Telephone
Numbers

Beginning midnight December 31, 1996, CCITT Recommendation E.164 will be
implemented.  This recommendation allows the expansion of international
telephone numbers from 12 to 15 digits.  Although the North American Numbering
Plan will not change, some foreign administrations may assign numbers up to 15
digits long to subscribers. Therefore, terminal equipment originating calls to
these subscribers must be able to handle the additional digits.

For more information on this change, please contact:
 Marion Norman
 Stentor Canadian Network Management
 410 Laurier Ave. West, 8th Floor
 P.O. Box 2410, Station D
 Ottawa, Ontario
 Canada  K1P 6H5
 telephone (613) 560-3420
 fax (613) 560-3226

Note from NDA: I think the "international telephone number" in this context
includes the country code but excludes the access prefix, such as 011 in
Canada and the United States.

Readers in the United States should presumably contact the North American
Numbering Plan Administration at Bellcore rather than Stentor.

 Nigel Allen              nigel.allen@canrem.com

Canada Remote Systems  - Toronto, Ontario
World's Largest PCBOARD System - 416-629-7000/629-7044

             [And so, dear RISKS readers, beginning 1 Jan 1997 we
             can expect various strange reports to appear...   PGN


Humorous Submissions

Andrew Davison <ad@munta.cs.mu.OZ.AU>
Fri, 27 Nov 1992 11:53:46 +1100
CALL FOR SUBMISSIONS FOR A NEW BOOK Tentatively titled:
        "Humour the Computer:
    Humorous Writings concerning Computing"

We seek contributions of a humorous nature for a new book called `Humour the
Computer', an anthology of the comic, amusing and laughable aspects of
Computing. It is scheduled to be published by MIT Press early in 1994.

Topics of interest include:

* computing experiences: e.g. `a day in the life',
  computing jobs - industry, commerce, teaching, research, etc;
  getting funded in the 90's, getting a job in the 90's,
  home computing, how computer people relax, exam marking stories,
  getting published, office politics

* the history of computing: e.g. the mistakes, the disasters, the egos
  gone mad (with names changed to prevent law suits), folklore (not
  necessarily true)

* the future of computing

* computing in the Real World: e.g. in the media, computer salesmen,
  misconceptions about computing, the misuse (abuse) of computing, faulty
  software/hardware, the `training' course

* fake journal articles: e.g. AI, expert systems, machine architectures
  software engineering, new programming languages, semantics;
  surveys of `important' fields: Virtual Reality, neural nets,
  OO, what-ever, etcl

* jokes and cartoons (these will only occupy a small portion of
  the book)

* suggestions for a better name for the collection

Contributions will be judged on clarity, insignificance, irrelevance,
incorrectness, originality, but most of all for their ability to make the
editor laugh. Authors should make their papers understandable to a broad
audience - not too many `in-jokes' please.

Authors should submit three (3) copies (preferably double-sided) of the
prospective article to the editor; people without access to a photocopier may
submit a single copy. Electronic submissions are *not* encouraged.

Each manuscript should have a title page with the title of the essay, full
name(s) and affiliation(s) of author(s), complete postal and electronic
addresses, telephone number(s), and a 150 word summary.

Contributions must *not* exceed 4000 words (approximately 8 A4 pages typeset
10-point on 16-point spacing, or 12 pages if typewritten double-spaced).
Shorter articles will be viewed more favourably than longer ones.

Previously unpublished material is preferred. Contributors of published
material must have copyright to it. All contributors will be asked to sign a
permission letter giving MIT Press the right to print the material but NOT
transferring copyright.

Submissions must be received by April 3rd, 1993. Authors will
be notified of acceptance or rejection by July 1st, 1993. Full
versions of the accepted articles must be formatted according to MIT Press
conventions, and camera-ready copy must be received by the editor
by August 24th, 1993.

For further information, contact the editor:

Dr. Andrew Davison, Department of Computer Science
University of Melbourne, Parkville, Victoria 3052, Australia
Email: ad@cs.mu.oz.au  Fax: +61 3 348 1184  Tel: +61 3 344 7207 / 5230
Telex: AA 35185

Summary of Important Dates:

April 3rd, 1993:    Due date for 3 copies of submission
July 1st, 1993:     Notification of acceptance
August 24th, 1993:  Due date for final manuscript
Early 1994:     Publication

    [If you submit something that you saw in RISKS, which even I shall
    probably do, you might be very careful to indicate all of the
    indicated sources — including RISKS...  Good luck!  PGN]

Please report problems with the web pages to the maintainer

x
Top