The RISKS Digest
Volume 13 Issue 83

Tuesday, 29th September 1992

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Computer `Kills' Prospective Jurors
Fernando Pereira
RISK with limited precision arithmetic
Lars-Henrik Eriksson
Risks of Safety Systems
Christopher Wood
Computer Systems Security and Privacy Advisory Board of NIST
Dave Farber
Therapy
Sean Matthews
Garage Door Openers
Mark Thorson
Gordon R. Dickson story "Computers Don't Argue"
Vernor Vinge
Alex Heatley
Marc Horowitz
Info on RISKS (comp.risks)

Computer `Kills' Prospective Jurors

Fernando Pereira <pereira@mbeya.research.att.com>
Tue, 29 Sep 92 10:20:33 EDT
The Associated Press reports in 9/29 from Hartford, Conn. that for the last
three years Hartford residents have been excluded from the federal grand jury
pool. The problem was discovered in a lawsuit disputing the racial composition
of the federal grand jury that indicted a minority defendant for bail-jumping.
Apparently, the city name for Hartford residents had been typed in the wrong
place (wrong field?) in computer records, with the effect that the "d" in
"Hartford" overflowed into a status field, indicating the named person as
deceased.

Fernando Pereira, 2D-447, AT&T Bell Laboratories, 600 Mountain Ave, PO Box 636
Murray Hill, NJ 07974-0636   pereira@research.att.com

    [Noted by others as well...  PGN]


RISK with limited precision arithmetic

Lars-Henrik Eriksson <lhe@sics.se>
Mon, 28 Sep 1992 19:40:11 GMT
My union just sent out a letter to its members containing (among other
things), the note:

"If your salary is less than 11844 or greater than 32767, please
notify us immediately and we will adjust your membership fee."

The membership fees are dependent on the salary. My salary is higher
than the first figure quoted and I already have the maximum fee. So
why the request to people with a salary greater than 32767 swedish
crowns?

Since 32767 is the greatest integer you can represent in 16 bits with
2's complement arithmetic, I am willing to bet that their computer
misrepresents larger salaries. I guess that someone with an income of
33000 crowns (say) are charged the fee for an income of
33000-32767=233 crowns! That would be the minimum fee...

Someone must have noted this, and now they must correct those cases
manually...

Lars-Henrik Eriksson, Swedish Institute of Computer Science, Box 1263
S-164 28  KISTA, SWEDEN      Phone (intn'l): +46 8 752 15 09


Risks of Safety Systems

wood,christopher <ccw@prefect.cc.bellcore.com>
28 Sep 1992 15:32 EDT
The incident _was_ caused by a computer.  An automated system was "in the loop"
that examined sensor inputs (including the control for the thrust reversers and
brakes (and in the other incident, the flaps), and the indication given by the
"squat" switch, and decided whether or not to deploy the thrust reversers and
the landing wheel brakes.  It seems odd to me (but then again, I'm not an
engineer designing commercial aircraft) that the squat switch should _also_
disable the brakes, which don't seem like they pose a safety hazard if used
during flight.

This seems like an area where a cockpit crew needs to be able to explicitly
override the safety system.  I can imagine a sort-of dialog (not with spoken or
even typed words, but by command actions - pulling levers and switches, and so
on...)

Crew: deploy thrust reversers
Safety Systems: No. Use of thrust reversers in flight will destroy the
                aircraft.
Crew: Acknowledged.  Override safety constraints! Deploy Thrust
                Reversers NOW!

[...]

We still put human crews in airliners.  Maybe the next step is to admit
that the safety systems are fallible, and give crews a way to overcome
that fallibility.

There are numerous design issues involved, though.  The safety systems
are there for a purpose, and the bypass mechanism should have enough
restrictions that it is only used in an emergency, rather than as a way
of avoiding a routine bother.  On the other hand, if the restrictions
are too severe, the crew will be unable to override the systems when
they have to.  At around 200 MPH, a jetliner runs out of runway _very_
quickly, and there isn't a lot of time for access codes, or even
synchronizing movements of two crew members.  Perhaps requiring a lot of
paperwork _after_ the use of the override system would be appropriate.


Computer Systems Security and Privacy Advisory Board of NIST TESTIMONY

Dave Farber <farber@central.cis.upenn.edu>
Sun, 27 Sep 92 09:24:43 -0400
>From the EFFector 3.05

Following are excerpts from the testimony of Professor David Farber, a
member of the EFF Board of Directors, before the Computer Systems
Security and Privacy Advisory Board of the National Institute of
Standards and Technology (NIST) on September 16, 1992.

Mr. Chairman and Members of the Advisory Board:

    My name is David Farber. I am Professor of Computer Science at the
University of Pennsylvania and a member of the Board of Directors of the
Electronic Frontier Foundation (EFF). I am here today representing only
the views of EFF. I want to thank you for inviting us to testify today
as part of your investigation.

    We are pleased to be included at this early phase of the Advisory
Board's inquiry and offer a brief set of principles for proceeding with
this inquiry. First, it is essential that in examining discrete issues
such as the desirability of various cryptography standards, the Board
take a comprehensive view of what we call "digital privacy" policy as a
whole. Such a comprehensive view requires a clear vision of the
underlying civil liberties issues at stake: privacy and free speech. It
also requires looking beyond the cryptography questions raised by many
to include some of law enforcement's recent concerns about the pace of
digital infrastructure innovation. Second, for the sake of promoting
innovation and protecting civil liberties, the Board should bear in mind
the principle that computer security policy is fundamentally a concern
for domestic, civilian agencies. This principle, as articulated in the
Computer Security Act of 1987, can serve as an important guide to the
work of this Board.

A. THE GROWING IMPORTANCE OF DIGITAL PRIVACY TECHNOLOGY

    With dramatic increases in reliance on digital media for
communications on the part of private individuals, government, and
corporations, the need for comprehensive protection of privacy in these
media grows. For most in this room, the point seems trite, but the
digital communications revolution (which we stand at only the very
beginning of), is the key event of which the Advisory Board should take
note. As an example, a communication which is carried on paper through
the mail system, or over the wire-based public telephone network is
relatively secure from random intrusion by others. But the same
communication carried over a cellular or other wireless communication
system, is vulnerable to being overheard by anyone who has very
inexpensive, easy-to-obtain scanning technology.

    For the individual who relies on digital communications media,
reliable privacy protection cannot be achieved without the protection of
robust encryption technology. While legal restrictions on the use of
scanners or other technology which might facilitate such invasions of
privacy seem to be attractive preventative measures, these are not
lasting or comprehensive solutions. We should have a guarantee — with
physics and mathematics, not only with laws — that we can give
ourselves real privacy of personal communications through technical
means. Encryption strong enough that even the NSA can't break it. We
already know how to do this, but we have not made encryption technology
widely available for public use because of public policy barriers.

B. THE BOARD SHOULD UNDERTAKE A COMPREHENSIVE REVIEW OF DIGITAL PRIVACY
ISSUES

    Inasmuch as digital privacy policy has broad implications for
constitutional rights of free speech and privacy, and for international
competitiveness and economic vitality in the information age, these
issues must be explored and resolved in an open, civilian policy
context. These questions are simply too important to be decided by the
national security establishment alone. This principle is central to the
Computer Security Act of 1987.1 The structure of the Act, which is the
basis for the authority of this Advisory Board, arose, in significant
part, from the concern that the national security establishment was
exercising undue control over the flow of public information and the use
of information technology.2

    When considering the law in 1986, the committee asked the question,
"whether it is proper for a super-secret agency [the NSA] that operates
without public scrutiny to involve itself in domestic activities...?"
The answer was a clear no, and the authority for establish computer
security policy was vested in NIST (the NBS).

    In this context, we need a robust public debate over our
government's continuing heavy-handed efforts to control commercially
developed cryptography. It is no secret that throughout the cold war
era, the Defense and State Departments and the National Security Agency
have used any and all means, including threats of prosecution, control
over research, and denial of export licenses to prevent advanced secret
coding capabilities from getting into the hands of our adversaries. NSA
does this to maximize its ability to intercept and crack all
international communications of national security interest.

    Now the Cold War is over but the practice continues. In recent
years, Lotus, Microsoft, and others have developed or tried to
incorporate powerful encryption means into mass market software to
enhance the security and privacy of business, financial, and personal
communications. In an era of computer crime, sophisticated surveillance
technologies, and industrial espionage it is a laudable goal.

    Although NSA does not have the authority to interfere with domestic
distribution of DSA, RSA, and other encryption packages, its licensing
stranglehold over foreign distribution has unfortunate consequences.
Domestic firms have been unable to sell competitive security and privacy
products in international markets.  More important, because the cost of
producing two different products is often prohibitive, NSA policy
encourages firms to produce a single product for both domestic and
worldwide use, resulting in minimal privacy and security for users both
here and abroad.

    While we all recognize that NSA has legitimate national security
concerns in the post cold war era, this is a seriously flawed process.
Foreign countries or entities who want to obtain advanced encryption
technology can purchase it through intermediaries in the United States
or from companies in a host of foreign countries who are not subject to
US export restrictions.  There is a big, big hole in the national
security dike. By taking a page out of the Emperor's New Clothes, NSA
opts to act as if the process works by continuing to block export.

    In order to get some improvement in mass market encryption, the
Software Publishers Association, representing Microsoft, Lotus, and
others, had to use the threat of legislation to get NSA to engage in the
negotiations that finally led NSA to agree to expedited clearance for
the export of RSA encrypting software of limited key lengths. Still, all
concede that the agreement does not go far enough and that far more
powerful third-party products are commonly available in the US,
including the fifteen-year-old US Data Encryption Standard.  SPA knows
that specifying maximum key lengths offers little long-term security
given advances in computer processing power, but was willing to
compromise because of NSA's refusal to budge.

    Does this kind of policy make any sense in the post Cold War era?
Mass market products offer limited security for our citizens and
businesses. Determined adversaries can obtain much more powerful
products from foreign countries or by purchasing it here in the US. Is
the NSA policy of slowing down the pace of encryption use by foreigners
and adversaries --even if demonstrable--any longer worth the significant
price we pay in terms of failing to meet our own communications privacy
and security needs?  That is the policy challenge for this Board to
address by a frank, open, and inclusive public debate.

C. THE BOARD MUST ADDRESS THE DIGITAL PRIVACY ISSUE IN A COMPREHENSIVE
MANNER WHICH REQUIRES CONSIDERING THE FBI'S DIGITAL TELEPHONY PROPOSAL
AND ITS IMPLICATIONS.

    The public policy debate on electronic privacy issues over the last
few years has demonstrated that a comprehensive approach to digital
privacy policy cannot be complete without examining both questions
regarding the availability of encryption technology, and the
corresponding infrastructure issues, such as those raised by the FBI's
Digital Telephony Proposal. Attempts to solve one issue without
addressing the other is an exercise in irrational policy-making and
should be avoided by this Advisory Board.

    Last year, the FBI first proposed a "Sense of the Congress"
resolution stating that communications firms and computer and
communications equipment manufacturers were obligated to provide law
enforcement access to the "plain" text of all voice, data, and video
communications, including communications using software encryption. The
Electronic Frontier Foundation (EFF) played an active and leading role
both in opposing such a law and in seeking to find more acceptable means
for meeting legitimate law enforcement needs. Because of our advocacy
and coalition-building efforts with communications and privacy groups,
we were successful in persuading Senate Judiciary Chairman Joseph Biden
to remove the Sense of the Congress Resolution from active consideration
as part of Omnibus crime legislation last year.

    Putting aside its attempt to control the use of encryption systems,
this year the FBI has come forward with proposed legislation that would
require telephone companies, electronic information providers, and
computer and communications equipment manufacturers to seek an FCC
"license" or Attorney General "certification" that their technologies
are susceptible to electronic surveillance. We are in danger of creating
a domestic version of the export control laws for computer and
communications technology.

    While the FBI claims that neither of this year's proposals address
encryption issues, the Bureau has made it clear it plans to return to
this issue in the future. The Board needs to hear from the broad
coalition made up of telephone companies such as AT&T, computer firms
such as IBM, Sun Microsystems, and Lotus Development Corporation, and
public interest groups such as the EFF. The EFF will shortly release a
white paper representing coalition views on the need for the FBI to
explore more realistic, less vague, and potentially onerous policy
options for meeting legitimate law enforcement needs.

    The resulting multi-front battle being waged about digital privacy
creates formidable roadblocks to a final resolution of the policy
disputes at issue. Those who seek greater privacy and security cannot
trust a settlement on one front, because their victory is likely to be
undermined by action on the other issue. And law enforcement and
national security concerns cannot be adequately addressed without a
sense of the overall solution being proposed on both the encryption and
infrastructure fronts. This Advisory Board can play a valuable role for
the policy process by conducting a comprehensive review of digital
privacy and security policy, with a consideration of both of these sets
of issues.

1 Pub.L.No. 100-235.
2 House Committee On Government Operations, H.R. Rep. No. 99-753,
 Pt. 2, at 5.


Therapy

"Sean Matthews" <sean@mpi-sb.mpg.de>
Sat, 26 Sep 92 16:45:38 +0200
Some papers run personals among the classified ads.  The New York Review of
Books runs not only personals, but, right above them, therapy ads too.  This
appeared in the October 8, 1992, edition:

  FEELING HELPLESS ABOUT DEPRESSION? Overcoming Depression 2.0 provides
  computer based cognitive therapy for depression with therapeutic
  dialogue in everyday language.  Created by Kenneth Mark Colby, M.D.,
  Professor of Psychiatry and Biobehavioural Sciences, Emeritus, UCLA.
  Personal Version ($199), Professional version ($499).  Malibu
  Artificial Intelligence Works, 25307 Malibu Rd, CA 90265.
  1-800-497-6889.

The risks, to coin a phrase, are obvious.  If anyone who happens to live in the
'States followed this up, I would be fascinated to know what exactly this thing
is.

Sean Matthews, Max-Planck-Institut fuer Informatik, Im Stadtwald,
W-6600 Saarbruecken, Germany  +49 681 302 5363  (sean@mpi-sb.mpg.de)


Garage Door Openers

<mmm@cup.portal.com>
Fri, 18 Sep 92 20:45:16 PDT
With regard to garage door opener security, I recently was asked to
inspect the malfunctioning garage door opener transmitter for a
friend's mother.  I used a screwdriver to open it up, and found a
broken battery wire.  The unit included a microcomputer and a
DIP switch for a 12-bit password.  I don't think I'd be revealing
any great secret to tell you what her password was.  It was
the binary number 000000000001.

Mark Thorson (mmm@cup.portal.com)


Re: Airliners playing chicken (Somos and Seiler, RISKS-13.82)

Robert Dorsett <rdd@cactus.org>
Sat, 26 Sep 92 00:50:59 CDT
Leslie J. Somos:

>I can understand preventing deployment of spoilers or thrust reversers while in
>the air, but I don't understand preventing brake application.

A lot of the replies have missed a pretty fundamental component of this
problem: the one of the increasing design modality of airliner systems.
*Landing* models, *Ground* models, *Takeoff* models, *Flight* models, ad
nauseum.  Conditional logic being used to disable systems or alter the
behavior of control devices to fit the projected use in a specified mode.

We have had squat switches for years.  They're useful.  The problem arises,
as I see it, when they provide an online datum for evaluation and use by
client devices in a highly abstract *design* context.  The brake question is
one such example.  The brakes weren't enabled because it made no sense to
enable them, from the perspective of the cockpit control logic.  It's a
"tidiness" that makes for clean block diagrams, but in many ways, lends a
higher level of complexity to a system interface.  In a conventional
interface, the pilot would be able to massage the brakes to his heart's
content, in the air, gear stowed, or whenever.  This may not make much LOGICAL
sense, though, so the feature's *turned off* in the air...  It's yet another
manifestation of the conflict of old-fashioned "open-loop" design, vs.
"modern" "consider-all-cases" (and hope we got it right!) design.

Larry Seiler:
>I agree that it is better to have an occasional accident due to a safety
>interlock system that fails than to have more accidents due to people
>accidentally doing fatal things like engaging the thrust reversers

I don't see this as an "override" issue.  We need to differentiate between
items that can cause disasters, and items that don't fit an abstract design
model.  A failure of thrust-reverser safety interlocks can kill an airplane,
as the Lauda crash showed.  "Modality" logic in the case of the brakes makes
very, very little sense, however-- it's likely, as Fokker learned, that the
modality *decreased* the safety margin, with *no* increase in safety in a
properly-functioning system, anyway!

Technology for technology's sake, once again.  Electronic toilets, anyone?  :-)

                                 [Big story on the French electronic toilets
                                 in New York in this weekend's papers!   PGN]

Robert Dorsett  rdd@cactus.org  ...cs.utexas.edu!cactus.org!rdd


Gordon R. Dickson story "Computers Don't Argue"

Vernor Vinge <vinge%saturn@sdsu.edu>
Fri, 25 Sep 92 20:23:31 -0700
The fictional story of a book club hassle escalating to a capital case
via compounded system errors:
  "Computers Don't Argue" by Gordon R. Dickson, ANALOG SF&SF Magazine,
  September 1965, pp.84-94.
Reprinted (as of 1976):
  ANALOG 5, 1967, Doubleday, J. W. Campbell, ed.
  ASTOUNDING ANALOG READER, Vol 2, 1973, Doubleday, Harrison and Aldiss, eds.
  NEBULA AWARD STORIES, 1966, Doubleday, D. Knight, ed.
  TRANSFORMATIONS II: UNDERSTANDING AMERICAN HISTORY THROUGH SCIENCE FICTION,
    1974, Fawcett Crest Books,  D. Roselle, ed.
  WONDERMAKERS 2, 1972, Fantasy Premier Books, R. Hoskins, ed.

It may have been reprinted in DATAMATION, too. (But the above citations are
from William G. Contento's INDEX TO SCIENCE FICTION ANTHOLOGIES AND
COLLECTIONS, 1976.)

It's a great story (though by now more an archetypal contribution to RISKS than
science fiction).

-- Vernor Vinge,   vinge@sdsu.edu


Re: RISKS DIGEST 13.82

Alex Heatley <Alex.Heatley@vuw.ac.nz>
Sat, 26 Sep 92 19:22:23 +1200
The Story of Escalating Computer Mistakes entitled "Computers Don't Argue"
by Gordon R. Dickson appears in "Computer Crimes and Capers" edited by
Isaac Asimov, Martin H. Greenburg and Charles G. Waugh, ISBN 0-14-007310-8
(British Edition, Published by Penguin Books), according to the title page
the copyright was made by Conde Nast Publications in 1965.

This book is recommended to all risks readers for the inclusion of two
stories which highlight risks related issues. The first is "An End of
Spinach" by Stan Dryer (which also appeared in the Magazine of Fantasy and
Science Fiction and carries a copyright of 1981) and the second is "Sam
Hall" by Poul Anderson (copyright 1953 by Conde Nast Publications).

Having just flipped through the volume again I can also recommend
"While-U-Wait" by Edward Wellen (copyright 1978, Magazine of Fantasy and
Science Fiction).

"Computers Don't Argue" also appears in "The Best of Creative Computing
Vol Two" edited by David Ahl. But it is likely that this is now out of
print, however it does mention that the story originally appeared in the
magazine "Analog".

It is interesting to note that many risks mentioned in this forum were
considered by SF writers in the fifties and sixties...


Re: Datamation fiction (Weinstein, RISKS-13.81)

Marc Horowitz <marc@Athena.MIT.EDU>
Sat, 26 Sep 92 01:58:56 EDT
> The classic treatment of the "computer-induced" nightmare through
> "minor" errors must be the humorous (fictional) piece done by
> "Datamation" in the early 70's.

Another classic story of computer automation gone overboard is "Brazil", by
Terry Gilliam.  Makes what we read about here seem like a day at the park :-)

Marc

Please report problems with the web pages to the maintainer

x
Top