The RISKS Digest
Volume 6 Issue 32

Friday, 26th February 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Back-Seat Driving Goes High Tech
PGN
Lottomatic computing
PGN
Billion Dollar Software for $900 ??
Ken De Cruyenaere
Airbus Fly-by-Wire Controversy
Nancy Leveson
File matching
Barry Nelson
Mistaken Identity and Display of Retrieved Sets
James H. Coombs
Re: Taxing information
Dick King
Jeff MacKie-Mason
jong
Re: the risks of voice recognition in banking services
Jerry Kew
SDI S/W
Fred Baube
Request for Viruses to be used to test AntiBiotics
Amir Herzberg
Viruses and "The Adolescence of P-1"
Pat Reedy
Info on RISKS (comp.risks)

Back-Seat Driving Goes High Tech

Peter G. Neumann <NEUMANN@csl.sri.com>
Fri 26 Feb 88 14:32:37-PST
A 1977 Dodge van with a computerized loud-mouth back-seat driver designed to
avoid collisions was demonstrated at the Governor's Regional Transportation
Management Conference.  Upon detecting a nearing collision to which the
driver does not respond, the system barks out simulated voice messages such
as "WATCH IT!  WATCH IT!  LOOK OUT!  LOOK OUT!" or "FALL BACK!  FALL BACK".
When the driver does nothing, the computer applies the brakes and slows the
vehicle smoothly.  "It was like driving with a loud, nervous and ill-tempered
co-driver."  The system is called "Lookout", and is made by Radar Control
Systems, Inc.  The computer is about the size of a cigaret pack.  (Source: A
front page article by Kevin Leary, with the above title, San Francisco
Chronicle, 26 Feb 88.)

From the RISKS point of view, this could be a scary development.  Inordinate
dependence on this technology by people who are not sensible in the first place
may tend to make matters worse.  Drivers who are drunk, stoned, or sleepy may
soon be taking to the roads with alacrity, possibly causing collisions among
cars that do not have the devices even if the drivers themselves were magically
protected.  Some drivers may keep a failed unit mounted, so that in case of a
collision, they could blame it on the computer.  A second-order concern is that
lawsuits against the manufacturer are likely in the event of accidents IN SPITE
OF the device (e.g., if it was turned off).  (Yes, lawsuits BECAUSE OF the
device might also be expected — e.g., if simultaneous approaches from two
sides caused signals to cancel each other, due to a design flaw.)  Thus, we
need to check out rather carefully the social and other implications of this
technology.  Blind trust in such a technology may be more dangerous than the
risks of the technology themselves...  PGN


Lottomatic computing

Peter G. Neumann <NEUMANN@csl.sri.com>
Fri 26 Feb 88 14:40:02-PST
GTECH Corp, which operates California's on-line real-time lotto control
system, has been fined more than $730,000 because of various computer
system failures that have prevented bets from being placed.  GTECH blamed
"an overly complex system design that has proved to be too much for the
lottery's central computers.'  (San Francisco Chronicle, 26 Feb 88, p. 2)

  [Here is a need for nonstop, reliable, secure, high-integrity computing.  
  I wonder whether the system designers really anticipated the requirements 
  properly, and whether GTECH anticipated the risk of such a fine!  PGN]


Billion Dollar Software for $900 ??

<Ken De Cruyenaere <KDC%UOFMCC.BITNET@CUNYVM.CUNY.EDU> 204-474-8340>
Thu, 25 Feb 88 09:30 CST
From the Feb. 23 issue of the Winnipeg Sun (reprinted without permission):

COMPUTER PURCHASE OFFERS A BLUEPRINT FOR SUCCESS

Toronto (CP) A man who bought computer equipment for $900 at auction last
September is being sued by a Canadian subsidiary of a U.S. telecommunications
giant, which says software included in the sale is worth billions of dollars.
The story could prove embarrassing to the Ontario government.  One of its
agencies, the Ontario Development Corporation, turned over to a receiver
valuable material.  Norbert Stoeckl, president of the Scarborough Bone
Analysis Clinic, purchased the source code and manuals for the UNIX operating
system at an auction by Danbury Sales Ltd.


Airbus Fly-by-Wire Controversy

Nancy Leveson <nancy%murphy.ics.uci.edu@ROME.ICS.UCI.EDU>
Tue, 23 Feb 88 18:43:54 -0800
There is currently some controversy over the certification of the Airbus
320 in England.  In case you are unfamiliar with this aircraft, it is
to be the first truly fly-by-wire civilian aircraft.  Much of the argument
that I have read that Airbus uses to support the claim that the software 
is highly reliable is based on the fact that they use n-version programming.  

The London Sunday Times of December 13 contained the following article:

   "A math professor is preparing to go to court in an attempt to prevent the
   world's most advanced civilian aircraft coming into service because he
   believes it is unsafe."

   "Michael Hennell, Professor of computational mathematics at Liverpool 
   University, wants to stop the Civil Aviation Authority licensing the latest 
   European Airbus, the 320.  He alleges that the computer program that will 
   fly the plane is flawed."

   "Hennell, 47, has worked for the government and the EC on computer design.
   He accused the aircraft's designers of making "absurd" safety claims and has
   challenged Airbus Industrie to prove that the computer would break down no
   more than once in every billion hours of operation, as the company claims."

   "He is supported by Bev Littlewood, Professor of Software Engineering at 
   City University, London.  Littlewood says he also has serious doubts about 
   the reliability of the computer system and believes Airbus's claims are 
   unrealistic."

   "Airbus yesterday rejected the charges, and said the 320 would be the safest
   passenger aircraft ever.  `We believe that the safety requirement of a total
   breakdown occurring only once every billion hours is achievable,' a 
   spokesman said.  Airbus dismissed Hennell's fears as extravagant and 
   `wildly off target,' but admitted the computer had failed during test 
   flying.  The breakdowns were caused by teething problems and the aircraft 
   had landed safely, it said."

   ...

   "The 320 is the latest and most advanced Airbus built by the four-nation
   consortium...It is the first Airbus to use a computer system, nicknamed
   `fly-by-wire,' to carry out many tasks normally performed by a pilot."

   "Airbus said fly-by-wire made the aircraft safer by preventing it stalling
   or manoeuvering [sic] too violently.  It also saved fuel costs by keeping
   the aircraft on optimum trim."

   "But Hennell claimed the aircraft relied too heavily on the system. `There
   are always inherent faults in the software.  If the Airbus computer breaks
   down it will put the plane in jeopardy.'"

   "Hennell pointed to the crash of a US F-18 military aircraft, in which the
   pilot failed to recover from a spin because the on-board computer thought
   his commands were `too extreme' and blocked them."

   "He is to apply for an injunction to stop the CAA [similar to the U.S. FAA] 
   approving an airworthiness certificate for the 320.  The CAA said
   yesterday it did not believe there was a safety problem with the Airbus
   computer. `The CAA has rigorous procedures for the certification of all 
   aircraft systems ... In the case of the Airbus we are satisfied that the
   tests carried out achieve the safety objectives.'"


File matching

Barry Nelson <bnelson@ccb.bbn.com>
Fri, 26 Feb 88 18:14:34 EST
Well-I-suspected-as-much Department:

I discovered this tidbit in the Federal Register (52 FR 49556, 31 DEC 1987) and
thought I'd pass it along to the group.  Other such systems may already be in
place at other agencies, but I just happened to notice this one today.

COMPUTER MATCHING PROGRAM - US Postal Service/Federal Creditor Agencies - 

The Post Office "...intends to conduct continuous matches [between] files of
delinquent debtors [supplied by various Federal agencies] and its payroll file.
Using the Social Security Account Number, USPS will [prepare a list of USPS
employees who] may be subject to salary offset under the Debt Collection Act of
1982 [subject to due process]. [Of course we'll manually verify any hits and
carefully discard erroneous information, so nobody will retain an undeservedly
bad reputation]."

In other words, "We're using your SSN, which we solicited solely for IRS
record-keeping purposes, to check on your bill-paying habits too." 

What next?  Badge-readers that make you write a check to get in the door?

Barry C. Nelson


Mistaken Identity and Display of Retrieved Sets

"James H. Coombs" <JAZBO%BROWNVM.BITNET@MITVMA.MIT.EDU>
Thu, 25 Feb 88 23:29:37 EST
Amos Shapir writes:

         The Israeli state collection agency issued a warrant for the
         arrest of a debtor; since they had only his name (a rather
         common one) and the town he lived in, a clerk completed the
         missing information - full address, ID number and father's name
         - from the first entry for a person of the same name he found
         in the citizen's registry.

At first, this clerk's action sounds extremely irresponsible.  It's quite
common, however, for a system to retrieve a set of records and display them
one at a time.  A naive operator may well not be aware that more than one
record has been retrieved (yes, there may still be some irresponsibility
here).  Whether or not the incident followed this scenario, we should keep
the possibility in mind and consider displaying the number of records
retrieved before displaying any records.  (Or an alert box might work as
well for a Mac-style interface.)

PGN comments:

     [This one is computer-related in the sense that input data should
     acquire an appropriate measure of trustworthiness and then be
     handled accordingly.  That measure should stay with the data, as
     is the case with a security label.  PGN]

What does this mean?  Practically?  How would one implement a "measure of
trustworthiness" for a data set such as this.  Also, I have treated it as
a retrieval problem; but PGN focuses on input.  Does this mean that there
should be something like a primary key, and that this primary key must be
involved in all retrievals?  Furthermore, would this primary key have to
be something more descriptive than an automatically generated surrogate,
such that any reasonably trained and attentive operator would notice an
error immediately?  But then what would the key consist of to defeat the
sort of error that Amos reports?
                                               --Jim

Dr. James H. Coombs, Software Engineer, Research 
Institute for Research in Information and Scholarship (IRIS), Brown University

    [In this case, the OUTPUT should bear a credibility label such as 

       "THE FOLLOWING ITEM IS ONE OF POSSIBLY MANY THAT MATCHES THE REQUEST."

    If data is marked on input or on acquisition as to its credibility,
    and then the output process further diminishes the credibility based
    on the contextual nature of the processing, a lot of the false matches
    might have less impact on the user.  This is a serious problem in the
    identification of suspects based on partial information, where the
    input data may not have been verified and the processing may introduce
    further uncertainties.  ("Fuzzy logic" revisited?)  PGN]


Re: Taxing information

Dick King <king@kestrel.ARPA>
Wed, 24 Feb 88 08:36:15 PDT
    Date: 17 Feb 88 07:48:28 GMT
    From: Steven Koinm <goog@a.cs.okstate.edu>
    Subject: Taxing of information
    Organization: Oklahoma State Univ., Stillwater

    I recently came across an interesting idea presented by a hacker
    while doing research for a paper.  The hacker said that he could
    not consider information property because it cannot be taxed. [...]

Seems bogus to me.  The hacker's lament is that the value of the piece
of information cannot be precisely measured.

There are other pieces of information whose values cannot be precisely
measured.  I understand that they are sometimes taxed [or split in a
marital property settlement, which is a similar idea] based on the
cost of acquiring them, sometimes on a market value, and sometimes on
an estimated value of unclear origin.

Examples of eack of these valuation methids include an oilfield of
unknown extent, a patent, and a professional license.

    Would this make people stop collecting HUGE amounts of information
    that they keep around just for the sake of "I'll need that
    someday" or "Why bother erasing it, it may still be valid."

Information depreciates.  A software concern can sometimes depreciate
the software over three years rather than expensing the effort of
producing the software as it is expended.


Re: Taxing of information (RISKS-6.30)

<Jeff_MacKie-Mason@um.cc.umich.edu>
Wed, 24 Feb 88 21:28:20 EST
In many countries, one form of information *is* taxed.  In most western
European countries, information that is covered by a valid patent is not
protected unless the patentee pays an annual renewal fee, effectively
taxing the value of that intellectual property to its owner.  Of course,
the fees make no attempt to assess the value of the property to the owner,
but many taxes take on a fixed-fee form.

Jeff MacKie-Mason, Dept. of Economics, University of Michigan


Re: Taxing of Information

<jong%delni.DEC@decwrl.dec.com>
25 Feb 88 12:15
An unnamed hacker has raised the question of taxing information.  This is
perhaps only a "risk" if it catches on, but the technical question is how it
could be done.  Well, taking my cue from Xerox, which keeps a cycle counter
in its machines and thus charges a cent or so per copy, I say it's simply a
matter of an application program keeping a counter of how many times it was
invoked.  It could also track how many times it opened individual data
files.  If the counter was encrypted, it might be safe from hacking.

Egads! Every time I fire up PageMaker I pay a one cent tax to the IRS.  Or
worse, a tax plus a royalty to Aldus! I can see that adding up fast.  Of
course, the IRS will create a standard withholding for users of computers; you
will have to prove that you didn't actually use the program as much as was
assumed, by including the encrypted Federal program ID/counter string on a
form that you must file every year by August 10th (one copy per program);
except for shareware authors, who must file a form listing all users who have
registered, as failure to notify the IRS of a user of a shareware program is a
criminal offense...


Re: the risks of voice recognition in banking services (RISKS-6.30)

<kew%hldg00.DEC@src.dec.com>
Wed, 24 Feb 88 03:16:04 PST
If it is the TSB service, then funds transfers can only be made to
pre-arranged destinations, ie, you go into the bank and set up the service
for phone gas electricity etc - to pay your bills, so, the worst someone can
do is pay your bills for you. They could also find out your balance.  They
also offer a keypad which fits over the microphone allowing you to enter a
p.i.n. and then drive a menu of voice synthesized options.
                                                               Jerry Kew


SDI S/W

Fred Baube <fbaube@note.nsf.gov>
Thu, 11 Feb 88 08:50:41 -0500
For a paper on the future of strategic (i.e. nuclear) stability between the
superpowers, I'd like to hear about sources that explore the prospects for
systemic stability in Star Wars software.  Possible topics:

- The possibility of unstable software behavior in a tightly-
  linked system due to feedback .. a la Black Monday, say.

- Design techniques to forestall/circumvent such built-in unstable behavior 

- The prospects for keeping human decision makers in the loop
  during a crisis involving SDI

- Lessons learned from other large distributed S/W systems, such as the ATC 
  upgrade, or the stock market, or even telecommunications

- The prospects for SDI S/W research creating the ability to generate
  error-free S/W directly from algorithmic or even English-language functional
  descriptions (assuming that such a description is itself error-free,
  naturally).

I'm looking for articles, manuscripts, ruminations, anecdotes, personal
speculation, SDIO blatherings, whatever.  Also ANY info about the National Test
Bed contract to Martin Marietta.  Also general info about the use, misuse, and
abuse of simulations, and how the SDI S/W developers plan on convincing us that
they have avoided these pitfalls.  Thanx in advance.

#include <disclaimer.h>
Disclaimer #2: This paper is not for my employer.


Request for Viruses to be used to test AntiBiotics

Amir Herzberg <amirh%TECHUNIX.BITNET@CNUCE-VM.ARPA>
Mon, 22 Feb 88 19:01:40 +0200
The risk of Viruses, especially in computers w/o hardware supported secure
OS, is of much concern lately. We intend to develop software to protect
against viruses in an unprotected environment (e.g. a PC - even an AT with
MS-DOS). Some of the software is "preventive", other is "corrective".  The
software will be developed as projects in "Lab for Advanced Prog."  course.

  To test the software, and to improve understanding of the Viruses, we need
samples of viruses. Anybody who has a contaminated disk is requested to send
it to me: Amir Herzberg, Comp. Science Dept., Technion, Haifa, Israel.  I will
return a disk (if requested, with the programs when done).  Physical disks may
be better then e-mailed files. To check if I already have your Virus, or for
more details, e-mail is amirh@techunix.bitnet or amirh@techsel.bitnet. Thanks
for the co-operation!!!
                                           Amir Herzberg

P.S. I represent in the entire matter myself only, not the Technion (or
anyone else...).
P.S.S. Detailed information would also be most welcome.

   [See my comment on Dave Horsfall's message in RISKS-6.31 on the dangers
   of Trojan horses (and bugs!) in allegedly antiviral software.  What a
   wonderful opportunity to plant Trojan horrors, in both directions --
   to Amir and from Amir.  The risks are more than Amir pittance.  PGN]


Viruses and "The Adolescence of P-1" (Re: Risks-6.31)

<preedy@nswc-wo.ARPA>
Thu, 25 Feb 88 08:26:46 est
I just finished reading the novel "The Adolescence of P-1" by Thomas J.
Ryan, which was mentioned by Kian-Tat Lim.  This was a very
thought-provoking novel.  Considering the learning capabilities that exist
when using neural networks, it is hard to say where fact meets fiction in
this book.  That is scary.  Could a computer possibly take over?  What risk
are we taking when we teach a computer to learn?
                                                        Pat Reedy

              [The author of the Adolescence of P1 is Thomas J. Ryan, 
              published by Collier, in 1977.   JPAnderson@DOCKMASTER.ARPA]

Please report problems with the web pages to the maintainer

x
Top