The RISKS Digest
Volume 4 Issue 13

Tuesday, 18th November 1986

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Framing of life-and-death situations
Jim Horning
On placing the blame
Peter J. Denning
Computer picks wife
Matthew Kruk
Re: Micros in cars
Brint Cooper
Re: They almost got me!
Will Martin
Re: A variation of the Stanford breakin method
Joe Pistritto
Microfiched income-tax records stolen
John Coughlin
Re: Copyrights
Andrew Klossner
Info on RISKS (comp.risks)

Framing of life-and-death situations

Jim Horning <horning@src.DEC.COM>
Tue, 18 Nov 86 17:31:40 pst
In the "1986 Accent on Research Magazine" published by Carnegie Mellon
University there is an article on "The Science of Decision Making" by
Robyn Dawes. The whole article is interesting, but I was particularly
struck by a passage that succinctly states an issue we have often skated
around in Risks:

    ... Such a contradiction violates any model of human decision making
    based on a premise of rational choice. Such framing effects also
    lead decision makers faced with life and death situations to act
    conservatively when the alternatives are framed in terms of lives
    saved (because the first life saved is the most important), but take
    risks when the same alternatives are framed in terms of lives lost
    (because the first life lost is the most important--thereby leading
    to a desire to avoid losing any lives at all). The result can be a
    contradictory choice for identical life and death problems, depending
    upon how they are framed.

    ... have demonstrated not only that framing affects decision, but
    that people systematically violate the rules of probability theory
    by adopting--either explicitly or implicitly--certain heuristics to
    evaluate the likelihood of future outcomes. ...

Jim H.


On placing the blame

Peter J. Denning <pjd@riacs.edu>
Tue, 18 Nov 86 14:34:50 pst
In recent issues of RISKS there were two items that on the surface
did not appear to be in the stated purview of RISKS:

   A.  Two jetliners in near-miss.  Controller unable to warn
       the pilots because there was an open microphone jamming
       the frequency.

   B.  Young girl suffocates from carbon monoxide fumes generated
       by home grille after power company turned off power for non-
       payment of bills but delayed resumption due to operator error.

I asked Peter Neumann about this.  With respect to (A), he said,
radar is a vital component of the system: it is called INPUT.
Vulnerabilities of radars affect the ability of the computer to
do its job.  With respect to (B), he said, a computer operator
put in incorrect data, which contributed to the problem.

In both cases, there is a total system containing an embedded computer
system.  In (A), for example, the total system includes the jetliners,
the pilots, the radars, the radios, the computers, and the controllers.
In (B), the total system includes the customers (especially the
unfortunate family), power distribution, review of requests for welfare
status, and the computer accounting system.

In both cases, there is a temptation to ascribe safety failures in the
total system to one of its components, the embedded computer, and by
implication to make the designers of that software responsible.  In (A),
the computer could not possibly have compensated for jammed radio
frequencies.  In (B), there is a possibility that, had the computer
operator entered correct data, power would have been restored a few
days sooner, in time to forestall the death of someone in that
household; however, the child's parent, not the computer designers or
operator, chose to heat the cold house with a lethal fuel and to defer
application for welfare status until after the power was turned off.

In both cases, a variety of factors combined to create the unfortunate
circumstance.  The embedded computer systems could not have been
programmed to prevent the mishap.  And yet the news reports contain
suggestions that computers, or their operators, are somehow at fault.
Have some journalists become unduly accustomed to fingering the
computer for every mishap?  Have some computer people become unduly
eager to accept the blame when there is a mishap in a system that
contains a computer?

Peter Denning


Computer picks wife

<Matthew_Kruk%UBC.MAILNET@MIT-MULTICS.ARPA>
Mon, 17 Nov 86 08:00:52 PST
(Associated Press) November 15th

      IZMIR, Turkey - A man who divorced his wife after a bitter
   six-year court battle and turned to a computer service to find
   himself the "ideal" mate was surprised when - from 2,000
   prospective brides - the machine selected his former wife.

      "I did not know that my ex-wife had been the ideal counterpart
   for a marriage," Suleyman Guresci was quoted as saying by the
   Anatolia News Agency before re-marrying Nesrin Caglasa.

      "I decided to try being more tolerant toward her," He said.

      The couple, whose first marriage lasted 21 years, were divorced
   nine months ago due to "severe disharmony" after living apart for
   six years, Anatolia reported.


Re: Micros in cars

Brint Cooper <abc@BRL.ARPA>
Mon, 17 Nov 86 15:42:40 EST
There's another risk of re-programming your engine control ROMs.  
It's a federal offense to remove or alter the operation of emission 
control equipment.  Since fuel mixture and ignition affect emission 
levels, they are considered emission control.


Re: They almost got me!

Will Martin — AMXAL-RI <wmartin@ALMSA-1.ARPA>
Tue, 18 Nov 86 9:50:24 CST
Your note on RISKS impressed me tremendously. What you described has so many
odds against it that the fact that it happened just HAS to be significant.
Just what that significance is, I am not sure, but it must be important!
The odds against the occurrence  of the unlikely combination of grades and
data that would get through the filtering code are themselves high, but,
as you said, at least two people's records produced this — the number
of possible students and their grade combinations could easily explain
this, so that, in itself, isn't significant. But the fact that you,
yourself one of these very few that fit this unusual mix of historical
data and participated in this special course, were then asked to rewrite
the computer program that contained this flaw is an incredible coincidence
in itself. However, the fact that this was a special honors humanities
course, the graduates of which would NOT be likely to be computer or
programmer types, takes the odds out of the merely "incredible" category
and puts them into some utterly indescribable astronomical range.

Thanks for sharing this with us. 

Regards,
Will Martin


Joe Pistritto (JHU|mike) <@RELAY.CS.NET,@CSNET-RELAY.CSNET:jcp@BRL.ARPA>
    [+ SECURITY@RUTGERS]
Subject:  Re:  A variation of the Stanford breakin method

What you have here is the standard 'spoofing' problem.  I think the only way
to control this problem (for a system attached to the Internet) is to route
all the traffic thru a gateway (over which you have physical access control)
that will DROP immediately any packets originating from the Internet world
with SOURCE addresses that are anywhere on your local nets.  (You could put
insecure nets on the other side of a similar gateway inhouse, to protect the
'trusted' networks.)  Prevents anyone from spoofing along as one of your
hosts.  (This might cause some loopback features of TCP to stop working in
some implementations, however)  And yes, it means that the 'trusted' hosts
have to be on 'trusted' networks that are physically distinct (and of course
physically secure).

Begins to sound like DoD already, doesn't it...
                            -jcp-

PS: Security is a pain the ass...  [So may be the absence of security!  PGN]


Microfiched income-tax records stolen

John Coughlin <JC%CARLETON.BITNET@WISCVM.WISC.EDU>
17 Nov 86 23:41:00 EST
It was announced in the Canadian House of Commons today that microfiche
containing personal income tax records for 16 million Canadian taxpayers was
stolen from a Toronto office of Revenue Canada on November 4.  The
microfiche was returned November 17 after being retrieved by the RCMP.  It
is not known whether the material was duplicated by the thief, who has not
been identified.

CTV news said that several hundred people had access to the microfiche in
the Toronto office.  Duplicate copies are kept in several district offices
as well.  This incident adds a new dimension to the recently discussed RISKS
of easily portable information media, such as hospital medical records on
computer diskettes.
                                                                 /jc

     [This item is at first blush of marginal relevance to RISKS strictly
      from the computer point of view — unless the microfiche was computer
      generated (it was probably just a record of actual returns).  
      Nevertheless, I include it as symptomatic of the deeper problems.  PGN]


Re: Copyrights (RISKS DIGEST 4.8)

Andrew Klossner <tektronix!hammer.TEK.COM!andrew@ucbvax.Berkeley.EDU>
Mon, 17 Nov 86 10:23:58 PST
    [Andrew wished to clarify the issue of whether there is a risk in 
     using "(c)" or a half-circled "c".  Although his response does not
     seem strictly RISKS related, I think it may clarify a thorny issue 
     for some of you who are willing to contribute to RISKS but want to
     protect your rights.  I have abridged it somewhat.  PGN]

It is the considered opinion of the chief legal counsel at Tektronix that
the genuine circled-c can be replaced only by the string "Copyright (c)".
Both the word "Copyright" and the pseudo-glyph "(c)" are required...

The three basic elements needed to obtain copyright protection in the
United States and the member countries of the Universal Copyright
Convention (most countries of any significance) are the copyright
symbol (circle-c or string "Copyright (c)"), the name of the copyright
owner, and the year date of first public distribution.  The law
requires that the notice "be affixed to the copies in such manner and
location as to give reasonable notice of the claim of copyright."

The phrase "All rights reserved" extends protection to member countries
of the Buenos Aires Convention who are not also members of the
Universal Copyright Convention (a few Latin American countries).

Whenever the program or document is revised significantly, the year
date of the revision must be added to the notice, as in:

    Copyright (c) 19XX, 19YY.

When licensing software to the (US) federal government under the the
Defense Federal Acquisition Regulation Supplement (DFARS), a completely
different set of legends is required.

  -=- Andrew Klossner   (decvax!tektronix!tekecs!andrew)       [UUCP]
                        (tekecs!andrew.tektronix@csnet-relay)  [ARPA]

Please report problems with the web pages to the maintainer

x
Top