The RISKS Digest
Volume 15 Issue 06

Tuesday, 5th October 1993

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

RISKs of trusting e-mail
Theodore M.P. Lee
Stocks and Wands
Paul Dorey
Dead for 3 years, but computer kept paying bills
Alan Frisbie
Newton Tale of Woe
Steven Sargent via Paul M. Wexelblat
RISKS of unverified driving records
Mich Kabay
Redundant data
Mich Kabay
Conditioning and human interfaces
Robert Dorsett
Portable phones and fire detectors
Trevor Kirby
Re: Cancer Treatment Blunder
Paul Smee
Re: Security holes and risks of software ...
Bob Bosen
Bank of America fires employee after reading his e-mail?
David Jones
E-mail for denial of services and corruption
Fredrick B. Cohen
Software Quality vs Staff Size
Mike Willey
Info on RISKS (comp.risks)

RISKs of trusting e-mail

Theodore M.P. Lee <tmplee@tis.com>
Fri, 1 Oct 1993 11:43:00 -0600
Until such time as either the general population learns what to expect or
digital authentication (such as PEM) becomes widespread, I suspect we will
hear more of this kind of incident. This academic year the University of
Wisconsin started providing e-mail accounts to all students at its Madison
campus. (6,000?, maybe) The students, both technical and non-technical, are
being encouraged to use e-mail as a way of interacting with their instructors.
They access the accounts either through University-supplied machines scattered
throughout the campus or through dial-up Serial Link Protocol (SLIP)
connections. A mix of Macintosh's, PC's and other assorted workstations are
involved.

Last week (note how early in the school year) a group of five students,
several from the Honors floor of one of the freshman dorms, were caught having
forged several pieces of e-mail. Most potentially damaging was a note saying
it was from the Director of Housing, to the Chancellor of the University,
David Ward; note that the previous Chancellor is now Pres.  Clinton's
Secretary of HHS, so the present Chancellor is new to the job.  The forged
message was a submission of resignation. Ward's secretary had just returned
from vacation and apparently assumed the proferred resignation was legitimate.
The secretary accepted it and started to act upon it — it was only during the
course of that that it was discovered to be a fake.

The students also sent messages purporting to be from the Chancellor to
other students asking them to pay their tuition. They also forged a message
from the Chancellor (my information doesn't say who it went to) saying he
was going to "come out of the closet" and announce it Sept. 25.

The students were only caught through a combination of circumstances.  First,
since they used one of the dial-in connections there were logs of who dialed
in when. Secondly, during the course of their experiments they botched some
addresses which caused enough traffic to go to the dead-letter office that the
investigation could narrow what was happening. (It should be pointed out that
the forgery was fairly easy to accomplish using the Eudora mail client on a
Macintosh: the user has complete choice over the "from:" field of a message.)

The FBI is investigating whether any federal crime was involved and,
needless-to-say, the students are likely to be expelled at the least.

Ted Lee, Trusted Information Systems, Inc., PO Box 1718, Minnetonka, MN  55345
   612-934-5424   tmplee@tis.com


Stocks and Wands

<Paul.Dorey@barclays.co.uk>
Thu, 23 Sep 1993 12:30:26 +0100
The market report in the London Evening Standard of 17th September 1993
reported:

  ...With so many people away from their desks for Yom Kippur there was little
  inspiration but dealers were startled to see the price of P&O race up from
  653p to 863p, triggering thoughts of a dawn raid.  The price then
  disappeared from screens and it was put down to a computer error.  The real
  price was 603p, up 2p.

     [Someone was playing the P&O blindfolded.  Perhaps what was needed was
     a Yom Kippured Herring Aide who could hear "2p on Travel".  PGN]


Dead for 3 years, but computer kept paying bills

"Alan Frisbie" <frisbie@flying-disk.com>
Sun, 19 Sep 93 17:44:18 PDT
>From the Reuters News Service, printed in The Los Angeles Times
 Sunday, September 19, 1993:

Computers Paid Bills as Woman in Sweden Lay Dead for 3 Years

Stockholm — The body of an elderly woman who died in 1990 lay undiscovered in
her apartment for more than three years while computers received her pension
and automatically paid her bills, Swedish police said Saturday.  "It's very
unusual for someone to be dead so long without anyone else reacting," a police
duty officer in the Stockholm suburb of Farsta told the national news agency
TT.

The woman's last-opened mail was dated May 11, 1990, police said, indicating
she had died at the age of 72.  Her name has not been made public.  Police
were called to break into the apartment by its landlord after he had made
repeated efforts to gain the occupant's permission to renovate it.

Alan E. Frisbie, Flying Disk Systems, Inc., 4759 Round Top Drive, Los Angeles,
CA 90065  (213) 256-2575 (voice) (213) 258-3585 (FAX) Frisbie@Flying-Disk.Com

   [Also noted by Trevor Jenkins tfj@apusapus.demon.co.uk .
   The RISKS archives also show a previous similar case.  PGN]


Newton Tale of Woe

"Paul M. Wexelblat" <wex@cs.uml.edu>
Fri, 1 Oct 1993 13:26:54 -0400 (EDT)
The following was culled from Gene Spafford's Yucks Digest;

I have no idea whather the tale is apocryphal or not, but the implications of
putting yet another layer of technology between the mind and the bits (with no
sanity clause)...

[Yes Virginia there is a Sanity Clause (cf Groucho)]

anyhow, excerpt follows:

==============================

Date: Thu, 23 Sep 93 18:51:39 PDT
From: uunet!frame.com!sbs (Steven Sargent)
Subject: "I think my face is on fire."
To: various

One of my spies on the net reports:

>
> ... I thought you might like to know what Abraham Lincoln would've
> said if he'd been writing his notes on a Newton instead of paper.
>
>       "Bookstore avis screen deans ago, our fort fathers brownies
>       front it on fits continent a new nation, concerned in in berry
>       and  bridge area to fire proposition that air me fire created
>       erasers...."


RISKS of unverified driving records

"Mich Kabay / JINBU Corp." <75300.3232@compuserve.com>
11 Sep 93 15:10:52 EDT
In the Globe and Mail newspaper from Canada for Saturday, Sept 11, 1993, Mary
Gooderham, Applied Science Reporter, has an article on page A3 entitled,
"Technology points finger at poor drivers: Car rentals could become more
difficult as motorists' records are shared."

The article explains that for $5, the Ontario Transportation Ministry will
provide any registered driver's record (including name, license number,
date of birth, sex, conviction--criminal and highway--and accident history.
Such information is supposedly restricted to "authorized requesters" including
police, collection agencies and insurance companies.

In contrast, the province of British Columbia will being in October 93 to
require written permission from data subjects before releasing their accident
records.

Gooderham writes that some US rent-a-car agencies routinely check the driving
record of applicants from NY, MD, FL and OH.  Poor drivers are refused
service.

In Canada, the Interprovincial Record Exchange is managed by the Canadian
Council of Motor Transport Administrators, which is considering "giving third
parties access." In NY, TML Information Services serves as broker between the
State and rental-car agencies.  According to Gooderham, TML's CEO, Sean
Doherty, expects "half of U.S. drivers will be covered by the system by the
middle of next year."

+ + +

The prospect of reducing road accidents and thefts by keeping track of rotten
drivers and no-goodniks appeals to my orderly side.  However, in the light of
extensive discussions in RISKS-14 about faulty credit, driving, and criminal
records, the lack of clear information or procedures for _checking_ and
_correcting_ such records is a problem.  How would you, a model citizen and
driver, like to discover on the morning of your vacation/business/emergency
trip that you appear to have been disqualified by an erroneous database
record?  Arguing with the poor soul on the other side of the counter will
clearly not help.  I expect to see a growing number of lawsuits as a result of
database errors or possibly even program bugs when innocent people suffer from
data corruption.

It will not be good enough to allow just _anyone_ to make ostensible
corrections in our records, either. Some method of identification and
authentication will have to be devised to prevent nasty people from damaging
other people's histories.

And just what, pray, will that entail?  A national identification card? Perhaps
we are headed that way.  The social security number is already becoming an
equivalent that can tie together many independent databases to provide a
detailed vision of an individual's personal and professional life.

Without adequate provisions for maintaining data integrity and validity, the
growing use of databanks containing personal information will result in costly
and perhaps dangerous errors.

Michel E. Kabay, Ph.D., Director of Education, National Computer Security Assn


Redundant data

"Mich Kabay / JINBU Corp." <75300.3232@compuserve.com>
11 Sep 93 15:11:34 EDT
As a followup to the article on drivers' records I just posted, I would like to
explore the consequences of having many interlinked but independently-managed
databanks describing us.  These problems have been familiar to data processing
personnel for the last 50 years, but they will be new to some of the
designers, administrators and users of interlinked personal-information
databases being established throughout government and industry.

Agency A maintains a databank and links in to Organization B, which links up
with Institution C.  Data flow from A -> B -> C.  An error creeps into the
record for Percy Perfect in database A.  It propagates to B and C.

Percy discovers the problem one morning when he lands at Seattle Airport and
tries to pick up his Superior Rent-A-Car vehicle so he can make a 10 am
meeting 120 miles away in the back woods of Washington state.  He is refused
because his record now shows that he stole a car in Florida 3 years ago.
Actually, Percy has never stolen as much a jellybean in his life and has never
been to Florida.  It's one of those identity-mixups.  Oops, sorry, we'll fix
it.

Three weeks later, the correction finally makes it into database A.

Question:  is there a mechanism in place to record the fact that the record had
in the past been sent to database B?  And will B also "know" that it ought to
send a correction to database C?

If all works well, there is no problem.

But what if C doesn't "know" that B got data from A.  What if A, not "knowing"
that C got some of _its_ data from B, signs an agreement to begin sharing data
FROM C?  Then we have A -> B-> C -> A: a circular data path.

As soon as there is a loop in the topology, the loop may inadvertently become
an accumulator or buffer.  So A send a correction to B which propagates it to
C, but unfortunately just moments before the correction arrives, C ships an
update to A showing the erroneous data.  Will A update its own new records
with an old record that is in fact wrong?  Yes it will, unless provisions are
implemented to forestall such problems.

Some of the techniques that will have to be evaluated include

     o    time stamps using coordinated time to allow a system to establish
          which of two records is newer.

     o    a standardized format for exchanging the _history_ of a record:
          where did it come from?  when?

     o    mechanisms for unique identification of a database.

     o    mechanisms for alerting member databases to circular references.

     o    provision for authentication of updates; perhaps message
          authentication codes in the style of ANSI X9.9, including a sequence
          number in the MAC to prevent insertion and deletion of transactions.

A printout of the data subject's record should be delivered to the data
subject on demand when and where the information is used.  In addition, as a
means of catching errors fast, it would help if a printout were delivered to
the data subject every time the record is modified.  These practises would not
prevent or identify all problems (e.g., they'd fail when the address is wrong)
but at least they'd be on the right path.

The Internet contains files which co-exist in different repositories without
too much conflict.  Let us hope that this model of collaborative data storage
will serve as an example of how to accommodate redundant data without
bureaucratic and governmental meddling.

Michel E. Kabay, Ph.D., Director of Education, National Computer Security Assn


Conditioning and human interfaces

Robert Dorsett <rdd@cactus.org>
Fri, 1 Oct 93 05:32:34 CDT
After testing a computer-based training program last night, on a PC, it was
time to quit.  The program used a GUI-ish interface, one of those home-
grown interfaces that appeared on the PC before Windows became established.

So, I selected quit, saw the prompt, and hit "no."  Automatically.  Then
returned to the main menu.  Did this three times, very fast, figuring I
missed the button.  Finally, my host said, "Try clicking on yes."

The prompt?  "Do you want to quit?"  Yes/No.

But *I*, a Mac-user, have been conditioned to see: "Save changes before
quitting?"  Yes/No/Cancel.  This is the only time you see "are you sure"
prompts on exiting a Mac program.  I was just screwing around, so of COURSE
I hit "No."

I'm sure there's a RISK in there, somewhere...:-)  It was COMPLETELY
instinctive for me to hit "No"...


And while I'm at it, an old vending-machine story, which someone suggested
I send to RISKS at the time.  This was one of those machines with an alpha
A-F "category" selector, with a numeric item selector.  So if you wanted the
fifth item on row C, it would be C-5.  After entering the proper amount,
one hit the button "C", then the button "5."

Except in this case, the item was C-11.  What did I, techno-nerd, do?
"C-1-1."  Twice.  Getting the item C-1.  Then I looked down, and saw that
the numeric selectors went down to 12, including 10, 11, and 12.  Oh.


In this vein, airliner avionics requiring numerical entry tend to use
"phone-style" keyboards, with 123 the top row of keys.  Yet all calculators
are exactly the opposite, with 123 the second-to-bottom row.  What happens
when cultures clash, in THAT case? :-)  Or the 0 key is transposed from
the left bottom-most key, to the right?  Or put to the left of the 1?  :-)


Profound questions at 5AM....

Robert Dorsett   rdd@cactus.org   ...cs.utexas.edu!cactus.org!rdd


portable phones and fire detectors

"Trevor Kirby" <Trevor.Kirby@newcastle.ac.uk>
Fri, 1 Oct 93 13:01:53 BST
Michel E. Kabay was wondering about the risks of new digital phones on
microprocessors.  I don't know about this but I do know that using certain
types of portable phone near certain types of fire detectors can set off a
false alarm.

The problem was caused by portable phones causing an ionization detector
to send out a signal which was interpreted by the system as a genuine detect.
The problem came to light when the customer used a portable phone to complain
about the late arrival of the engineer, needless to say he was sat right under
one of the detectors.

 Trev


Re: Cancer Treatment Blunder (Randell, RISKS-15.05)

Paul Smee <P.Smee@bristol.ac.uk>
Fri, 1 Oct 1993 10:28:19 +0000 (GMT)
Brian.Randell@newcastle.ac.uk wrote:

> Details of the errors were disclosed after a clinical inquiry by senior
> radiologists who examined the cases of all 1045 patients who had
> radiation doses of up to 35% less than prescribed. Their report blamed
> human error by Margaret Grieveson, a physicist, who unnecessarily
> programmed a correction factor into the radiography computer in 1982."

The inquiry has indeed blamed Dr Grieveson, but from the news reports it is
not totally clear to me that this is fair - there are some unanswered
questions.

The nature of the error is that she manually instructed the new computer
controlling the X-ray to apply a standard correction factor (as had been
required previously) based upon the distance of the radiation source from the
patient.  Unknown to her, however, the computer program already had this
correction coded into it, so that in essence it was being applied twice.

It's the 'unknown to her' that bothers me, and that the reports have not
addressed.  Does this imply that she didn't bother to RTFM, or is it the case
that the manufacturers of the equipment thought it so obvious that the program
would include the correction, that they didn't bother to mention it?  I'd
really like to know.

Whichever is the case, it clearly demonstrates the risk of changing
methodologies without making absolutely certain that everyone fully
understands the new methods, and how they compare to the old ones.

Finally, if the board of inquiry is able to determine, merely by reference to
the set of records which were routinely kept, that patients were consistently
being underdosed, then it feels probable that a review of the case papers in
each individual case, while treatment was going on, could have revealed the
same thing.  Does this also demonstrate the risks involved in changing
procedures without creating a mechanism for monitoring the effects of the
change?


Re: Security holes and risks of software ... (Peterson, RISKS-15.05)

Bob Bosen <bbosen@netcom.com>
Fri, 1 Oct 1993 16:51:24 GMT
> The interesting part is that for the first time we are approaching the point
> where true separation is possible. Not in a mainframe, nor in a UNIX machine
> but in the client-server network (not peer-peer though).

> IMHO this changed world-view is going to cause the single greatest change in
> information security that we have ever seen. Networks will cease being
> "unsecurable" and become the only accepted means for protection of data.

I hope you're right, Padgett, but we've got a LONG way to go. It's amazing how
many network users are unaware of the ease with which packets can be
monitored, copied, and replayed. Every time I present my lecture on
public-domain software tools to monitor LAN segments, most of the audience is
shell-shocked!

-Bob Bosen-  Enigma Logic Inc.


Bank of America fires employee after reading his e-mail?

David Jones <djones@cim.mcgill.ca>
1 Oct 1993 12:55:50 -0400
I read a fax of a photocopy of a newspaper article (sigh), which unfortunately
had the name of the newspaper as well as the date and author of the article
are obscured.  Although I am confident this is authentic, please keep this
source in mind.

    "This is the new field of what [...] calls
    `electronic voyeurism'.     [...]
    The results of electronic peeping are as
    troubling as they are bizarre.
    Supervisors with no intent to do mischief
    may watch employee message traffic on computer systems.
    That kind of surveillance led a major California
    financial institution, the Bank of America,
    to fire an employee after his electronic mail
    indicated that, when his day job ended,
    he worked nights as a professional gay stripper."

(1) Can anyone out there supply more factual details about this case?
    It must have hit the papers in California.

(2) We are left with a legal morass ...

    (a) At least in Canada, public and private institutions do have the
    right to monitor "private communications", but only if their intent
    is to check the performance and security of the systems
    that they are responsibile for maintaining.  By the way, one might
    argue that the date, time, volume, originator, recipient
    of communications should be sufficient for this purpose, making
    perusal of the contents, under the guise of checking performance
    or security, questionable.

    [caveat: the absence of a "reasonable expectation of privacy"
    makes listening much easier, as in cellular phones, or
    product support phone calls to an employee in the service dept
    of a company — these issues are orthogonal to the following...]

    (b) Privacy laws protect personal communication.

    (c) Some Universities have rules of the following nature:

    (i)  it is against policy to read another person's
         files or e-mail.

    (ii) a person's personal files and e-mail *are* admissible
         as evidence against an accused person.

    One might imagine that (ii) is ineffective because
    while it says personal files and e-mail can be used, (i) says
    no one can ever obtain them!  Implicit in the presence of (ii),
    however, is the notion that if such personal information
    *magically* appears at the pubic printer and is handed to
    someone evaluating the accused, then ... so be it.

    A motivated technical person could even conveniently
    "suspect" a problem with the computer or communications system,
    totally eviscerating the privacy rule, by appealing to (a).


E-mail for denial of services and corruption

Fredrick B. Cohen <fc@Jupiter.SAIC.Com>
Thu, 30 Sep 93 03:48:34 PDT
I just did an experiment sending massive quantities of e-mail to a typical
Unix box, and of course, I was able to overrun the disk capacity on the
recipient machine, thus making the system grind to a crunching halt for lack
of space.  Since I sent it to daemon, nobody noticed the mail for quite some
time, and it took a bit before they figured out the problem and were able to
fix it.

I don't know for sure, but I think a lot of systems are susceptible to this
attack, and there is no easy solution, at least if you still want to get mail.

To assess the degree to which this might be a threat, I got a listing of DoD
and US Government sites from the Chaos Computer Club (thank you Charles) and
tried sending mail to them - only 1 refused the mail out of 67 tried.  Several
told me there was no such mail recipient, but gave me a directory of other
recipients with simnilar names - how helpful.  A few told me they didn't have
sucha user and identified that they were a particular type of system - now I
know for certain what UID to send to.

    Under some versions of Unix, you can put quotas on users, but not on
e-mail space - as far as I know.  The ULIMIT prevents unbounded growth, but it
is now set high enough by default on most systems that it won't stop this
attack.  You can explicitly refuse mail on some systems, but I don't think
there is a general way to do this selectively enough to defend against this
attack.  The default is almost always to get all that comes to you.  Your
suggestions are welcomed - FC


Software Quality vs Staff Size

Mike Willey <mwilley@feenix.metronet.com>
Wed, 22 Sep 1993 08:34:36 -0500
My company is involved with a client that is producing a device that will
be used in open heart surgery.  Our responsibility is the design and
implementation of the electronics and software that will control this device.

Our client is pushing to increase the staff size 2 to 3 times beyond the
number of individuals actually required to do the work.  Our contention is
that "too many cooks spoil the broth", an oversized staff is less likely to
produce a high quality, safe product.

Does anyone in these news groups have information on acedemic or industry
references relating to this subject, pro or con?  FDA or military project
histories would be especially useful.

Thanks for the help.  Mike Willey

Please report problems with the web pages to the maintainer

x
Top