The RISKS Digest
Volume 11 Issue 29

Friday, 15th March 1991

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Paranoia and Telephone risks revisited
Larry Nathanson
Smartening up coexists with dumbing down
R Mehlman
"US study discounts VDT-miscarriage link"
Martin Minow
Interesting analysis of drug poisoning risks
Jerry Leichter
Drawing the correct conclusions from that Miami NTSB report
Donald A Norman
Known bug turns 25 years old?
Jerry Bakin
RISKS in Dateline
anonymous
"What the laws enforce" [RTM]
Allan Pratt [and PGN]
What does "authorisation" mean? [RTM]
Peter da Silva
PGN
Info on RISKS (comp.risks)

Paranoia and Telephone risks revisited (W.A.Simon, RISKS-11.27)

Larry Nathanson <lan@bucsf.bu.edu>
Thu, 14 Mar 91 22:24:13 -0500
     * Our medical files are violated by law enforcers as a matter of
     routine.  Employers are not far behind.  Insurance companies are
     misusing this same information.  Government agencies are unable
     to protect it against leaks.  And, big insult, I am not allowed
     to see the file my own doctor keeps on my subject.

This paragraph ranges from blatant misunderstanding to just dead wrong.
Medical confidentiality is being taken very seriously nowadays.  I really can't
think of many reasons why a cop would want your medical file, (drug testing is
generally not part of your record) and I certainly don't see any health
professional allowing a cop access to medical records without a warrant.  For
one thing, it's unethical.  And another, its illegal.

I don't see how insurance companies are "misusing" the medical information.
They may be doing a lousy job of interpreting it, and their payments might not
be fair, but if you want them to pay, you have to tell them what the doctor
did.  That's not poor use of information, not misuse.

Government agencies might not be able to protect your medical record against
leaks, but that's not terribly surprising — it's not their job.  Last I heard,
Bush was not deploying the Marines into every doctor's office.  The medical
record is part of the doctor patient relationship- thus it is the doctors
responsibility to keep it private.  It is the government's repsonsibility to
hear the case, if you sue him for not doing so.

(Quoted without permission from _The Rights of Patients_, by George Annas.
VERY highly recommended for anyone interested in their medical rights.  George
Annas is one of the top Health Law people in the country, and is quite a good
speaker and author.)

Q: Does the patient have the legal right to see and copy the medical record?
A: The majority of states grant individuals a legal right to see and copy their
   medical records ...  In most other states individuals have a probable legal
   right to access without bringing suit.  ...  On the federal level the
   Privacy Act of 1974 requires direct access in most circumstances.

Q: What should the patient do if denied access to the medical record?
A: Raise Hell.  There is no valid, ethical, or legal reason to deny a competant
   patient access to the medical record.  ...  (exceptions of "would do
   harm...etc" listed.)

Q: Is the maintainance of confidentiality a legal or an ethical obligation of
   health care providers?
A: It is both.  Historically, the doctrine was an ethical duty applicable only
   to physicians.  Currently it is also the legal duty of physicians ... and it
   is becoming a legal duty of other health care practitioners, as well.
   (Quotes Hippocratic Oath, AMA Principles of ethics, ANA Code)

         Therefore I propose all medical files should be made public.
         If I am discriminated against by an insurance company for
         smoking pot, why should their president be sheltered from
         the public revelation that he is a drunk and a wife abuser?

Because technically, pot smoking is illegal, drinking to abuse is not, and wife
beating only is, if she brings charges.  What kind of insurance are you
applying for, that requires a drug test?

        * All of my past employers know about my checkered past.
       I know nothing about theirs.  Why should their resume be
       confidential?  I ask that human resource departments make
       all resumes public.

Ummm.. because you are working for them, not vice versa???

       * Police and politicians keep tabs on our every move through
       passport control and credit cards.  But politicians travel with
       diplomatic passports and they use assumed names to protect their
       privacy.

Whoa!  While the general feeling on RISKS (mine too) is that technologies can
ALLOW one to be tracked, I think postulating that such a system is officially
in place and active is getting a bit paranoid.  And why in the world do you
think the politicians are going around incognito for their privacy?  Where do
you get this stuff from?

         What makes our privacy so fragile is that we value it.

Ah!  Finally, we agree on one thing.


Smartening up coexists with dumbing down

"GRUMPY::RMEHLMAN" <rmehlman%grumpy.decnet@uclasp.igpp.ucla.edu>
14 Mar 91 21:59:00 PDT
The following is excerpted from Digital Review, March 11, 1991, p.16,
"Files Getting Bigger All the Time" by Bill Hancock:

  A customer called today and said that one of our network security products
  had set off a security alarm proclaiming that excessive data was being
  transmitted from one node to another.  [...]

  It seems that the user of the offending microcomputer had a hard-disk head
  crash about a year ago and has been extremely diligent ever since about
  performing system backups between the microcomputer and the system with
  the file server software.

  Well, the user of the microcomputer received a CD-ROM optical disk player
  (read only) and thought it had to be backed up every night.  The result?
  A 650MB CD-ROM was being backed up to the file server each night, and it was
  setting off the excessive data alarm in the network security product.


"US study discounts VDT-miscarriage link"

Martin Minow 15-Mar-1991 1009 <minow@ranger.enet.dec.com>
Fri, 15 Mar 91 07:26:26 PST
>From the Boston Globe, Mar 14, 1991: "Medical Notebook" by Judy Foreman
(abridged slightly by MM):

"Government researchers say a new study settles a question that has worried
women of childbearing age for more than a decade: Can using a video display
terminal cause miscarriage?

"The answer is clearly no, according to the National Institute for
Occupational Safety and Health in Cincinnati, which studied 730 pregnant
directory assistance or general telephone operators, some of whom  worked
at VDTs [15% miscarriage rate] and some of whom did not [16%]."

... "The new study, published today in the New England Journal of Medicine,
did find that three factors are linked to increased risk of miscarriage:
heavy drinking, smoking and the presence of a thyroid disorder."

   ----

My comments: Brodeur's New Yorker articles quote a researcher who suggests
that electromagnetic-field related miscarriages may occur at such an early
stage of development that the woman doesn't realize she's pregnant.  I
suppose one could control for this by noting the number of months between
"intent to get pregnant" and "missed period" (i.e., whether there were
miscarriages in the first two weeks after fertilization).  This is an
extremely noisy measurement, and would probably require a much larger
sample size to elicit reliable data.  Also, both VDT and non-VDT operators
might both work in an "electromagnetic-field-rich" environment inside a
telephone office which would mask any differential effects of the VDTs.

My own suspicion, however, is that the researchers are correct and that
whatever VDT-related effects exist are better explained by job-stress and
general economic-related issues: the VDT offers a visible symbol, even
if it is more of an effect than a cause.

Martin Minow        minow@ranger.enet.dec.com
The above does not represent the position of Digital Equipment Corporation


Interesting analysis of drug poisoning risks

Jerry Leichter <leichter@LRW.COM>
Fri, 15 Mar 91 11:09:30 EDT
The following is extracted and summarized from an essay, "Sudafed's the Last
Thing to Be Afraid Of", by Paul H. Rubin, former chief economist of the
Consumer Product Safety Commission, soon to be a professor at Emory University.
It appeared in the 13-March issue of the Wall Street Journal.  Much of it
concerns the specific risk of drug poisoning; but many of the same issues and
arguments arise for all kinds of risks.                              Jerry

    Two people have died after taking poisoned Sudafed.  These
    murders, apparently due to a random act of wanton violence, are
    tragic but not unique.  In the past decade, there have been at
    least six previous episodes of ... poisoning [and] perhaps 10
    deaths....  These incidents can provide serveral lessons:

    - As a society, we have trouble adapting to small risks.  The
    number of deaths involved have been very small by any measure.
    There are more than two million deaths a year from all causes in
    the U.S.  There are 5000 deaths ... from accidental poisoning,
    and more than 20,000 ... homicides.  If there continues to be
    one death per year from tampering, risk to a typical consumer is
    on the order of one in 30 million.  This is 30 times smaller
    than the smallest risk that the [EPA], not a conservative body,
    will address.  In the grand scheme of things, deaths from
    product tampering are a minute problem, whether relative to all
    deaths or to homicides.

    Nonetheless, consumers have reacted strongly to these poison-
    ings.  By some measures, Tylenol has never fully recovered from
    the loss in brand value from two poisoning incidents [in 1982
    and 1986]....  [The recalls cost some $100 million; one estimate
    of actual losses to the maker, counting loss of market share and
    necssary price cuts, is on the order of $1 billion.]

    Before mass communications, a person probably would learn of a
    hazard only if it harmed someone in the community.  Such risks
    were likely to be sufficiently probable to be worth worrying
    about.  Today, however, our intuitions about risk are not a good
    guide to action.  We can learn of risks with minute probabilities.
    Indeed, the more unusual the death, the more newsworthy it
    becomes, and we may be more likely to learn about such risks
    than about more common and more significant risks, such as death
    from automobile accidents.  The Alar episode teaches that we may
    even learn about nonexistent risks.

    Because of the publicity given trivial risks, people incorrectly
    perceive that the world is becoming riskier.  In reality, it is
    becoming ever more safe, and life expectancies are continuously
    rising.  Life expectancy at birth in the U.S. was 59.7 years in
    1930; by 1987 it had risen to 75 years.  Moreover, death rates
    from accidents of all sorts are falling in the U.S.

    - Businesses must respond to consumer perceptions, even if the
    perceptions are not objectively justified.  [This is expensive -
    tamper-resistant packaging costs $50 to $75 million a year just
    for materials.  Tylenol is no longer sold in capsules, a loss
    of convenience.] ... Ralph Nader and others who claim that
    businesses are indifferent to consumer risk have it exactly
    backwards.  Businesses are exceedingly sensitve to consumer
    perceptions of risk, even when those perceptions are biased.

    - ...[T]he tort system serves only to create additional injury,
    not to provide real benefits.  There were at least four claims
    of $5 million each filed in the Tylenol matter, and at least
    one claim has already been filed against Sudafed.  There is
    no negligence and no blame in either case, and nothing the
    manufacturers could reasonably have done to prevent the inci-
    dents. The only effect of such litigation is to raise the
    price [to consumers].

    - Some harms may have no cure.  Unfortunately, there is nothing
    feasible to do about product poisonings.  After the Tylenol
    poisonings, many firms began using tamper-resistant packaging,
    perhaps a rational response given the consumer fears generated
    by the poisonings.  No package, however, is entirely tamper-
    proof.

    We can see this clearly with the Sudafed murders.  Like all
    over-the-counter medications, Sudafed was sold in tamper-
    resistant packaging - but tampering occurred.  A walk through a
    grocery store or a drug store will quickly indicate that there
    are inumerable places where one could insert poison.  (Remember
    the poisoned Chilean grapes?)  A determined killer can always
    find something to poison.

    In the case of over-the-counter drugs, tens of millions of
    dollars are already being spent on a hazard that experience
    tells us may involved one death a year.  Regulatory agencies ...
    commonly estimate that it pays to spend $1 million to $2 million
    to save a life; upper-bound estimates are about $9 million.  The
    tens of millions we are spending on packaging could save more
    lives if spent more efficiently - and, anyway, do not seem to
    be effective in saving even one life.

    Some authorities suggest that consumers examine packages more
    carefully to determine if tampering has occurred.  However, the
    risk is so small that this would not be a useful way to spend
    time.  Time spent in other actions, such as buckling seat belts,
    would save more lives than time spent examining drugs.

    - Government reactions are often counterproductive.  After the
    first Tylenol incident, the [FDA] began to require tamper-
    resistant packaging....  As we've seen, this hasn't been
    effective (though, of course, it's impossible to tell if it's
    deterred less-determined poisoners.)  While firms may decide
    to introduce such packaging themselves, there is no reason to
    mandate it by law.  Now, as a result of the Sudafed incident,
    there have been calls for the FDA to ban capsules for all
    over-the-counter medications....  There are benefits to cap-
    sules, such as ease of swallowing, and it would be misguided
    to ban them.

    The only appropriate government response to such incidents ...
    may be increased resources [to catch poisoners] and increased
    punishment....  The threat of execution is more likely to be
    effective ... than are increased requirements for packaging....

    Most of us do not routinely wear bullet-proof vests, even though
    the chance of being shot by a mugger or being a victim of a ran-
    dom bullet in a drug war is vastly greater than the chance of
    buying a poisoned capsule.  Public authorities and a responsible
    press should indicate the order of magnitude of risks, and allow
    people to take whatever precautions are appropriate.  Unfortunately,
    sometimes there aren't any.


Drawing the correct conclusions from that Miami NTSB report

Donald A Norman-UCSD Cog Sci Dept <danorman@UCSD.EDU>
Thu, 14 Mar 91 21:45:19 PST
In previous issues of RISKS there was considerable discussion of the case
of the missing o-rings on all three engines:

NTSB. (1984). Aircraft Accident Report — Eastern Air Lines, Inc., Lockheed
L-1011, N334EA, Miami International Airport, Miami, Florida, May 5, 1983
(Report No. NTSB/AAR-84/04). National Transportation Safety Board.  Washington,
DC

Sorry for not responding earlier, but I am in hiding this academic year.

I thought that the real impact of this report was missed by all the
discussants.  This is a classic RISK case example, and perhaps ought to be
required reading.  The incident has several unrelated phases, each all by
itself a case story of hazard and risk.  Here is a very brief review of the
highpoints.

Part I:   Maintenance of the engines.

A plug is removed from each engine to check the oil.  The plug is magnetic,
and by examining the metal particles on it, you can tell the state of the
engine oil.  Obviously, after removing such plugs, they have to be
replaced.  With a plug and an o-ring.

Item: all three o-rings were missing, one each from three engines.

Item: two different mechanics replaced the plug/o-ring combination on the three
engines (the center engine is somewhat different than the two wing ones)

Item: they signed off the checklists correctly signing that they had
installed the o-rings on the plugs, but without putting on the o-rings.

Item: they NEVER put on o-rings for as long as they had worked there, so
they always (properly?) signed off the checklists saying they had put on
o-rings because in fact, the o-rings were always already on.  And you know
those stupid forms that ask you to certify things that have no relevance.
(My comment).

Item: the o-rings were already on because the supervisor, for years, got
the parts from the storeroom, put on the o-rings, and left them in his
desk.

Item: this one night — out of many years — the desk drawer was empty, so
the parts had to be gotten from the storeroom.  And the parts are
separate, not attached together in the storeroom.

Item: the maintenance crew started the engines and looked for leaks, but
didn't find any.  (But it was night and the engines may not have been left
on long enough)

------
moral so far: Never try to do favors — if people then count on you and you
fail once, severe problems can arise. So, the supervisor was partly to blame.

Don't sign off for something even if you have to to continue: insist on having
the checklist changed, or something.  Or having the supervisor sign off.
(minor moral — you could get yourself in trouble for following this moral: It
would certainly be unfriendly behavior).  So the mechanics were partly to
blame.

Moral: How come the o-rings were not packaged with the plugs if they ALWAYS
had to be installed on them?  So the system was partly to blame.

Moral: If you invent a check (turn on engines and look for oil leaks), TEST IT!
Leave out the o-rings or don't screw in the plug all the way and see if the
leak can be seen.  At night, with a flashlight (standing on a ladder, on
tiptoes, peering into and around the engine?).  So the maintenance procedures
were partly to blame.

Moral: who is to blame?  The system.

======================================================================

part II

The crew discovered low oil pressure in one engine and shut it down.  They
then turned around to go back to Miami, even though they were more than
1/2 way to their destination.  NTSB hints but doesn't say why. (They trust
the mechanics at Miami better?)

When the other two engines showed low oil pressure, the captain sad my
favorite phrase of all time:

It can't be true!  Its a one in a  million chance that all three engines
would have low oil pressure at the same time.
  I apologize that this is not a quote because my NTSB report is buried
  somewhere where I can't find it.)

The captain was right. And he was that one in a million.

A one in a million chance is NOT good enough.  We have close to 8,000,000
departures a year in commercial aviation, one in a million means about 8
incidents a year.

People are notoriously bad at low probability events.  And this is why it
is hard to get engineers and designers to take safety seriously enough (That
could never happen they say, meaning, one in a  million.  I heard someone say
that elevators were very safe because there was less than one chance in
10,000 that it wold get stuck or break between floors.  Yikes.  That means I
am guaranteed to be in one of them.

So the captain didn't trust the gauges and kept the engines going until
they failed.  (Even had he believed the problem, he didn't have much
choice anyway.)

======================================================================

Part III.

The communication between flight crew and cabin crew was close to zero.
These are really two separate operations, one with high status, one with
low, and crew don't bother to inform the cabin crew of everything.  And, to
be fair, they were rather busy.

So great panic in the cabin.  passengers screaming, panicking.    Cabin
crew pretty scared themselves.   Told to assume crash position far too
early.  No idea how much time they had.

Read the report.

==========

Yup, they got the first engine working again just barely in time, so they
landed.  And the engine then died on the runway.

=====

all accidents involve a complex chain of events.  Never believe anyone who
gives you "the reason" for an accident.  I have never seen an industrial
accident with a single reason.   Remember that, RISKS, folks.  And look for
the subsidiary effects — like how well the passenger side was handled.

As I said this is a classic study: I am amazed nobody else mentioned all
the side effects.

Don Norman, Department of Cognitive Science 0515, University of California, San
Diego, La Jolla, California 92093 USA  dnorman@ucsd.edu  dnorman@ucsd.bitnet


Known bug turns 25 years old?

<JERRY@HMCVAX.CLAREMONT.EDU>
Fri, 15 Mar 1991 00:09 PST
Bug celebrates its Twenty-Fifth Anniversary!

The March 31 issue of Aviation Week & Space Technolology discusses the X-31, a
research aircraft intended to explore high angles of attack.  The craft has a
long fuselage, a delta wing, canards, and thrust vectoring paddles stuck in the
engines exhaust.  From the bottom, it looks vaguely similar to a thin space
shuttle.

Here's a quote I found interesting:

"The Honeywell digital flight control system is derived from one used on the
Lockheed C-130 high-technology testbed aircraft. ... The control system went
into a revisionary mode four times in the first nine flights, usually due to a
disagreement between the two air data sources.  The air data logic dates back
to the mid-1960s and had a divide-by-zero that occurred briefly.  This was not
a problem in its previous application, but the X-31 flight control system would
not tolerate it.  This was fixed with software in January, and the problem has
not reoccurred...."

So what does this mean?  Have they had a known bug from 1965 which was never
fixed?  What is a divide-by-zero which occurs briefly?

Is the X-31 better or worse than its predecessor?  Perhaps the previous
application ignored divide-by-zeros and produced spurious results which were
never noticed (except by grieving relatives?).  The X-31 now proudly catches
these errors.  Or maybe the X-31 demands more accuracy and doesn't catch these
errors and dies instead.

Or, perhaps the previous application had an error-handler which could recover
and it's the X-31 engineers who never considered a divide-by-zero....

Jerry Bakin.                                    Jerry@ymir.claremont.edu


RISKS in Dateline

<[anonymous]>
Thu, 14 Mar 91 12:18:27
My wife and I met through Dateline.  This is easily the largest computer dating
agency operating in the UK.  It is commercially successful; has been running
for over twenty years and claims to have the largest client list in the
country.  Their advertisements appear almost everywhere.

The anuual fee is about \pounds 100 at the moment (it was \pounds 70 when I
joined four years ago) — call it $200.  For that you get a questionaire asking
for personal characterisics (age, education, religion, etc); a personality test
(complete the following doodles, do you prefer hiking to watching TV, and the
like); and a section on required, desirable, neutral, disliked and rejected
characteristics in prospective partners.  After pattern matching (and they make
a point of using fuzzy matching, though they don't use that term in their
promotional literature) you will receive a list of 6 contact names and
addresses and/or phone numbers.  Further lists are available for a nominal fee
-- a couple of pounds or so.  (Incidentally, quite a few women reveal only
their phone numbers, presumably as a mis-guided security precaution..  In every
case but one, a trawl through the phone directory revealed an address.  Not a
computer-related risk, as I used the paper version.)

In my case, I was pretty tolerant on age differences.  I was 30, I thought that
women much younger or older than I would not likely be appropriate, so I put my
REQUIRED age range as 24-36, i.e., plus or minus 6 years.  My wife, for her
part, had +3 years to -5 years.

My wife is now 40 and I am 33.  We've been married 18 months.

In this particular case, a bug in Dateline's pattern matcher has been
beneficial.  Whether that is a RISK or not is open to question.

    [In case you were not reading carefully, the program did not adhere to the
    spec, violating the specified spread of 6 on his part and 5 on hers.
    Fuzzy matching, eh?  Apparently the organization had advertised that it
    would be a stickler for specified constraints, but permitted "don't cares".
    So, who's counting, especially if it works.  Maybe the real risk is in
    overspecifying your constraints.  PGN]


"What the laws enforce" (RTM Conviction, RISKS-11.25)

Allan Pratt <apratt@atari.UUCP>
Tue, 12 Mar 91 18:08:57 pst
In a RISKS article about RTM's conviction being upheld, PGN writes: "It
seems to me that there is still a significant gap between what it is
thought the laws enforce and what computer systems actually enforce."

This is true in all parts of the law.  A "Keep off the grass" ordinance
need not be "enforced" by a fence to be legitimate.  Laws provide
punishments for violators; they don't prevent violations.

   [My reponse not saved.  His response to my response follows.  PGN]

"Keep off the grass" is just a sign.  A cop giving you a citation for
walking on the grass is authority.  There are better examples where there's
no sign, and yet doing something is against the law. Giving a cigarette to
a duck in Arizona comes to mind as one of those silly "did you know it's
against the law to..." things.  There's no sign saying, "Don't commit
murder," either.  If nobody catches you, you get away with it, but that
doesn't make it right or legal.  That's the point I was trying to make.

The article I was responding to said, "How can you say something is wrong
and then not put up security barriers against doing it?"  That statement
implies that when you want to say something is wrong or illegal, you have
to put up barriers against doing it.  I think that logic is flawed.

-- Allan Pratt, Atari Corp.       ...ames!atari!apratt  [Standard disclaimer]

     [WWWHOA! THAT IS *NOT* A QUOTE FROM RISKS-11.25.  YOU ARE INVENTING A
     STATEMENT AND ATTRIBUTING IT TO SOMEONE ELSE, a classical example of
     SHOOTING A STRAW HERRING IN THE MOUTH.  I did not even SUGGEST that you
     have to put up security barriers, merely pointing out that a gap exists,
     which can be addressed by a variety of means, technological, social,
     legal, ethical, etc.  PGN]


What does "authorisation" mean? (RTM Conviction, RISKS-11.25)

Peter da Silva <peter@taronga.hackercorp.com>
Thu, 14 Mar 1991 03:29:23 GMT
Should the computer systems be required to enforce the law? Should an absence
of protection imply authorization? If so, say goodbye to a goodly part of
Usenet.

It's not so in the real world, you know. Am I allowed to steal my neighbor's
lawnmower simply because he left it out?
                                           (peter@taronga.uucp.ferranti.com)


Re: What does "authorisation" mean?

Peter G. Neumann <risks@csl.sri.com>
Thu, 14 Mar 1991 9:51:37 PST
I don't think that is the conclusion that should be drawn from the gap.  But if
the laws say that exceeding authorization is illegal and the computer systems
require no authorization, then it seems to be a MISAPPLICATION of the law to
say that Morris was guilty of exceeding authorization or misusing authorization
or whatever... [with respect to the use of finger, the debug option, .rhosts,
...]  [The laws could be a little sharper.]


Re: What does "authorisation" mean?

Peter da Silva <peter@taronga.hackercorp.com>
Thu, 14 Mar 91 23:16:29 CST
Only if you assume the computer system is solely responsible for enforcing
the authorization. Is that a valid assumption? Do you want it to be?

    [Of course not.  See my foregoing comments on Allan Pratt.  PGN]

Please report problems with the web pages to the maintainer

x
Top