The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 4 Issue 25

Sunday, 7 December 1986

Contents

o Child electrocuted
Anonymous
Brad Davis
Paul Nelson
o On models, publications, and credibility
Bob Estell
o Encryption and criminals
Perry Metzger
Fred Hapgood
o Mode-C altitude transponders
Dan Nelson
o ATM Limits
Richard Outerbridge
o Taking the 5th
Jerry Leichter
o Info on RISKS (comp.risks)

Child electrocuted (RISKS-4.24) (anonymous contribution)

<NEUMANN@CSL.SRI.COM>
Fri, 5 Dec 86 17:19 PST
This contribution was sent to me privately, but is being distributed
anonymously -- at my request -- with the permission of the author.

  I used to volunteer in the Emergency Room at a SF hospital, and the heart
  monitoring lines there had about six pins arranged in a circle, similar to
  the bottom of vacuum tubes.  The exposed pins were shielded by a heavy
  (1/8-in thick) metal ring with a key which permitted it to be plugged into
  the proper receptacle in only one orientation.  Every EKG line I've ever
  seen (including some at other locations) is compatible with this
  configuration.

  Unless someone had built a non-standard connector for this particular
  monitor line, such an electrocution would not have been possible with a
  standard electrical receptacle.  [...]


Child electrocuted

Brad Davis <b-davis%utah-cai@utah-cs.arpa>
Fri, 5 Dec 86 18:42:50 mst
If this is true then I don't think that the equipment met Underwriter Lab's
(UL) safety specs.  They have some strict requirements on what certain plug
designs can be used for and how current carrying plugs can be configured.

Brad Davis  {ihnp4, decvax, seismo}!utah-cs!b-davis   b-davis@utah-cs.ARPA


Child electrocuted

<ssc-vax!ssc-bee!nelson@beaver.cs.washington.edu>
Fri, 5 Dec 86 15:30:32 pst
[...] The leads were incorrectly inserted into the end of a power cord for
an IV pump, causing the electrocution.  This particular IV pump had a
detachable power cord for portable battery-powered operation.  Most all news
reports that I have heard put the blame on human error (Nurse electrocutes
child patient...).

How this could happen was beyond my comprehension until I watched the news
and had a look at the ends of the power cord for the IV pump and the cord
for the heart-monitor equipment.  The ends were very similar in shape and
the heart-monitor leads actually fit into either [termination] without much
difficulty.

Besides the obvious finger-pointing consequences of this incident, I was
immediately hit with the grave psychological damage that must have been
caused to both the nurse and child's family.  The ultimate risk of living in
our "high tech" society had certainly been realized by them.

                Paul Nelson, Boeing Aerospace Co.


On models, publications, and credibility

"ESTELL ROBERT G" <estell@nwc-143b.ARPA>
5 Dec 86 11:21:00 PST
Perhaps one way to encourage researchers and authors to take more care
with their data and their models would be for some leading journals
[e.g., ACM and IEEE pubs, among others] to encourage authors to submit
complete listings of programs, and data, in appendices to papers.
For lack of page space, many such appendices would NOT be printed
with the articles.  But the information could be made available from 
the publisher, via network, or floppy disks, for a reasonable fee.

Some work will involve sensitive data, or proprietary models.  In such
cases, sometimes the data can be "sanitized" and sometimes the model can
be described generically.  That won't be a lot different from today's 
situation, where models and data rarely appear in detail.

On the subject of "getting the right(?) answer" we need to remember
[and tell our non-computing colleagues] that even the "facts" that we
seek as data influence the decision; ditto the model design, and the
parameterization of the model.  One of my grad school profs told a story
of working for Getty Oil: J. Paul's two sons [half brothers] were rivals;
one had North American operations; the other, European.  The European
leader proposed some corporate scheme; my prof's assignment was to
"prove him wrong."  So they went to work on a model; fed it some good
estimates; and it agreed with the European recommendation; modified the
parameters, and re-computed. ... On the 253rd such iteration, the model 
finally said that the brother was wrong.  It was that last case that
the USA manager took to his dad, who believed it.

That doesn't necessarily mean they cheated.  How many models of the DNA
structure were wrong, before the right one was found?  How many bad airplane
designs crashed, before Kittyhawk?  How many flawed page replacement
algorithms, or sort algorithms, et al, have we tried?  For the candy maker,
good "fudging" is obviously progress; the rest of us have to wonder.

The power of computer models is that they allow us to try out so many
ideas, or variations of them, so rapidly, so inexpensively.  The risk
of computer models is that we accept their results, without critique.
I contend that tinkering with a model and its data are proper; and that
the results of the Nth iteration may well be better than the results of
the "first best guess."  But the reasons for believing any model output
must rest on a *causal link to reality.*  

Bob


Encryption and criminals

Perry Metzger <metzger@heathcliff.columbia.edu>
Fri, 5 Dec 86 19:02:23 EST
One of the classic books on this subject, "The Code Breakers" by Kahn,
discusses the incidents during prohibition with rumrunners and encryption.
It seems that earlier in the century commercial codes were widely used.

One of the more humourous incidents listed (reminiscent of trials
involving technology today) was during the trial of one set of
smugglers in which a star witness was a cryptanalyst who was quite
incompetently questioned by the defense. The lawyer's ignorance of the
techniques used was hysterical, and reminiscent of what happens today.

But back to the subject, during prohibition the law enforcement
agencies would quite often call in outside help and try to break the
codes involved, often with success. So far as I could tell from the
book, this did not lead to wide-scale abuses of any sort involving the
police trying to crack commercially used codes and the like.

After all, breaking a code is a long and labour intensive task. You
don't do it unless you have to. Routine breaking of encryption by the
police will not be a reality any time soon.

Perry Metzger

     [Although in a real crunch, there are skilled cryptoanalysts 
      around who could probably be brought into the fray.]


Encryption and criminals

"Fred Hapgood" <SIDNEY.G.HAPGOOD%OZ.AI.MIT.EDU@XX.LCS.MIT.EDU>
Sun 7 Dec 86 07:40:35-EST
    Re encryption by criminals. Some years ago I fell into conversation
with a gentleman who worked as an IRS prosecutor.  Occasionally he brought a
house of prostitution before the bar, and they routinely encrypt their
client lists and financial records, and probably have for millenia.
    He had no interest in spending the time trying to break their codes.
What he did was subpoena the records from their towel company, multiply the
number of towels they used by the average charge, and bill them for the tax
due on that amount. He said the Courts proved happy to accept that document.


Mode-C altitude transponders

Dan Melson <crash!dm@pnet01>
Sat, 6 Dec 86 17:14:35 PST
ronys@wisdom writes  "ATC is trained to never trust a transponder"

I'm sorry, but this is incorrect information.  ATC is trained to verify, at
the time the pilot checks on frequency, that the mode-C is accurate.  If,
of course, the pilot is not talking to the controller, there is no way for
that controller to know that mode-C is verified.

Phraseology for issuing traffic on unverified mode-C readouts includes
telling the pilot that we have no confirming report that mode C is correct.

However, a verified mode-C readout *is* used as basis for separation.

                                                DM


ATM Limits

Richard Outerbridge <outer%csri.toronto.edu@RELAY.CS.NET>
Sat, 6 Dec 86 07:44:10 est
Typically ATMs are hung off a controller, which acts as a front-end for the
bank's mainframe host.  The controller often performs a lot of the normal
processing anyway - for instance, pin verification and sanity checking - and
can usually "stand-in" for the host while the latter is down.  One mechanism
used to prevent fraud is a "cycle file".  This keeps a record of all the
cards used within a 24-hour period along with the amount of cash dispensed
to each.  The "cycle limit" is either pre-defined (according to the "type"
of card) or read from the card itself.  So, if the host is down, you may be
able to withdraw up to your daily "cycle" limit at 23:55 and again at 00:05,
but only every two days.  If the cycle limit is recorded on the card, by
re-writing that field you may also be able to withdraw virtually unlimited
amounts of cash (again, if the host is offline).

If the controller is down, the ATMs will be closed, but usually the
controller is more stable than the host.  In the event of hardware
failure the only solution is a "hot" backup controller which can be
switched over to resume processing, albeit after a brief interruption
of service.  If more than one controller is attached to the host, then
each will maintain its own cycle file; if you knew the network you
could withdraw your cycle limit from each.


Taking the 5th

<LEICHTER-JERRY@YALE.ARPA>
6 DEC 1986 10:35:22 EST
I asked a lawyer friend about this issue - a criminal with encrypted records
refusing to divulge the key, citing the fifth amendment - a couple of years
back.  His strong feeling - and, of course, until someone actually pushes such
a case, probably to the Supreme Court, all you can GET are feelings - was that
there was no way a court would uphold such a claim.  The Fifth Amendment lets
you refuse to provide information ABOUT possibly-criminal activities.  It does
NOT allow you to avoid turning over evidence.  In general, the courts guard
their rights to obtain evidence very jealously, and interpret limitations on
those rights as narrowly as they possibly can.  (Consider the various "shield
laws" that states have passed to allow journalists to protect their sources.
Even with fairly explicit laws on the books, courts, when they've found a need
for journalists' testimony, have found ways to force it.)

In practice, I doubt it makes much difference.  The worst that is likely to
happen to you for refusing to testify is a couple of months in jail.  (There
are typically two stages:  The court first jails you "until you reconsider
your refusal".  In principle, this can be forever.  In practice, when it
becomes clear that you will not change your mind, we move to a second stage,
where you may or may not be held in contempt of court.  I don't know what the
maximum sentence for contempt is, but typical contempt sentences seem to be a
couple of months.)  So a real criminal is likely to see this as an excellent
trade-off.

This whole issue, BTW, illustrates an interesting point.  Those of us who are
heavily involved with computers, networks, and so on, as technologists, tend
to see what we do as entirely new and unprecedented.  Lawyers tend to view
EVERYTHING as a variation of some precedent.  It's been my experience that the
lawyers are usually closer to the truth.  You really don't need computers to
encrypt bookie's records - bookies have been doing that by hand for years.
(Perhaps you can figure out the quantities of money, but no bookie worth his
salt leaves customer's names in his records in any recognizable form.)

In fact, you don't need to consider encryption AT ALL in deciding whether
the Fifth Amendment applies in cases like this.  Consider, for example, an
arrested man found in possession of an unmarked key to a safety deposit box.
It's very, very likely that the box contains valuable evidence.  Can he be
compelled to reveal where the box is?  I don't know, but I'm sure similar
cases have arisen over the years.
                            -- Jerry

Please report problems with the web pages to the maintainer

Top