The RISKS Digest
Volume 8 Issue 07

Sunday, 15th January 1989

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Re: Medical Software (Are computer risks different?)
Jon Jacky
Ground proximity warning
Bill Standerfer via Mark Brader
Aircraft
Dale Worley
You don't know what you've got till it's gone.
Phil Agre
Data integrity
Brent Laminack
Quality of Evidence
Bill Murray
D.Robbins' conclusions (Authenticity of Information)
Allan Pratt
Risks of trusting the press
Brad Templeton
Risks of Remote Student Registration: Another Interaction Story
Gary McClelland
Medical information systems
Jerry Harper
Info on RISKS (comp.risks)

Re: Medical Software (Are computer risks different?)

<jon@june.cs.washington.edu>
15 Jan 1989 18:13:46 EST
> (Regarding a posting on the Therac-25 radiation therapy accidents, Don
> Alvarez writes)  ... The problem of testing and validating advanced 
> hardware is not any way unique to computers.  (Then he gives examples
> of accidents that might arise from people abusing non-computerized 
> equipment: trimming hedges with electric lawn mowers and putting portable
> radios in the shower).

I work in radiation therapy and just finished a lot of research on the Therac
accidents, and there are two points I would like to make:

First, the Therac accidents were *not* examples of people abusing equipment
contrary to instructions, as in the examples Don gives.  The accidents happened
because the machine included faults in software and, many would argue, 
additional design errors in the hardware which provided insufficient protection
against software faults.  It is arguable that the clinics do bear some 
responsibility also, because they continued to use the machines after they
had some evidence that there were problems with the machine --- but faults
in the machine were the source of the problem.

Second, are computer-controlled devices a *special problem*?  Overall, I
agree with Don that the problems of testing and validating machinery are
broadly similar whether the machinery includes a computer or not.  However,
we currently have a special problem with computer-controlled devices because
industry practices in software development are often much worse than for
other kinds of technology.   The Therac is a glaring example of this;
the physical design of the radiation-producing apparatus was considered
superb; the control system, and in particular the software (it is now clear)
were very poor, relative to the safety requirements of this application.
Therefore, I do not think that articles in the press (or RISKS postings)
devoted to this problem are in any way analogous to "blinders"; rather,
they are well-deserved attention to a problem that ought to be fixed.

In particular, it is very important to understand that people are not picking
on the Therac-25 just because the faults involved a computer.  This machine
was more dangerous than machines with similar functionality that were
not computer controlled, even the ones built by the same manufacturer.
The particular hazard manifested in the Therac accidents has been 
well-understood since a similar series of accidents with one of the first
(non-computerized) accelerators in 1966.  Evers since, this hazard has
been adequately handled in most machines by non-programmable hardwired
interlocks.

It is reasonable to expect that successive product generations that introduce
new technologies should represent progress overall.  When a new product
turns out to be *less* safe than its predecessors, that is newsworthy. 

- Jonathan Jacky, University of Washington


Ground proximity warning

Mark Brader <msb@sq.sq.com>
Sat, 14 Jan 89 03:18:58 EST
         [Gerald McBoeing-Boeing and the Near-Sighted McCrew?]

Path: sq!geac!yunexus!utzoo!utgpu!watmath!clyde!att!pacbell!ames!pasteur!
      agate!ucbvax!hplabs!hpda!hpcuhb!hpcilzb!bills
From: bills@hpcilzb.HP.COM (Bill Standerfer)
Newsgroups: rec.aviation                                  [with one typo fixed]
Subject: Boeing Sense of Humor?
Date: 10 Jan 89 16:37:33 GMT
Organization: HP Design Tech Center - Santa Clara, CA

I was paging through a recently acquired 727 manual and came across this little
gem of wisdom.  (GPWS is the ground proximity warning system.  It tells the
crew when the ground is getting too close for what they're doing.)

     "Note: the GPWS will not provide a warning if an airplane is flying
     directly towards a vertical cliff."

Gee, thanks.  I'll keep that in mind. :-}

Bill Standerfer, KG6FQ — hplabs!hpdtc!bills — bills%hpdtc@hplabs.hp.com
Hewlett Packard Design Technology Center
5301 Stevens Creek Blvd., Santa Clara, CA  95052 — 408-553-3139
Restoration crew chief - B-29A and KC-97L - Castle Air Museum


Aircraft

Dale Worley <worley@compass.UUCP>
Fri, 13 Jan 89 18:18:02 EST
In reply to Steve Philipson's remarks about aircraft, a friend once pointed out
to me that fighter aircraft are designed to a lower safety standard than
civilian aircraft, "because if 1 in 1000 crashes due to mechanical problems,
that's far less than are lost due to combat" — as a matter of policy, some
safety is sacrificed for improved performance.

Mr. Philipson also wisely points out that people involved in technology should
point out to the public the risks associated with that technology, so that
intelligent policy debate can be carried out.  Unfortunately, new technology is
often sold as "risk-free", when it isn't.  Even more unfortunately, new
technology often won't be allowed by the public unless a (false) appearance of
no risk is maintained — people reject new technologies on the basis of risks,
even if larger risks are already accepted in old technologies.  (A bizarre case
is AIDS in the United States — the number of people who have ever died of AIDS
in the U.S. is less than the number who die yearly of motor vehicle accidents,
but we don't convene national commissions on motor vehicle accidents!)

Dale Worley, Compass, Inc.                      compass!worley@think.com


You don't know what you've got till it's gone.

<Agre@AI.AI.MIT.EDU>
Sun, 15 Jan 89 14:21 PST
By now we've seen several cases in which computer-based systems failed because
they did not implement features which had been implicit in the physical systems
they replaced.  Thus, for example, physical mechanisms do a great deal of
implicit sanity-checking, inasmuch as ten and ten thousand look much more
different when coded as angular velocities than when coded in binary.
Computational abstraction is attractive because it is less cumbersome than
physical realization, but cumbersomeness is very often a virtue in itself since
it assures that important parts of the world will tend to move at manageable
speeds.  Drawing up balance sheets of risks and benefits of various uses of
computer technology is a good and necessary thing.  The problem is that we've
always benefitted from the implicit virtues of physical objects without ever
having to articulate them.  The time to make a good, thorough list of these
virtues is now, before we've lost them for good.


Data integrity (Re: RISKS-8.5)

Brent Laminack <brent@itm.UUCP>
13 Jan 89 14:28:27 GMT
    A few random thoughts:

    Yes, the time is here when we can no longer believe photographs
we see published.  This even goes for the bastion of reliability:
The National Geographic.  At least two of their covers have been
digitaly retouched.  One was of two pyramids and a camel in the sunset.
One of the pyramids was moved over to fit the space requirements of
the cover.  Another cover was a photo of an old man somewhere in
the mid-east, I believe.  They liked his face, but also liked the
headdress another man was wearing, so they put the other headdress
on his head.  It looked real.  Painters have done this for years.
The Mona Lisa was a composite.  What is new is the technology for
doing it in a supposedly "trusted" medium.

    But this information is catching on.  A friend was in an auto
accident.  No one was hurt, but damage was done to the car.  One of
the parties took a Polaroid photo of the scene.  The attendant police
officer asked to see it.  He signed and dated it on the back.  Otherwise
he said it would be inadmissable as evidence.  His signature was there
to state that yes, that's the way things looked.

    As to evidence of computer crime, I believe U.S. Federal rules
regard whatever the computer prints out as "best evidence".  Scary.

    The intelligent gun brought to mind a friend who's an Electrical
Engineer.  An appliance manufacturer came to him to design an intelligent
toaster.  It has a knob on the front and an LED readout of the "brownness"
setting.  Unfortunately, all it is is a timer circuit that times how brown
the toast should be.  The old way of doing things (a bimetal strip)
had feedback from the active site.  Not so the new.  The intelligent
toaster with an open heating element will proudly pop up raw bread
after 90 seconds.  Worse yet, flaming toast could keep being heated 
until it's supposedly brown enough.

    On the computerization of hospitals, a friend of mine (who shall
obviously remain nameless) was working on software for a hospital.  One
project was the scheduling of IVs.  A typical regimen would be to administer
a unit of saline mixed with some drug every six hours.  i.e. noon, six p.m.,
midnight, six a.m., etc.  Daylight saving time then happened.  Being a good
UNIX system, it carried right on: noon, six p.m., midnight (time change)
seven a.m., 1 p.m., etc.  The hospital was up in arms.  They claimed the IVs
were an hour late.  My friend had to give in.  So now between the midnight
and six a.m.  doses, there may be five or seven hours depending on the time
change.  The administration wasn't particularly worried about over or under
medicating the patients.  Doses around 2 a.m. tend to get skipped.  Moral:
don't leave your money in the bank around the year 2000, and don't check
into a hospital around daylight savings time changeover.
                                                             brent laminack 


Quality of Evidence

<WHMurray@DOCKMASTER.ARPA>
Fri, 13 Jan 89 14:03 EST
>Are we approaching the point (or have we reached it already?) where
>truth is, for all practical purposes, whatever the computer says it is?
>Where what is accepted as truth is easily manipulated by those who are
>privileged to have access to the digital keepers of truth?

Recently, in an archeological excavation in the middle east, a large stone
tablet was unearthed.  Scholars determined that it was an ancient audit
report, complaining about the use of papyrus scrolls by the scribes.  It was
clear that such scrolls lacked the evidential integrity of stone and clay
tablets.

As recently as when I got into data processing, auditors were complaining
that punched cards lacked the integrity of ledger cards.  I had to work very
hard to convince the auditors that the new batch controls were equal to the
transaction-by-transaction controls to which they were accustomed.  There is
a cruel irony to the fact that I am still here to hear them complain about
the passing of batch controls and the return to transaction controls.

The more things change, the more they stay the same.  What goes around, comes
around.  Those who fail to heed the lessons of history, are doomed to repeat
them.

The same computers that enable us to manipulate records, also enable us to make
so many copies that no one person can alter them all.  The same computers that
enable us to digitize an analog record (e.g. a photograph), manipulate it, and
return it to analog, also enable us to create digital signatures to make any
such tampering obvious and the absence of such tampering equally obvious.

In the nineteenth century wills and contracts were expected to be hand written.
When the typewriter came along, they continued to be hand written for some time
for reasons of admissability as evidence.  Today, a hand written will is
suspicious.  Even though digitally signed wills and contracts are orders of
magnitude more difficult to forge than typewritten ones, type written documents
will like survive, even be preferred, for two more decades.

There was a time when the testimony from memory of the elders was preferred to
written records.

In this context, it is interesting to note that a vanishingly small number of
transactions are disowned.  Almost none are litigated.  A single forgery hardly
ever carries the day.  Hardly ever is the record of the contract at issue; it
is almost always the intent.

Written on the list of heresies and other words I try to live by, it says
"there is no truth, there are only hypothesies and evidence."  In the short
run, while we rethink our ideas of evidence yet again, the forgers may have a
field day.  I am not much worried for the long run.

William Hugh Murray, Fellow, Information System Security, Ernst & Whinney
2000 National City Center Cleveland, Ohio 44114                          
21 Locust Avenue, Suite 2D, New Canaan, Connecticut 06840                


D.Robbins' conclusions (Authenticity of Information)

Allan Pratt <apratt@atari.UUCP>
Fri, 13 Jan 89 11:57:40 pst
In RISKS volume 8 issue 5, Dave Robbins writes:

> We have no practical means of verifying the integrity of 
> such large volumes of information, and are thus left with no choice but 
> to trust that the electronic records are accurate. 

On the contrary.  Our other choice is to REFUSE to trust the accuracy of
the records.  If there is a computer record of at $100,000 withdrawal
from my savings account, the bank does not have to trust the record. 
The computer record is circumstantial evidence: it might provide useful
insight for further investigation, but it is not to be trusted as
conclusive proof. 

> It is wholly 
> impractical, for example, for the Social Security Administration's 
> entire data base (how many hundreds of millions of individual records?) 
> to be manually audited to verify its accuracy.

It would be no less impractical if all that information were on 3x5"
cards.  When dealing with volumes of information like this, you accept a
certain RISK of fraud and error as the norm, and investigate (manually
audit) the most egregious cases.  You can't blame computers for causing this
situation, and I think you'll have to give them credit for helping
ameliorate it.

Opinions expressed above do not necessarily — Allan Pratt, Atari Corp.
reflect those of Atari Corp. or anyone else.      ...ames!atari!apratt


Risks of trusting the press

Brad Templeton <brad%looking.uucp@RELAY.CS.NET>
Fri Jan 13 14:32:20 1989
The Hacker's Conference episode is just one of many.  Readers of USENET
last month closely followed attempts by the press to shut down my own
moderated newsgroup.  As in the CBS case, where you were "guaranteed" that
the story would put you in a good light, the reporter who interviewed me
acted in a very sympathetic manner.

Ha.

With most reporters I have encountered in this area, the fact is this:
If the reporter decides in advance that you're a wrongdoer, then just
about anything is ethical to get the story.  In particular, they will
pretend to agree with you and indicate that they are writing a favourable
story.   After all, it's not unethical to lie to criminals to get them to
expose themselves, is it?

This is general advice, but we must be particularly careful when it comes
to public exposure of modern technology.   People are predisposed to
fear it.  People are now predisposed to link hacker with criminal.  People
are predisposed to link "computer network" with "underground."
Watch out for this.  If you suspect the slightest bit of prejudice, clam
up.  Don't trust a word they say — their motives are not yours.

The image of technology is very important to RISKS.  It controls what
technologies people will trust, and how they will trust them.


Risks of Remote Student Registration: Another Interaction Story

<MCCLELLAND_G%CUBLDR@VAXF.COLORADO.EDU>
Mon, 28 Nov 88 09:54 MDT
An anonymous contributor in RISKS-7.82 notes the dangers of computer course
registration procedures using touchtone phones.  Our university also has the
same system and also uses the easily accessible SSN and birthdate as id's and
passwords.  Those risks are bad enough but I'm more fascinated by the risks
produced by the unexpected interaction among new computer technologies.  Our
university is much more concerned presently with getting computer registration
to work right than about security of the system.  Last semester, the system's
first run, many more students than anticipated had incomplete schedules because
the computer, not knowing any better, actually enforced prerequisites that had
long been ignored, blocked out the entire three hours scheduled for a lab that
everyone knew really only lasted one hours, etc.  This meant an astounding
number ("astounding" means about 30 times more than the system was designed to
handle) of students had to complete their schedules in a two-day period using
touchphones and a few scattered terminals to drop and add courses.  Of course,
most students trying to call the computer got busy signals.  Now here's the
interaction: not long ago the university also installed a fancy local switch
that gave all campus phones, including one in every dorm room, all sorts of
fancy features.  Not only was automatic redialing available, but also a cute
feature that calls you back when the busy line you are calling becomes free.
No telling how much of the switch's resources are required for that little
goodie.  The obvious outcome was that both the computer registration system AND
the campus phone network were brought to their knees.  Smart students then
figured out they were better off calling from off campus even without the
auto-redial features, but then the whole community phone system became
sluggish.
                                 Gary McClelland


Medical information systems

Jerry Harper <jharper@euroies.UUCP>
Fri, 6 Jan 89 12:29:32 GMT
A few weeks ago I mentioned the problems that the Irish Department of Health
had with the MCAUTO IRELAND installed system. I would be very grateful if
anyone reading this with experience of said system in the US would take the
trouble to email me their observations.  

Please report problems with the web pages to the maintainer

x
Top