The RISKS Digest
Volume 12 Issue 24

Wednesday, 4th September 1991

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Radiation therapy machine dose rate doubled by configuration error
Lawrence W. Berkley and James A. Purdy
summarized by Jon Jacky
Salomon Brothers — Database Design [anonymous]
Airworthiness Directive for 747-400 electrical system
Robert Dorsett
`Risk perception'
Phil Agre
Re: Risk Assesment High Priesthood
Robert W. Kerns
Re: A number is no name
EKristia...
Re: Re: +&*#$
Bob Frankston
Re: "Thieves Hit Social Security Numbers"
Urban Fredriksson
Re: Risks of a Universal Identifier
Martin Minow
Info on RISKS (comp.risks)

Radiation therapy machine dose rate doubled by configuration error

Jon Jacky <JON@GAFFER.RAD.WASHINGTON.EDU>
Wed, 4 Sep 1991 13:52:02 PDT
This incident was reported in a poster presentation at the annual meeting of
the American Association of Physicists in Medicine (AAPM) held in San Francisco
last July 21 - 25.  A brief abstract appeared in MEDICAL PHYSICS, 18(3),
May/June 1991, p. 608.  Some of the material quoted here will also appear in a
forthcoming AAPM Task Group 35 Report on Medical Accelerator Safety
Considerations.  Included here with permission.  Jon Jacky,
jon@gaffer.rad.washington.edu, University of Washington, Seattle

                       ==============================

Excerpts from Lawrence W. Berkley and James A. Purdy, "The Need for Better
Communication between Accelerator Manufacturers and In-House Service Engineers"

... This [incident involved] the switching of the "target and filter" interlock
boards between a Varian Clinac 1800 and a Clinac 2100C ...   Please note that
good quality assurance practices [at the clinic] detected the problem before
any significant increased dose was delivered to any patient.

A PC board intended for use on a Varian Clinac 2100C was placed in a Clinac
1800 by an in-house service engineer [an engineer was on the clinic's staff,
not the vendor's staff]. ...

The boards for the two machines had the same part numbers. ...  EPROM's on the
boards for the two machines were programmed differently due to  [to accommodate
the different characteristics of] different types of ion chambers [present on
the two different accelerator models].

When the incorrect PC board was in the Clinac 1800, the calibration changed by
over 100% for some beams. [A single therapy machine can produce several types
of beams that differ in particle (photons or electrons) and energy]. ... For
these beams, the calibration changed from 1.00 cGy/MU [centigrays (a unit of
radiation dose) per monitor unit (the indicator on  the therapy machine display
screen or control panel)] to 2.08, 2.51, and 1.09 cGy/MU for the 9, 12, and 16
MeV [electron] beams respectively.  This was due to reduced ion chamber
sensitivity.  ...

The dosimetry error was detected during a routine constancy check of beam
output.  [It is usual good practice to check each machine's internal dosimetry
system by making frequent independent measurements at the clinic with equipment
that is entirely separate from the therapy machine.  The "morning constancy
check" is usually performed every day.]

When the incorrect Target and Filter interlock board was placed in the Clinac
1800 ... no dosimetry fault was tripped and the machine appeared to be running
normally.

Varian was aware of the possibility for this to occur but did not alert new
owners of the Clinac 1800's from 1987 to 1990. ... Although notice was sent
to users of the equipment in 1986 following a similar incident, new owners were
not notified of the potential problem. ...

The fact that the incident described is for a Varian accelerator, is not
intended to imply that such problems exist only with the Varian organization.
We feel strongly that it is an industry wide problem and a solution must be
found rapidly to avoid any serious consequences. ...

The report relating to this incident issued by the Problem Reporting Program of
the FDA failed to mention the large change in machine calibration ...

Recommendations:

A cumulative list of problems unique to each model of accelerator should be
maintained by all manufacturers for their models.  This list should be made
available to all existing users and should be brought to the attention of new
purchasers of accelerators. ...

Medical physicists should be constantly aware that accelerators are capable of
large changes in calibration with no indication of a problem.  This reinforces
the need for frequent output checks.

Medical physicists should be aware that manufacturers and the Problem Reporting
Program of the FDA may temper their notifications to users such that serious
problems appear to be fairly benign.


Salomon Brothers — Database Design

<[anonymous]>
Tue, 3 Sep 91 08:57:07 EDT
The recent Salomon Brothers securities scandal was caused in part by sloppy
database design according to an employee in the database programming department.
Normally whenever there is a buy or sell order, several "confirmations" are
sent to individuals of the represented organization.  Four traders were
able to exploit this system by setting the number of confirmations to zero
and subsequently trading in an unauthorized and unsupervised fashion.

Among the many changes in the Salomon Brothers firm, a new requirement for
management to authorize setting confirmations to zero is being implemented in
their software along with a new audit trail of the confirmation process.
Suprisingly, no confirmation is actually desired by some organizations.

The problem is certainly not new to readers of risks and the proposed solution
is not particularly inspired.  Another unread audit trail or a scandal at the
next level of heirarchy both seem possible, unaddressed, and unacknowledged.
What is new is the fact that the Salomon Brothers scandal has been
international incident with severe political and economic consequences.
Butterfly wings and programmer fingertips can both cause hurricanes.

Of course, had the traders set the confirmations to `O' instead of `0',
Stephen could be the most influential man on Wall Street...

It should also be noted that Salomon Brothers does not allow direct access
of the programming department to Internet or Usenet and is a company that
routinely monitors electronic mail.


Airworthiness Directive for 747-400 electrical system

Robert Dorsett <rdd@rascal.ics.utexas.edu>
Tue, 3 Sep 91 22:12:04 CDT
>From the Federal Register 56:159, August 16, 1991, pp. 40773-40774.  Ties
into discussions of common-cause-of-failure cases on RISKS a couple of
years ago.  The 747-400 is Boeing's newest 747, featuring a glass cockpit
with conventional hydromechanical flight controls.  The aircraft was
rolled out January 26, 1988.

"Summary: This amendment adopts a new airworthiness directive (AD), applicable
to certain Boeing Model 747-400 series airplanes, which requires rerouting and
adding shielded wiring associated with the differential protection current
transformers in the P6 panel.  This amendment is prompted by the results of a
Model 747-400 electrical system safety assessment, which demonstrated that the
potential exists for a single event causing the loss of all normal sources of
airplane electrical power.  This condition, if not corrected, could result in
the loss of all normal sources of electrical power to the airplane essential
busses, limiting power availability to that provided by the standby system.

"...No commenter expressed any technical objection to... the rule...  Two
commenters requested that the proposed compliance period of 180 days be
extended to 12 or 15 months so that the modification could be performed during
other scheduled maintenance...  One commenter stated that, since this AD is
based on a safety assessment, and not an actual occurrence, an increase in the
proposed compliance time to 15 months would not compromise safety; and since
the airplanes affected by this proposed rule are relatively new and most have
been only recently delivered, chafing of the affected wire bundles during that
time seems unlikely...

"One commenter ... recommended that ... compliance time be reduced to 60 or 90
days.  This request was based on the commenter's stated opinion of the dire
consequences of losing all electrical power on a long overwater flight..."

[ FAA doesn't concur with any of these: recommends that rule be adopted as
proposed]

"There are approximately 107 Model 747-400 series airplanes of the affected
design in the worldwide fleet.  It is estimated that 18 airplanes of US
registry will be affected by this AD, that it will take approximately 8
manhours per airplane to accomplish the required actions, and that the average
labor cost will be $55 per manhour.  The cost of required parts per airplane is
estimated to be $20.  Based on these figures, the total cost impact of the AD
on US operators is estimated to be $8,280."

Robert Dorsett               UUCP: ...cs.utexas.edu!rascal.ics.utexas.edu!rdd
Internet: rdd@rascal.ics.utexas.edu


`Risk perception'

Phil Agre <pagre@weber.ucsd.edu>
Tue, 3 Sep 91 16:46:16 pdt
Wm Gardner's message is helpful in that it makes clear just how much is
at stake in accepting or rejecting the basic validity of research on `risk
perception'.  First a few side issues.  I did not mean to suggest that
``attributing irrational judgment to ordinary people can make them seem
responsible for the morbidity and mortality they suffer.''  That may well
be true, but I haven't actually encountered that particular interpretation.
Also, the LA Times article I cited contained a quote from an interview with
a risk expert to the effect that ``ordinary people are unwilling to accept
any risk at all'' though I am afraid that I no longer have my copy of the
article.  I doubt if anyone would say this in an official journal, and the
rhetorical context was something in the spirit of irony or hyperbole, but
I have encountered such opinions regularly in conversation with people who
work in industries where `risk' is a controversial issue.  Next, when I said
that the `findings' quoted in the LA Times article are ``easily explained''
in such-and-such a way, I took that to be a figure of speech meaning,
in scientific language, ``those findings are equally compatible with the
hypothesis that such-and-such, so that (1) no conclusion of irrationality
can be drawn at this point and (2) in searching for an explanation, it is
necessary to explore other new directions of research.''  Finally, I want
to make sure that WG and others don't take me to be saying that most `risk'
researchers operate in bad faith.  I am sure that many such people share
a concern with responsible technology, however misguided I may find their
approach to it, and I should probably not have put the word `research' in
sneer quotes when discussing this work.  However...

WG and I have different views about the social context within which `risk'
research operates — or, more precisely, about the consequences of that
social context.  Whereas my argument emphasizes the role of corporate PR
and recent history, WG wishes to portray risk research as a scientific field
with a history and origins like any other.  Every scientific field tries to
back-date itself to distant origins and risk research cannot be singled out
for any special criticism in this respect.  Certainly `risk' is an old word
that has been used in various contexts for a long time.  Now, WG says, ``The
concept of risk in the sense used in risk perception studies dates (at least)
from the beginnings of epidemiology and from the integration of probability
into the theory of insurance in the 18th century.''  The examples given here
involve a natural phenomenon and a business-risk calculation performed within
a single firm.  But what is new and distinctive about contemporary `risk'
controversies is that the `risks' involve dangers to the public at large that
result from human (bureaucratic) action.  At this point, `risk' becomes part
of something altogether different.  Though WG says that ``[p]sychological
research on risk perception and probability judgments was well established
when Mobil ran its ads,'' I did not mean to imply that the whole thing
sprung fully grown from Mobil's ads.  What I did mean to imply is that, with
the growth of public controversy around the politics and morality of large
technology-based organizations, there has evolved a huge ideological machinery
that propagates things *like* those Mobil ads through all manner of media.
This is the important thing to understand about PR.  It's not just a matter
of quarter-page ads and talking heads spewing reassuring nonsense when things
blow up.  And it's not just a matter of people being bad.  It's a matter of
large corporations involved in extremely competitive businesses, where profits
or losses can turn on sustained public assent to marginal technological
projects.  Ten thousand Mobils large and small are at work as we speak, driven
by the imperatives of the market to attempt to capture and occupy the workings
of civil society.  The principal weapon in this battle is not cash or fast
talk; rather it is precisely the sort of ideology I outlined in my note.

The people of Bhopal may have some trouble distinguishing between this
motivated sort of `risk'-talk and the properly scientific sort.  The readers
of the LA Times are probably having a hard time too.  Even so, it is important
for us to reflect carefully on the conditions under which it is possible to
conduct responsible scientific research.  WG's message argues in effect that
we can judge `risk' research in isolation from its social context.  Can we?
Let us suppose that risk researchers, by and large, are willing to dissociate
themselves from (for instance) those Mobil advertisements.  (Remember the
ones — cited in Langdon Winner's essay on `risk' in his book ``The Whale
and the Reactor'' — about how `risk-taking' is a good old American value,
how `risk'-avoidance is thus un-American, and how the very essence of the
Americanness of companies like Mobil is all of the business risks that made
them what they are today?  The word `risk' is uncannily plastic in this sort
of way.)  Let us also suppose that an audit of the sources of research funding
for `risk' research would also be beside the point.  (I don't actually know
what such an audit would turn up; and I do indeed think that such an audit
would not, all by itself, mean anything.)  My own argument would be that the
logic of `risk', however unconscious or unintended, is inherently ideological
in the way my note asserted.  I do not pretend that this is an easy argument
to make.

I am glad that WG has chosen as his test case the sexual lives of young gay
men.  It is easy to concur with a call (not actually present in WG's note, but
presumably implied) for the wide dissemination of accurate information about
HIV and AIDS.  This is not the issue.  The issue is how this process should
proceed.  The example is an excellent one because the AIDS epidemic itself is
a terrific illustration of a non-unilaterial, community-based response to a
serious public emergency.  Gay community activists differ in their beliefs and
approaches, but collectively they have been an inspiring model of how people
can take scientific, medical, and social issues into their own hands.  My own
view, which I think many in this community would share, is that phrases like
``availability heuristic'' are intolerably dull instruments for understanding
and acting upon human questions.  The world of gay people is a complicated
place, shot through with the social and psychological effects of eons of
homophobic prejudice, often dressed up as the most extraordinary scientific
nonsense.  Terms like ``availability heuristic'' do not begin to scratch the
surface of the epistemological situation of a young gay man.  Around 1985 the
gay community decided that it was not going to wait around while people with
generalized expertise about `risk' and the like designed studies ``that can
powerfully discriminate among many competing plausible explanations,'' all
of them founded in ignorance and likely to be wrong.  Through detailed study,
community-based research, and vociferous activism, they changed the conduct
of American science for the better.  And, to the extent that they have been
permitted by the keepers of `public service announcement' space and the like,
they have also been conducting an imaginative and highly successful education
campaign, based on their own extensive discussions among themselves, which
not only says ``sex can give you germs'' but also places the epidemic within
a broadly drawn political and economic context.  *This* is what socially
responsible medicine looks like — and it is a success that ought to be
replicated by the victims of toxic wastes, unsafe workplaces, and several
other horrors of the market as well.

In conclusion, it is my argument that `risk' research is inherently complicit
with the ideology of Mobil oil so long as it persists in understanding `risk
perception' as a narrowly drawn cognitive matter, and not as a reflection of
rational responses to corporate sophistry and fundamental disagreements about
the social organization of technology and medicine.  Probability theory is not
wholly irrelevant, but the numbers only make sense in a very large context.
Until `risk' research decides to start at the beginning, it cannot *help* but
portrary ordinary people as irrational, jumping at shadows and losing out to
experts who are paid to tell us why we don't think that everything's alright.

Phil Agre, UCSD


Risk Assesment High Priesthood

Robert W. Kerns <rwk@Crl.dec.com>
Tue, 3 Sep 91 22:22 EDT
In regard to Phil Agre's remarks on PR and the Risk Assesement field: I too am
offended by most of what I see in this area.  (Note: What I see is a biased
sample, and not representative of all research done in the field).  I consider
the field to be seriously contaminated by self-serving interests, to the point
where I don't trust ANY conclusions presented, unless I'm also given the data,
methodology, and chain of reasoning.

The most fundamental flaw in my view, and one I haven't seen discussed except
peripherally, is the assumption that risk can be reduced to comparing numbers
of deaths, with all deaths being equal.  Whenever the public at large doesn't
agree with this, "public reaction" is labeled as being irrational.  It is the
strong parallel here between religious dogma and Risk Assesement
"professionals", which leads me to term this a High Priesthood.  Time and time
again, I see results which are surprising when viewed from a "all-deaths-equal"
viewpoint, used to argue that people's perceptions of risk are flawed.

I KNOW there's value to scientific methodology in analyzing risk, and I would
like to make use of it in forming my opinions.  But when most of what I see
masquarading as analysis is contaminated with this religious assumption, I am
really thwarted in using Risk Analysis.

To take one oft-cited example, coal vs nuclear, and twiddle with it a bit to
make what I'm talking about clear.  (Numbers invented; I'm illustrating a RA
concept, not comparing coal and nuclear).

  Coal:  5 death per 10,000 man-years.
  Nuclear:  1 death per 10,000 man-years.

Sounds like Nuclear is the clear winner here, right?

But let's consider a couple scenarios:

1)  Coal:  Equal geographic distribution of risk over area of benefit.
    Nuclear:  Risk concentration around the plant.

2)  Coal:  Constant, predictable rate of deaths.
    Nuclear:  Low rate, except in rare accident, resulting in very
    high cost of health care, entire families wiped out.

The results I've seen indicating that people's perception of risks are "skewed"
to me indicate that people are more adverse to risks with particular
characteristcs:

*  Unfair distribution
*  High concentration; that is occasional disaster is worse than
   continual high risk.
*  High subsidiary cost, such as an area of land rendered
   uninhabitable, or high health-care costs.
*  Low amount of individual control over individual risk factors.
*  Risks whose assesements are based on questionable data or from
   sources whose veracity is suspect.

To me, this seems to be a much more rational approach than reducing
the debate to numbers killed by coal or nuclear.

>From this, you might assume that I am wildly pro-coal and
anti-nuclear, which would exagerate my position considerably.
(I'm anti-coal, and consider chemical waste to be more serious
than nuclear.  Nuclear waste decays, but heavy metals are forever!).

I see the same phonomena operating in assesement of air-travel risks, where
people are more concerned with air accidents, which wipe out entire families,
and over which there is little personal control and choice, than over
automobile travel, which is far more dangerous.  This concern is viewed as
non-rational, but I find it eminently rational.  Killing off and injuring large
numbers of people at once overloads our mechanisms for dealing with tragedy, by
overloading emergency health-care, wiping out large segments of families,
wiping out entire upper-level management of companies, or rock groups, or
what-have-you.

There's more structure to risk than a single scaler value; deaths per 100,000
deaths per {man-hours,passenger-miles, etc.}, and these scalar numbers are not
what society tries to optimize.

Real scientific research would try to go to the next step, and model and
quantify what people really DO try to optimize.

But it's easy to look scientific if you ignore this and say "see, it all
reduces to this number, which scientifically PROVES it."


Re: A number is no name (RISKS-12.20).

<EKRISTIA@estec.bitnet>
Tue, 03 Sep 91 08:52:17 CET
In his accompanying note, PGN is musing whether characters traditionally
considered non-alphabetic, such as "!" (I hope the exclamation mark translates
correctly from my IBM PROFS to the machine running the list) could be a name.

I recall reading in Scietific American some time ago obout a people somewhere
in Africa named !KUNG. The exclamation mark is part of the name and is
pronounced as a [Zulu] "click" (as in Miriam Makeba's "Click song" ).

In fact, this is just an example of a much broader issue: National alphabets.
Computer alphabets like ASCII and EBCDIC originated in an English-speaking
country and consequently knows only the 26 letters of the English alphabet.
Most other countries using, basically, the Roman alphabet, however, have a few
more characters. Examples: a, o, u, with two dots on top ("umlaut") in German
and Swedish; a with a small circle on top in Danish, Swedish, Norwegian; n with
a tilde in Spanish; accents in French, Spanish and Turkish, and many more.

National extensions to e.g. ASCII exist, but unfortunately they tend to overlap
and to "steal" characters like the square brackets away from the US-ASCII.  The
result is that names (or anything else) containing one of the national
characters may print quite differently on a computer assuming another national
alphabet. I recently printed a C program on a German PC. All the square
brackets came out as A-umlaut and o-umlaut, respectively. Not very readable!

With the increasing mobility of people, more and more people end up in places
where computers cannot handle their names properly because they contain
characters which are not in the alphabet of the country of residence. In
practice, of course, there are work-arounds like representing special letters
by letter combinations which are phonetically reasonably close. But this is not
very satisfactory.

I wonder how your legal situation is if you refuse to accept documents, say,
from public authorities, as long as they cannot spell your name properly??  (I
am a Dane living in Holland, but I have no "non-standard" letters in my name,
so I cannot put it to a test).


Re: Re: +&*#$ (Blinn, RISKS-12.22)

<frankston!Bob_Frankston@world.std.com>
3 Sep 1991 20:56 -0400
Admittedly I wasn't trying to be exact in my representation of NH plates.
But the response reminds us that the Risks of Technology makes us understand
existing risks better.  The problem of encoding license plates is present
even in paper systems.  Is a "-" significant?  Might there be both 12-134 and
121-34 or, even, 12<space>13-4?  How about +RMF+ (which I've seen on an NH
plate)?  Obviously the Maine Lobster is decoration.  Or is it?  Does NH
purposely create a situation that makes it less likely for their drivers to
get out of state tickets since other states can't refer to the plate numbers?

I once had my office manager order property stickers.  Since there were to be
both removeable and nonremoveable stickers I ordered some of each in
different colors.  I didn't expect her to order the same series of numbers (1
to 1000) on both sets. Obviously she saw the color as a significant
distinction while I was assuming it was unrelated to the actual numbering.

In Massachusetts the number series are reused for each class of plates.  Thus
the Taxis are numbered from 1. (The Governer, however, is G-1 because one of
the previous Governers refused to return his "1" plate — politics triumphs
over technology).  Of course the fact that there aren't check digits in
license plates is also naive considering the importance of accuracy in
recording the numbers.


Re: "Thieves Hit Social Security Numbers" (RISKS-12.23)

Urban Fredriksson <urbanf@yj.data.nokia.fi>
Wed, 4 Sep 91 10:49:58 EET_DST
One reason SSNs are seldom 'stolen' in Sweden is the fact that they are public.
For important things, nobody [should] think I am I, just because I know my
number.

If I take out a loan card at a library, they will want my number (so if I don't
return books and move they can track me), but they won't give me a card without
seeing an ID.

When I started accounts in a bank I hadn't done business with before, they
asked me for my number, but no ID.  Unsafe you say? Well, they also didn't ask
me for my address, and didn't give me anything at their office.  They sent all
confirmation papers to my address anyway, so I had to sign for them at the post
office.

But there have been cases of serious SSN abuse: One man, a drug addict, went to
a hospital wanting aid, and gave them his brother's SSN. For _security_
reasons, no hospital computer records may be erased, so now the brother has got
a permanent record of beeing a drug addict.

And at the same time, the health services doesn't use our SSNs for keeping
track of what prescribed drugs we are given, so you can go to 20 doctors, and
be given 20 prescriptions for the same (mildly) narcotic drug.

Urban Fredriksson, Stockholm, Sweden.


re: Risks of a Universal Identifier

Martin Minow <minow@ranger.enet.dec.com>
Tue, 3 Sep 91 13:11:41 PDT
I used to live in Sweden where there is a universal identifier, assigned to
natives at birth, and to immigrants when they get a residence visa (but not
to tourists).  There are a number of advantages:

-- when you move, you fill out one postcard and all of your magazine
   subscriptions (etc.) change, since all of the publishers subscribe to the
   "change of address tape."

-- since the "personnummer" is an official id and Sweden has an extremely
   strong data privacy law, there are safeguards surrounding its use.  The
   Swedish data privacy law controls information processing where there is a
   "risk for personal integrity." (For example, when I applied for an
   American Express card, I gave my bank reference. Two weeks later, my bank
   sent me — unsolicited — a copy of the credit report they sent AmEx. This
   wasn't because of the d.p. law, but illustrates the way private information
   s disclosed in Sweden.)

There are also risks — fewer, however than there would be in America, with
it's reliance on industry to do the right thing without unnecessary government
meddling in the workings of the free enterprise system.

By the way, and not entirely off the subject, there is an interesting
background to the now twenty-year-old Swedish data privacy law. The "change
of address tape" contains public information which MUST be made available to
any requestor. Public information, in Sweden, includes civil status,
profession, age, sex, weapon-license possession, and taxable income. As I
recall the story (from a Swedish newspaper), one of the incidents that led
to the d.p. law was the monthly purchase of the change of address tape by a
large American company in the credit-bureau business. When the government
discovered this, they realized that this was ideal material for "economic
espionage." Since access couldn't be restricted under the "Sunshine Laws,"
they restricted its use.

Like most laws, the Swedish d.p. law is only a few pages long.  If someone
could get me a current copy, I could try to knock together a translation
for Risks (or perhaps CACM).
                               Martin Minow      minow@ranger.enet.dec.com

Please report problems with the web pages to the maintainer

x
Top