Forum on Risks to the Public in Computers and Related Systems
ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator
Volume 6: Issue 20
Tuesday, 2 February 1988
Contents
Unusual Computer Risk -- Harem Scarem?- Mike Bell
Mistaken AIDS warnings- Al Stangenberger
Human error vs human error (and bad design)- George Michaelson
Technology Transfer Policy- Henry Spencer
Re: Blowing Whistles or Blowing Smoke?- Ronni Rosenberg
Dan Franklin
Jonathan Kamens
Phil Agre
Steve Philipson
Frank Houston
Re: Virus anxiety expressed in NY TIMES- Amos Shapir
Info on RISKS (comp.risks)
Unusual Computer Risk -- Harem Scarem?
Mike Bell <mcvax!camcon!mb@uunet.UU.NET>
1 Feb 88 13:17:06 GMT
Reproduced from COMPUTER TALK - 1 February 1988 COMPUTER PROGRAM DRIVES ARAB TO SEXUAL EXHAUSTION A Saudi Arabian millionaire thought he was heading for conjugal bliss when he had the bright idea of organising his harem by computer. Unfortunately his plan misfired. Instead of leaving him with the satisfied smile of a clever Cassanova, Saleh-el-Modiia's rigorous regime left him completely knackered. A fact which one of his four wives tearfully related to a newspaper in the Saudi city of Riyadh. "The computer has gone haywire. It's making Saleh too exhausted... he just falls asleep in my arms", she said. The computer devised a weekly schedule for the 38-year-old failed Lothario after he had keyed in his wives ages, birthdays, clothes sizes and medical details. The schedule told him who to go to see, what to wear, and what he was meant to do. But even though Modiia's wives are complaining, he refuses to ditch the computer. "It's only gone wrong once. That was when I was in hospital and all four wives came to visit me at the same time", he said. Mike Bell UUCP: ...!ukc!camcon!mb or mb%camcon.uucp +44 223 358855 Cambridge Consultants Ltd, Science Park, Milton Road, Cambridge CB4 4DW [I saw this a while back, but I don't recall it appearing in RISKS. PGN]
Mistaken AIDS warnings
<forags@violet.Berkeley.EDU>
Tue, 2 Feb 88 08:44:07 PST
I heard a report on KCBS this morning that two Berkeley hospitals have mistakenly sent letters to an unknown number of former patients warning that they might have been exposed to AIDS through contaminated blood transfusions. Naturally, attributed to a computer error. Al Stangenberger Dept. of Forestry & Resource Mgt. forags@violet.berkeley.edu 145 Mulford Hall uucp: ucbvax!ucbviolet!forags Univ. of California (415) 642-4424 Berkeley, CA 94720
Human error vs human error (and bad design)
<munnari!ditmela.oz.au!george@uunet.UU.NET>
02 Feb 88 14:06:09 +1100 (Tue)
There is an interesting article in "New Scientist" of 21st January '88
titled "The Zeebrugge-Harrisburg syndrome" which broadly speaking is
about the crossover between human error and bad design.
(article by Stephen Pheasant, two extracts without permission):
1. Three Mile Island:
"...Another example of catastrophic system failure in which ``human error''
is generally acknowledged to have played a critical role took place at the
Three Mile Island Unit 2 nuclear reactor .... They thought that the reactor
was in danger of ``going solid'', that is, overfilling because they were
unaware that a relief valve was open and that water was flowing out almost
as quickly as they were pumping it in. The Status of this indicator
changed when a control signal was sent to the valve, rather than when the
valve itself closed. It was technically easier to do it that way and
nobody had ever thought the difference would be important."
2. A British Motor Car
"...basic error-including mechanisms may have consequences which range
from the catastrophic to the trivial. The Headlamp switch on a certain
British motor car is mounted on the left hand side of the steering column
and is pushed for ``on'' contrary to the general stereotype. On leaving the
vehicle it is easy for the driver to operate this switch accidentally with
the knee. The worst that can result is a flat battery but in another context
(such as the cockpit of an aircraft) the accidental operation of a control
could be catastrophic..."
I'm sure the former item is well known to many (apologies if raised before in
this forum) and I bet there are more examples of "lazy engineering" decisions
having massive consequences.
George Michaelson
ACSnet: george@ditmela.oz.au
Postal: CSIRO, 55 Barry St, Carlton, Vic 3053 Phone: (03) 347 8644
Technology Transfer Policy
Henry Spencer <henry%utzoo.uucp@RELAY.CS.NET>
1 Feb 88 20:09:47 GMT
One negative consequence of the US's attempt to apply its technology-transfer
rules to foreign nationals outside the US is that it makes international
agreements much more difficult. One of the (several) problems that has been
stalling negotiations on international participation in the space station
is that the US wants its technology-transfer laws to apply to foreign users
of the station as well, and the would-be partner countries find this
outrageous and unacceptable.
Henry Spencer @ U of Toronto Zoology {allegra,ihnp4,decvax,pyramid}!utzoo!henry
Whistle Blowing
Ronni Rosenberg <ronni@VX.LCS.MIT.EDU>
Tue, 2 Feb 88 15:02:27 est
In response to the recent RISKS article that bashes whistle-blowing (Guthery, "Blowing Whistles or Blowing Smoke?", RISKS 6.19), I again want to defend whistle blowing as an ethically responsible -- sometimes ethically required -- action for some engineers in some circumstances. Guthery writes: "the very last thing a whistle blower is interested in is accepting responsibility," a claim that is not supported by the literature on whistle blowing. Whistle-blowing engineers typically are responsible for some aspect of a system's current use, not its original engineering. In this sense, they are concerned about problems that others caused; e.g., Roger Boisjoly did not design the original shuttle O-rings, but he was responsible to some degree for their effectiveness. Complex systems are worked on by so many people, for so long, that the original engineers are likely to be gone by the time the system begins to be used and a problem arises -- assuming one can even determine who was responsible for the original work. Is pointing out a critical problem in one's area of responsibility, when one becomes aware of it, really just "pointing my finger at everybody else in sight"? Guthery's other main point, that "most whistle blowing has to do with how the papers were shuffled and the most predictable aftereffect of whistle blowing is still more bureaucracy," also is not supported by the literature. The whistle-blowing case studies that I've seen had to do with conscious decision- making to reject the concerns raised by engineers (as in the Boisjoly case, where Morton-Thiokol manager appear to have knowingly decided to launch with unsafe O-rings). Entrenched bureaucracy clearly is a problem, and most of the cases I've read about took place in very large organizations, and it is hard to get things done via bureaucracy. But like it or not, most engineers work in large organizations with a lot of money at stake, and you cannot enact major changes any other way. The results of whistle-blowing often are not just paper shuffling; sometimes they are saved lives or safer systems. Is the assumption that only papers will be shuffled just a rationalization for remaining silent when you should speak out? I couldn't agree more with Guthery's statement that "I don't think we know enough about building computer systems to build good systems without making mistakes," but I disagree with his conclusion that we should just be allowed to make our mistakes, without the annoyance of whistle blowers pointing them out. We have the right to make mistakes only if we (1) acknowledge up front that this is the way we have to work, and (2) do not put a system into use, particularly in a critical application, if we are not sure that it works. (1) Although the RISKS community seems to agree that many mistakes are made in any large system, for the most part, the computing "profession" does not admit this. The for-profit part of the industry claims -- through ads, sales people, grant proposals -- to deliver systems that work, period. But new products/systems are routinely delivered with many important bugs. Funders and customers get upset when they see what they really have to go through and spend to get a system that works reasonably well. Sometimes, as in the recent bank case, the customer abandons the whole project; you can be sure that time for "making mistakes" was not adequately built into the bank project. (2) Whistle blowers usually act in situations where critical systems are in use, don't appear to be working safely, but are alleged to be working fine. What gives us the "right" to make mistakes in such situations? All the literature on professional ethics agrees that people with special expertise, such as engineers, have a special OBLIGATION to inform and educate others, including the general public, about the limits and risks of the systems they build. I am upset to see in the RISKS Forum the standard technological enthusiast's argument, that people who criticize technology are just Luddites. Some critics are more concerned about the uses of technology than engineers, who as we know can get so wrapped up in the technology that they fail to consider the people whom the system will effect. Most whistle-blowers come from inside the system, are not normally inclined to get involved in nontechnical issues, and try every internal channel before going public. We owe them special attention when they raise problems. Before condemning whistle blowers because they've criticized a neat system, I encourage you to read about their cases and view the Boisjoly videotape (available for rent from CPSR/Boston). When you read about what they've suffered as a result of their complaints, and when you hear the anguish in Boisjoly's words, you may change your mind. For a good, readable discussion of engineering ethics, including several case studies of whistle-blowing, read Stephen H. Unger, CONTROLLING TECHNOLOGY: ETHICS AND THE RESPONSIBLE ENGINEER (New York: Holt, Rinehart and Winston, 1982). [The response here was almost unprecedented, indicating significant interest in the topic. Yes, the following messages contain MUCH overlap. However, in this case let me try not to reject or edit, and let the discussion speak for itself. You may skip the rest of the issue if you have had enough. PGN]
Re: Blowing Whistles or Blowing Smoke? [RISKS-6.19]
<dan@WILMA.BBN.COM>
Tue, 02 Feb 88 11:04:01 -0500
I find Guthery's reaction to whistleblowing bizarre. In none of the
whistle-blowing cases I've read about (including the ones in Nancy Leveson's
article) did the whistle-blowers immediately run to a phone and call the
Times as soon as they found anything wrong. They tried to straighten it out
with their superiors. Unfortunately, their superiors were part of the
problem! Guthery provides no advice for what to do in that case.
In Roger Boisjoly's case, not only his immediate superiors but several layers
of management above that simply didn't want to hear what he had to say.
In Sylvia Robins's case, she was FIRED. How on earth could she stay and fix
the problem then? I think her response--going to the NASA inspector general
and the FBI--was entirely appropriate. If she had immediately called the New
York Times, perhaps Guthery would have a case, but she didn't; she went
through what appropriate channels were left to her.
As Nancy Leveson's article showed, whistleblowers DO accept personal
responsibility for the quality of their work--and when their management makes
it impossible to turn out work that meets safety standards, they do their best
to get their management overruled. That will often entail contacting channels
outside the company.
Dan Franklin
The motivation behind whistle-blowing
<jik@ATHENA.MIT.EDU>
Tue, 2 Feb 88 12:55:43 EST
I cannot agree with the claim that, "What a whistle blower is saying to me is, `Something is wrong here and rather than fix it and risk being held even partially responsible, I'll make sure I'm perceived as being wholly blameless by being a really Good Person and blowing this whistle and pointing my finger at everybody else in sight.'" Instead, I think it might be more correct as follows: "What a whistle blower is saying is, `I have found something wrong with my organization. I have tried to remedy the situation through the proper channels, but I have been rebuffed and impeded every step along the way. The only way, therefore, to solve the problem is to step outside of the proper channels and to blow the whistle on the improprieties that are being propogated.'" Roger Boisjoly, the Morton Thiokol engineer who attempted to prevent the January 1986 Challenger launch, is an excellent example of the second type of whistle-blower. He realized that there was a problem and he did everything within his power both to bring the problem out into the open and to accept responsibility for remedying the situation. When his efforts were thwarted, he chose to go outside of normal channels and jeapordize his job. -- Jonathan Kamens | jik@ATHENA.MIT.EDU
us rationals, them luddites
<Agre@AI.AI.MIT.EDU>
Mon, 1 Feb 88 21:48 CST
Can you think of any cases of `whistle-blowers' who had actually had it in their power to fix the problems they were complaining about? Almost always they had spent a lot of time trying to go through channels before taking the great personal risk of going public. Almost always they encountered indifference or cowardice or mendacity among the `teams' within which they were supposed to be `players'. Besides, someone who blew a whistle on something they had the ability to fix would look pretty silly, wouldn't they? Do whistle blowers complain about `mistakes'? No. Most often they complain about lies. Falsification of test data. Systematic suppression of contrary evidence. People who design and implement and approve and release systems that they know will not work, that they know will be impossibly expensive to maintain, that they know will be dangerous. Are these things inherent in large organizations? If so then we have some hard thinking to do. Phil Agre
Re: RISKS DIGEST 6.19 Who's really blowing smoke?
Steve Philipson <steve@ames-aurora.arpa>
Tue, 2 Feb 88 12:31:26 PST
In a Risks digest on Monday, Feb 1,"guthery%asc@sdr.slb.com" puts forth
several ideas on "whistle blowers" that demand to be challenged. Guthery
states that whistle-blowers are not interested in accepting responsibility.
Case histories of whistle-blowers show this not to be the case. Many
such people expended a large amount of effort within thier organizations
working through normal channels to have problems corrected. It is only
after such attempts fail that these people have "gone public" or leak
information to appropriate agencies. The personal risk these people take
is very high -- they risk loss of their jobs and financial security
because they feel a moral imperative to right a wrong. These are exactly
the kind of people I'd trust with my security. Even before they went
outside of their organizations, these people were fired, harrassed, and
threatened with death or harm to thier families. In it unecessary to
cite cases here -- anyone who reads has seen enough of these to know
that at least some of them are real.
Guthery further argues that the only outcome of whistle-blowing activity
is to create more paper work, which produces no gain because bureaucracies
have no positive effect. If this is true, why not abolish all rules and
laws? This line of reasoning is faulty. Problems in our systems and
companies must be exposed to view and be corrected. Legal means are but
one mechanism. Public outcry is sometimes enough in and of itself as
companies are concerned with public image (and its effect on profits).
If we do not protect those who seek to protect us, then we are in
complicity with the wrongdoers. If we allow the whistle blowers to
be harrassed and injured, then we are as guilty of the crimes they
expose as those who commit them. It seems to me that it is not
the whistle blowers who are blowing smoke, but rather it is Guthery.
Steven Philipson, NASA/Ames
Smoke and Whistles, guthery, risks 6.19
Frank Houston <houston@nrl-csr.arpa>
Tue, 2 Feb 88 13:06:34 est
This may be a "flame", but since the subject is smoke, I decided to send it anyhow. I could not let guthery's comments about whistle blowers pass. What is whistle-blowing, anyway. I suggest that it assumes various forms, the most extreme being either calling reporters to disclose shortcuts that slight safety in favor of schedule or privately informing a customer of potential problems that are being ignored in your company's product or service. <... In other words, encouraging whistle-blowing provides a DISINCENTIVE to>![]()
Amos Shapir <nsc!taux01!taux01.UUCP!amos@Sun.COM> 2 Feb 88 15:05:53 GMT
Re: Virus anxiety expressed in NY TIMES (RISKS DIGEST 6.19)
jon@june.cs.washington.edu (Jon Jacky) writes: >May 13 will be the 40th anniversary of the last day Palestine existed as a >political entity; Israel declared its independence on May 14, 1948. ... >Israeli officials suggested a "Friday the 13th" coincidence, but Mr. Rakavy >said the virus was coded to ignore Nov. 13, 1987." Israel celebrates holidays according to the Jewish calendar; this year's independence day falls 3 weeks before May 13. I suspect November 13 was ignored just to let the virus more time to spread. (Note that this give us a clue to the time the virus was initiated). Amos Shapir National Semiconductor (Israel) amos%taux01@nsc.com 6 Maskit st. P.O.B. 3007, Herzlia 46104, Israel Tel. +972 52 522261
Report problems with the web pages to the maintainer