The RISKS Digest
Volume 6 Issue 20

Tuesday, 2nd February 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Unusual Computer Risk — Harem Scarem?
Mike Bell
Mistaken AIDS warnings
Al Stangenberger
Human error vs human error (and bad design)
George Michaelson
Technology Transfer Policy
Henry Spencer
Re: Blowing Whistles or Blowing Smoke?
Ronni Rosenberg Dan Franklin Jonathan Kamens Phil Agre Steve Philipson Frank Houston
Re: Virus anxiety expressed in NY TIMES
Amos Shapir
Info on RISKS (comp.risks)

Unusual Computer Risk — Harem Scarem?

Mike Bell <mcvax!camcon!mb@uunet.UU.NET>
1 Feb 88 13:17:06 GMT

Reproduced from COMPUTER TALK — 1 February 1988

COMPUTER PROGRAM DRIVES ARAB TO SEXUAL EXHAUSTION

A Saudi Arabian millionaire thought he was heading for conjugal bliss when he had the bright idea of organising his harem by computer.

Unfortunately his plan misfired. Instead of leaving him with the satisfied smile of a clever Cassanova, Saleh-el-Modiia's rigorous regime left him completely knackered. A fact which one of his four wives tearfully related to a newspaper in the Saudi city of Riyadh.

 “The computer has gone haywire. It's making Saleh too exhausted… he justfalls asleep in my arms“, she said.

The computer devised a weekly schedule for the 38-year-old failed Lothario after he had keyed in his wives ages, birthdays, clothes sizes and medical details. The schedule told him who to go to see, what to wear, and what he was meant to do.

But even though Modiia's wives are complaining, he refuses to ditch the computer.  “It's only gone wrong once. That was when I was in hospital andall four wives came to visit me at the same time”, he said.

Mike Bell  UUCP:  …!ukc!camcon!mb   or   mb%camcon.uucp    +44 223 358855 
Cambridge Consultants Ltd, Science Park, Milton Road, Cambridge CB4 4DW

Mistaken AIDS warnings

<forags@violet.Berkeley.EDU>
Tue, 2 Feb 88 08:44:07 PST

I heard a report on KCBS this morning that two Berkeley hospitals have mistakenly sent letters to an unknown number of former patients warning that they might have been exposed to AIDS through contaminated blood transfusions. Naturally, attributed to a computer error.

Al Stangenberger                    Dept. of Forestry & Resource Mgt.
forags@violet.berkeley.edu          145 Mulford Hall
uucp:  ucbvax!ucbviolet!forags      Univ. of California
(415) 642-4424                      Berkeley, CA  94720

Human error vs human error (and bad design)

<munnari!ditmela.oz.au!george@uunet.UU.NET>
02 Feb 88 14:06:09 +1100 (Tue)

There is an interesting article in New Scientist of 21st January '88 titled “The Zeebrugge-Harrisburg syndrome” which broadly speaking is about the crossover between human error and bad design.

(article by Stephen Pheasant, two extracts without permission):

  1. Three Mile Island:

    “…Another example of catastrophic system failure in which “human error” is generally acknowledged to have played a critical role took place at the Three Mile Island Unit 2 nuclear reactor … They thought that the reactor was in danger of “going solid”, that is, overfilling because they were unaware that a relief valve was open and that water was flowing out almost as quickly as they were pumping it in. The Status of this indicator changed when a control signal was sent to the valve, rather than when the valve itself closed. It was technically easier to do it that way and nobody had ever thought the difference would be important.”
  2. A British Motor Car

    “…basic error-including mechanisms may have consequences which range from the catastrophic to the trivial. The Headlamp switch on a certain British motor car is mounted on the left hand side of the steering column and is pushed for “on” contrary to the general stereotype. On leaving the vehicle it is easy for the driver to operate this switch accidentally with the knee. The worst that can result is a flat battery but in another context (such as the cockpit of an aircraft) the accidental operation of a control could be catastrophic…”

I'm sure the former item is well known to many (apologies if raised before in this forum) and I bet there are more examples of "lazy engineering" decisions having massive consequences.

George Michaelson
ACSnet: george@ditmela.oz.au
Postal: CSIRO, 55 Barry St, Carlton, Vic 3053   Phone:  (03) 347 8644

Technology Transfer Policy

Henry Spencer <henry%utzoo.uucp@RELAY.CS.NET>
1 Feb 88 20:09:47 GMT

One negative consequence of the US's attempt to apply its technology-transfer rules to foreign nationals outside the US is that it makes international agreements much more difficult. One of the (several) problems that has been stalling negotiations on international participation in the space station is that the US wants its technology-transfer laws to apply to foreign users of the station as well, and the would-be partner countries find this outrageous and unacceptable.

Henry Spencer @ U of Toronto Zoology {allegra,ihnp4,decvax,pyramid}!utzoo!henry


Whistle Blowing

Ronni Rosenberg <ronni@VX.LCS.MIT.EDU>
Tue, 2 Feb 88 15:02:27 est

In response to the recent RISKS article that bashes whistle-blowing (Guthery, “Blowing Whistles or Blowing Smoke?”, RISKS 6.19), I again want to defendwhistle blowing as an ethically responsible — sometimes ethically required — action for some engineers in some circumstances.

Guthery writes:  “the very last thing a whistle blower is interested in isaccepting responsibility,” a claim that is not supported by the literature onwhistle blowing. Whistle-blowing engineers typically are responsible for some aspect of a system's current use, not its original engineering. In this sense, they are concerned about problems that others caused; e.g., Roger Boisjoly did not design the original shuttle O-rings, but he was responsible to some degree for their effectiveness. Complex systems are worked on by so many people, for so long, that the original engineers are likely to be gone by the time the system begins to be used and a problem arises — assuming one can even determine who was responsible for the original work. Is pointing out a critical problem in one's area of responsibility, when one becomes aware of it, really just “pointing my finger at everybody else in sight”?

Guthery's other main point, that “most whistle blowing has to do with how thepapers were shuffled and the most predictable aftereffect of whistle blowing is still more bureaucracy,” also is not supported by the literature. Thewhistle-blowing case studies that I've seen had to do with conscious decision- making to reject the concerns raised by engineers (as in the Boisjoly case, where Morton-Thiokol manager appear to have knowingly decided to launch with unsafe O-rings). Entrenched bureaucracy clearly is a problem, and most of the cases I've read about took place in very large organizations, and it is hard to get things done via bureaucracy. But like it or not, most engineers work in large organizations with a lot of money at stake, and you cannot enact major changes any other way. The results of whistle-blowing often are not just paper shuffling; sometimes they are saved lives or safer systems. Is the assumption that only papers will be shuffled just a rationalization for remaining silent when you should speak out?

I couldn't agree more with Guthery's statement that “I don't think we knowenough about building computer systems to build good systems without making mistakes,” but I disagree with his conclusion that we should just be allowedto make our mistakes, without the annoyance of whistle blowers pointing them out. We have the right to make mistakes only if we (1) acknowledge up front that this is the way we have to work, and (2) do not put a system into use, particularly in a critical application, if we are not sure that it works.

  1. Although the RISKS community seems to agree that many mistakes are made in any large system, for the most part, the computing "profession" does not admit this. The for-profit part of the industry claims — through ads, sales people, grant proposals — to deliver systems that work, period. But new products/systems are routinely delivered with many important bugs. Funders and customers get upset when they see what they really have to go through and spend to get a system that works reasonably well. Sometimes, as in the recent bank case, the customer abandons the whole project; you can be sure that time for "making mistakes" was not adequately built into the bank project.
  2. Whistle blowers usually act in situations where critical systems are in use, don't appear to be working safely, but are alleged to be working fine. What gives us the "right" to make mistakes in such situations? All the literature on professional ethics agrees that people with special expertise, such as engineers, have a special OBLIGATION to inform and educate others, including the general public, about the limits and risks of the systems they build.

I am upset to see in the RISKS Forum the standard technological enthusiast's argument, that people who criticize technology are just Luddites. Some critics are more concerned about the uses of technology than engineers, who as we know can get so wrapped up in the technology that they fail to consider the people whom the system will effect. Most whistle-blowers come from inside the system, are not normally inclined to get involved in nontechnical issues, and try every internal channel before going public. We owe them special attention when they raise problems.

Before condemning whistle blowers because they've criticized a neat system, I encourage you to read about their cases and view the Boisjoly videotape (available for rent from CPSR/Boston). When you read about what they've suffered as a result of their complaints, and when you hear the anguish in Boisjoly's words, you may change your mind. For a good, readable discussion of engineering ethics, including several case studies of whistle-blowing, read Stephen H. Unger, CONTROLLING TECHNOLOGY: ETHICS AND THE RESPONSIBLE ENGINEER (New York: Holt, Rinehart and Winston, 1982).


Re: Blowing Whistles or Blowing Smoke? [RISKS-6.19]

<dan@WILMA.BBN.COM>
Tue, 02 Feb 88 11:04:01 -0500

I find Guthery's reaction to whistleblowing bizarre. In none of the whistle-blowing cases I've read about (including the ones in Nancy Leveson's article) did the whistle-blowers immediately run to a phone and call the Times as soon as they found anything wrong. They tried to straighten it out with their superiors. Unfortunately, their superiors were part of the problem! Guthery provides no advice for what to do in that case.

In Roger Boisjoly's case, not only his immediate superiors but several layers of management above that simply didn't want to hear what he had to say.

In Sylvia Robins's case, she was FIRED. How on earth could she stay and fix the problem then? I think her response—going to the NASA inspector general and the FBI—was entirely appropriate. If she had immediately called the New York Times, perhaps Guthery would have a case, but she didn't; she went through what appropriate channels were left to her.

As Nancy Leveson's article showed, whistleblowers DO accept personal responsibility for the quality of their work—and when their management makes it impossible to turn out work that meets safety standards, they do their best to get their management overruled. That will often entail contacting channels outside the company.

Dan Franklin

The motivation behind whistle-blowing

<jik@ATHENA.MIT.EDU>
Tue, 2 Feb 88 12:55:43 EST

I cannot agree with the claim that, “What a whistle blower is saying to me is,‘Something is wrong here and rather than fix it and risk being held even partially responsible, I'll make sure I'm perceived as being wholly blameless by being a really Good Person and blowing this whistle and pointing my finger at everybody else in sight.’”

Instead, I think it might be more correct as follows: “What a whistle bloweris saying is, ‘I have found something wrong with my organization. I have tried to remedy the situation through the proper channels, but I have been rebuffed and impeded every step along the way. The only way, therefore, to solve the problem is to step outside of the proper channels and to blow the whistle on the improprieties that are being propogated.’”

Roger Boisjoly, the Morton Thiokol engineer who attempted to prevent the January 1986 Challenger launch, is an excellent example of the second type of whistle-blower. He realized that there was a problem and he did everything within his power both to bring the problem out into the open and to accept responsibility for remedying the situation. When his efforts were thwarted, he chose to go outside of normal channels and jeapordize his job.

— Jonathan Kamens | jik@ATHENA.MIT.EDU


us rationals, them luddites

<Agre@AI.AI.MIT.EDU>
Mon, 1 Feb 88 21:48 CST

Can you think of any cases of ‘whistle-blowers’ who had actually had it in their power to fix the problems they were complaining about? Almost always they had spent a lot of time trying to go through channels before taking the great personal risk of going public. Almost always they encountered indifference or cowardice or mendacity among the ‘teams’ within which they were supposed to be ‘players’. Besides, someone who blew a whistle on something they had the ability to fix would look pretty silly, wouldn't they?

Do whistle blowers complain about ‘mistakes’? No. Most often they complain about lies. Falsification of test data. Systematic suppression of contrary evidence. People who design and implement and approve and release systems that they know will not work, that they know will be impossibly expensive to maintain, that they know will be dangerous. Are these things inherent in large organizations? If so then we have some hard thinking to do.

Phil Agre


Re: RISKS DIGEST 6.19 Who's really blowing smoke?

Steve Philipson <steve@ames-aurora.arpa>
Tue, 2 Feb 88 12:31:26 PST

In a Risks digest on Monday, Feb 1, “guthery%asc@sdr.slb.com” puts forthseveral ideas on “whistle blowers” that demand to be challenged. Gutherystates that whistle-blowers are not interested in accepting responsibility.

Case histories of whistle-blowers show this not to be the case. Many such people expended a large amount of effort within thier organizations working through normal channels to have problems corrected. It is only after such attempts fail that these people have “gone public” or leakinformation to appropriate agencies. The personal risk these people take is very high — they risk loss of their jobs and financial security because they feel a moral imperative to right a wrong. These are exactly the kind of people I'd trust with my security. Even before they went outside of their organizations, these people were fired, harrassed, and threatened with death or harm to thier families. In it unecessary to cite cases here — anyone who reads has seen enough of these to know that at least some of them are real.

Guthery further argues that the only outcome of whistle-blowing activity is to create more paper work, which produces no gain because bureaucracies have no positive effect. If this is true, why not abolish all rules and laws? This line of reasoning is faulty. Problems in our systems and companies must be exposed to view and be corrected. Legal means are but one mechanism. Public outcry is sometimes enough in and of itself as companies are concerned with public image (and its effect on profits).

If we do not protect those who seek to protect us, then we are in complicity with the wrongdoers. If we allow the whistle blowers to be harrassed and injured, then we are as guilty of the crimes they expose as those who commit them. It seems to me that it is not the whistle blowers who are blowing smoke, but rather it is Guthery.

Steven Philipson, NASA/Ames

Smoke and Whistles, guthery, risks 6.19

Frank Houston <houston@nrl-csr.arpa>
Tue, 2 Feb 88 13:06:34 est

This may be a “flame”, but since the subject is smoke, I decided to send it anyhow.I could not let guthery's comments about whistle blowers pass.

What is whistle-blowing, anyway. I suggest that it assumes various forms, the most extreme being either calling reporters to disclose shortcuts that slight safety in favor of schedule or privately informing a customer of potential problems that are being ignored in your company's product or service.

 >… In other words, encouraging whistle-blowing provides a DISINCENTIVE to
 >the acceptance of personal responsibility and accountability.  …

Not a completely fair generalization, I think. Ponder for just a moment what “accepting personal and financial responsibility” means to mostindividuals. Despite “legal safeguards” a whistle-blower has very littleeffective protection and stands a very good chance to lose both income in the short term and career in the long run. I suggest that the disincentives to whistle-blowing far outweigh any negative effects on personal accountability.

Moreover, the whistle-blower may not be in a position to solve the problems in question. There may not be a solution, or the solution may conflict with other corporate, governmental or political objectives. Sure, the ideal way to address a problem is just to solve it quietly, but when the solution requires unbudgeted resources or time, the bureaucrats will balk. Then the responsible, ethical person has a difficult decision, namely, whether the problem is significant enough to warrant risking both livelihood and career by blowing the whistle.

    >Do you want to risk your family's financial security to a guy who's going
    >to start lobbing fault grenades at the first sign of difficulty or
    >something unexpected?

I don't think most guys are going to risk their own financial security by lobbing fault grenades at the first sign of problems.

    >…and the most predictable aftereffect [Sic] of whistle-blowing is still
    >more bureaucracy. … And anyone who thinks that bureaucracies build
    >safe, reliable compuer [Sic] systems should visit the Social Security
    >Administration's data processing center or their favorite nuclear reactor
    >project.

The truth hurts, and I believe this is truth. If anything should be a major disincentive to whistle-blowing, it is humanity's persistent dream that laws and regulations and more “controls” (usually meaning six more approvalsignatures, minimum) can solve problems. If we can believe TIME, Feb. 1, 1988, NASA's safety bureaucracy does not seem to have prevented their contractors from slipping some fairly obvious problems through the system. When economic incentives conflict with human values, the outcome is fairly predictable no matter how many or how strong the countervailing controls, regulations or laws may be. The most painful result of bureaucracy is the addition of significant costs (both to provide review and to cope with it) for little or no appreciable affect on the problems.

    >Indeed, it is exactly the process of making mistakes that will teach us 
    >how to build good ones [computer systems] and avoid building bad ones.  
    >whistle-blowers would deny us this learning and condem [Sic] us to      
    >building with our current and quite incomplete state of knowledge.  In  
    >the main, they are 20th century Luddites blowing smoke not whistles.

Making mistakes can be excused; repeating them should not. Hiding mistakes should be condemned. Whistle-blowers can insure that mistakes are neither hidden nor forgotten, thus helping insure that they are not repeated. It seems that computer scientists, or whatever you call these professionals, have plenty of mistakes from which to learn. I suggest that the so-called 20th century Luddites may be motivated by a desire to force the professions to learn from their collective and individual mistakes, and that, I submit, is more substantial than smoke.

I was very appreciative of the comment that appeared somewhere in the same number, “Every six months computer science loses its memory (if not its mind).”Seems appropriate to repeat at this point.

Frank Houston   FDA, CDRH  (houston@nrl-csr.arpa)Disclaimer:  The views presented above are those of the author alone and do
not represent policies of either the FDA or the CDRH.

Re: Virus anxiety expressed in NY TIMES (RISKS DIGEST 6.19)

Amos Shapir <nsc!taux01!taux01.UUCP!amos@Sun.COM>
2 Feb 88 15:05:53 GMT

jon@june.cs.washington.edu (Jon Jacky) writes:

>May 13 will be the 40th anniversary of the last day Palestine existed as a 
>political entity; Israel declared its independence on May 14, 1948. …
>Israeli officials suggested a  “Friday the 13th” coincidence, but Mr. Rakavy>said the virus was coded to ignore Nov. 13, 1987."

Israel celebrates holidays according to the Jewish calendar; this year's independence day falls 3 weeks before May 13. I suspect November 13 was ignored just to let the virus more time to spread. (Note that this give us a clue to the time the virus was initiated).

Amos Shapir National Semiconductor (Israel)           amos%taux01@nsc.com
6 Maskit st. P.O.B. 3007, Herzlia 46104, Israel       Tel. +972 52 522261

Please report problems with the web pages to the maintainer

x
Top