The RISKS Digest
Volume 1 Issue 3

Friday, 30th August 1985

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Miscellaneous comments on V1#2
Dave Curry
Computer/hardship list
Jerome Rosenberg
Medical KBES — Some AI systems may need FDA approval
Health hazards of CRT use
Robin Cooper

Miscellaneous comments on V1#2

Dave Curry <davy@purdue-ecn.ARPA>
Thu, 29 Aug 85 21:22:44 EST

Just some miscellaneous comments on some of the things in RISKS V1#2. Hope this isn't too long.

  1. Fishermen. This sounds like a crock to me. I wonder whether the broken buoy or the fact that the storm was not predicted was the deciding factor in the case. Since the NWS/NOAA is providing a service, which nobody is required to use, I can't understand how they can be sued for not predicting a storm. What would happen if they predicted a storm which never showed up? Could all the fishermen who stayed home sue for their lost profits? I can see it now…. "Cloudy Thursday, rain Friday — use this information at your own risk."
  2. Union Carbide. I always wonder in cases like this whether the plant is actually having more accidents than usual, or if because of Bhopal we're just hearing about it more because the press has a new victim to pick on. The number of accidents at that plant is disgraceful. Does anyone think the government will shut it down?
  3. Bob Carter's comments. I think I agree with PGN on these… I would prefer to see RISKS cover more or less anything related to computer "hazards", rather than centering on one or two things. There are plenty of other lists which already take certain parts of this (e.g. SOFT-ENG for "who's responsible" type stuff, ARMS-D for SDI). I also like the SEN quotes — I don't personally read SEN, and even if some of the stuff is dumb (computer kills scientist), overall I think the brief summary PGN provided in V1 #1 gives a nice broad range of topics to discuss.
  4. Medical programs. I'm not sure I trust these fully yet. I'd have no qualms about my doctor using one to suggest things to him, but I would draw the line at his accepting the program's diagnosis unless he could verify on his own that it was correct. For example, a heart specialist interpreting a heart-diagnosis program's output would be good; a general practicioner's taking it as gospel would not be good. We need to make sure the doctor is capable of knowing when the program is wrong. (I saw a comment about MYCIN once - "if you brought MYCIN a bicycle with a flat tire, it would try like hell to find you an antibiotic.")
  5. SDI. I'm going to leave this for the experts. I personally lean towards Parnas's "side", but I don't know enough about it. I do like reading the comments on it though. (BTW, for those of you who haven't yet read Herb Lin's paper, it's excellent.)

Great list so far… keep it coming. As a (possibly) new topic, did anyone go to this AI show in San Diego (?) or wherever? I saw a blurb on it somewhere… how about a review of what the current toys are and what risks they may take? I remember seeing something about a program to interpret the dials and gauges of a nuclear power plant….

—Dave Curry
davy@purdue-ecn

Computer/hardship list

Rosenberg Jerome <jerome@wisc-rsch.arpa>
Thu, 29 Aug 85 14:00:58 cdt

Peter: One basis for a focussed discussion of risks would be to try to establish a list of those computer systems whose failure would cause great hardship —economic, political, social—to a significant number of our citizens. For example, the failure of our computer-controlled electric power grid or the failure of the Reserve's check clearance system.

Your readers/participants could be asked to suggest the systems to be included on the list. Your forum could then discuss probabilies of failure,costs of failures vs failure time, etc. etc..

Jerry

Medical KBES

Ave decus virginum! <goun%cadlac.DEC@decwrl.ARPA>
Friday, 30 Aug 1985 05:37:48-PDT

Some AI systems may need FDA approval

Expert systems come within the FDA ambit to the extent that they supplement doctor's work, according to Richard Beutal, a Washington D.C. attorney specializing in the legal aspects of technology.

An expert system may be defined as a computer program that embodies the expertise of one or more human experts in some domain and applies this knowledge to provide inferences and guidance to a user. some of the earliest and most sophisticated systems were developed for medical diagnosis: MCYIN, EMCYIN, CADUCEUS AND ATTENDING. [There are several more in use in Japan. —mjt]

Beutal called attention to proposed FDA regulations that, if implemented, would require medical expert systems to obtain FDA pre-marketing approval. Given that FDA approval for what are class 3 devices could take up to 10 years and that reclassifying such devices can take almost as long, these FDA regulations would virtually cause investment to dry up.

{Government Computer News Aug 16, 1985}


health hazards of CRT use

Robin Cooper <cooper@wisc-ai.arpa>
Thu, 29 Aug 85 10:35:20 cdt

With respect to the introduction of the topic of the health hazards of using video terminals, I would be particularly interested in seeing discussion of risks to pregnant women and their unborn children. Both Sweden and Canada have apparently introduced legislation which gives pregnant women the right to change job assignments, whereas the official US line seems to be that there is not sufficient risk to warrant this.

Robin Cooper

Please report problems with the web pages to the maintainer

x
Top