Forum on Risks to the Public in Computers and Related Systems
ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator
Volume 4: Issue 60
Monday, 9 March 1987
Contents
Feel better now?- Martin Minow
Computers in the Arts (or The Show Must Go On ...)- Jeannette Wing
Sensitive Intelligence Document Published On Magazine Cover- Stevan Milunovic
Mode-C Transponders- Phil R. Karn
Physical risks and software risks- Eugene Miya
Safe software- Scott E. Preece
Helicopter rotor failures- Peter Ladkin
Re: Electronic steering- D. V. W. James
Altitude Encoders... expensive for some- Herb Lin
F-104- Elliott S. Frank
Info on RISKS (comp.risks)
Feel better now? [Risk probabilities in nuclear power]
I need a vacation <minow%thundr.DEC@decwrl.DEC.COM>
09-Mar-1987 1623
From a long article in the Boston Globe, Mar 9, 1987: "When the owners of the Seabrook nuclear power plant recently proposed shrinking the plant's emergency evacuation zone from 10 miles to 1 mile, they based their argument on what may be the most comprehensive computer study ever done of a nuclear reactor. "Engineers spent $4 million and 35 man-years of work assembling millions of bits of data on Seabrook's design, construction and maintenance, then plugged them into a huge main-frame computer programmed to simulate how the reactor would handle anything that could conceivably go wrong. What emerged was a 50-foot high computer printout analyzing 4.5 billion possible accident scenarios, from minor valve failures to catastrophic core meltdowns. "A 4,700 page study concluded that with a one-mile evacuation zone, the risk each year of a member of the public's dying from an accident at Seabrook would be less than one in 10 million -- low enough, Seabrook's owners said, to justify a smaller zone. ... "If the US Nuclear Regulatory Commission accepts that logic and the courts reject a likely legal challenge, the $4.5 billion reactor will be able to open despite [Mass.] Gov. Dukakis' refusal to participate in what he says is an unworkable evacuation plan. There are six Massachusetts communities inside the 10-mile zone, but none fall within the one-mile zone." The article continues by discussing criticisms of the study, and "the little-understood field of probabilistic risk assessment." Martin Minow minow%thundr.dec@decwrl.dec.com
Computers in the Arts (or The Show Must Go On ...)
<Jeannette.Wing@k.cs.cmu.edu>
Monday, 9 March 1987 10:39:47 EST
Over the weekend I attended a dance concert put on by a local college
company here in Pittsburgh. It was announced before the show started that
the computer that controlled the lighting was not working, but the show
would go on. However, only stage lights would be used so that the
audience would not get the intended effect and mood that color and spotlights
could give. People were offered their money back--no one left.
I wonder what backup strategies are typically used for professional music,
dance, and theatrical productions. For example, some people in the audience
wondered why the lights could not just be done by hand. Do Broadway shows
use backup computers just in case of failure?
[There have already been two big losers -- "Grind" and "Les Miserables",
reported in earlier RISKS issues. This is the old local-optimization
false-economy problem. One can economize with cheap computer control
systems, but if they crash on you, the overall cost may be quite high.
I imagine there is some backup here. But, as you well know, there are
many cases where the main system and the backup system both fail, or
where it is the redundancy mechanisms themselves that fail! PGN]
Sensitive Intelligence Document Published On Magazine Cover
Stevan Milunovic <Milunovic@SRI-STRIPE.ARPA>
Thu 5 Mar 87 02:54:09-PST
[The following item is not directly computer related, but is illustrative
of a kind of risk not previously noted here -- although I vaguely remember
other cases in which sensitive VDT screen images have appeared in
photographs. PGN]
Sensitive Intelligence Document Published On Magazine Cover
By CLYDE H. FARNSWORTH
c. 1987 N.Y. Times News Service
WASHINGTON - A picture on the cover of the current issue of The Foreign
Service Journal shows a readable copy of one of the government's most
sensitive intelligence documents, according to government officials. The
Foreign Service Journal, published for members of the Foreign Service, is
generally available to the public and has a circulation of 10,000. The
document, a copy of the National Intelligence Daily, which is produced by
the Central Intelligence Agency in traceable, numbered copies exclusively
for the president and a small circle of others with top-secret clearance,
was photographed on the desk of Ronald I. Spiers, the Under Secretary of
State for Management. Spiers was the subject of the article referred to on
the magazine's cover. The CIA intelligence summary, which reports the
latest intelligence evaluations by the agency, was open to two pages,
apparently about the situation in Lebanan.
A map of Lebanon was partly blocked by Spiers' left hand. He had some
hand-written notes partly shielding the print on the facing page, but
clearly visible at the bottom of the page was the number 121. Some text as
well as codes, also at the bottom of the page, were not legible with normal
magnifying equipment, but a Congressional aide with a background in
intelligence said, ''Based on my time in the business, this is the kind of
thing you could blow up and clarify what the final thing is with not even
very sophisticated equipment.'' The aide continued, ''This is a major
breach of security.'' An aide to Sen. Jesse Helms of North Carolina, the
ranking Republican on the Senate Foreign Relations Committee, said,
''Anybody else in the government who did this would have been fired if this
had happened to them.'' [...]
Mode-C Transponders (Re: RISKS 4.59)
Phil R. Karn <karn@faline.bellcore.com>
Mon, 9 Mar 87 15:57:34 est
As far as I'm concerned, people who fly on airliners only as passengers have
every right to complain about general aviation aircraft without
altitude-encoding transponders, since they seem to collide in mid-air with
airliners with alarming frequency. I really get tired of this "I can do
what I want with my neck, why is the government trying to tell me what's
good for me?" routine.
The simple fact is that your actions put others (like me) under involuntary
risk, and preventing this sort of thing is the fundamental reason why laws
and governments exist. I don't care whether 5% or 50% or 100% of small
planes lack electrical systems; if they can't be flown without hazard to
other planes, then they shouldn't be flown at all.
Phil
Physical risks and software risks
Eugene Miya <eugene@ames-nas.arpa>
Mon, 9 Mar 87 11:05:26 PST
I've been thinking about the nature of physical systems and the addition of software to them. The comments by Martin Harriman and the comments and bridges and buildings moved me. I am reasonably familiar with Sikorsky helicopters, and it makes me wonder if we should should put information into software which takes long term degradation into a software system. It has some interesting consequences, and it would be difficult to think of all of them out before hand. Parnas points out that computers are basically discrete systems (obvious over-simplification), but real systems are less so. Bio degradable software anyone? --eugene miya, NASA Ames Research Center
Safe software
Scott E. Preece <preece%mycroft@gswd-vms.ARPA>
Mon, 9 Mar 87 08:36:49 CST
geraint%sevax.prg.oxford@Cs.Ucl.AC: > The answer to questions like ``why can't I install my own scheduler?'' > has surely to be that this is not a question that an applications > programmer should know how to ask! In particular, if one is writing > real-time programs, then the correctness of one's code had better not > depend on how it is scheduled. Eh? The real-time code I've heard about has depended very strongly on tight control of scheduling -- cyclic scheduling of tasks and strong control of priorities and sequencing of tasks. Whether the people writing real-time systems are "application programmers" in the sense conventional in the US is another question... scott preece, gould/csd - urbana, uucp: ihnp4!uiucdcs!ccvaxa!preece
Helicopter rotor failures
Peter Ladkin <ladkin@kestrel.ARAP>
Mon, 9 Mar 87 15:46:02 pst
As far as I remember, Martin Harriman is referring to the rotor failure on a Bristow Helicopters' Sikorky S76A in Scotland. The rotor hub has elastomeric bearings, which were wearing prematurely, and the bolt on the inside of the rotor shaft was taking shear as well as strain forces, whereas it was only designed for the latter. The inappropriate finishing technique to which Harriman refers was a contributory factor in the failure of the bearing under the shear loads. The wear was the main factor. I believe that the aircraft was also operating out-of-inspection, being ferried to a maintenance shop with an illegal passenger aboard. The only moral relevant to RISKS would be not to take a free ride in aircraft that are out of inspection.
Re: Electronic steering
D. V. W. James <vnend@ukecc.uky.edu>
9 Mar 87 20:00:52 GMT
>From: "Hien B. Tang" <hbt@ICSE.UCI.EDU>
>Side note: Isn't the F-16 a fly-by-wire plane? If electronic steering is
>safe, and reliable enough for combat jets, why wouldn't it be safe enough
>for everyday car?
Several reasons. Primarily due to the fact that while a combat
jet is constantly maintained, your average car on the road is driven
until something breaks and causes it to be undrivable before repair is
even thought of. Also, there are a lot more cars of a given model on
the road than there are a given aircraft in the air.
Second, your average F-16 pilot is well trained and knowledgable
about his aircraft, as is his ground support (though less so than the
pilot). Your average (American, though I have never seen any real evidence
that other countries do a better job) automobile driver is barely aware
of the way a car should be driven. How can it be otherwise? To get your
lisense in the US all you have to do is prove you exist, answer a few
questions, mostly about signs, and such, and then drive a total of
at most a mile at low speed. The most harrowing part of the test for
most people is the parking! But this may be irrelevant, what could
the driver of an automobile do if s/he suddenly found out that they
had no directional control? And what warning signs could they notice
of impending (electronic) steering failure?
It certainly sounds like a nightmare to me...
cbosgd!ukma!ukecc!vnend; or vnend@engr.uky.edu; or vnend%ukecc.uucp@ukma.BITNET
Also: cn0001dj@ukcc.BITNET and Compuserve 73277,1513
Altitude Encoders... expensive for some
<LIN@XX.LCS.MIT.EDU>
Mon, 9 Mar 1987 22:36 EST
... There are too many self-appointed aviation safety experts out there,
like Ann Landers, whose only qualification is that they fly on airliners
a lot.
This is scary to me. The aviation community does NOT affect only itself.
The "mere" qualification that someone flies alot is certainly good enough to
give that person a legitimate interest in safety concerns. If a solution
won't work, then it's up to you "real" experts to say why not, and to
explain it in a way that others will understand it. Telling them to "stay
out" just doesn't wash.
F-104 (Re: RISKS 4.56)
Elliott S. Frank <amdahl!esf00@Sun.COM>
Mon, 9 Mar 87 16:24:40 PST
The story referred to by munnari!csadfa.oz!davidp@seismo.CSS.GOV is
an old one: it dates back (at least) to the early or mid sixties. [Aside:
"The Choking Doberman, and other Urban Folklore" should be required
reading for RISKS contributors.]
The F-104 suffered a spate of crashes when it was first adopted by the
West German Air Force: the pilots thought that they were smarter than
the terrain-following radar with which the planes were equipped. They
were. However, the planes were faster than the pilot's reflexes. After
a sufficient number of crashes, the cause was determined.
I also heard a similar story about early versions of the terrain-following
radar on the F-111.
Elliott S Frank ...!{ihnp4,hplabs,amd,nsc}!amdahl!esf00 (408) 746-6384
[the above opinions are strictly mine, if anyone's.]

Report problems with the web pages to the maintainer