The RISKS Digest
Volume 9 Issue 25

Friday, 15th September 1989

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…


o Risks of distributed systems
Eugene Miya
o Medical accreditation: good for big shops only?
Douglas W. Jones
o The role of government regulation
Douglas W. Jones
o Is modern software design contributing to societal stupidity?
Tom Comeau
o Re: Aircraft simulators
Alan J Rosenthal
Robert Dorsett
o Mission: Impossible
Robert Dorsett
o Info on RISKS (comp.risks)

Risks of distributed systems

Eugene Miya <>
Thu, 14 Sep 89 19:14:35 PDT
Peter on his soap box notes the "tricky" problems of distributed systems.
Anita Jones in her March 1982 survey of multiprocessors in Computing Surveys
noted that programming parallel (or distributed) processors as in the Cm*
system was no more difficult than other programming.  This has been challenged
by other authors (numerous), BUT the software engineering community has by and
large ignored issues which might be "unique" to distributed and parallel
systems.  The education of most students does not include distributed systems,
there being a) no standard product lines, all home grown in different ways from
standard components, and b) no consistent standard software (despite rough
standardization on protocols).  One does not buy a distributed system as one
buys a computer system (although there aren't many differences); one builds
from "scratch." People do not address issues such as synchronization,
atomicity, security, etc. in software design classes, these are issues for the
"operating systems" class.  And it isn't the researchers and developers of
distributed systems who need exposure, we have plenty, it's the students who
need this (as if there really were much of a distinction 8).

In short, I believe programming is going to get harder before it gets easier.
I believe this, because I am grappling with these issues NOW.  I only have to
move my mouse cursor into the appropriate window on my workstation to one of my
parallel processors.
                                            --eugene miya

Medical accreditation: good for big shops only?

Douglas W. Jones <>
Fri, 15 Sep 89 08:51:05 CDT
In RISKS 9.23, Frank Houston suggests that software shops use an accreditation
system modelled on that used by hospitals.  The suggestion is interesting, but
there are some big differences between hospitals and software shops.

One primary difference, from the accreditation point of view, is that most
hospitals are big, while many quite competent software shops are small.
Hospitals with fewer than 100 employees are rare, while there are many
corporations in both the software and hardware fields that are smaller.  Many
of these corporations are some of the most creative ones in the field, and the
development of major ideas in both hardware and software can be credited to
them.  The administrative overhead of seeking and maintaining accreditation may
be easy to adsorb into a large organization, but it is likely to be prohibitive
for such small organizations.

Another difference is that most hospitals have only a small number of staff
physicians.  Most physicians practicing in a hospital are not on the staff, but
instead, have an associate relationship.  The analogous structure in a software
shop would be to have the secretaries, machine operators, and a few of the
programmers on the payroll, but to have the majority of the programmers working
there on a contract basis, paid by the customers (patients) and not by the
software shop.  I don't know what effect this has on the notion of hospital
accreditation versus software shop accreditation, but I have a hard time
believing the effect is small.

Finally, a hospital is a clearly defined organizational unit; even if it is
part of a hospital chain, the physical and staff boundaries are easy to define.
Many software organizations are far harder to circumscribe.  Finding
accreditable units in a large corporation may be quite difficult, and I doubt
that entire corporations (say Ford or Xerox) are appropriate units of
accreditation.  From three job interviews I did at Xerox in 1980, I can state
that some software groups in that company were clearly among the best I've
encountered, while others were horribly staffed and poorly managed.  I would
hope that these different groups would be separately examined for accreditation
purposes, but that brings up the problem of size and expense, because the
largest group (and by far the poorest, by all my measures) had only 60

Douglas Jones, Department of Computer Science, University of Iowa

The role of government regulation

Douglas W. Jones <>
Fri, 15 Sep 89 09:52:32 CDT
Bob Ayers comment in RISKS 9.24 on Frank Houston's accreditation proposal in
RISKS 9.23 was representative of a number of previous comments on my own
and other programmer certification proposals in earier RISKS issues.  He said:

    The ultimate lever for accreditation is the action of government in
    defining non-accreditation as proof of the absence of quality, and,
    ultimately, banning non-accredited service on that basis.

Similar comments were made in response to earlier proposals for programmer
certification.  These comments are reasonably representative of a fairly
extreme libertarian view that government regulation of the free market is
inherently inept and has a corrupting influence on the quality of services
offered by the market.  I believe that these comments need to be answered.

When there is one big customer, I will agree that such a customer has unusual
clout.  Charles Babbage wrote extensively on this, pointing out in his book, On
The Economy of Machinery and Manufacturies, that the theory that the free
market is optimal works only if there are both many competing suppliers and
many competing customers.  The market price will not be the optimal price for
goods in the presence of a monopoly among either the buyers or sellers.

Medicare is certainly a large customer for medical services, and it is indeed
a government program, but it does not follow that government is the "ultimate
lever for accreditation".  In most states, the Blue-Cross Blue-Shield
insurance organization is a bigger customer than Medicare, and their policies
are certainly a bigger lever.  It might be argued that such huge insurance
conglomerates can be viewed as pseudo-governmental, but they are largely
products of the free market at work regulating itself.

Charles Babbage pointed out a second factor that corrupts the free market, the
inability of a customer to discern the quality of competing products.
Babbage's examples were largely drawn from early 19th century scandals
involving milk adulteration, but what he said applies just as well to medical
practice, where I, as customer, am ill equipped to judge the quality of medical
care I purchase.

Babbage said that, in the presence of quality problems and in the absence of
easily discernable indicators of quality, customers will pay prices well
above the market price to buy from a vendor in whom they have confidence.
In such a situation, both customers and vendors benefit from independent
certifying authorities such as, in the classic case of agricultural products,
the USDA.  With colleges and universities, accreditation arose largely
without government involvement, although the US Department of Education now
accredits accrediting agencies.

My favorite example of a purely non-governmental regulatory agency is
Underwriters Laboratories.  They are a creation of the insurance industry, and
they are the primary and oldest regulator of consumer product safety in the US.
The insurers involved cover the manufacturers liability in case of lawsuits
over faulty products, while others offer fire and casualty insurance to
consumers.  Both profit from reduced claims when manufacturers submit products
for UL certification, and customers buy only UL certified products.

My point is this: When it is in the best interests the buyers and sellers of a
product to have a regulated marketplace, the marketplace will be regulated, and
government has little to do with it.  Of course, in times when governments are
big and all powerful, government becomes a natural choice as a regulatory
authority, but in the absence of strong government involvement, insurance
companies, market cooperatives, and other organizations will emerge to regulate
the market.

Government did not invent the Computing Sciences Accreditation Board, and it
did not invent the Institute for the Certification of Computing Professionals.
Right now, nobody is forced to seek accreditation or certification, but it is
not hard to imagine a few court cases establishing the principle of strict
liability for losses caused by software failures, and if this happens, I see
little hope for avoiding forced adherence to some kind of accreditation or
certification standards in the software industry.  If government does not force
these on us, the insurance industry that emerges to provide malpractice
insurance for programmers will do it.

Douglas Jones, Department of Computer Science, Univeristy of Iowa

Is modern software design contributing to societal stupidity?

Chairman, Von Neumann Catastrophe Relief Fund <"STSYSC::TCOMEAU"@SCIVAXM.STSCI.EDU>
Fri, 15 Sep 1989 16:06:07 EDT
A mentor from early in my career once told me that if we made a system
idiot-proof, only an idiot would want to use it.  Are modern software systems
directed toward that end?  Is it really too much to ask for people to read and
understand, and even use, documentation?  That's the question at the heart of
the article which follows.

Tom Comeau, Space Telescope Science Institute     |

  >From _The DEC Professional_, September, 1989, p. 160.

   John C. Dvorak, "The Stupidity Factor"

    ....I recently read [an] aricle...about research on the topic of stupidity
being done by Jon MIller of the Public Opinion Laboratory at Northern Illinois
University.  He discovered that 36 percent of the American public believes that
boiling radioactive milk will make it safe to drink.
    The more we study stupidity, the more we realize that the technological
society toward which we're headed must protect itself from its own inability to
keep up with things because of its own stupidity.  The public will be
overloaded with bad information and will be unable to distinguish hokum from
    People involved in the PC revolution aren't any smarter.  Like the general
public, they suffer from an overall incompetence that stems from lack of
intiative, fear of the unknown, and plain old sloth.  ...
    Watching Microsoft Windows try to turn the corner on its quest for
popularity reflects this.  As easy as Microsoft Windows is to use, it's still
too hard to use.  ...
            [Discussion of the difficulties in bringing up Microsoft windows,
            and consumer resistance to the product.]
    The attitude seems to be that if the machine booted Windows automatically
and if a lot of extra work wasn't needed, people might like it.  Currently,
it's too much trouble.  The biggest fear is that you'll go through a lot of
effort only to be disappointed with the results: The package won't work as
advertised, or run your favorite program, or it crashes.
    The only computer that has overcome this sloth factor is the Macintosh,
with its logical interface.  Steve Jobs realized that many people don't read.
He put the documentation in pamphlet form.  Plenty of information was omitted,
but who cares?  ...
    People don't read documentation.  This is part of a national trend towards
stupidity, because people don't read anything!  ...
            [Discussion of author's son, who has difficulty reading and
            following installation instructions for a software game.]
    Is he different from anyone else who refuses to read documentation — the
majority of today's users?  In the past, it was easy to condemn documentation
writers for their mediocre and hard-to-understand prose.  But much
documentation today is well-organized, simple and easy to follow.  Still,
nobody reads it.  Even sophisticated users — the ones who used to read
documentation — have joined the forces of the illiterate.  They argue that
life is too short and that a good program doesn't need documentation.
    What do we end up with?  The market demands bulletproof software that's
extremely intutive.  Can software be so intuitive that it communicates its
commands through some nether world of non-verbal signals?  We can expect
researches to find out.  Meanwhile, interface engineers will make a lot of
    What does the future hold?  Windows will have to change drastically to be
popular, and soon 50 percent of Americans will believe that boiling radioactive
milk makes it safe to drink.

Re: Aircraft simulators (RISKS-9.24, Rob Boudrie)

Alan J Rosenthal <>
Thu, 14 Sep 89 18:23:29 EDT
> ... not confusing reality and simulation.  How about
>deliberately confusing the two - a pilot in an emergency situation would
>not know if it was real or simulation, and could therefore be expected to
>behave in a calm, professional manner without panic.  (Sort of like not
>telling school children if the fire bell is a true alarm or a drill).

I think this would be a very bad idea.  Pilots in emergency situations should
indeed behave in a calm, professional manner, but they should not necessarily
behave the same as if it were a practice simulation.  I can't give a good
example because I don't know anything to speak of about planes, but I certainly
can give a fire alarm example.

In a fire situation, you might jump from a second or third-story window, and
sprain some muscles badly, or break some bones.  In a fire drill, you would
never do such a thing.  In extreme situations you would say, "and at this point
I would jump from the window, possibly breaking my leg."  You wouldn't do it.

The trade-offs during practice and emergencies are different, and should be.
It's sad that this means that you can't really simulate emergencies fully.
Nevertheless, it's *true* that this means that you can't really simulate
emergencies fully.


p.s. when I was in public school, we were always warned about drills.  We knew
whether fire alarms were real or not.

Aircraft simulators (Rob Boudrie, RISKS-9.24)

Robert Dorsett <>
Thu, 14 Sep 89 17:44:05 -0500

The successful resolution of emergencies invariably results in the aircraft
being landed ASAP, which isn't terribly profitable.  An *emergency* continues
until the aircraft is stopped and the passengers deplaned.

As for mucking with the readouts to provide "fault-solving" practice,
the problems are that (a) if we have a system so sophisticated, why bother
with pilot control in the first place; and (b) once pilots learn to distrust
an instrument (say, through negative training), accidents can happen (look
at the recent British Midlands 737 crash last January--it was partially
attributed to pilot reluctance to trust the new Smiths Industries
engine vibration indicator, which replaced a device that, on older aircraft,
had proved to be quite unreliable.).

>There is also a vast untapped market for an "aircraft passenger simulator".
>(get come cramped seats, small rest rooms, hard to view movies and poor food
>for a few hours).  The market would be large, and there are many more aircraft
>passengers than there are pilots.

Before launching an airplane, the large manufacturers do extensive "comfort"
studies on real-scale cabin mockups.    They have employees (or volunteers)
play the part of victims, etc.  They've experimented with some pretty wild
layouts, covering seat-back TV to evacuation scenarios.  By the time the
aircraft is ready for a "launch" decision, a mockup is prepared for customer
inspection, and entire "flights" are flown, complete with uniformed stewardesses
and meal service.  I may be wrong on this, but I believe Boeing pioneered the

Robert Dorsett    UUCP:!!rdd

Mission: Impossible (RISKS-9.24)

Robert Dorsett <>
Thu, 14 Sep 89 17:34:21 -0500
Lest anyone credit Mission: Impossible with advancing the state of computer
literacy, here's a review I wrote a few weeks back on rec.arts.movies...

Mission: Impossible looked interesting this evening, so I watched it.  My

Tonight's episode was concerned with computer viruses.  It starts up with
a US nuclear submarine "under attack" from an emergency buoy they went to
investigate.  The buoy infected the ship with a "virus," which screwed up
all the control systems.  The ship was ultimately destroyed when torpedoes
(shot to destroy the buoy as a last-ditch measure, no pun intended) blew up
in the loading bay--the sub was destroyed.

The IMF (no not the International Monetary Fund) is called in.  Phelps strolls
up to an F-111, just landed, and chats with the pilot.  The pilot says "climb
on in!"  What does Phelps find, but his CD player!  (the CD replaces the
tape recorder/envelope used in the old series).  The CD player informs him
that this nasty A-rab arms dealer is in the market for the virus.  He wishes
to market it to "clients in the Gulf," so as to disrupt American warships.

This provides an opening.  The arms dealer is going to Hong Kong to bid for
the virus.  They arrange for him to be detained and a surrogate sent in his
place.  The person doing the dealing is a US admiral, the ex-head of the
American-Russian team on "computer virus disarmament," and is an expert on
"digital warfare."

Still with me? :-)

Anyway, bidding starts.  The IMF gasses the estate where the bidding's
being held, right after the admiral explains that he's giving out the virus
for free-it's the *antidote* that he's offering for sale.  Bidding goes up
to $12 mil before everyone passes out.  The admiral crushes the 3.5" Sony
disk in his hands before passing out.  The evaluation of the man on the
spot is that "He's destroyed the disk!" Phelps: "We have to go to Plan B."

Plan B involves setting up a sub simulator.  In clasic IMF style, the admiral
is fooled into believing that the collision of two supertankers caused the gas
cloud that knocked everyone out (some combination of chemicals, you see :-)).
A Russian warship is also taken out, as are various airplanes.  One airplane
which crashes is a Russian sub-hunter, which the American sub intercepts.  One
of the crewmen bring a distress buoy on board.  "No, you fools!  Don't bring
that on board!"  Anyway, one thing leads to another, the admiral is
conveniently fooled, and, at the last minute, as the ship's about to be
"destroyed," he enters the "antitote" by hand: he jumps in front of a computer
terminal, types in raw hex at about 150 character per second, and saves the

IMHO, this show pushed computer literacy standards back to oh, 1950's science
fiction standards.

(Incidentally, someone mistakenly concluded that the above "jumping in front
of terminal" incident involved a miraculous use of a password...  there was
no password!)

Please report problems with the web pages to the maintainer