The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 9 Issue 17

Wednesday 23 August 1989

Contents

o Hazards in Airliners and Medicine
Nancy Leveson
o Re: Technology Doesn't Have to Be Bad
Mike Trout
Robert Dorsett
o "Drive-by-wire": What about bicycles?
Anne Paulson
Donald A Norman
o Re: Autopilots
Brinton Cooper
o Re: Automated Highways
George H. Feil
o Roads made safer or not?
Pete Lucas
o Training & Software Engineering, a reply...
Edward A. Ranzenbach
o Info on RISKS (comp.risks)

Hazards in Airliners and Medicine (Brint Cooper, RISKS-9.16)

Nancy Leveson <nancy@murphy.ICS.UCI.EDU>
Wed, 23 Aug 89 11:45:03 -0700
  Subject: Hazards of Airliner Computerization (Brint Cooper, RISKS-9.16)
  >Such reasonableness checks as humans would be capable of performing,
  >would be far from "make work" and would reduce significantly some of the
  >risks associated with increasingly automated flying.

Such reasonableness tests in software are not easily constructed.  It is
much more difficult to do than you seem to think.

   Subject:  Computers in Medicine (Brint Cooper, RISKS-9.16)
   > 1. Is my perception correct?  Are there proportionally more
   >    life and property threatening computer-related faults in banking,
   >    transportation, and national defense than in medical applcations?

No.  No, there are not.  There are proportionally the same given the relative
complexity of the systems and the amount of use.

   > 2. If there's even a modicum of truth in #1, then why?

There is no truth in it.

   > 3. Or are the physicians merely burying their mistakes again?

Be careful to have your facts before attacking a group of people.  In fact,
medical computer problems are more carefully reported because the FDA requires
this while there are not similar requirements in other applications.  There
are large numbers of reported computer errors and recalls in the FDA database.

I have noticed, however, that many medical equipment manufacturers take quality
control more seriously than some other industries because of the potential
liabilities and costs involved in an FDA-ordered recall of a medical device.


Re: Technology Doesn't Have to Be Bad (RISKS-9.15)

Mike Trout <miket@brspyr1.brs.com>
23 Aug 89 17:06:15 GMT
Don Norman (dnorman@ucsd.edu) writes:

> 4. The report of airliner crew fatigue...

My profound apologies for omitting part of the BBC report; I typed it in from
memory.  Don's statements reminded me that the BBC report included considerable
discussion on the subject of allowing aircrew napping.  The basic argument was
generally that which Don raised, i.e., that minor aircrew napping would
actually make things safer.  It did, however, sound like the "powers that be"
were rather reluctant to allow the public to know that the folks up front may
be sawing logs.

As an aside, a few weeks back NPR had an extensive report on napping.  The
contention was that studies conclusively show that napping dramatically
improves concentration, productivity, and the ability to deal with problems.
Yet society strongly disapproves of napping, and anybody caught napping on the
job will be very lucky if they don't get fired.  Napping is viewed as "wasted
time"; time which could be spent doing something "constructive."  Never mind
that major portions of most workers' afternoons are often spent in an advanced
state of bleary-eyed semi-coherence.  It will be a long time before this
attitude changes, despite the fact that some of history's most effective
personalities, such as Winston Churchill, took frequent naps.

Michael Trout, BRS Information Technologies, 1200 Rt. 7, Latham, N.Y. 12110
(518) 783-1161


Re: Technology Doesn't Have to Be Bad (RISKS-9.15)

Robert Dorsett <mentat@walt.cc.utexas.edu>
22 Aug 89 23:52:50 GMT
Don Norman wrote:
  >... It is well known that the circadian rhythm has two minima -- ...

1.  It's extremely difficult to apply simple principles of the analysis of
circadian rhythm when the crew in question might be transiting multiple time-
zones, on a continuous basis, on irregular schedules.

2.  Part of the problem is cockpits that lull their pilots to sleep.  In _The
Journal of Navigation_, an oft-repeated statistic through 1983 and 1984 was
that Britannia Airways found that 767 cockpit crews were far more sedate (lower
heart rates) than those flying the 737.  This was ascribed to better-designed
seating, noise reduction, and low workload.

In cruise, modern crews are given very little to do.  To address this problem,
very positive steps are being taken--such as having the stewardess check in,
airline policy requiring manual calculation of navigation problems, mandatory
(unnecessary) radio call-ins every fifteen minutes, etc.  I see no reason to
criticise such measures: they attempt to make the best of what is becoming
increasingly clear is a bad situation.

3.  Operational practice on many airlines (even if it's not *policy*) IS to
let a pilot take a snooze if he just can't keep his eyes open.  The other
pilot's notified, and the guy takes a nap.  Great.  But that reduces the
redundancy in a modern cockpit by half.  So what happens if the remaining guy
(often on the same sleep-schedule) falls himself nodding off? (if you've never
flown long-distance, it's quite startling, after being *awake* 25 hours or so,
to suddenly wake up--without ever discovering you had fallen asleep).  Modern
autopilots are incredibly reliable, and it's easy to argue that there's nothing
wrong with even *both* pilots napping for a while.  The problem is that,
even in cruise, even with modern flight assist mechanisms, problems can manifest
themselves faster than the crew can "get back into the loop."  One of the rec-
ommendations of the NTSB report on the China Airlines flipover over Los Angeles
a few years ago was to bring pilots closer into the control loop.


My personal gripes with flight automation are:

* Lack of experience.  Automation's a new concept.  We have little experience
  dealing with the human factor in automation design.

* Economics-oriented vs. safety-oriented design.  It's always possible to fund
  a study that supports the manufacturer's viewpoints.  Even if backhanded tech-
  niques aren't used, genuine divisions exist within the industry on the place
  and role of automation.

* Lack of human-factors influence in system design (letting the engineers have
  at it).

* Too much human-factors influence in system design (psychologists pushing
  pet theories in safety-critical situations).

(the above really boils down to lack of *operations* influence in system des-
ign, but even that's not always a help--consider a system designed to pilot
specifications, which leaves the pilots with precious little to do).

* Relatively low lack of computer literacy among pilots (the more microproc-
  essors the better), and consequent over-reliability on automation (Cf. my
  article in RISKS 9.13).

* Unfounded fault-probability claims by manufacturer (both in program ver-
  ification and hardware reliability).

* Lack of standardization.  How one airplane does something is not necessarily
  how another one does it (look at airspeed displays on the 747-400, Fokker
  100, A320, and A310).  Not even COLORS are standardized.


In my opinion, not nearly enough muckraking is being done on the risks of
automation in aviation (and other disciplines).  When computers are marketed
on the mere basis that they're high-tech, and purchased, respected, and put in
safety-critical situations solely on that basis, it's time to worry.  I think
RISKS serves a purpose by bringing such problems to our attention.  And I, for
one, do not "get a laugh" out of reading about how yet another implementation
has been botched up.


"Drive-by-wire": What about bicycles?

Anne Paulson <PAULSON@INTELLICORP.COM>
Wed 23 Aug 89 12:31:23-PST
Another opinion on "drive-by-wire" systems:

I, like many other submitters, am against them, but for a reason not yet
brought up.  Designers of automated traffic-control systems have an unfortunate
tendency to design for cars, and forget about other road users, like cyclists
and pedestrians.

A case in point is the "smart" traffic light, which is actuated by automotive
traffic.  There are a lot of these menaces where I live.  They work fine for
cars, sure.  But try to make a left turn when you're riding a bike.  You get in
the left-turning lane, right on top of the sensor, in hopes that it will notice
you.  If you are riding a steel bike, you have about a 70% chance that the
light will turn for you.  The sensors, I believe, work by magnetic inductance,
so if you're riding an aluminum bike (as I do) you have a less than 50% chance
that the light will turn for you.  Of course, if the light doesn't turn, you
are forced to run it.  This is more dangerous than if there were no light
there, because the drivers going straight through think they have the right of
way, and aren't expecting turning traffic.  Another problem for cyclists is
that "smart" lights often go green for only an instant, so that if the road is
crowned, it's difficult to get through the intersection while the light is
still green (or, in some egregious cases, while it's still yellow).

These smart lights often endanger pedestrians, too.  A block and a half from my
office is a "smart" T intersection.  For a long time, the light was adjusted so
that left-turning traffic and pedestrians would be in the same place at the
same time.  (I think the designers forgot that this was a T intersection, so
that all the traffic would be turning left.)  Again, it was particularly
dangerous because the drivers thought that the pedestrians were jaywalking,
when in fact the pedestrians had a green light.  Finally, after at least three
years, the problem was "fixed" by barring pedestrians from crossing there at
all.

Technology lovers might argue that these are design flaws that could be fixed.
This is true, but they haven't been.  (I, and other members of my bicycle club,
routinely call the cities and couties when we find lights that don't turn for
us, but we rarely get satisfaction.  And why should it be up to us to make sure
the lights work, anyway?)  I don't believe that "drive-by-wire" systems would
be any better.

By the way, where I live, the number of cyclists is not insignificant.  In
Santa Clara County, twice as many people commute by bicycle as commute by mass
transit.

I would be happy to hear from Donald Norman, or other people working or
consulting for companies that are designing "drive-by-wire" systems, on how
such systems will allow for cyclists.

-- Anne Paulson,    Intellicorp,    1975 El Camino Real,    Mountain View, CA


"Drive-by-wire": What about bicycles?

Donald A Norman-UCSD Cog Sci Dept <norman%cogsci@ucsd.edu>
Wed, 23 Aug 89 14:34:51 PDT
My long diatribe in RISKS has generated the proper result: numerous people have
written me and several, as this piece from Anne Paulson indicates, have
elaborated in the spirit of my remarks.

Although my piece in RISKS argued that technology was not all bad, my main goal
in life is to convince the technologists to consider the human side of things.
There is a tendancy to let the technology dominate, forgetting the
inconveniences this causes for people.  How do I convince anyone?  Well, I
write, preach, and evangalize.  In my consulting, my entire emphasis is on ways
of understanding the needs of the users, and then finding ways to bend the
technology to fit the people (instead of the other way around).

Consider Paulson's point, that designers of traffic lights (intersections and
traffic flow) do not properly consider pedestrians and bicyclists.  Actually, I
think she underestimated the problem: I suspect these designers consider
pedistrians and bicyclists a nuisance that are best gotten rid of.  The best
approach I ever saw to these concerns was in a letter to the editor in a local
nespaper.  The author asked why we had "pedestrian crossings" at streets.
Shouldn't pedestrians come first?  We should have "car crossings."  That is,
the pedestrian should automaticlly be considered to have the right of way, and
cars would have to have special places where they could cross the pedestrian
stream.  The same argument goes for bicycles.

Paulson concludes by saying: "I would be happy to hear from Donald Norman, or
other people working or consulting for companies that are designing
"drive-by-wire" systems, on how such systems will allow for cyclists."  That
isn't the right approach.  The correct approach is to change the entire mindset
of the city planners and people who purchase these devices to put in the proper
emphasis.  The engineers and designers are often very good, with proper
motives, but they can't overcome the mindset of cities that compute traffic
flow, and purchase from the lowest bidder -- even if the equipment thus
purchased is inferior and the system considerations neglected.

     [In addition, I think that automobiles, bicycles, and pedestrians have
     such different characteristics that they simply should not be in the
     same traffic streams.  We need special bicycle ways (separated from
     roadways by more than a painted line) and special walkways.  (Some
     European cities -- espcially Scandavian cities -- seem to take this
     approach.)  And we should separate diffrent kinds of vehicles as well.
     And as we get more and more elderly drivers who tend to drive slowly,
     we will need either efficient point-to-point mass transit or special
     driving lanes for these elderly "super cautious" drivers (reaction and
     decision times slow with age, and attention is limited, with less
     ability to divide attention among several tasks.  Those of us in the
     attention business sometimes say that the elderly have "less
     attentional resources.").]

The problem faced by RISK readers, designers, and users, is that society puts
cost and efficiency first, and cost and efficiency are measured by local,
monetary variables.  Real cost and eficiency would take into account accident
rate, long-term pollution, long term recycling, long-term learning, and
employee comfort and job satisfaction.  In the end, I think proper attention to
these factors increases morale, decreases sickness, increases efficiency,
lowers turnover (which thereby lowers training costs), and lowers the cost of
cleanup and technological fixes by society.  But society is driven by
short-term views.  It will be hard to change this mindset.  But I am
optimistic.

Don Norman, Department of Cognitive Science D-015, University of California,
San Diego, La Jolla, California 92093 USA


Re: Autopilots

Brinton Cooper <abc@BRL.MIL>
Wed, 23 Aug 89 16:50:23 EDT
mrotenberg@cdp.uucp quotes a recent article by Carl Levin.  Briefly,

  "Airlines are starting to fly a new generation of highly automated
  jets, raising concerns among safety researchers that pilots will rely too
  much on the technology and will lose or never learn the sharp skills and
  reflexes needed in emergencies."

Most of the article quotes the fears of "experts" and cites recent examples
where a pilot's skill overcame emergency conditions, saving many lives.
However, the article does not objectively cite balancing situations in which
automated systems handled sets of conditions that were too complex and changing
too fast for humans ever to respond satisfactorily.

The article includes an irrelevant reference to the old saw that calculators
prevent pupils from learning the "basic principles of mathematics," neglecting
the fact that one does not do "mathematics" on calculators.  One does
"arithmetic" on calculators!

Only near the end of the article are cockpit simulators discussed.

  "A pilot in a simulator can practice flying after losing various computer and
  control systems...Still, while simulator training could help for some kinds
  of emergencies, others... are considered so remote that pilots do not train
  for them on simulators."

One wonders, "Why not?"  Can the great minds of the aircraft industry not
postulate virtually any kind of emergency?  Can not the simulators be
designed/programmed to handle any kind of emergency?  This sounds very much
like a failure of will, not of ability.

Much of the problem expressed is not new.  For years, the US Air Force has been
flying high performance tactical jets using significant amounts of automation.
If the automated system fails, the game is over.  The demands made upon the
pilot/airplane system are too demanding ever to be met by humans.  These pilots
are among the best in the world, and their training heavily depends upon
cockpit simulators.

The entire article is a good example of the warped treatment afforded a
complex, scientific topic by the popular press.  That example, how the public
is told about these issues, is probably the best contribution made by such an
article to the dialog in the Risks Digest.  Perhaps, instead of writing to one
another so much, the well-informed among us might write to such as the NY
Times, giving more accurate pictures of the issues.
                                                              _Brint


Re: Automated Highways

"George H. Feil" <gf08+@andrew.cmu.edu>
Wed, 23 Aug 89 10:57:54 -0400 (EDT)
After reading much discussion on the subject, I'd like to call to attention the
cancelled "Skybus" project that would have replaced the old streetcar lines in
Pittsburgh.  [A feature story on the subject was aired on KDKA-TV Eyewitness
News last night; some of the facts presented here came out of the story.]

20 years ago, Westinghouse designed and built a prototype "Skybus" system,
which involved driverless, electric rubber-tire vehicles on an exclusive track.
This system was billed as being the cheapest and most efficient public
transportation system available.  However, many politicians and citizens were
opposed to the idea of robot vehicles operating on tracks, "not being able to
tell whether an object on the road is a newspaper, a concrete block, or a human
body".  Of course, this doesn't take into account the fact that the roads were
elevated and, for the most part, isolated from the rest of the world.

As a result of the public hysteria, the system was canned, and the new
LRV system was built, at five times the cost of Skybus.

There is no doubt in my mind that such a system can and will work,
with the proper safeguards taken into account.  By isolating the
track, there is little need to risk collision with humans, although
there would be a need to scan for foreign objects that could serve as
obsticles.  As for the spacing and switching of vehicles, this is
already an automated function for many railroads.  What recent
railroad accidents can be directly blaimed on computer failure?

The point of this story is that the problem can go both ways.  We can easily
get burned by relying too much on automated systems that can be prone to
failure.  By the same token, we can get burned economically by refusing a
technological advancement that might present a few risks, but at the same time
would solve many others (as I feel the Skybus project would have done).

In fact, since then, Westinghouse has teamed up with a German firm (the name
evades me at the moment), and has implemented Skybus-like systems in many
locations, including Morgantown, WV.  In fact, the new mid-field terminal at
Greater Pitt. Intl. Airport will be implementing such a system when the
terminal opens in '92.

George (HAL) Feil, Carnegie -Mellon University     bitnet:  gf08%andrew@CMCCVB


Roads made safer or not?

"Pete Lucas - NERC Computer Services U.K." <PJML@ibma.nerc-wallingford.ac.uk>
Wed, 23 Aug 89 09:31:08 BST
`Drive-by-wire' - YES i agree that the `human controlled' system causes
fatalities at an unacceptable level - YES i agree that the druggies/alcoholics
should be kept off the roads, YES i agree that there is a place for civilised
public transport (tramcars, buses) and YES i agree that theres a place for
automated navigation systems.  BUT i don't think they will solve the worlds
transport problems.  Most journeys (in Europe at least) are of less than 10
miles - taking the kids to school, collecting groceries, popping out for a
pizza, and it is the case in the UK that most accidents happen within 5 miles
of your home.  These are the very situations where the `blocks' of automated
vehicles travelling close up, just will not work - they are in most cases urban
areas with too many hazards (stop lights, school buses, intersections).

The instances when the bunched, automatic vehicles WOULD be of use are on
long-distance journeys. Paradoxically, on a basis of fatalities per thousand
miles, the motorways, autobahns and autoroutes of Europe are the safest roads.
For the long journeys, there are trains and planes.  Having only been to the
States once, and only driven on minor roads, i can't really get a feel for
Stateside freeway conditions. But how can a `civilised' country have a 55MPH
limit?

Come over to Europe, drive round the London orbital motorway (where the speed
limit is 70MPH, but the police turn a blind eye to anybody doing less than 100)
and sharpen up your driving skills!  The human brain, if it is working
properly, is still the best real-time adaptive guidance system we have.  Whats
more, it comes fitted as a standard, no-cost feature in most people.  Its
catastrophic failures are thankfully rare, and if it does suffer one, then you
aren't going to be worrying any more in any case!

                                          Pete L.


Training & Software Engineering, a reply...

"Edward A. Ranzenbach" <Ranzenbach@DOCKMASTER.NCSC.MIL>
Wed, 23 Aug 89 13:30 EDT
I would like to echo some of the sentiment expressed by Tim Shimeall in
RISKS 9.13.

I am a non-degreed "software engineer".  I started programming as a kid in 1968
and although the term was not used at that time, I was a "hacker".  After
graduating high school in the early seventies I began attending a community
college, the best that I could afford.  Unfortunately the draft was on and when
I got that compelling notice I enlisted.  I honed my craft while in the
service, bypassing the service's programmers school because of very high test
scores.  Within six months, I was appointed the system administrator at a
scientific research and development facility.  When not handling system matters
I was tasked with helping others with their debugging problems.  Interestingly
enough, there were very few "uneducated" enlisted folk at this site and most of
the college trained junior officers were my best clients.  They could not
program their way out of the proverbial paper bag.  When I decided to leave the
service, it was decided that my job would be converted to a GS civilian slot.
I applied for the job and was turned down because of my lack of a college
degree.  The base personnel office told me that I was unqualified for a job
that I had been doing so well that I was recommended by my supervisor and
selected for commendations, as well as NCO of the quarter.  I eventually went
to work for a large computer vendor who sold me back to the Government to fill
my old job.  They got $100,000/year for my services and the facility management
people who I worked for as an NCO were thrilled with my encore performance.

In the past twelve years in this business I have held the following titles:

  Computer Programming Specialist,   Systems Analyst,   Senior Systems Analyst,
  Associate Software Engineer,   Software Engineer,   Senior Software Engineer,
  Principal Software Engineer,   Computer Security Engineer,  Systems Engineer,
  Senior Computer Scientist.

I have always felt proud of the work that I have accomplished.  I am a
published author and have attended invitational conferences.  I have always
been employed based on my reputation, not on my formal education.  I have
managed people with far more formal education (and less talent) than I.

I have tried to obtain a degree over the last twelve years but it has not been
an easy road.  My work has caused me to travel a great deal and to relocate
often.  At one point I was asked to work abroad for a couple of years.  Each
time I have moved I have attempted to matriculate to a new university only to
lose a substantial number of credit hours to a version of the "not invented
here" syndrome.  This has made obtaining my "credentials" not only near
impossible by very costly.  I have taken three different courses in "operating
system design" from three different schools and gotten an "A" each time.  In
one course I corrected the professor's misstatements (discreetly of course)
about the operation of the virtual memory demand paging algorithms of a system
that I helped to maintain.  Don't get me wrong, I feel that formal education is
a worthwhile experience.  But, for those who must pursue this science without
the benefit of a college education, self education is a viable alternative.

I think the software engineering community places too much stock in formal
education and not enough in the proven ability to do the job.  As Mr. Shimeall
pointed out in RISKS-9.13, apprenticeship has long been recognized in this
country as an extremely reliable measurement of one's ability to perform.  An
office mate of mine who holds an advanced degree in computer engineering once
told me that obtaining a degree just proved that you could "play the game".

I apologize for the personal nature of this transaction but Mr. Jones' comments
brought to a head the frustration that I have felt for a number of years.  It
seems that people in this field are more interested in where you went to school
than in what your opinions on various design issues are.  I remember an
incident a couple of years ago when I submitted a paper to a conference.  It
was reviewed and I received a letter of acceptance in the mail along with a
request for a biography to be published with the paper.  Shortly after sending
my biography I received a curt reply explaining that my paper was being dropped
from the schedule and would not be published.  The explanation was that this
was a "professsional" conference and surely I would understand.  My protests
went unanswered.  I was recently rejected voting membership to the IEEE because
they questioned my professional status because I was non-degreed.

I guess my point is here is that the degree doesn't make the engineer...
-ear

Please report problems with the web pages to the maintainer

Top