The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 10 Issue 52

Wednesday 17 October 1990


o Re: "Pilot error" and Human Factors
P.F. Spelt
o Be careful of what you give away!
M. Freeman
o Re: Technophilia-induced problem at Educom?
Benjamin Ellsworth
o Passwords and chess
Steve Bellovin
o "Expert Systems in the Loop" explained
Martyn Thomas
o Info on RISKS (comp.risks)

Re: "Pilot error" and Human Factors

SPELT P F <sfp@stc06.CTD.ORNL.GOV>
Wed, 17 Oct 90 09:50:46 EDT
In his article posted in RISKS forum of 15th October 1990, Robert
Dorsett made a comment about the A320/human interface which triggered a
"respond NOW" action in me.  I am a psychologist working at ORNL in
human factors — the study of the way people and their machinery (for
work or play) interact.  Dorsett said:

>I suggest (again) that the way the airplane interacts with the pilot is
>at LEAST as important as component-wise reliability.

I say:  YOU BET!!!  My work in human factors (HF) for various projects,
some involving computerized interfaces, some not, has yielded various
comments.  The worst kind is:  "HF is just common snese."  Oh, yea?
Then why have we had SO MANY instances of poorly designed devices
creating "human error", aka "pilot error" in the cases of the A320 and
other aircraft crashes?  Another major porblem, also suggested in
Dorsett's posting, is the use of HF consultation.  The prevailing modus
operandi has traditionally been to design the system, call in the HF
consultants for evaluation, then have them design a training program to
"train around" the problems designed into the system.  Such training
will "work" adequately until a major off-normal event (like TMI), when
the operator is unable to react properly to (interact properly with?)
the mis-designed system.

As we come to design and install more and more complex computerized
interfaces between the machinery and the humans using it, we run the
serious risk of making even greater design errors, many of which will
not show up at all until a major off-normal occurrence comes along.  The
introduction of artificial intelligence (AI) into these interfaces adds
an additional dimension along which design errors will propagate.  These
concerns have been very adequately covered in the postings on the Aegis
system (Expert System in the Loop postings), although there WAS no ES in
that system.

Several  of us at ORNL are involved in research into the use of AI in
"operator associates" for various settings.  The potential for using
intelligent icomputerized interfaces is already being explored in a variety of
settings, but many issues remain to be settled, as the Aegis discussion
has highlighted.  These issues need BASIC research directed to answer
the questions raised.  In this era of increasingly tight budgets,
however, finding support for that basic research is very difficult.
Hrowever, if we don't address these issues, there will continue to be an
increased number of "operator error" accidents analogous to the A320
"pilot error" crashes.

The usual disclaimers apply: These opinions are my own, and do not necessarily
reflect those of ORNL, the Department of Energy, or Martin Marietta Energy
Phil Spelt   bldg 6025, ms 6364 POBox 2008 Oak Ridge, TN 37831-6364

Be careful of what you give away!

17 Oct 90 10:33:07 EDT
>From CompuServe's Online Today Forum Data Libraries:

                       MONITOR MONTH IN REVIEW
                            September 1990

    FEDS SEIZE COMPUTERS IN KY. TOWN (Sept. 2): Federal agents over the
    weekend seized computer equipment from a Nancy, Ky., business office
    when it was learned that the computers might contain secret
    government files. The owner of Challenger Ltd., Charles Hayes, said
    federal marshals came 70 miles from the US attorney's office in
    Lexington, Ky., to seize nine computer terminals, a computer memory
    device and other equipment which were purchased from the government
    for $45."

This shows a Risk from computer equipment you are trying to get rid of.  Make
sure you are only getting rid of the equipment, and not giving away copies of
your data!  A tape bulk-eraser probably does a nice job on old tapes and hard

Mark Freeman                       Microcomupter Technology Specialist/Analyst
CompuServe                                        M.Freeman@CSI.CompuServe.COM

Re: Technophilia-induced problem at Educom?

Benjamin Ellsworth <>
Wed, 17 Oct 90 10:02:19 pdt
> The system must have used some kind of voice-recognition algorithm,
> because no human typist that I know could have kept up with the
> speaker at times.

I very strongly doubt this.  I would bet a substantial sum of money
that there was a stenographer and not a computer capturing the words.

> The weakness of the voice-recognition system was made painfully
> obvious...

There is RISK of assuming all failures are technologically induced.  It
could very well be that the stenographer hired was simply not very
good.  The good ones are expensive, and to do "real-time" stenography
takes a good stenographer.

There is a plausible explanation involving computer RISKs however.  The
translation from the steno notation to full english words was in all
likelyhood automated.  In stenography there are a number of dialects
(usually called theories).  Some dialects, especially the older ones,
are not particularly suitable to machine translation.  There are also
more than a few translation programs.  Between stenographic dialects
and computer translators there can be a significant compatibility
problem.  It could be that the stenographer was extremely capable in
the courtroom (where the translations are done off-line by a human),
while at the same time using a style/dialect/theory which was
incompatible with the machine translator.

There has been an interesting interaction between technology and court
recording in the last couple of decades.  My mother, for instance,  is
in the process of re-learning her stenography in a computer compatible
dialect.  It reminds me of pilots who have to learn to fly in a
computer compatible way (training around system weaknesses).

Benjamin Ellsworth         All relevant disclaimers apply.

Passwords and chess

Tue, 16 Oct 90 22:46:39 EDT
Well, since we're talking about chess, here's a tidbit from Saturday's
NY Times, in an article about the Kasparov-Karpov match:

    Trying to meet a noon deadline yesterday for invoking the
    time-out, Lajos Portisch, a Hungarian grandmaster who is Mr.
    Karpov's second, telephoned Geurt Gijssen, a Dutchman who is
    chief arbiter of the match, at 11:53 A.M.

    How was the arbiter to be sure it really was Mr. Portisch on
    the line?

    The Hungarian, who had considered a singing career early in
    life — a fact known to some chess experts — suggested singing
    something in his distinctive voice.  Mr. Gijssen agreed, and
    Mr.  Portisch burst forth with several bars of a Hungarian

    The arbiter granted the postponement, although the written
    request for the time-out arrived late, at 12:07 P.M.

Sounds like they need some sort of challenge/response scheme; that
password is blown...

        --Steve Bellovin

"Expert Systems in the Loop" explained

Martyn Thomas <>
Wed, 17 Oct 90 18:28:08 +0100 (Randall Davis) writes:

>As for the title of this whole discussion — "Expert systems in the loop":

>2) There aren't any and there never were any.
> ... ...
>So until otherwise informed, let's be clear about this: it was a problem of
>"Instruments in the loop".  That by itself may be worth discussing, but it is
>not and never was an expert system.  And it might be interesting to ask, Why
>the rush to label it an expert system?

The original article was mine, and referred to a report of a new research
project in the UK to develop an expert system to advise commanders in
tactical situations which are too complex to analyse without assistance.

This report *explicitly* referred to an expert system. The point of my
original posting was that an expert system which provides advice, in
circumstances where a decision must be made and there is insufficient time
for the commander to analyse the situation him/herself, is effectively
making the decision. Many who followed up agreed with this viewpoint. I
apologise for mentioning the USS Vincennes - it distracted attention from
the major point, and wasted a lot of net bandwidth. So far as I recall,
noone, throughout the discussion, suggested that Aegis is an expert system.

Please report problems with the web pages to the maintainer