The RISKS Digest
Volume 6 Issue 42

Sunday, 13th March 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…


o A legal problem — responses sought
Cathy Reuben
o Computers on Aircraft
Robert Dorsett
o High-Tech Trucking
Rick Sidwell
o Re: Programs crying wolf
Peter da Silva
o Pay cut
Martin Taylor
o Dangers of Wyse terminals
o Burnt-out LED
G. L. Sicherman
o Re: Display self-test
Peter da Silva
o Calculator Self-tests: HP34C has a full functional self-test
Karl Denninger
o Trying harder on complex tasks than on simpler tasks
Robert Oliver
o Police using computers - Licence plate matches - etc, etc.
Ted G. Kekatos
o Info on RISKS (comp.risks)

A legal problem — responses sought

    [Forwarded-From: John W Manly <JWMANLY%AMHERST.BITNET@MITVMA.MIT.EDU>]

    I am writing a law school paper on the proper allocation of rights
in software between programmers and their employers.  I am curious to
know how well the legal standards I've uncovered line up with the way
people in the industry peceive the equities of the situation.

    Below is a hypothetical which lays out the basic problem.  Please
send me your reactions.  I don't need anything extensive, just a short
statement of where you personally come out and why, and from what
perspective (i.e. programmer, employer, student, etc.) you're
approaching the problem.  I'm not interested in what you think the law
is, only what you feel it should be.  Many thanks!

(Please be sure to respond directly to me [and NOT TO RISKS]:

        Cathy Reuben, Harvard Law School, REUBEN@HULAW1.BITNET

 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

    In 1981 Mr. John Allan receives a Masters degree in computer
science from University of Massachusetts.  At that time, Allan
delivers a paper entitled "No More Manuals:  The Use of Touch and
Sound Sensitive Hardware to Promote Accessibility to Computer

    Shortly after that time, Allan is recruited by a representative
from Medicomp, Inc., a small company servicing hospitals.  Medicomp's
primary product is MEDSTORE, a database for storing patient
information.  Medicomp seeks to enhance MEDSTORE with an on-line,
touch-sensitive help system.

    Allan accepts a programming position with Medicomp.  During his four
years there, he develops modules for a touch- sensitive help facility.
These modules are incorporated into MEDSTORE.  Largely due to
MEDSTORE's remarkable ease of use, Medicomp quickly becomes the leading
supplier of patient information database systems for hospitals.

    In 1985, Allan leaves Medicomp.  At that time, he teams up with a
lawyer to create TAXELF, do-it-yourself tax preparation software for small
businesses.  TAXELF utilizes Allan's now famous touch-sensitive help
utility, and is projected to be a huge commercial success.

    Shortly before TAXELF is due to be released,  Medicomp files suit
against Allan.  Their underlying argument is simple:   "As the investor in
touch-sensitive help, Medicomp deserves the fruits of its success.  You,
Allan, basically stole something that belongs to us."

    Allan's answer to Medicomp's argument is also straight-forward and
compelling:  "You hired me as an expert in help utilities, and you got
what you paid for.  Any further benefits from the system should flow to me
as creator."

Questions: (for use as a guide only)

        Should Allan have the right to reuse the touch sensitive
help utility he developed while at Medicomp?

        a.  Right to copy the actual code?
        b.  Right to rewrite the code from memory?
        c.  Right to use the program structure and organization?
        c.  Right to use touch sensitive help in general?

        What rights, if any, should Medicomp retain in the utility
which they hired Allan to produce?

        a.  Right to use the utility in MEDSTORE?
        b.  Right to use the utility in other Medicomp products?
        c.  Right to prevent Allan from using the utility?
        d.  Right to prevent Allan from using touch sensitive help?

        Should Allan's rights to use the modules, or the ideas they
embody, be any greater than those of the general public?

        Has the act of answering these questions changed your first
impression of what is just in this case?  If so, why did you back
down?!  Should you have?

                 [I trust that Cathy will share her results with RISKS.  PGN]

Computers on Aircraft [RISKS-6.41]

Robert Dorsett <>
Sun, 13 Mar 88 04:47:05 CST
>  I don't believe that pilots are expected to believe computers over
>indications given by other sources.  

What other sources are they supposed to use?  Consider the standard
navigational equipment on the 747-200:

    Horizontal Situation Indicator--computer processed display.
    Flight Director--computer generated flying instructions.
    Autopilot--analog/digital computer.
    Flight Performance Computer/Flight Management System--computer 
          used for flight management, calculating fuel consumption, etc.
    Inertial Navigation System--computer used for "blind" navigation.

The INS is usually linked to the HSI and autopilot; there are a variety of
configurations that the pilot may select.  The FMS, when installed, can 
link into the network as well, and fly the airplane efficiently from take-
off to landing.

On the 747-400, Airbus A320 (and the forthcoming A340), MD-11 (the DC-10
derivative) and, to a lesser degree, the Boeing 757 and 767, the pretense of
electromechanical instruments has been done away with altogether, and replaced
with CRT displays, under the assumption that the CRT displays are less prone
to failures.  The problem here is that the *means* of display may in itself
contribute to error: for example, the current vogue for the traditional line
of instruments displaying a "clock" airspeed, artificial horizon, and
altimeter, is to have a computer-displayed "tape" airspeed, and tape altimeter
bracketing the horizon.  The immediate sacrifice is the lack of "trend"
information:  tape instruments are only marginally better than a digital LED
display.  Research on these issues is continuing, but what I've read indicates
that NASA is advising caution, while Boeing and Airbus are producing their
own, contrary figures.

The point must be made that, in modern aircraft, all of the pilot's inputs
are preprocessed by computers.  The Boeing philosophy thus far has been to
simplify overall design and efficiency by introducing automation; the Airbus
philosophy has been to redefine the role of the pilot in the cockpit while
simultaneously changing the way information is displayed.  It is clear that
Boeing has considered following in Airbus' footsteps during the design 
phase of the (suspended) 7J7.

On the navigation issue: airlines have little say in how their pilots actually
navigate: it's largely up to the background of the individual pilot.  While
one pilot may double- or triple-check sources, another might prefer to read
the newspaper: consider the worst-case scenario, the incompetent captain and
the resentful and disinterested first officer.  There is a great tendency in
modern airplanes to rely on the INS/autopilot link, to great detriment, as
evidenced by the China Airlines flip over California in 1985, or the KAL 007
tragedy.  A recent conference sponsored by the Flight Safety Foundation, held
in Tokyo, advocated a return to the attitudes of the early 1960's, and a
return to basic skills.  It is clear that highly automated cockpits serve to
insulate the pilot from the airplane, and thus increase boredom and stress.
Design engineers, on the other hand, see the pilot error problems, and try to
insulate the pilot yet further, creating more automated and "safe" systems.
Modern cockpits such as the A320's, are contrary to the recommendations of
organizations such as the Flight Safety Foundation's: the reasons most often
cited are minimising training and maintenance costs, and reducing "pilot
workload", all at the expense of long-term pilot welfare.

Robert Dorsett     Internet:
UT Austin          UUCP: {ihnp4,allegra,ihnp4}!ut-emx!!mentat

High-Tech Trucking

Rick Sidwell <sidwell@commerce.UCI.EDU>
Sat, 12 Mar 88 08:13:37 -0800
Here is an article from a report sent by California State Senator John Seymour
to all of his constituents.  The issue has been discussed before in RISKS; this
is a fresh example.


  "Under state and federal law, truck drivers are required to keep handwritten
  logs to record the number of miles and hours they're on duty.  These logs are
  easily tampered with and are often a work of fiction as some drivers try to
  circumvent highway safety laws designed to prevent accidents.

  "The result has been a dramatic increase in truck-related accidents, injuries
  and deaths on our highways.  According to the California Highway Patrol, last
  year alone, 678 Californians died and more than 16,000 were injured in truck-
  related accidents.  Snce 1982, truck-involved fatalities are up over 40 
  percent and truck-related injuries are up more than 32 percent.

  "In his continued leadership role in highway safety, Senator Seymour has 
  introduced legislation to require large commercial trucks to install 'black
  boxes.'  The 'black box' is an onboard computer that automatically records
  drive time, speed, distance traveled as well as other important functions
  that reveal how a driver handles his rig.

  "'More and more, truck drivers are pushing themselves and their equipment
  beyond their limits,' said Seymour.  'Driver fatigue, equipment failure and
  speeding are killing hundreds of innocent people every year on our highways.
  By requiring the use of "black boxes," heavy commercial truck drivers will
  be forced to more closely adhere to highway safety laws.'"

When I first read this, I noticed that there was a potential invasion of
privacy in that a highway patrolman could look at the electronic log and
see if the trucker had been speeding, and give him a ticket if so.  Then
it dawned on me that this is the very purpose of requiring the "black
boxes" to be installed!  It would be interesting to know what the "other
important fuctions that reveal how a driver handles his rig" are.

Re: Programs crying wolf (RISKS DIGEST 6.38)

Peter da Silva <peter@sugar.UUCP>
11 Mar 88 08:48:29 GMT
Once upon a time a programmer who regularly used both MS-DOS and UNIX systems
sat down at an MS-DOS system and typed "format<CR>". The program replied:


The programmer stuck the floppy in the machine, hit <CR>, and formatted his
hard disk. What's wrong with this picture?

    (1) The UNIX format program took a reasonable default if executed
        with no parameters: the floppy drive. The MS-DOS format program
        took a stupid default: the current drive.

    (2) The MS-DOS format program printed an incredibly stupid "warning"
        message. "Please insert floppy disk in this hard drive".

I understand that the situation has been corrected since then.

Peter da Silva  `-_-'  ...!hoptoad!academ!uhnix1!sugar!peter

Pay cut

Martin Taylor <mmt@zorac.ARPA>
Fri, 11 Mar 88 17:29:25 est
I'm not sure for whom this is a risk, but today's Toronto Globe and Mail
reports that an ex-cabinet minister was placed in charge of a new agency
which was expected to be quite important.  But the politics of the situation
changed and the agency had very little to do, so the minister asked that his
pay should be halved.  The possibility of reducing someone's pay had not
been programmed, and the computer reported, and someone publicised, that his
pay had been doubled.  Very embarrassing for him and for the government of
the day.  (This happened some years ago).

Martin Taylor  (

Dangers of Wyse terminals

A.Cunningham <cstjc@ITSPNA.ED.AC.UK>
Fri, 11 Mar 88 15:46:08 GMT
The department of computer science at Edinburgh University has a collection of
Sun workstations for use by first year undergraduates.  Connected to the Suns
via pads are a number of Wyse75 terminals.  Recently mail was sent to users
which had the following effect:

    1). The user's keyboard was locked and his screen blanked.
    2). His terminal was put into reflect mode (input to terminal
        was reflected back to the host).
    3). The nasty bit. Files permissions were changed and processes
        were killed.

The first year students involved were caught and now face disciplinary
proceedings. A few questions were raised that may be of interest to other users
of the terminals.

    1). Why are the features in the terminal in the first place? I can
        only assume that Wyse put them in as security features. A hacker
        accesses your system you lock out the terminal.
    2). Has anyone had similar experiences? I've only been reading this
        group for a year while we've know of the possiblities of the Wyse
        for at least two. At first it was limited to changing a friend's
        screen to inverse mode. We never envisaged it being used so
    3). Is there a modification to the Wyse to stop it?  We need this to stop
        next year's CS1 from doing the same thing again.

           [This is another tip-of-the-iceberg problem.  All of the control
           characters, escape sequences, and function keystrokes that are 
           used (constructively) by software driving your terminal can also be
           MISUSED by any programs running as if they were you, Trojan horses,
           etc.  Recall that an early example was Trojan Messages, which 
           when READ (not interpreted) would GETCHA.  PGN]

Burnt-out LED (Re: RISKS-6.39)

g.l.sicherman <gls@odyssey.ATT.COM>
12 Mar 88 05:43:00 GMT
Al Stangenberger's lament points up the vulnerability of LED digits to burnout
errors.  Maybe we should redesign the digits to look like this?

     —            —    —            —          —     —    --
    |  |   |  |      |     |   |  |   |     |        |         |  |
            —     —    —     —     —    —    —     —    --
    |  |   |  |   |        |      |      |  |  |  |  |   |  |     |
     —            —    —     —     —    —           --

It's ugly but at least it detects single errors.  (Surely somebody has 
thought of this already?  Are arabic numerals technologically obsolete?)

A recent issue of _Industrial Design_ (Jan. 1974) presents an entire
alphabet in this format.  Imagine the potential for transmission errors!
(In fact, the article goes even further: it presents a four-stroke
alphabet.  How's that for low resolution?)

Col. G. L. Sicherman   ...!ihnp4!odyssey!gls

   [The visual confusion between 6 and 8 is a bit awesome, and the
   unnaturalness of 1 and 7 is also.  (The GE check code is a little
   easier to deal with — people can ignore it.)  But putting in display
   self-checks that tries to GET-THE-LED-OUT seems much more acceptable.  PGN]

Re: Display self-test (RISKS-6.39)

Peter da Silva <nuchat!peter@uunet.UU.NET>
13 Mar 88 15:45:26 GMT
Many calculators [have some sort of self-test]. They come up with all
segments lit. That way you can tell when they're bad.  Gas pumps do this
too... ever noticed digital gas pump displays showing 8888.88 before you
start pumping?

Calculator Self-tests: My HP34C has a full functional self-test

Karl Denninger <ames!lll-crg!lll-winken!ddsw1!>
Fri Mar 11 11:05:24 1988
The HP34C has a sequence, which you ask for by hitting <STO> <ENTER>, which
does a full functional self-test.  You get all segments lit if all is ok,
or an error code (or a dead unit) if it fails.  The manual claims that it
is a full computational and functional test (and it does take a couple of
seconds to run).

I use it every time I power the thing on.

Karl Denninger             |  Data: +1 312 566-8912
Macro Computer Solutions, Inc. | Voice: +1 312 566-8910
...ihnp4!ddsw1!karl        | "Quality solutions for work or play"

Trying harder on complex tasks than on simpler tasks

Robert Oliver <rabbit1!>
10 Mar 88 20:45:26 GMT
My experience indicates that we often DO try harder on complex tasks than on
simple ones.  In working on a large on-line transaction processing system, it
was observed by various people (notably those responsible for testing and
quality assurance) that whenever we completed major overhauls of the system,
it often passed the tests with little trouble and did not "crash" when
eventually run live.  New versions which contained simple fixes or minor
modifications inevitably acted mysteriously during testing or catastrophically
when put on-line.

What this implied was that complex changes garnered more of our attention  
than simple changes when we were analyzing the problem, designing and 
implementing the change, and testing the final product.  This is not to imply 
that we were simply careless when making simple changes.  On the contrary, 
we were much more careful than most software groups I have seen.  However, 
the simple changes did not elicit that keen level of awareness needed to 
adequately foresee hidden problems and to test for such possible cases.  

Careless, no.  Less careful, less alert, less interested, maybe.  It's not 
only a very gray area, but it's also a tough problem to correct.  One can 
state that, "when making simple changes, remember to be just as 
alert and think just as clearly as when making complex changes," but the 
very nature of the problem will often undermine this maxim.

Robert Oliver           
Rabbit Software Corp.       (215) 647-0440
7 Great Valley Parkway East     ...!ihnp4!{cbmvax,cuuxb}!hutch!robert
Malvern, PA  19355      ...!psuvax!burdvax!hutch!robert

Police using computers - License plate matches - etc, etc.

Ted G. Kekatos <moss!ihuxv!>
9 Mar 88 22:27:44 GMT
All this talk about innocent people vs. police computers reminds me of the
Movie "Brazil". If you have not seen it, it is available in video tape.

The same RISKS question comes up again: If the "computer system" helps the
police to find one (1) indeed "bad" person, and also find one (1) indeed
innocent person, are we willing to deal with the consequence.

Ted G. Kekatos    backbone!ihnp4!ihuxv!tedk (312) 979-0804 
AT&T Bell Laboratories, Indian Hill South, IX-1F-460 Naperville & Wheaton Roads
Naperville, Illinois. 60566 USA

     [If you are looking for one person and you find two, you have some
     incentive to probe further.  The problem is when you get only one,
     and it is the wrong person.  But ultimately it is how the query response
     is handled that matters.  PGN]

Please report problems with the web pages to the maintainer