The RISKS Digest
Volume 9 Issue 80

Friday, 13th April 1990

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…


Risks of Daylight Savings Time
Chuck Weinstock
Authentication via User-Defined Fields
Jim Kimble
Rites of consumption
Phil Agre
Franklin Resources Computer Glitch
John Murray
Risks of computerized publishing
Henry Spencer
Re: Computer generated letters
Benjamin Ellsworth
Nathaniel Borenstein
Software: A320 vs. shuttle
The C3 legacy, Part 5: Subsystem I
Les Earnest
COMPASS 90 program and registration information
John Cherniavsky
Info on RISKS (comp.risks)

Risks of Daylight Savings Time

Chuck Weinstock <weinstoc@SEI.CMU.EDU>
Mon, 09 Apr 90 16:34:47 EDT
With the time change a week ago, I personally experienced the results
of a programming error.  On Saturday, March 31, some friends and I
left Chicago on the Amtrak train Southwest Chief bound for Flagstaff,
AZ.  Because of the time change that occurred that night, the train
was automatically an hour late at 3am on the 1st.  This meant that we
would arrive in Flagstaff at 10:24pm Sunday instead of the 9:24pm
scheduled.  However, by Albuquerque due to some excellent train
handling by the Santa Fe Railroad, the train had made up most of that
hour, and by Winslow, the first stop in Arizona, we were on time.

Unfortunately the Amtrak ticket computer had not been programmed
correctly to deal with daylight savings time in Arizona (there isn't retains Mountain Standard Time == Pacific Daylight Time), and
the incorrect train departure time had been printed on the tickets.
The conductor decided we couldn't leave Winslow until the time on the
tickets, so we sat in the station for an hour and eventually arrived
in Flagstaff an hour late.

Chuck Weinstock

Authentication via User-Defined Fields

The Programmer Guy <>
Mon, 9 Apr 90 15:43:01 CDT
I was picking up a prescription at my local pharmacy when I was told by the
clerk that the order had not yet been phoned in by the doctor.  I was asked to
wait for a few minutes while the clerk rifled through a stack of FAXed

After the clerk found the FAX from my doctor's office and passed it on to the
pharmacist, I began thinking how easy it would be to forge a prescription by
using your favorite desktop publishing software, a scanner, and a PC-FAX board.
This might have taken a little skill and a lot of ingenuity a few years ago,
but as extremely easy-to-use software proliferates into the general public,
this task becomes almost trivial.

I asked the clerk how they went about verifying that the FAX transmissions were
legitimate.  She said the Sending FAX transmits the name of the doctor's office
and its phone number; the clerks simply ensures that the phone number listed is
correct.  The obvious flaw in this system is that the FAX machine itself does
NO VERIFICATION of your name or phone number — you simply "roll your own"
banner as you see fit!  The hopper holding the received documents was sitting
in plain sight so obtaining a "legitimate" banner wouldn't be at all difficult.

I point this out as an example that the only thing worse than *no*
verification of authenticity, is *assumed* authenticity.

--Jim Kimble, TCP/IP Development, IBM Austin, TX

Rites of consumption

"Phil Agre" <>
Mon, 9 Apr 90 17:35:33 199
The articles about automation in fast food restaurants brought to mind a recent
article in the New York Times about the McDonald's in Moscow.  It seems that
the Russians did not yet have the skills necessary to eat at McDonald's.  In
particular, though they all have extensive experience at waiting in line, they
were accustomed to lines parallel to the counter, not orthogonal to it.  And
they were entirely unaccustomed to the precise, stylized form of dialogue that
is involved in purchasing a meal at such a restaurant.  The exuberant managers
of the restaurant had taken this as a kind of challenge, it seems, and declared
their intention to "McDonaldsize" the Russians.  For the managers of
McDonald's, this wasn't just a matter of helping the poor Russians learn to
order a hamburger, nor was it simply a matter of running an efficient
restaurant.  It was symbolic, both for the people who own McDonald's and for
the New York Times — the perspectives of the two of them were quite
indistinguishable throughout — of the larger process of `converting' the
Russians to a capitalist way of life.  It was explicit in their explanations of
this process that living in a capitalist social and economic system is a
*skill*, something located not nearly so much in the mind as in the body, in a
cultivated habituation to the various forms and interfaces of efficient
consumption.  It would seem that these Russians are getting their remedial
lessons just in time, given that the process of buying fast food is about to
become even more of a skill.  The touch screens will be confusing at first,
given that everyone has been so reliant upon the small space of `slop' in the
human contact — minimal, to be sure, but real — between consumer and counter
attendant.  But soon we will all get used to it, and the skill of sorting one's
desires into the combinatorial space of a touch-screen menu will become part of
each new citizen's socialization, another technological boundary joining our
own bodies to the rational machinery of a system without a face or a meaning
beyond the pure excitation of standardized desires.

Phil Agre, University of Chicago

Franklin Resources Computer Glitch

John Murray <>
9 Apr 90 21:22:19 GMT
Franklin Resources of San Mateo CA was critized recently in the San Francisco
Chronicle over its sales commission policy. The company runs a family of
"loaded" mutual funds, meaning that customers pay an up-front commission charge
to invest. The criticism related to how Franklin charges an additional, hidden
commission on those dividends which are reinvested in the fund, rather than
paid out in cash. It turns out that the fine print contains a provision which
allows an investor to avoid this hidden commission by paying a minimum annual
fee of $50. However, Franklin doesn't promote this discount feature.

According to the Chronicle (April 7 1990), the company will soon kill the
special arrangement. Chuck Johnson, VP of corporate development, says "It was
never a deliberate policy, it was a data processing snafu which we ignored for
a while. But an increasing number of brokers caught on, and it's reached the
point where we have to address it."

Some points to ponder:

1) A prospectus is the legal document on which an investor is supposed
   to make financial decisions. But what's to stop Franklin, or anyone
   else, from arbitrarily ignoring ANY of its provisions, and blame a
   data processing snafu?

2) What sort of 'snafu' could possibly cause a variety of prospectuses
   to contain a complete description of a non-existent feature? A text
   processing error seems most likely, but that doesn't say much for
   Franklin's proof-readers.

Mr. Johnson makes Franklin seem particularly sleazy. The company waited
until "an increasing number" of brokers (not just one or two) drew
attention to the error, or "caught on" as he puts it. Then, instead of
sending out correction notices, the company just "ignored it for a while".

In any case, it seems that Franklin will no longer let you pay them to
avoid having to pay them for getting what you originally paid them to
do up-front in the first place!

- John Murray, Amdahl Corp. (My own opinions, etc.)

Risks of computerized publishing

Thu, 12 Apr 90 23:20:19 EDT
The April issue of Locus, the major trade magazine of science fiction and
fantasy writing and publishing, contains an interesting news item.  Some

    _The_Fall_of_Hyperion_, the new book by Dan Simmons, appeared
    with a critical page missing.  All 20,000 copies of the hardcover
    and trade editions were shipped before bookseller and Locus
    reviewer Tom Whitemore called Bantam/Doubleday/Dell to point out
    that page 305 is missing, and page 306 appears twice.  Their
    initial statement is not printable...

    The page was correct in the advance galleys...  The mistake was
    in the film shooting at the printer.  Now that publishers and
    printers are computerized, there seem to be more mistakes, not
    less.  Last year, Tor's edition of _Angel_Station_ by Walter
    Jon Williams had several sections turned into gibberish by a
    computer glitch.  Again, the checked galleys were correct...

The piece goes on to mention that such problems, of course, are not new.
There have been some major disasters — entire chapters or crucial scenes
omitted — even in pre-computer days.

Henry Spencer at U of Toronto Zoology              uunet!attcan!utzoo!henry

Re: Computer generated letters

Benjamin Ellsworth <>
Mon, 9 Apr 90 17:25:06 pdt
> [great story about a erroneous computer generated letter and bogus
> signature]... [These tales continue to come out of the woodwork.  Please
> don't take this as a challenge to submit still more of them.  Thanks.
> PGN!]

Yes, but in the interest of good science (and having a good laugh ;-) we
should definitely collect an archive of as many as possible.  You never
know when such data will be needed by a someone doing a "study." ;-)

This is a call for such stories.  If you have a story, please send it to  If the incident happened to you, make your subject
"Letter story, first-hand."  If you just heard the story, the subject
should be "Letter story, hearsay."  Mailings using other subject lines
are not guaranteed to make it into the archive.

Once the mailing slows down a little, I'll find away to make the
archive generally available.

Benjamin Ellsworth      |

Re: Wonderfully mistaken letter generators

Nathaniel Borenstein <>
Tue, 10 Apr 90 00:15:58 -0400 (EDT)
My mother-in-law is the President of a small synagogue, Congregation
Beth El in Potsdam, NY.  She frequently receives some very amusing
examples of this form, most notably ones that conclude from the "Beth"
that the recipient is female, e.g.:

Dear Miss. El,

Dear Beth,

And of course, they make frequent references to "the El family" which is
particularly amusing when you realize that "El" in this context means
nothing less than God Himself (Beth El is Hebrew for "House of God").

In a related matter, one of my relatives, whose last name is Glasser,
decided to throw away $17 on the "Glasser family newsletter".  This
turned out to be a 20 page Macintosh-generated newsletter that was
obviously completely generic — globally replacing "Glasser" with
"Smith" would have yielded an equally reasonable Smith family
newsletter.  Anybody know how much money such companies manage to make
by selling such dubious computer-personalized products?

Software: A320 vs. shuttle (RISKS-9.79)

Wed, 11 Apr 90 10:36:53 EST (Dan Brahme) writes  [...]
>A look at how many times the space shuttle launch had to be postponed
>due to some software problem should shed some light.

I don't think you can validly compare the two systems, the shuttle software was
a generation or more of technology behind that of airbus.  I remember reading
an article in communications of the ACM about the shuttle software (1982 I
think), and in many ways it was a disaster waiting to happen. The method of
mutual exclusion used in the shuttle was called "alibies" and involved proving
that at the instant your process wanted to access a variable, no other process
would. This was dubious for a few processes, but when the system complexity
increased trying to calculate this n^2 problem for 10's to 100's of processes
was ridiculous. One of the reasons for delay was each time someone wanted to
change a process, they had to recalculate all its alibis.

When one of the first flights was delayed due to weather, and the crew went
back to base to practice more in the simulator. (Real shuttle software/hardware
attached to a enviroment simulator.)  They ran a senario called "Trans-Atlantic
Abort" in which something happens to thrust so that they are too high to return
to base, and too low to go into orbit. The idea was to glide to Spain. In the
middle of this, all CRTs went into an error mode indicating they had not
recieved any updated info from the computers in 5 seconds, and that there data
was invalid. The astronauts were understandably shaken, and the software
enginiers traced the problem to a bad "goto" into the middle of code without
reinitialising the data.

The shuttle software was written in a goto orientated sequential language on
redundent IBM computers of the same design.  The A320 has 3 or 4 separate
computers with CPU's from different manufactures each running software from
different companies written with different compilers for high level languages
that include process management and security. Goto's are forbidden in the

This makes it likely that the problem (if any) in the airbus software is in the
problem specification/design rather than the implementation.

I have been appalled at Airbus' behaviour regarding the A320.  It is not
unlikely that there are some software errors, but until now Airbus has had an
excellent reputation for installing good saftey features not available on other
aircraft (e.g. completely fire retardent cabin materials with non toxic fumes
years ahead of other manufacturers). It seems silly to risk that reputation by
not being as open as possible.


The C3 legacy, Part 5: Subsystem I

Les Earnest <LES@SAIL.Stanford.EDU>
11 Apr 90 1635 PDT
(Continuing from RISKS 9.74)

Of the dozens of command and control system development projects that were
initiated by the U.S. Air Force in the early 1960s, none appeared to
perform its functions as well as the manual system that preceded it.
I expect that someone will be willing to argue that at least one such
system worked, but I suggest that any such claims not be accepted

All of the parties involved in the development of C3 systems knew that their
economic or power-acquisition success was tied to the popular belief that the
use of computers would substantially improve military command functions.  The
Defense Department management and the U.S. Congress must bear much of the
responsibility for the recurring fiascos because they consistently failed to
insist on setting rational goals.  Goals should have been specified in terms of
information quality or response time for planning and executing a given set of
tasks.  The performance of these systems should have been predicted in the
planning phase and measured after they were built so as to determine whether
the project was worthwhile.

Instead, the implicit goal became "to automate command and control," which
meant that these systems always "succeeded," even though they didn't work.
Despite a solid record of failure in C3 development, I know of just one such
project that was cancelled in the development phase.  That was Subsystem I,
which was intended to automate photo-interpretation and was developed for the
Air Force by Bunker Ramo, as I recall.

The "I" in the name of this project supposedly stood for "Intelligence" or
"Interpretation."  This cryptic name was apparently chosen to meet the needs of
the prospective users in the intelligence community, who liked to pretend that
nobody knew what they were doing.  This pretense occasionally led to odd
conduct, such as when they assigned code names to various devices and tried to
keep them secret from outsiders.  For example, a secret name was assigned to
one of the early U.S. spy satellites — as I recall it was Samos — but when
that name somehow showed up in the popular press they tried to pretend that no
such thing existed.  In support of this claim, everyone in the intelligence
community was directed to stop using that name immediately.

When I attended a meeting in the Pentagon a few days after this decree and
mentioned the forbidden word, the person operating the tape recorder
immediately said "Wait while I back up the tape to record over that!"  This was
a classified discussion, so there was no issue of public disclosure involved,
just the belief that there should be no record of the newly contaminated name.

Sometime in the 1981-82 period, the Air Force decided to terminate the
development of Subsystem I.  A group of about 30 people from various parts of
the defense establishment, including me, was invited to visit the facility in
suburban Los Angeles where the work was going on to see if any of it could be
used in other C3 systems.  We were given a two day briefing on the system and
its components, the principal one being a multiprocessor computer.

The conceptual design of this Polymorphic Computer, as they called it, was
attributed to Sy Ramo, who had earlier helped lead Hughes Aircraft and
Ramo-Wooldridge (later called TRW) to fame and fortune.  The architecture of
this new machine was an interesting bad idea.  The basic idea was to use many
small computers instead of one big one, so that the system could be scaled to
meet various needs simply by adjusting the number of processors.  The problem
was that these units were rather loosely coupled and each computer had a
ridiculously small memory — just 1K words.  Each processor could also
sequentially access a 1K buffer.  Consequently it was very awkward to program
and had extremely poor performance.

I sought out the Subsystem I program manager while I was there and asked if our
group was the only one being offered this "free system."  He said that we were
just one of a number of groups that were being flown in over several months
time.  When I asked how much they were spending on trying to give it away, he
said about $9 million (which would be equivalent to about $38 million today).
The Air Force Systems Command seemed to be trying desparately to make this
program end up as a "success" no matter how much it cost.  When I asked why the
program was being cancelled, I got a very vague answer.

I did not recommend that my group acquire any of that equipment and as far as I
know nobody else did.  The question of why Subsystem I was cancelled remained
unresolved as far I was was concerned.  It is conceivable that it was because
they figured out that it wasn't going to work, but neither did the other C3
systems, so the reason must have been deeper (or shallower, depending on your
perspective).  My guess is that they got into some kind of political trouble,
but I will probably never know.

(Next part: the Foggy Bottom Pickle Factory)

    -Les Earnest (

COMPASS '90 program and registration information

John Cherniavsky <jcc@cs.UMD.EDU>
Thu, 5 Apr 90 10:58:07 -0400
           COMPASS '90 Monday June 25
                    PRE-CONFERENCE TUTORIALS

  0800 Registration; Coffee
0900      Software Safety by Archibald McKinlay of McDonnell-Douglas Corporation

1300      Software Verification and Validation for High Risk Applications
          by Janet R. Dunham and Linda Lauterbach of Research Triangle Institute
  1700 Close of tutorials

              COMPASS '90 PROGRAM:  Tuesday June 26

  0800 Registration; Coffee
0900      Opening, General Chair, Dolores R. Wallace,
      National Institute of Standards and Technology

0915      Honorary Chair Address, Honorable Mike Parker,
      U.S. House of Representatives

0930      Keynote Address, "Safety in Numbers?"   Air Marshal M J D Brown,
      RAF, Director General of Strategic Electronic Systems,
      UK Ministry of Defence

1030      Program Chair, H. O. Lubbes, Naval Research Laboratory
  1100 Break
1130      Systems Engineering: The Forgotten Discipline, Dario De Angelis,

          Verification and Validation: A Systems Engineering Discipline
      for Producing Quality Software Systems, Roger Fujii, Logicon
  1300 Lunch
1400      A Vital Digital Control System with a Calculable Probability of an
      Unsafe Failure, David Rutherford, Rail Transportation Systems, Inc.

1430      Using Symbolic Execution to Aid Automatic Test Data Generation,
      A. Jeff Offut, E. Jason Seaman, Clemson University

1500      A Case Study: Production Problems in an Application Running on the
      Prodigy Service, Marnie L. Hutcheson, Prodigy Services Company
  1530 Break
1600      Fast Static Analysis of Real-time Rule-Based Systems to
          Verify Their Fixed Point Convergence, Albert Mo Kim Cheng,
      Chih-Kan Wang, University of Texas at Austin

1630      Real-Time Software Failure Characteristics, Janet R. Dunham,
      Research Triangle Institute and George Finelli,
      NASA Langley Research Center

1700      Uncovering Redundancy, Rule-Inconsistency and Conflict in
      Knowledge Bases via Deduction, James McGuire, Lockheed AI Center

  1900 Banquet; On Quality, W. Earl Boebert, Secure Computing
       Technology Corporation

             COMPASS '90 PROGRAM: WEDNESDAY June 27

  0830 Registration; Coffee
0900      Mathematics for Digital Systems Engineering, Donald Good,
          Computational Logic, Inc.

1000      The Rigorous Specification and Verification of the Safety Aspects of a
      Real-time System, Derek P. Mannering, Rex, Thompson & Partners, LTD

1030      An Analysis of Ordnance System Software Using the MALPAS Tools,
          Ken Hayman, Department of Defence, Australia
  1100 Break
1130      Proving Proof Rules: A Proof System for Concurrent Programs,
          David Goldschlag, Computational Logic, Inc.

1200      A Formal Approach to Railway Signalling, W. J. Cullyer and W. Wong,
          University of Warwick

1230      A Structured Approach to Code Correspondence Analysis, J.W. Freeman
          and R. B. Neely, Ford Aerospace Corporation
  1300 Lunch
1400      Maintaining Abstractions with Verification, William D. Young and
          Warren A. Hunt, Jr., Computational Logic, Inc.

1430      Using CSP to Develop Trustworthy Hardware, Andrew P. Moore,
          Naval Research Laboratory
  1500 Break
1530      Trusted Computer System Standards, Chuck Pfleeger, Trusted
          Information Systems

1630      Panel: Are Trusted Computer System Standards Useful for the
          Development of Systems Whose Criticality Is Other Than Security?
      Chair, H. O. Lubbes, Naval Research Laboratory

              COMPASS '90 PROGRAM: THURSDAY June 28

  0830 Registration; Coffee
0900      Rationale for the Development of the UK Defence Standards for
          Safety-Critical Software, Air Marshal M J Brown, RAF, Director
      General of Strategic Electronic Systems, UK Ministry of Defence

1000      Safety-related Programmable Electronic Systems: Guidelines and
          Standards, R. Bell, Health and Safety Executive, UK

1030      DRIVE-ing Standards - A Safety Critical Matter, T. F. Buckley,
          P. H. Jesty, K. Hobley, and M. West, University of Leeds
  1100 Break
1130      Panel: Should Government Regulate Medical Software?
          Chair: Diane Jackinowski, Varian Associates

1230      The Computer-Related Risk of the Year: Distributed Control,
          Peter Neumann, SRI International
  1300 Lunch
1400      SAFETYNET 89, Digby Dyke, Charter Technologies, LTD
1430      Product Liability in the UK - Issues for Developers of Safety-
          Critical Software" - Ranald Robertson, Stephenson Harwood Solicitors
  1530 Break
1600      Panel: Are Certification and Accreditation Useful Concepts for Safety
          Critical Systems? Chair, H. O. Lubbes, Naval Research Laboratory

1700      Verifiable Microprocessors: Architectures and Verification
          Techniques, John Kershaw, Royal Signal and Radar Establishment
      and John Wise, Charter Technologies, LTD

          Dolores Wallace    301-975-3340

SPONSORS: IEEE AESS, IEEE NCAC, in Cooperation With: ACM SIGSOFT.  Co-sponsors:
Advanced Ordnance Technology Incorporated, Charter Tehcnologies, Ltd.,
Computational Logic, Incoroporated, Computer Sciences Corporation, Georgetown
University, Logicon, Incorporated, National Institute of Standards and
Technology, Naval Reseqarch Laboratory, Naval Surface Warfare Center, Research
Triangle Institute, Tri-Service Software System Safety Working Group, Trusted
Information Systems

Please report problems with the web pages to the maintainer