The RISKS Digest
Volume 10 Issue 74

Thursday, 3rd January 1991

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Vicious elevator door failure recovery
Curtis Jackson
Dehumanization by old Cobol programs; how to get more junk mail
Darrell Long
Computer data putting history out of reach
Jay Elinsky
Re: Computer Age Causes Key U.S. Data To Be Lost Forever
Joe A. Brownlee
Re: computer "warfare"
John Abolins
Re: "Computer Models Leave U.S. Leaders Sure of Victory"
Jeff Griffen
David Holland
John C Slimick
David Wright
Trojan in MS-DOS 4.01?
John Chapman Flack
Re: Organizational Aspects of Safety
Nick Szabo
A RISKy video store kiosk
R. Aminizade
Call for papers, VDM '91
Hans Toetenel
Info on RISKS (comp.risks)

Vicious elevator door failure recovery

Curtis Jackson <jackson@adobe.UUCP>
2 Jan 91 22:48:02 GMT
While vacationing in Honolulu last week, my fiance and I had the misfortune to
run into a very vicious failure recovery mode in an elevator in our hotel.  As
near as I could tell, the elevator doors did not have the usual leading-edge
vertical strip which, when depressed horizontally, causes the doors to open so
as not to crush a person or appendage.  Instead, this function was relegated
entirely to an "electronic eye" beam.

One particular elevator was malfunctioning — closing only partially before
jerking open as if someone had broken the beam.  But regardless, if the doors
on any elevator attempted to close for four times without success due to the
beam being broken (or due to a faulty perception that the beam was being
broken), it would buzz loudly as it closed the doors slowly and completely.
During this final closure, presumably to prevent someone from standing in the
doorway and monopolizing the elevator on one floor, the doors would close
regardless of what the electronic eye told them.  I tested this by placing my
shoulder between the doors (an action that normally always opened the doors
because I broke the beam), and the doors continued to close onto my shoulder
and then made a meaningful attempt to crush my shoulder.  The DOOR OPEN button
had no effect when the elevator was in this close-at-all-costs recovery mode.

My first thought was what would happen to someone who slipped on the doorsill
of the elevator and injured themselves.  The doors would attempt to close
several times, then buzz at the poor sod as they attempted to crush him/her in
the doors.  Even if the elevator had not been programmed to ignore the beam in
this mode, I would still find the lack of a physical means to override the door
closure (the traditional leading-edge strip) to be a severe safety hazard.

Curtis Jackson @ Adobe Systems in Mountain View, CA  (415-962-4905)
                                    uucp: ...!{apple|decwrl|sun}!adobe!jackson


Dehumanization by old Cobol programs; how to get 4x as much junk mail

Darrell Long <darrell@sequoia.ucsc.edu>
Wed, 26 Dec 90 10:29:14 PST
My dear mother blessed (or perhaps cursed) all of her children with two middle
initials, in my case "D" and "E".  This has caused me a good deal of trouble,
as you can imagine.

It seems that TRW (and now we learn Lotus) sells certain parts of you credit
information, such as your name and a demographic profile.  Well, I recently
got a new credit card from Gottchalks and found to my chagrin that my name
had been truncated to "Darrell D. Long".  I went to the credit manager and
patiently explained my situation, and was assured that things would be fixed
and they were very sorry etc, etc.

Well, two things happened: I got a new credit card, this time as "Darrell E.
Long", and TRW now has an annotation in my file to the effect "File variation:
middle initial is `E'".  Soon after this I start getting mail for "Darrell E.
Long" (along with the usual "Darrell Long" and "Darrell D. Long" and the
occasional "Darrell D. E. Long").

I called up the credit bureau and it seems that the programmer who coded up the
TRW database decided that all good Americans are entitled to only one middle
initial.  As the woman on the phone patiently told me "They only allocated
enough megabytes [sic] in the system for one middle initial, and it would
probably be awfully hard to change."

I know I'm not the only one with more than one middle initial — of my
european friends have several.  I wonder what they do with a name like
"Ananthanarayanan", do they randomly truncate it?  — I suppose I should my
friend.

I'm afraid it's going to get worse before it gets better though.  With Lotus'
product such name mutilation will only spread.                             DL

Dr. Darrell D. E. Long, University of California at Santa Cruz


"Computer data putting history out of reach"

"Jay Elinsky" <LINSKY@YKTVMZ.BITNET>
Wed, 2 Jan 91 14:11:38 EST
This is an excerpt, extracted by Charlie Hart in IBM Raleigh and appended to
the IBM internal NEWSCLIP FORUM, from an article with the above title from the
Raleigh News & Observer 1/2/90 (Associated Press):

* A slice of American history has become as unreadable as Egyptian hieroglyphics
  before the discovery of the Rosetta stone.
* More historic, scientific and business data in danger of dissolving into a
  meaningless jumble of letters, numbers and computer symbols.
* Americans pay billions to collect the data and may pay millions more to
  preserve it.
* Much information from past 30 years stranded on computer tape from primitive
  or discarded systems - unintelligible or soon to be so.
* Detection of disease, environmental threat or social shift could be delayed
  because data was lost.
* Examples:
  - 200 reels of 17 year old Public Health Service tapes were destroyed last
    year because no one could find out what the names and numbers on them meant
  - Agent Orange task force unable to use Pentagon's tapes containing date,
    site, and size of every herbicide bombing in Vietnam.
  - Extensive record of U.S. WW II vets exists only on 1600 reels of microfilm
    of computer punch cards - no money or manpower to return data to computer.
  - Census data from the 1960s & old NASA data exist only on old tapes - some
    may have decomposed; others may fall apart if run through the balky
    equipment that survives from that era.
* Director of National Archives states it would take 25 years to process 20
  years of old data if money and manpower existed.
* One of the biggest headaches is sloppy record keeping - no written record of
  programs or data formats. "Generally it's the last thing you do and pay the
  least attention to" according to assistant Census director Gerald Cranford.


"Computer Age Causes Key U.S. Data To Be Lost Forever"

jbr@cblph.att.com <j.a.brownlee>
3 Jan 1991 14:43 EST
[text of the same article read by the previous contributor deleted ...]

While the risks of storing important data on media with short life-spans or in
undocumented formats are fairly obvious, I suppose that it should not be a
surprise that the U.S. government is having such problems.  After working at
companies that do government work and seeing the many rules in place to
``protect'' the American taxpayers, this is almost predictable.  Because of the
procurement process and the length of time it takes to solicit proposals and
bids for a system, often by the time the implementation begins, the requirements
can be several years old — a long time for computer systems.  Changing
requirements to be more reasonable can mean up to a year of red tape.  Also,
the government has been known to buy some rather non-standard systems.

All in all, this is a rather startling article when you consider the type and
amounts of important data that are probably already lost forever.

Joe Brownlee, Analysts International Corp. @ AT&T Network Systems, 471 E Broad
St, Suite 1610, Columbus, Ohio 43215; (614) 860-7461 E-mail: jbr@cblph.att.com


Re: computer "warfare"

John Abolins <jabolins@well.UUCP>
Tue, 1 Jan 91 20:41:21 pst
In RISKS-10.70, Sanford Sherizen wrote...

> I would like to gather any *hard* evidence that viruses have been used for
> political/military purposes.
> It is possible that the Jerusalem virus was first set off to commemorate a
> Palestinian event but has there been any way to verify this?

The best person to speak for the virus case at Hebrew University in early 1988
would be Y. Radai.  In BITNET e-mail "chats" with him in 1988, Radai
emphasized that the virus was NOT politically motivated.

Unfortunately, the claim of political origins was circulated in various
reports, including an article in the New York Times.  The author of the New
York Times article had claimed that the Friday May 13, 1988 (when the virus
would wreck many files) was connected to May 14, 1988 , the 40th anniversary of
the establishment of the State of Israel.  The author, thus, interpreted May
13, 1988 as the "last day of Palestine".  (More correctly, it was the last day
of the BRITISH Palestine Mandate.)  For the Middle East (where when something
goes "boom", many hands reach for a phone to claim the act), it was too "cool".
No individual or group claimed credit for this virus. No messages were embedded
in the code.  The only reason people outside of the Middle East interpreted the
virus as a political act was that 1) it was causing problems for an Israeli
institution, 2) some computers used by Mossad were allegedly also affected by
the virus, and 3) it happened in the Middle East so near Israel's 40th
anniversary.  By this reasoning, any auto accident or heart attack in
Washington, DC must be politically caused. :-)

Sanford Sherizen continues...

> Are there other viruses that have been specifically distributed or
> directed to harm a political foe? ...
> Is the virus a potential "small nation's weapon"?  Can viruses become
> terrorist surrogates, disrupting an enemy nation without leaving direct
> fingerprints (strings?) traceable back to the ultimate sponsor?  What roles
> could viruses play in future small scale intensive conflicts as well as
> major wars?

While it is possible that unreported cases with actual political origin exist,
the reported substantiated cases of political attacks against computers have
been physical, not logical. For example, the bombings of computer facilities in
Europe.  By these trends, it would look that most computer related attacks will
be against the hardware or against the people working with the computers
themselves.

I said "by these trends" because there is no guarantee that those trends will
hold out forever.  The physical means have been favored because they are
spectacular, producing fear in the general society, and playing well in the
media.  Computer "warfare" is more difficult to exploit because to many people
the effects of a virus are abstract. (A newscaster try to describe a virus
grabs less attention than fire and blood.) But as societies of the West and
many other areas of the world become more dependent upon computers, computer
may be more enticing targets.

The most enticing aspect of computer targets is the quality of civilization
that terrorism most seeks to destroy: trust. (Although many people talk about
computer errors and bugs, how many people actual stop using credit cards, ATMs,
airlines, etc.?)  A savvy group could exploit the computer environment to erode
trust in a society's systems. Gumming up the works of financial systems, air
traffic controls, etc. could make people uneasy about commerce or travel. It is
also possible for a group to attack a very specific types of system (military,
government, etc.) for the purpose of incapacitation pure and simple.  One of
the difficulties in this type of attack is the ability to get a suitable virus
on suitable targets.  In most cases, it would take someone getting inside (or
being able to get an unwitting person to get the code inside). This aspect
increases the possibility of detection.

Another possibility is general harassment. Just letting more generalize
computer viruses go their way.  This mode is hard to detect as evidenced by the
difficulty in tracing down virus writers in general. This mode is good for
discouraging people from sharing files but, as seen from the past two years of
non-political viruses, not all that disruptive. (And even profitable for some
software companies. ;-] )

Regarding the possibility of viruses as a "small nation's weapon", it does
exist.  However, similar possibilities exist for other weapons, especially
biological.  By various reports, a simple biological arsenal (nothing exotic)
could be started for much less cost than most aircraft or armored vehicles. Or
chemicals, in crude way, could be used. (The old "LSD in the city's water
supply" scenario.)  Fortunately, for one reason or another (perhaps human
inertia), these thing have not yet happened.

By the way, making a computer virus can be done by one person.  Thus, the
usual indicators of offensive development (eg.; satellites photos of
installations, movement of materials and personnel, etc.) may be nonexistent.
Lest one dismisses the possibility of some Third World group being "smart
enough" to produce a computer virus, many terrorist or partisan groups have
better educated people than popular stereotypes claim. Also, since a computer
virus can be easily transferred on disk, tape, etc., the virus can be produced
practically anywhere. And it does not require a member of the group or even
sympathetic to it.  False flagging is always possible by convincing a
technically competent person that the code is being made for another cause,
one favored by the person.

With computer technology, an important change occurs in terrorism/partisan
warfare: an individual might be able to inflict continuous harm beyond the
capabilities of groups using tradition methods. The reason for this is that
with groups, the risk of infiltration, defection, disclosure, etc. increase
with the number of members.  THe advantage of groups is the  people resources.
But if a terrorist is skilled with certain technologies and chooses to use
them, counter measures may be much more difficult.

In my thoughts about such possibilities for computer viruses, I prefer to move
away from the usual category of "terrorism" and use the broader category of
"partisan conflict". Partisan conflicts can include conflicts not usually
considered as warfare or terrorism but, nonetheless could involve computer
viruses. Such conflicts could be those involving animal rights, tax protest,
abortion, etc. (In short, anything that draws intense conviction and
sentiment.)

But in many of these partisan conflicts, computers may be used in generally
legitimate applications such as communications services (eg.; BBS's), desktop
publishing, etc.  In some case, computer "warfare" could occur in the form of
using computers to monitor opponents or targets, to increase the effect of
black propaganda" and forged materials (eg. fake pamphlets in the name of the
target group), possibly to plant misinformation in institutional systems.

Viruses as a terrorist tools might be more attractive to Third World
entities on the basis that "blowback" is of less danger than for
industrialized entities. That is, entities that don't have computers don't
have to worry about the virus affect their systems. Their main problem is the
access to the technology to make the virus in the first place.

One of the chief disadvantages with viruses for such warfare is the danger of
side effects.  For example, indiscriminate disruption of a country's military
C3I could lead to firing of weapons in panic with unpredictable results.
Also, the disrupted C3I may be part of the same system that is need to give
orders for surrender and negotiations.  Considering these things, perhaps, a
nihilistic partisan group may be more likely to unleash computer viruses.
(Just as a group that has no goals and just wants to inflict harm may be more
partial to biological agents.)

J. D. Abolins   301 N. Harrison Str. #197   Princeton, NJ 08540  USA
jabolins@well.sf.ca.us                                  609-633-0740


Re: "Computer Models Leave U.S. Leaders Sure of Victory"

Griffen <griffenj@ncube.com>
Mon, 24 Dec 90 11:52:50 PST
In the June-July 1990 issue of Fire & Movement (a wargamer's magazine),
there is a _Forum_ section with several articles from various authors
regarding battlefield simulation as practiced by the US armed forces.

Probably the most applicable article of these is "The Right Tool Wrongly
Used", by Eric M. Walters.  In it, he mentions that the simulations and
wargames used by the armed forces are regularly modified "in order to
achieve training objectives," and not necessarily to promote realism.

He mentions several examples, such as the modification of a rule allowing
M60A1 tanks to engage - in force - with enemy vehicles at ranges over 3Km.
Walters (Captain in US Marines) states that while hits are possible at
those distances, the number of hits in a real war against a "thinking,
moving enemy" would be statistically insignificant.  The tanker lobby won.

I'll close with the following quote from the article:

  ...Red Force electronic warfare is reduced or eliminated because its
  success causes a complete breakdown in exercise force command and
  control (with a corresponding "loss of staff training time"); and so on.
  Thus, game "reality" is molded to accommodate Blue Force plans and
  intentions - not vice versa.

- Jeff


re: "Computer models leave U.S. leaders sure of victory"

David Holland <achilles@pro-angmar.UUCP>
Mon, 24 Dec 90 01:57:35 EST
Someone referred to the wargames undertaken by the Japanese prior to the
battle of Midway; I quote, from _Miracle_At_Midway_, by Gordon W. Prange,
Penguin Books, 1983, ISBN 0 14 00.6814 7:

     Ugaki presided with a firm hand, and carried through this grandiose scheme
  on tabletop with a sunny lack of realism. As he sincerely believed that no
  situation could exist in which the Japanese would not be in complete control,
  he allowed nothing to happen which would seriously inconvenience the smooth
  development of the war games to their predestined conclusion. He did not
  scruple to override unfavorable rulings of other umpires.  (Ch. 4, pg. 31.)

Also:

      [Ugaki] cautioned Nagumo that the possibility of an enemy breakthrough
  must be taken into consideration. Yet Ugaki himself promptly nullified any
  good his warning might have done. For during the table maneuvers, the
  theoretical American forces broke through and bombed Nagumo's carriers while
  their aircraft were away from their mother ships attacking Midway - the very
  situation which had concerned Ugaki. Lieutenant Commander Masatake Okumiya,
  the umpire, ruled that the enemy had scored nine hits, sinking both _Akagi_
  and _Kaga_. But Ugaki would not suffer such *lese majeste*, and immediately
  overruled Okumiya, allowing only three hits, with _Kaga_ sunk and _Akagi_
  slightly damaged. And later, when conducting the second phase practice, he
  blandly resurrected _Kaga_ from her watery grave to participate in the New
  Caledonia and Fiji invasions.  (Ch. 4, pp. 35-6)

Now, considering that all that took place 48 years ago, without any computers,
where all the participants could see and understand the workings of the
simulation, how much worse will it be in the Pentagon - after all, even if all
the generals understand the RISKS of computer simulations, they *still* don't
know what algorithms are being used and can't tell if the computer has been
engaging in this sort of fudging.

All I can say is I hope they don't have to learn the hard way.

David A. Holland  pro-angmar!achilles@alphalpha.com  aeneas@blade.mind.org


re: "Computer models leave U.S. leaders sure of victory"

John C Slimick <slimick@unix.cis.pitt.edu>
26 Dec 90 20:54:09 GMT
The usual reference to the wargaming in the Imperial Navy during the planning
of the Midway operation was that on one roll of the dice, the value indicated
that the attack force would lose three aircraft carriers.  The attack force
team immediately appealed to the referee that such an event was impossible. The
referee agreed and apparently the next toss was more acceptable. Note: in
reality, the attack force lost four carriers.

This is usually cited as an example of the "Victory Disease" that swept over
Japan from 1940 through late 1942, where everyone was convinced that the war
was won and the Japanese forces were the best (and that's why they won) and so
on.  My own interpretation is that such games can produce the desired results,
and that little has changed since early 1942.

John Slimick, University of Pittsburgh at Bradford   slimick@unix.cis.pitt.edu


Re: "Computer Models Leave U.S. Leaders Sure of Victory"

David Wright <wright@stardent.com>
Sat, 29 Dec 90 14:26:28 EST
This discussion caused me to recall the professor in my simulation course,
circa 1980.  He had consulted for the Army and was extremely skeptical of some
of their simulation work.  To paraphrase his words, "They use a computer model
that runs for a week and then provides a single number as an answer.  How can
you possibly have any confidence in that?"

David Wright, Stardent Computer Inc                      uunet!stardent!wright


Trojan in MS-DOS 4.01?

John Chapman Flack <76066.1006@compuserve.com>
31 Dec 90 21:06:03 EST
After replacing one SCSI host adapter with another, I found that I was unable
to boot my system from my SCSI disk.  Knowing that changing host adapters
should have no effect on the accessibility of data on a SCSI device, I decided
to boot from my original MS-DOS 4.01 distribution diskette (OEMed by AST
Research).

I popped the diskette in the drive and hit reset.  The system beeped, accessed
the diskette, and, without warning or pause, formatted my hard disk.  (It
ordinarily presents a menu offering to install DOS on a disk or diskette, or to
exit to the command level).

After going through the required stages (disbelief, denial, anger, guilt,
restoration from backups), I experimented to learn what had happened.  When the
DOS installation program first gets control, it checks to see if there is a
hard disk.  If there is a disk, and it has no partition set up (as would be the
case with a new system), the familiar menu is presented.  If the user chooses
to install DOS on the hard disk, the program creates a partition table, and
then forces a reboot so the table will be loaded.

If, on bootstrap, the program sees a hard disk which is PARTITIONED but not
FORMATTED, it assumes it is continuing the process above.  So, without any
further interaction with the user, it formats the disk and copies the DOS
system files onto it.

Changing the host adapter of course has no effect on the data maintained by the
drive and its controller.  However, SCSI devices are addressed by logical block
number, and the IBM BIOS disk functions use physical cylinder, head, and sector
numbers, so each host adapter needs to map the actual logical addresses into
fake physical addresses, and different adapters have different algorithms for
doing that.  The disk, whose fake physical layout appears to have changed (but
with data intact), evidently looks to the install program like a partitioned
but unformatted disk.

It's easy to see why the installation program was designed as it was.  On the
other hand, the documentation nowhere mentions that the program might format
the disk without consulting the user.  (In fact, the only references in the
installation instructions to formatting the disk *at all* is an instruction to
see the Command Reference manual for information on how to format a disk, and
the line "once your hard disk is formatted and partitioned correctly, SELECT
completes the installation of MS-DOS...."

So the "feature" meets the definition of a Trojan horse, and in destructive
power ranks right up with the biggies (loss of all hard disk data).  And it
could have been easily avoided with the addition of "Continue the Install
process by formatting the disk (Y/N)?"


Re: Organizational Aspects of Safety (RISKS-10.71)

Nick Szabo <szabo@sequent.uucp>
28 Dec 90 23:31:53 GMT
Charles Martin proposes the rule, "it costs 1/100th as much to do it right as
it costs to do it over."

Often (most of the time?), nobody knows with 100% certainty what is "right".
Where knowledge is lacking, and cannot be inexpensively obtained, doing it over
-- and over and over again, until it is right — may be far cheaper and faster
than trying to do it right the first time.

Nick Szabo          szabo@sequent.com


A RISKy video store kiosk

<r.aminzade@lynx.northeastern.edu>
Thu, 3 Jan 91 14:18:15 EST
Last weekend I went to the newly-opened Empire Video store here in Burlington,
VT.  Empire uses both standard classifications like comedy, sci-fi, foreign,
and clever classifications like "feel-good", "tearjerkers" "Bogart", etc.  This
made it hard to find a film we were looking for.  Knowing that most of these
stores have a computerized database to help find titles (and show which are
currently out), I looked around.  sure enough, I found a kiosk with paper
catalogs, books of movie reviews, and a CRT aimed out at the floor, clearly for
customers.  The terminal was off, but we found the ON switch.

"It's broken," said my friend.  Sure enough, the machine was displaying only a
'>' prompt.  Hmm.  I tried "DIR".  No luck.  "ls" didn't work either.  Tried
several other UNIX, VAX, and DOS commands.  EXIT seemed to work!  It put me
into a menuing system...but it didn't seem to list films, it looked like the
entire store-management database!

Before I could stop myself, I had looked up my file information (nothing
overdue, can't remember if they had my VISA card number, but I think they did)
.  I wandered around a bit until I realized that I was doing something
not-very-ethical.  I turned the machine off before checking which of my
neighbors had checked out dirty movies.

Sure Hope the next person to turn on the CRT and try some random commands
thinks through the ethical implications.  I'll talk to the store manager this
week.


Call for papers, VDM '91

Hans Toetenel <winfabi@dutrun.tudelft.nl>
3 Jan 91 14:52:37 GMT
                               Call for Papers
                                   VDM '91
                      Formal Software Development Methods
            Noordwijkerhout, The Netherlands
                              October 21-25, 1991

This symposium is the fourth in a series addressing model-oriented
approaches to formal software specification and development. The first
three symposia concentrated on specification and design notation and
techniques, featuring approaches such as VDM and Z.

The fourth symposium, VDM '91, will concentrate on formal *development*.
It will be organised as two days of tutorials and three days of conference,
with two parallel tracks throughout: one dedicated to practice, and one
dedicated to theory. The symposium will also include tools demonstrations.

After many years of research into and application of model-oriented
methods like VDM, Z, RAISE and B, the time is now ripe to record facets
of development in more detail, as well as the role of formal development
methods in the larger context of problem domain modelling, software
engineering, tool development and management. One can identify a spectrum
of formality offered or required by various methods, as well as a set of
paradigms and principles, such as invent-and-verify, transformation, and
design-calculi.

On this basis, papers (to be fully refereed) are welcomed in the following
and related areas:
- stepwise development of architectural requirements
- stepwise development of software designs
- development by transformation
- data reification
- rigorous justification
- proof of correctness
- recording of validation and verification conditions
- links between formal development and pragmatic aspects of software
  engineering (such as requirements tracing, version control, configuration
  management, change request control, test case generation and validation)
- principles of support tools

Also, project reports , recording industrial experience and ongoing tool
development and research are welcomed.

Important dates:                           Program Committee

Submission deadline:                       Patrick Behm (France)
   March 1, 1991                           Andrzej Blikle (Poland)
                                           Hans Langmaack (Germany)
Notification of acceptance:                Peter Lucas (U.S.A.)
   June 17, 1991                           Soeren Prehn (chairman) (Denmark)
                                           Hans Toetenel (The Netherlands)
Camera ready papers due:                   Jim Woodcock (U.K.)
   August 16, 1991

Please direct all mail and inquiries to: Hans Toetenel, Delft University
of Technology, Faculty of Technical Mathematics and Informatics, PO Box 356,
NL-2600 AJ Delft, The Netherlands; E-mail: toet@dutiab.tudelft.nl.

Please report problems with the web pages to the maintainer

x
Top