The RISKS Digest
Volume 8 Issue 11

Thursday, 19th January 1989

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Risks of no backup systems for critical applications
Yoram Eisenstadter
Computer malfunction downs traffic lights, one killed, one injured
Scott Campbell
Chaos Theory Predicts Unpredictability
PGN
China accused of software piracy
PGN
Friday the 13th Again
PGN
Computer error locks out politicians
D. Steele
Re: Losing Systems
Jerome H. Saltzer
Technical brilliance v. commercial acumen
Jerry Harper
National Credit Information Network
Sidney Marshall
Re: Ethics of the Internet
John Gilmore
RISKs of reading newspapers: Credit card fraud is not hacking.
Mike Van Pelt
Counting engines
Don Alvarez
Info on RISKS (comp.risks)

Risks of not having backup systems for critical applications

Yoram Eisenstadter <yoram@garfield.cs.columbia.edu>
Thu, 19 Jan 89 00:16:37 EST
The following article, which appeared in the "Metropolitan Diary"
section of today's New York Times, illustrates the risk of not having
backup systems for super-critical computerized applications.

         The other day, Gloria Ross was late for an appointment at a
    company on the Avenue of the Americas.  She holds herself
    blameless for being tardy and in defense she offers this
    explanation:
     The high-technology building where the company has its
    offices has a computerized directory.  To find the floor of the
    person you wish to visit, you push a button with the first letter
    of the last name.
     Aware of this procedure, Ms. Ross pressed the button marked
    "O" on one of the computer monitors mounted on a large black
    column.  Nothing happened.  A guard told her to try the next
    column.  Again, nothing.  The computer was down.  Her next stop:
    the information desk in the lobby.
     "I get my information the same way you do, lady," the man at
    the desk said, informing her that even he did not have a printed
    directory...

The article goes on to describe the chaos that ensued in the building, with
"dozens of people desperately cruising from floor to floor" looking for the
right offices.  Let's hope that the building's managers learned the obvious
lesson from this incident.


Computer malfunction downs traffic lights. One killed, one injured.

Peter Neumann <neumann@csl.sri.com>
Wed, 18 Jan 1989 22:48:49 PST
One child was killed and another injured [Mon 9 Jan 1989] when they were hit
by a truck after entering a crosswalk where the pedestrian signals were not
working.  The malfunction was caused by a computer error that affected
traffic signals at 22 school crossings.  The pedestrian signal cycles failed
to switch to the school schedule.  The cause reportedly may have been a
breakdown in the radio communications between a computer in Colorado Springs
and an atomic clock in Boulder.  [Colo Spgs Gazette Telegraph, 10 and 11 Jan
1989; contributed by Scott Campbell, PAR Gov't Sys Corp, Colo Spgs.]


Chaos Theory Predicts Unpredictability

Peter Neumann <neumann@csl.sri.com>
Wed, 18 Jan 1989 22:39:33 PST
A physicist who applied the new mathematics of `chaos theory' to the Star Wars
missile shield foudn that the equations pointed again and again to crisis and
war or — at best — a continued and precarious balance of terror.  ``The
question is not really Star Wars, but what do you do if all you can predict is
unpredictableness?'' Alvin M. Saperstein of Wayne State University asked [at
the AAAS meeting in San Francisco].  [From an article by Charles Petit, SF
Chronicle, 18 Jan 1989, p.  A18]


China accused of software piracy

Peter Neumann <neumann@csl.sri.com>
Wed, 18 Jan 1989 22:32:31 PST
Beijing (Washington Post, 18 Jan 1989) --
American companies are losing "many millions" of dollars in potential 
business in China because the companies' computer softwae has been widely
pirated...  China has no copyright law of its own...  


Friday the 13th Again

Peter Neumann <neumann@csl.sri.com>
Wed, 18 Jan 1989 22:28:34 PST
There were various reports of Friday-the-13th virus deletions in Britain,
attacking MS-DOS systems.  The so-called virus "has been frisky and
hundreds of people, including a large firm with over 400 computers, have
telephoned with their problems," according to Alan Solomon, director of S
and S Enterprises, a data recovery center in Chesham.  The virus reportedly
bore similarities to the Friday the 13th Israeli virus (13 May 1988, the
previous Friday the 13th).  [Source: SF Chronicle, 14 Jan 1989, p. B1]


Computer error locks out politicians

D. Steele <uivkey@NADC.ARPA>
Thu, 19 Jan 89 09:27:15 EST
    Just to show that computer systems play no favorites in politics,
local news reports are blaming a computer error for denying Pennsylvania
Republicans tickets and access to many of the Presidential inauguration balls
and festivities. The politicians are complaning "its like being all
dressed up with no place to go".

Submitted by Scott Berger, Naval Air Development Center, Warminster, PA


re: Losing Systems

Jerome H. Saltzer <Saltzer@LCS.MIT.Edu>
Thu, 19 Jan 89 12:31:05 gmt
The question as to why there are so many losing systems may have a simpler,
more fundamental answer than has been suggested in the contributions over the
last couple of weeks.  So far, those contributions have (1) suggested
incompetence in management or technical ability, and (2) questioned some of the
currently fashionable magic bullets, such as structured programming.

I believe that the more fundamental answer is that the pace of improvement of
hardware technology in the computer business has, for 35 years now, simply been
running faster than our ability to develop the necessary experience to use it
effectively, safely, and without big mistakes.

The losing systems almost always contain some elements of newness; in fact on
close inspection they usually contain several such elements.  (If someone
claims there is nothing new in a project that involves software development,
then ask why they aren't just using previously existing software.  It is the
attraction of taking advantage of new possibilities, usually as the result of
hardware being either more functional or cheaper than it used to be, that leads
to new software systems.)  If these new elements were to arrive on the scene
one at a time, and spaced far enough apart that thorough experience could be
assimilated with each previous new element, then I submit that traditional
engineering practice, as applied to pyramids, cathedrals, bridges, consumer
electronics, and even airplanes, would lead to higher success probabilities.
Mistakes would still be made, but they would tend to occur on the far-out
projects that are expected to carry an element of risk, rather than the ones
that intuitively seem like they ought to be routine, such as automating the
county records.

Arguing that managers should become computer wizards, or offering structured
programming to fix the problem, just don't seem to me to get to the heart of
this more fundamental issue.

When the technology ground rules change at a rate that is ten times faster than
in other engineering disciplines, it would seem that unless one can figure out
how to accumulate and assimilate experience also at a ten-times-faster rate,
system failures are an expected result.  Perhaps a more interesting question is
how it is that some computer systems manage to be successful.  I observe two
related things that are often associated with successful systems:

     1.  Those systems that are successful are usually conservative, with
     somewhat simpler objectives than the state of technology would have
     permitted.

     2.  Systems that are succesful often had the management advantage
     of a system dictator who had the absolute power to say NO to
     ideas that didn't seem to fit in.  A dictator is one of the
     few mechanisms that can keep an implementation conservative in
     the face of pressures to be state-of-the-art.

My conclusion from these observations is that since:  (1) it is hard to be
conservative in the face of tempting technology advances; and (2) appointing
dictators isn't a common management practice; successful systems aren't very
common either.  And having conservative goals and a dictator doesn't guarantee
that the system will be winning or that its future users will like it, it just
sets the stage for that possibility.
                             Jerry Saltzer


Technical brilliance v. commercial acumen

Jerry Harper <jharper@euroies.UUCP>
Thu, 19 Jan 89 15:34:31 GMT
Steven C. Beste made the point that managers are trying to come to grips with
computer technology moreso now than ever before; this I would generally agree
with subject to the caveat that the degree of managerial immersement in the
technology will never match that of the technical expert.  One of the last
companies I was consultant to actually lost sales because the management didn't
understand either the product or the market, and knowing both was especially
important as the company was making the transition from conventional DP through
Cobol to providing a logic programming environment on a mainframe.  The
permanent technical staff couldn't have sold their souls for ice pops and the
management were having fiercesome difficulty in making the paradigmatic shift
from Cobol inspired projects to AI (expert system bespoke applications).  Just
as you thought the management was grasping the core issues Sisyphus would pop
up and roll progress back.  Even more lamentable were the salesforce who new
sweet f.a. about either methodology.  Because AI was "sexy" the salespeople
were inclined to promise the earth (one salesman reckoned he had a contract for
a complete CASE system for a major motor manufacturer in the UK even though
neither he nor the company had any experience in this area) and take umbrage
when it was explained that the company simply couldn't deliver.  The net result
was that the company became unsatisfactory for quite a number of the technical
people who carried their skills elsewhere.Nevertheless, observing the company's
progress from a distance it seems to be doing quite well and the mangement have
made the learning curve.


National Credit Information Network

<marshall.wbst@Xerox.COM>
18 Jan 89 15:50 EST
I just received in the mail as part of the BYTE magazine package of postcards
from manufacturers etc. a post card selling a program capable of accessing the
National Credit Information Network (if I qualify). Here is the text of the
postcard (the typography of the card was ragged and this is as exact as I could
make it):


NATIONAL CREDIT INFORMATION NETWORK
ON-LINE ACCESS PACKAGE

AVOID SLOW PAY - NO PAY     HIRE QUALITY EMPLOYEES

SAVE $200.00         $498.00 *       SAVE $200.00


IF YOU QUALIFY FOR ACCESS...THIS INFORMATION IS IDEAL FOR:


FREE ON-LINE DEMO

"MONEY-BACK GUARANTEE
 IF YOU DO NOT QUALIFY


 After connection, slowly press the [ENTER] key  4 times.
When prompted for a Username: type DECK4  then press [ENTER]

Is this scary or what?

--Sidney Marshall

Please report problems with the web pages to the maintainer

x
Top