The RISKS Digest
Volume 15 Issue 33

Tuesday, 7th December 1993

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Review: "Digital Woes" by Lauren Ruth Wiener
Pete Mellor
Review: "Terminal Compromise" by Winn Schwartau
Rob Slade
Info on RISKS (comp.risks)

Review: "Digital Woes" by Lauren Ruth Wiener

Pete Mellor <pm@csr.city.ac.uk>
Tue, 7 Dec 93 18:44:00 GMT
Title: Digital Woes
Subtitle: Why we should not depend on software
Author: Lauren Ruth Wiener
Publisher: Addison-Wesley Publishing Company
ISBN: 0-201-62609-8
Price: US $22.95
245 pages, hardback

This is a highly entertaining and easily readable book on a subject which is
dear to all our hearts. Without stretching the technical knowledge of the
average reader in the slightest, the author has managed to present a
comprehensive account of the problems that beset modern software-based
systems.

She sets the scene in "Attack of the Killer Software", by presenting 13
incidents in which disaster was caused, or nearly caused, by the misbehaviour
of software, including Mariner 1, Apollo 11, the Patriot's Scud-chasing
problems, the Bank of New York glitch, Therac-25, the ozone-layer fiasco, and
a few others.

If these sound familiar to readers of RISKS, this is no coincidence. The
majority of the raw material was taken from its archives, so providing a handy
counter-example of the sensible use of computing equipment.

Chapter 2 "Why Does Software Have Bugs?" describes the basic causes of the
"Software Crisis" (except that the author points out in her conclusions that
there is really no such thing: it is simply in the nature of software to
contain bugs). I had a strong feeling of "deja vu" reading this chapter,
mainly because it would make a good set of revision notes for the 1st-year
undergraduates on the Software Engineering course that I teach. The author
describes very well the discontinuity of behaviour that makes digital systems
so treacherous compared to more conventional analogue systems. She makes the
points that software is always part of a larger system, and so subject to the
vagaries of the system environment, that it suffers from complex execution
paths, timing problems (which make it truly non-deterministic), and unforeseen
interactions due to its high level of connectivity in distributed systems.
(These latter problems are well illustrated with reference to the AT&T
collapse of 1990.) The problems of novelty of design and the lack of
engineering discipline as applied to software (at least until recently) could
possibly have been given more attention, and it would have been good to see
the concept of programming as a subtractive exercise developed at greater
length.

The problem of resources is the subject of Chapter 3. This kicks off with the
classic 1979 US GAO study of 9 Federal Projects, and then goes on to
demonstrate that not much has changed, using a table of remarks quoted from a
1967 article in Fortune magazine set against similar statements from later
papers and articles from 1971 to 1991. Problems arising from the development
organisation (corporate hierarchies are not ideal organisational structures
for creating software), from the development process (getting the requirements
right, design, development, testing, "gutless estimation" of effort and
schedule and the resulting overruns), what passes for "maintenance" of
software, etc., are well covered, along with the value of prototyping, and an
account of the kind of "warranties" traditionally offered by software vendors
(i.e., disclaimers!).

Safety-critical systems are the subject of Chapter 4, with an account of the
applicable software engineering techniques, including the use of software
standards, formal specification and mathematical verification. The author is
not hopeful that any of these methods can "deliver the goods". Software
standards are more honoured in the breach than in the observance. There is the
eternal problem of whether to mandating aspects of the development process or
the resulting product. There is on evidence that the stipulated techniques
actually result in safer systems. (Indeed, some standards may be
counter-productive.) Formal specifications are difficult to write and usually
the domain experts cannot understand them. These problems have led (for
example) to Boeing specifying the 777 flight control system in English. Proof
of correctness is even more arduous and expensive, and may not detect
specification faults. The author cites the classic paper by De Millo, Lipton
and Perlis in support of these arguments, as well as the unhappy experience of
Ontario Hydro in attempting to prove one of their reactor shut-down systems
correct. Several formal methods specialists are quoted to the effect that
formalism alone is insufficient.

Certifying software engineers to work on safety-critical systems offers no
easy solution either, due largely to the lack of consensus on what core
curriculum should be studied in order to qualify. The main effect of requiring
certified personnel will be increased cost of software (and it will still have
bugs!).

The rest of the chapter introduces the basic concepts of risk analysis,
including HAZOP, FMEA, FTA, etc., and discusses the use of redundancy and
planned failure paths in fault-tolerant systems, so that they can "fail-safe"
or "fail-soft". The author points out that, paradoxically, it is often the
"safety" features of a system that lead to disaster.

The author next deals with something that probably does not get the attention
it deserves as a topic in its own right in the average Software Engineering
course. This is the temptation to "think big" where software systems are
concerned, partly due to their ease of connection (since they are all
digital). Chapter 5 discusses the issues of connectivity at length, the use of
communications networks of various kinds (particularly multi-media), "data as
a commodity" and the attendant risks to privacy (e.g., from the misuse of
distributed databases) and other social implications of digital technology.
Distribution is cheap - security is expensive. The fact that digital
information is volatile raises various questions of authenticity.  What is the
value of a digitised signature? Which version of an electronic book is an
author's genuine work? How can electronic voting affect the democratic
process? (Answer: badly!) Is a digitally recorded video admissible as evidence
in court, or could it have been skilfully forged? How do computer generated
simulations of an accident influence a jury?

The tendency to do everything-"by-wire" is well covered (fly-, sail-, drive-,
etc.) and the opportunity this provides for the introduction of massive
control systems, such as GPS navigation, the Advanced Automation System for
air traffic control, and the proposals for even more radical Advanced Vehicle
Control Systems running "platoons" of cars at 90mph with 10 inches separation
between the bumpers. Needless to say, the author is less than enthusiastic
about most of these proposals. In the meantime, Thailand is setting up a
central database to record all essential information about all of its 55
million inhabitants, and has received an award from the Smithsonian for its
efforts.

In the final chapter the author summarises her argument and proposes seven
questions that we should ask about any proposed system, starting with "Do we
need it?". We should counterbalance the enthusiasm for big technical projects
by asking: "Is it the right system?", "What is at risk when (not if) it goes
wrong?", "How big and complex will it have to be?", "How will it fit in with
what already exists?", "What will it require of its users?", "Will it require
extensive security?", "Is there a back-up?".

She also makes two "modest proposals" for projects which would be a sensible
use of resources: the "Educational Software Foundation" and "Congress
On-line". The first would inject funds into the exploitation of computing
power to educate children, taking advantage of the possibilities of
interactive graphical systems and virtual reality. The second would
decentralise Congress, so that the lawmakers could live among their
constituents, and conduct business by computer bulletin board.  Washington,
D.C., would be redundant: no more big expenses, no crowds of fat-cats lobbying
for big corporations! (The only slight objection I had here was that lobbying
by e-mail would be just as easy as law-making by e-mail.)

I do not know how much computers are already available in US schools, but
certainly many schools in the UK are making good use of them at all levels of
primary and secondary education, thanks largely to the imaginative initiative
by the BBC in conjunction with Acorn. There is also the Nuffield mathematics
project. Of course, the use of computers is uneven, as is funding for other
aspects of education.

Trying to place the second proposal in a UK context, I was unable to imagine
many British MPs being willing to give up their perks (not to mention the
opportunity for a bit of toe-sucking on the quiet! :-) for a life in front
of a computer terminal in the Orkneys. The Lords Spiritual and Temporal would
presumably have to find another way of justifying their daily attendance
allowance, and I doubt that logging in to read their e-mail would suffice.
Precisely how the Queen's speech would be handled, and what the electronic
equivalent of Black Rod would be, I leave to more imaginative readers! :-)

As will be seen from this synopsis, the author's tone is sceptical throughout.
She has little patience with misplaced enthusiasm for grandiose schemes in the
field of IT. One by one, they all take a hammering; in some cases, such as the
Strategic Defence Initiative, several hammerings. Most of the well-known
software disasters of recent years make their appearance at some point. The
author's message is that we have choices about what systems we implement, that
these choices will fundamentally affect the quality of our lives, and should
be made with our eyes open to the disadvantages as well as the potential
gains.

As a step in educating the lay public in the social issues of information
technology, the book is excellent, and should be required reading in any
foundation course on software engineering, or any course with a title such
as "Computers and Society".

The level of difficulty is just about right for the non-technical reader.  The
style is witty and the pace is maintained. At the same time, the book is
scholarly, and all technical points or disaster stories in the text are
supported by notes and references at the end of each chapter. The specialist
reader will find it useful as a source of references to more technical
descriptions.

There is a glossary of computing terms with some good explanations.
The definition of a "system" as a "set of systems ..." in order to emphasise
its recursive nature struck me as being particularly neat. The remark in
the definition of "natural language" that (because of the difficulty of
teaching natural language to a computer) "It is hard to avoid the conclusion
that children have special wiring." is a nice example of the humour of the
book.

As far as I could judge, the descriptions of disasters are accurate. The
author avoids urban myths (e.g., she gives a good account of the loss of
Mariner, on which misleading information is extremely persistent). There are
technical inaccuracies, however.

The author is wrong in several places about my favourite aeroplane. For
example (p.17), the Airbus which was downed as a result of the Vincennes'
crew's inability to read a timetable fast enough, and Aegis' inability to
report whether it was climbing or descending, was either an A300 or an A310,
but definitely not an A320. Software (p.25) does *not* take the place of the
hydraulics on the A320 and other aircraft of its generation. The control
surfaces are still (mostly) moved hydraulically, although the signals to the
hydraulic actuators are electrical, and are output by the computers in the
Electrical Flight Control System (EFCS). (This mistake is repeated on p.87.)
She also confuses the functions of the EFCS and the Flight Management and
Guidance System in the same paragraph.

The end-of-chapter note states that on the A320, there is one small control
surface on each wing under the pilot's direct control. This might have come
from the reference cited (Stix, Gary: "Along for the Ride?", Scientific
American, July 1991, pp. 94-106), but in any case it is wrong. The "manual
back-up" in the event of total EFCS failure consists of pitch control by
movement of the Trimmable Horizontal Stabilizer using the trim wheels, and
control of direction using the rudder, which can be moved directly by pedals.

In some places, the need to address a lay audience has led to
oversimplification of a technical point. For example, in the discussion of
fly-by-wire (p.88), it is stated that the "safe flight envelope" is defined as
"the limits of structural integrity of the aircraft". There is lot more to it
than that.

The description of the Shuttle computer architecture on p.30 is a bit
confusing, and is not helped by the use of "operational" where what is meant
is obviously "fail-operational". Also, I was left wondering if "fail-safe" and
"fail-operational" had been applied correctly.

The description of digital representation on p.44 is confused by the fact that
the binary pattern "0100011" is mistakenly described as the representation of
the number "67" instead of "35".

Analogue images (p.154), like digital images, possess only finite resolution.

Just to be really pedantic, I doubt that Augusta Ada Byron, Countess of
Lovelace, ever referred to herself as "Ada Lovelace" (p.16)! :-)

I found myself objecting to one or two of the definitions. I have always
understood a "patch" to be a modification to executable code, not a hasty
modification to source code. Also a "bug" is defined as "Behaviour on the part
of a computer program ..." and "failure" as "An inability of some part of a
system ...". The standard usage is that a "bug" is a "latent fault", i.e., a
*state* of the system, and a "failure" is an *event* in which a system does
not perform a required function (one of the causes of which may be the
"activation" of a bug).

A glossary of acronyms would be a useful addition. I could only guess from the
context that "UPS" is a parcel delivery service, and the only words I could
attach to "VPL" were "Visible Panty Line", though this might have been due to
the proximity of a passage describing the potential pornographic uses of
virtual reality! :-)

These minor criticisms do not detract from the overall value of the book.

To the author's two "modest proposals" in Chapter 6, I would like to add a
third of my own. Before the number of safety-critical software-based systems
exceeds the number of inhabitants of Thailand, could we not set up a central
database (either one international one, or several national ones) on which to
register these and record their dependability in service? To those who object
to the effort involved, I would say: how much effort did it take to register
every computer in the UK which holds confidential personal information, as
required by the Data Protection Act? Are we prepared to do as much to protect
our safety as to protect our privacy?

Peter Mellor, Centre for Software Reliability,
City University, Northampton Square, London EC1V 0HB
Tel: +44 (71) 477-8422, Fax.: +44 (71) 477-8585,
E-mail (JANET): p.mellor@csr.city.ac.uk


Review: "Terminal Compromise" by Winn Schwartau

"Rob Slade, Ed. DECrypt & ComNet, VARUG rep." <roberts@decus.arc.ab.ca>
4 Dec 93 23:58 -0600
BKTRMCMP.RVW   931002

Inter.Pact Press
11511 Pine St. N.
Seminole, FL   34642
813-393-6600
fax: 813-393-6361
"Terminal Compromise", Schwartau, 1991, 0-962087000-5, U$19.95/C$24.95
wschwartau@mcimail.com p00506@psi.com

"Terminal Compromise" was first published in 1991, and was enthusiastically
promoted by some among the security community as the first fictional work to
deal realistically with many aspects of data communications and security.
Although still available in that form, recently is has been "re-issued" in a
softcopy "shareware" version on the net.  (It is available for ftp at such
sites as ftp.uu.net, ftp.netsys.com, soda.berkeley.edu and
wuarchive.wustl.edu.  Use archie to look for TERMCOMP.)  Some new material has
been added, and some of the original sections updated.  Again, it has been
lauded in postings on security related newsgroups and distribution lists.

Some of you may be old enough to recall that the characters current in
"Outland" sprang from a previous Berke Breathed cartoon strip called "Bloom
County".  Opus, at one point, held the post of movie reviewer for the "Bloom
County Picayune".  I remember that one of his reviews started out, "This movie
is bad, really bad, abominably bad, bad, bad, bad!"  He considers this for a
moment, and then adds, "Well, maybe not *that* bad, but Lord! it wasn't good!"

A fairly large audience will probably enjoy it, if such trivialities as
language, characterization and plot can be ignored.  For once the "nerds"
don't get beat on; indeed, they are the heroes (maybe).  The use of computers
is much more realistic than in most such works, and many ideas that should
have greater currency are presented.  The book will also appeal to paranoiacs,
especially those who believe the US federal government is out to get them.

Consistency is the hobgoblin of little minds — but it does make for a
smoother "read".  "Terminal Compromise" would benefit from a run through a
style checker ... and a grammar checker ... and a spelling checker.
Constructions such as "which was to be the hypocenter of the blast if the
Enola Gay hadn't missed its target" and "National Bureau of Standards which
sets standards" are understandable, although awkward.  In other places it
appears words might be missing, and you have to read over sentences several
times to puzzle out the meaning.  (The softcopy/shareware version comes off a
little worse here, with fragments of formatting codes left in the text.)

On second thought, forget the spelling checker.  Most of the words are spelled
correctly: they are simply *used* incorrectly.  A reference to an "itinerant
professional" has nothing to do with travelling.  (Maybe he meant
"consummate": I couldn't think of a synonym starting with "i".)  The "heroine"
trade was probably intended to refer to white powder rather than white
slavery.  There are two automobile "wreak"s.  "Umbrage" is used twice.  An
obscure seventeenth century usage did once refer to shelter given by islands
to a harbour, but it's stretching the language a bit to make it refer to a
covering for the naughty bits.  Umbrage usually refers to offence, suspicion,
doubt or rage, as in "I take umbrage at what I suspect is a doubtful use of
the language".

Characterization?  There isn't any.  The major characters are all supposed to
be in their forties: they all, including the President of the United States,
speak like unimaginative teenage boys whose vocabulary contains no adjectives
other than obscenities.  This makes it difficult at times to follow the
dialogue, since there are no distinctives between speakers.  (The one
exception is the president of a software firm who makes a successful, although
surprising, translation from "beard" to "suit", and is in the midst of the
most moving and forceful speech in the book, dealing with our relationship to
computers, when the author has him assassinated.)

The book is particularly hard on women.  There are no significant female
characters.  None.  In the initial introduction and background of the hero
there is no mention of a significant other.  It is something of a shock later
to discover he is married, then that he is divorced.  Almost all of the
females are simply bedroom furniture.  The portrayals remind one of the
descriptions in "Don Quixote" of women "so gay, striking and beautiful that
the sight of her impressed them all; so vividly that, if they had not already
seen [the others], they would have doubted whether she had her match for
beauty".

Which raises another point.  All of the hackers, except some of the Amsterdam
crew, are fit, athletic and extremely attractive to the female of the species.
Even among the I-Hack crowd, while there may be some certifiable lunatics,
nobody is unkempt or unclean.  These urbane sophisticates drink "Glen Fetitch"
and "Chevas" while lounging in "Louis Boston" suits on "elegant ... PVC
furniture".  Given that the hackers save the day (and ignoring, for the
moment, that they caused the trouble in the first place) there seems to be
more than a touch of wish fulfillment involved.

(Schwartau tries to reiterate the "hackers aren't evil" point at every
opportunity.  However, he throws away opportunities to make any distinctions
between different types of activities.  Although the different terms of
phreaks, hackers and crackers are sprinkled throughout the story they are not
well defined as used by the online community.  At one point the statement is
made that "cracking is taking the machine to its limit".  There is no
indication of the divisions between phreaks, hackers and crackers within their
various specialties, nor the utter disdain that all three have for virus
writers.  Cliff Stoll's "Hanover (sic) Hacker", Markus Hess, is described as a
"well positioned and seemingly upstanding individual".  This doesn't jibe with
Stoll's own description of a "round faced, slightly overweight ... balding ...
chain smoking" individual who was "never a central figure" with the Chaos
Computer Club, and who, with a drug addict and a fast buck artist for partners
"knew that he'd screwed up and was squirming to escape".)

What little character is built during the story is unsteady.  The author seems
unable to decide whether the chief computer genius is one of the good guys or
the bad.  At times he is mercenary and self-centred; at others he is poetic,
eloquent and visionary; in yet other scenes he is mentally unbalanced.  (He
also appropriates the persona and handle of another hacker.  We are never told
why, nor are we ever informed of what happened to the original.)  Following
the characters isn't made any easier by the inconsistency of naming: in the
space of five paragraphs we find that our hero, Scott Byron Mason (maybe) is
the son of Marie Elizabeth Mason and Louis Horace Mason.  Or possibly Evelyn
Mason and Horace Stipton Mason.  The main academic studying viral programs is
Dr. Les (or Arnold) Brown (or Sternman) who is a professor at Sheffield (or
MIT).  (Interestingly, there is an obvious attempt to correct this in the
later "softcopy" version of the book.  At times the "corrections" make the
problem worse.)

For a "thriller", there is very little tension in the story.  The unveiling of
the plot takes place on a regular step by step basis.  There is never any hint
that the hero is in the slightest personal danger: the worst that happens is
that one of his stories is quashed.  Indeed, at the end of the book the
computer attacks seem basically all to have succeeded, credit card companies
are bankrupt, banks are in a mess, airlines are restricted, phone systems are
unreliable and the bad guys are in charge.  Yet our heroes end up rich and
happy on an island in the sun.  The author seems to be constantly sounding the
alarm over the possibility of this disaster, but is unwilling, himself, to
face the tremendous personal suffering that would be generated.

Leaving literary values aside, let us examine the technical contents.  The
data security literate will find here a lot of accurate information.  Much of
the material is based on undisputed fact; much of the rest brings to light
some important controversies.  We are presented with a thinly disguised
"Windows", a thinly disguised Fred Cohen (maybe two?), a severely twisted
Electronic Freedom Foundation, and a heavily mutated John Markoff.  However,
we are also presented with a great deal of speculation, fabrication and
technical improbabilities.  For the technically adept this would be
automatically disregarded.  For the masses, however (and this book seems to
see itself in an educational light), dividing the wheat from the chaff would
be difficult if not impossible.

As with names, the author appears to have problems with the consistency of
numbers.  In the same paragraph, the softcopy version has the same number
quoted as "over 5000", "almost 5000" and "three thousand".  (It appears to
have been "corrected" or updated from the original version without reading the
context).  A calculation of the number of hackers seems to be based upon
numbers pulled out of the air, and a computer population an order of magnitude
larger than really exists.  The "network", seemingly referring to the
Internet, has a population two orders of magnitude too large.  Four million
legal copies, with an equal number of pirate copies, of a virus infected
program apparently result in only "between 1 and 5 million" infections.  (I
*knew* a lot of people had bought Windows but never used it!)  Not the most
prolific virus we've ever seen.

Schwartau seems uncertain as to whether he wants to advertise real software or
hide it.  At various times the characters, incessantly typing to each other
across the (long distance) phone lines use "xtalk" (the actual filename for
Crosstalk), "ProCom" (ProComm, perhaps?), "ComPro" and "Protalk".  They also
make "4800 BAUD" connections (technically unlikely over voice grade lines, and
even if he meant "bits per second" 4800 is rather an odd speed) and
communicate with "7 bits, no parity, no stop bits" parameter settings.  (The
more common parameter settings are either 8 bits, no parity or 7 bits, even
parity.  You *must* have stop bits, usually one.  And to forestall the obvious
criticism, there is no indication in the book that a "non-standard" setting is
being used for security reasons.)

We are, at places in the text, given detailed descriptions of the operations
of some of the purported viral programs.  One hides in "Video RAM".  Rather a
stupid place to hide since any extensive video activity will overwrite it.
(As I recall, the Proto-T hoax, which was supposed to use this same mechanism,
started in 1991.  Hmmm.)  Another would erase the disk the first time the
computer was turned on, which leads one to wonder how it was supposed to
reproduce.  (This same program was supposed to be able to burn out the printer
port circuitry.  Although certain very specific pieces of hardware may fail
under certain software instructions, no printer port has ever been numbered
among them.)  One "hidden file" is supposed to hide itself by looking like a
"bad cluster" to the system.  "Hidden" is an attribute in MS-DOS, and
assignable to any file.  A "bad cluster" would not be assigned a file name and
therefore would never, by itself, be executed by any computer system.  We also
have a report of MS-DOS viri wiping out a whole town full of Apple computers.

Schwartau is not averse to making up his own virus terminology, if necessary.
("Stealth" is also reported as a specific virus.)  At one point the book
acknowledges that viral programs are almost invariably detected within weeks
of release, yet the plot relies upon thousands of viri remaining undetected
for years.  At another point the use of "radio broadcasts" of viral programs
to enemy systems is advocated, ignoring the fact that the simplest error
checking for cleaning "noise" from digital radio transmissions would eliminate
such activity.

A number of respected security experts have expressed approval of "Terminal
Compromise".  This approbation is likely given on the basis that this book is
so much better than other fictional works whose authors have obviously had no
technical background.  As such the enthusiasm is merited: "Terminal
Compromise" raises many important points and issues which are currently lost
on the general public.

Unfortunately, the problems of the book, as a book, and the technical excesses
will likely restrict its circulation and impact.  As a fictional work the lack
of literary values are going to restrict both its appeal and longevity.  As an
exhortative or tutorial work, the inability to distinguish between fact and
fiction will reduce its value and effectiveness in promoting the cause of data
security.

copyright Robert M. Slade, 1993   BKTRMCMP.RVW   931002

              ==============604-984-4067==============
Vancouver Institute for Research into User Security  Canada V7K 2G6
ROBERTS@decus.ca Robert_Slade@sfu.ca rslade@cue.bc.ca p1@CyberStore.ca

Please report problems with the web pages to the maintainer

x
Top