The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 18 Issue 26

Friday 19 July 1996

Contents

o ``Primary Colors'' and computer evidence
Peter G. Neumann
o The increasing complexity of everyday life
Don Norman
PGN
o "Computer Buff Raids Marks & Spencer Security Secrets"
David Kennedy
o ICEE voice-mail breakin
Thomas Insel
o NSA response to key-length report
Matt Blaze and Whit Diffie
o Re: 56-Bit Encryption Is Vulnerable, Says Zimmermann
Dave Tweten
A. Padgett Peterson
o New ATMs considered harmful
Carl Resnikoff
o Safety-Critical Computer Systems, by Neil Storey
o Info on RISKS (comp.risks)

``Primary Colors'' and computer evidence

"Peter G. Neumann" <neumann@csl.sri.com>
Wed, 17 Jul 96 18:32:47 PDT
In RISKS-17.75, Peter Wayner noted the computer study done by Professor
Donald Foster at Vassar College that attributed the writing style of the
anonymously authored novel ``Primary Colors'' to Joe Klein, a Newsweek
columnist and CBS commentator.  Recently, Maureen Casey Owens, past
president of the American Academy of Forensic Sciences, studied the
handwritten notes on the amended typescript pages for the novel and
concluded that the handwriting was most certainly that of Joe Klein.
[Source: an article by David Streitfeld in *The Washington Post*, 17 July
1996]  (An NPR program I heard at lunchtime on 17 Jul noted that other
investigators had turned up the fact that Klein had recently paid cash for
half of the price of his new house, seemingly having struck it rich in a
short period of time.)  On the same day, Random House finally admitted that
Klein was indeed the author.

Peter Wayner earlier had suggested in RISKS-17.75 that if Joe Klein were
really trying to hide his identity, he would have disguised his writing
style more assiduously.  But we might suspect that Joe did not want to hide
his identity completely, because the suspense has undoubtedly increased
sales and paid for his house, and the having been identified will now
dramatically increase his name-recognition index.  On the other hand, Klein
is now taking a lot of flak relating to his integrity as a journalist,
because until now he has lied in denying authorship.  Is it really true that
all publicity is good publicity, even if it is bad publicity?

  So, you may ask, where is the RISKS relevance in this case?

 * RISKS readers are by now accustomed to being suspicious of purported
computer evidence.  Here, the winnowing out of Joe Klein's identity by
Professor Foster is in retrospect very impressive.  A case in which such
evidence turns out to be actually correct is certainly worth mentioning,
particularly because RISKS is sometimes criticized for including so many
negative computer-related cases (a situation that occurs largely because we
so seldom see real successes).  Thus, there can be risks in *doubting* that
digital evidence is truthful.  But, above all, there are always risks in
*believing* that digital evidence is truthful.

 * There is the risk of believing that you can reliably hide your identity,
even in the presence of altered writing styles, veiled attacks on oneself
(the novel contains ``an unflattering portrait of a reporter who resembles
Klein, according to Streitfeld), various forms of steganography, spoofed
e-mail, and a trustworthy senior editor.  You must also beware of telephone
records, credit-card records, airplane reservation databases, library
records, nosy realtors, snoopy neighbors, coincidental encounters, etc.

 * The art of lying is very difficult.  If you are going to attempt it, you
must be prepared to be absolutely consistent forever, because otherwise
inferences can be drawn that can smoke you out.  However, absolute
consistency is in general impossible, especially if the cover story is in
any way inconsistent with perceivable reality.  Cover stories with plausible
deniability are best when they are also legitimate.  However, with
webservers and cross-linked databases, it is increasingly difficult to hide
the rest of the story.  Also, covert activities in the intelligence world
are such that you must lie in denying the mere existence of a covert
operation (which, *by definition* "does not exist"), and in the name of
national security you must perjure yourself whenever you are challenged.
So, beware of what can be gleaned from computer-related inferences --
especially when some of the information is perfectly above-board but perhaps
not completely correct.

In this case, the Klein battle to remain anonymous may seem to have turned
inside out, but the situation is now really a Klein bottle in which secrecy
and full disclosure are both on the same surface and ethics have become
blurred with nonethics.  Stay tuned for the Klein re-buttle.  I'm not taking
bet-tles.

The only consistent course is clearly to avoid putting yourself in such a
position in the first place.  Although that might seem to preclude April
Fools' spoofs, note that in our most famous cases (for example, the
Chernenko and Spafford e-mail spoofs discussed in the RISKS archives and in
my RISKS book), the prankster has unabashedly 'fessed up when confronted
(Piet Beertema and Chuq von Rospach, respectively), and generally received
admiration for his cleverness.  (The negative responses that "Chernenko"
took are also worth noting.)  Similarly, Robert Morris never denied his
involvement in the Internet Worm experiment that went seriously awry.  So,
perhaps it would have been OK for Klein to publish anonymously if he had
admitted his authorship when first challenged?  However, to expect that he
could remain anonymous forever is totally unrealistic in our
information-laden world, and that realization may color [!] future authors
seeking similar subterfuges.  On the other hand, to retain anonymity in
order to further increase sales is either (1) morally reprehensible, or (2)
just consistent with the emerging American Way -- anything is OK as long as
you can get away with it, or (3) both 1 and 2.

  [Peter Wayner found the following quote in Media Circus on-line,
  referring to Maureen Casey Owens' analysis of the Klein typescript:

    Says Joshua Sostrin, "The analyst later concluded that the Declaration
    of Independence, as has long been suspected, was indeed penned by Bob
    Dole."]


The increasing complexity of everyday life

Don Norman <dnorman@apple.com>
Mon, 15 Jul 1996 10:09:42 -0800
  Musings on the ever-increasing complexity of everyday life, triggered by
  the ever-increasing size of the "end-of-the-digest" announcement of the
  RISKS Digest:

I am alternately amused and terrified by the ever-increasing complexity of
everyday life.  Technology provides more and more functions essential to our
life. More and more artifacts pervade our lives and make themselves
essential to our lifestyles. Many of the new technologies involve
communication networks that interconnect large numbers of systems. These
lead to an increase in the complexity of societal interactions and the sheer
number of contacts among people. As a result, the number of potential weak
points increase, and thereby the dangers. The ever-increasing amount of
interactions among people, institutions, and governments gives ever-more
opportunities for disaster, and the natural tendency of governments and
institutions is to tackle each known problem by instituting rules,
regulations, and laws to control the abuses. But these well-intentioned (and
sometimes not so well intentioned) efforts simply proliferate to add yet
more complexity to our lives.

Those in computer science know how difficult it is to disentangle the
interactions of a rule-based system.  What happens when the rules are those
of an institution or government, designed by multiple people over decades --
centuries? We have rules that interaction unplanned ways, rules that are
inconsistent. Rules that are vague and ambiguous. Rules that were relatively
clear and precise in the era they were developed, but become outmoded or
imprecise with the passage of time and the invention of new technologies.
We all know what the result is: unstable, unpredictable systems.

Even my own home grows rapidly in complexity. 8+ remote controls to operate
my home theatre/TV. An ever-increasing set of manuals for the
ever-increasing number of home appliances. A dozen or so electric clocks to
be reset when power fails and during the biannual summertime/regular time
switch over. Items to be lubricated, adjusted, dusted, tested. batteries to
be checked and replaced on a regular basis. Security precautions in the home
and at work: identification badges, more secure driver's license, more
secure $100 bills, and the ever proliferation of passwords --each to be
non-words, each to be changed at frequent intervals.

Even the RISKS Digest itself is not immune. Look at the end-of-the-digest
announcement.  Once this was a few lines, at the beginning of the digest.
Then, people like me had problems with the FTP instructions, so the
instructions were made more complete and precise -- but thereby longer.
Then people like me kept asking for permission to reprint, so the
announcement was modified to spell out the policy (which, of course,
required inventing a policy).  Soon, the number of items got so large that
the whole thing was put at the rear of the digest, where it now takes up
over a page of printed text. How long before it is two pages?  How long
before it rivals the size of the digest itself?  [SEE NEXT ITEM.  PGN]

What's the RISK?  Well, as life gets more and more complex, the number of
interactions increases (as N-squared? faster?).  The number of unexpected
interactions also increases, usually with unknown impact.  We have already
seen how the interconnectedness of the phone systems and the electric
utility systems means that single-point failures can sometimes bring down a
large region of the country.  I expect these instances to increase, both in
number and in magnitude, as the number of interconnections increase.

The real question is: are there alternatives or are we doomed to
ever-increasing complexity?

Don Norman.  VP, Apple Research Labs  dnorman@apple.com


Re: The increasing complexity of everyday life

RISKS List Owner <risko@csl.sri.com>
Fri, 19 Jul 96 9:33:45 PDT
Don, I am really glad you raised this issue.  It is worthy of considerable
discussion in RISKS.

Not incidentally, a problem in the past is that RISKS gets distributed to
net-lame places that cannot use the web or FTP or much else, or in some
cases cannot even reply by e-mail.  The full risks.info message kept growing
in part to stave off all of the victims of noncompliant Internet service
providers and sites without webservers.  I too have been annoyed at the
increasing volume of the risks info message.  So, thanks to your urging,
from now on new subscribers will continue to get the full info page
initially when their subscription is acknowledged, and the regular
end-of-issue item will be BRIEF.  The full info message can be gotten by
e-mail to risks-request@CSL.sri.com with the one line INFO, by ftp, and as a
web page, and the brief message (see the end of this issue) states that "All
contributors are assumed to have read the full info file."  I hope that
suffices.  MANY THANKS for the prompt.  PGN


"Computer Buff Raids Marks & Spencer Security Secrets"

David Kennedy <76702.3557@CompuServe.COM>
10 Jul 96 19:08:36 EDT
PA News  7/10/96 12:39 PM

<>  A computer buff who downloaded the Marks & Spencer's security
<>file containing pin numbers when he visited a London store to
<>carry out maintenance, was ordered to complete 70 hours
<>community service work today.
<>   Former Olivetti computer engineer Edward Yearley, 29, of
<>Vicarage Lane, Bovingdon, Hemel Hempstead, Herts, was convicted
<>in June of gaining unauthorised access to computer material in
<>October 12 1994, under the Computer Misuse Act 1990.

o   Yearly posted the file to the "Gates of the Underworld" BBS where it
was noticed in November.  Police duly executed a search warrant on his
home, seizing his PC and disks.  Yearly denied belonging to the BBS, but his
computer showed where he had uploaded the file under the pseudonym "Mr. Ed."

<>   After reading pre-sentence reports magistrate Paul Clark
<>said: "There is an element of breach of trust here. It is fair
<>to say a certain amount of knowledge and expertise is needed to
<>commit an offence such as this."

[DMK: I don't know whether to sigh or barf.]

<>   He added: "I'm satisfied there was no question of personal
<>profit for you, no gain and no loss to Marks & Spencer.  But
<>the scope for other people to misuse the information that was
<>downloaded is hard to judge."

o   Yearley was ordered to perform 70 hours of community service and pay
UKP170 in cour costs.  His computer will be returned to him.

<>   "Although your own personal computer was seized and evidence
<>gathered from it, your own computer was not used for committing
<>the offence or facilitating it."

[DMK: Stealing the data in the first place and violating the Parker
principle of Possession or posting it to the BBS and violating the principle
of Confidentiality?  Maybe I'll barf after all.]

Dave Kennedy [CISSP] InfoSec Recon Team Chief, National Computer Security Assn


ICEE voice-mail breakin

Thomas Insel <tinsel@jaka.ece.uiuc.edu>
Thu, 11 Jul 1996 02:52:24 -0500
An article in the 10 Jul 1996 *San Francisco Chronicle* (p. A13 of the East
Bay edition) describes a group of high-school students who broke into a
drink manufacturer's voice-mail system, erased information, changed
passwords, created new accounts for their own use, and eventually crashed
the system through overuse.  The article reports that the company had to
spend $40,000 to bring in an outside technician and upgrade their software.

A few questions remain.  Should superuser functions like the creation of new
users be allowed via the phone at all?  How was the infiltration so
pervasive that they couldn't just shut the system down, erase bogus
accounts, and chance privileged passwords?

I'm reminded of a similar, more limited, incident which occurred five or six
years ago when my high school set up a voice mail system to keep parents
informed of their children's homework assignments.  There was a system-wide
password to update the information, which was changed monthly for security
reasons -- in January, it was 1111, in February it was 2222, and in December
it was 1212.  After the system was broken into, its administrator decided
against changing the password.  Her reasoning: it was near the end of the
month, and the password would be changed soon, anyway.

Tom Insel


NSA response to key-length report

Matt Blaze <mab@research.att.com>
Thu, 18 Jul 1996 12:08:21 -0400
18 July 1996

There is currently being circulated, to members of Congress and possibly
elsewhere, a four page document entitled ``Brute-Force Cryptanalytic
Attacks'' that calls into question some of the conclusions of the ``Minimum
Key Lengths for Symmetric Ciphers'' white paper [1].  The document bears no
author or organization attribution, but we are told that it originated from
NSA.

The NSA document argues that ``physical realities'' make parallel key search
much more expensive and time consuming than our white paper estimated.
However, the NSA document appears to have been written from the perspective
of general parallel processing or cryptanalysis rather than exhaustive key
search per se.  It ignores several elementary principles of parallel
processing that apply specifically to exhaustive key search machines of the
type that our white paper considered.

In particular, NSA argues that interconnections, heat dissipation,
input/output bandwidth, and interprocessor communication make it difficult
to ``scale up'' a key search machine by dividing the task among a large
number of small components.  While these factors do limit the scalability of
more general purpose multiprocessor computers (such as those made by Cray),
they do not apply at all to specialized exhaustive key search machines.  The
NSA argument ignores the most fundamental feature of brute-force key search:
the processors performing the search have no need to communicate with other
components of the system while they perform their share of the search, and
therefore the system has no need for any of the global interconnections that
limit scaling.  Indeed, there is no reason that all the components of a
parallel search machine must be located even within the same city, let alone
the same computer housing.  We note that one of our co-authors (Eric
Thompson, of Access Data, Inc.)  designs and builds medium-scale FPGA-based
key search machines with exactly this loosely-coupled structure, and
regularly uses them to recover keys for clients that include the FBI.

The NSA document also calls into question our cost estimates for ASIC
components, suggesting that ASIC chips of this type cost NSA approximately
$1000.00 each.  However, our $10.00 per chip estimate is based on an actual
price quote from a commercial chip fabrication vendor for a moderate-size
order for an exhaustive search ASIC designed in 1993 by Michael Wiener [2].
Perhaps NSA could reduce its own costs by changing vendors.

Finally, the NSA report offers estimates of the time required to
perform exhaustive search using a Cray model T3D supercomputer.  This
is a curious choice, for as our report notes, general-purpose
supercomputers of this type make poor (and uneconomical) key search
engines.  However, even the artificially low performance results for
this machine should give little comfort to the users of 56 bit keys.
According to NSA, 56 bit keys can be searched on such a machine in
less than 453 days.  ``Moore's law'' predicts that it will not be long
before relatively inexpensive general-purpose computers offer similar
computational capability.

/s/  Matt Blaze
     Whitfield Diffie

References:

[1] Blaze, M., Diffie, W., Rivest, R., Schneier, B., Shimomura, T.,
    Thompson, E., and Wiener, M.  ``Minimum Key Lengths for Symmetric
    Key Ciphers for Commercial Security.''  January 1996.  Available
    from ftp://ftp.research.att.com/dist/mab/keylength.txt

[2] Wiener, M.  ``Exhaustive DES Key Search.''  Presented at
    Crypto-93, Santa Barbara, CA.  August 1993.

=========================================================================
[Transcription of document circulated to various members of congress
and others in June, 1996, apparently by NSA]

BRUTE-FORCE CRYPTANALYTIC ATTACKS

Two published theoretical estimates of cost versus time to perform
brute-force hardware attacks on selected cryptography key lengths
differ between themselves and differ significantly from what we find
when we buy or build computers to carry out such attacks.

The differences lie in assumptions made in the theoretical estimates,
which are not fully spelled out by the authors, and in scaling up
hypothesized small machines to ever larger ones without accounting for
physical realities.

The factors not accounted for are:

  o R&D costs for the first machine, typically on the order of $10
    million.

  o As more and more chips are added to a machine, two effects occur:

      o Interconnections increase and increase running time;
      o Heat from the chips eventually limit [sic] the size of a
        machine.

  o Memory costs are not included.

  o When get [sic] to the very fast processing speed estimates,
    machines can become Input/Output bound; so [sic] it cannot achieve
    the estimated speed.

  o Assuming every algorithm can be tested in same amount of time and
    key length is the only difference.

Table 1 are [sic] the average time estimates made for a given cost
done by Michael Wiener of Bell Norther Research in 1995.  These are
published in Bruce Schneier's Applied Cryptography book.

Note that these are average times, one-half of the total exhaust time.

Table 2 are [sic] the estimates for total exhaust times using Field
Programmmable Gate Arrays (FPGA) and Application Specific ICs (ASICs)
done for the Business Software Alliance by Blaze, Diffie, Rivest,
Schneier, Shimomura, Thompson, and Wiener in 1996.  In addition to the
above factors not accounted for they have assumed ASICs cost as low as
$10.  We find ASICs more typically cost $1000 and their capabilities
can vary considerably depending upon the specific task.

Table 3 are out estimates based on our experience with a Cray T3D
supercomputer with 1024 nodes.  This machine costs $30 million.

[Tables 1, 2, and 3 not transcribed here.]


Re: 56-Bit Encryption Is Vulnerable, Says Zimmermann (Edupage, 18.25)

Dave Tweten <tweten@gilmore.nas.nasa.gov>
Fri, 12 Jul 1996 13:09:53 -0700
The "Edupage Editors" make a critical mistake of logic in an item in
RISKS-18.25.  The item reports on Philip Zimmermann's testimony before the
Senate.  In his testimony, Zimmermann discussed a well-known Michael Wiener
paper on the feasibility of building a DES cracking machine.

The RISKS item correctly states that a $100 million version of the machine
could (according to Wiener) crack a DES key in about two minutes.  Wiener's
hypothetical machine is composed of a parallel array of custom designed DES
cracking chips.  It is by no means a general purpose computer.  Still, the
RISKS item says, "Zimmermann's testimony contradicted a recent statement by
U.S. Attorney General Janet Reno that even with a 'top of the line
supercomputer, decoding a 56-bit key would take over a year and the evidence
would be long gone.'"

There is no contradiction at all.  One is a "machine" that may not even
qualify as a programmable computer.  A "top of the line supercomputer" should
not be expected to be anywhere near as effective a DES cracking engine as
would be a machine such as Wiener's, built for the task out of custom chips.


Re: 56-Bit Encryption Is Vulnerable, Says Zimmermann (Edupage, 18.25)

"A. Padgett Peterson" <PADGETT@hobbes.orl.mmc.com>
Sat, 13 Jul 1996 9:54:14 -0400 (EDT)
I love how politicians can find ways to say what they want people to believe
- actually both are right. It *would* take a supercomputer a year. However
as long time readers know neither I nor Michael have been suggesting
supercomputers, instead boolean sieves (could use DSPs) made up of
cascadable arrays of single bit processors were what I would use. Similar to
parallel processors but much simpler hence faster.

The state of the art today is about 300 million keys per second for a single
sieve. Of course for a few more dollars you can set up a parallel array of
sieves, as many as you wish (initial value can be distributed and once cranking
you need not be concerned about intercommunication, a single bit would do
for success).

Actually, the hard part is testing for success - of course if you have known
plaintext as most cryptographers always assume...(can think of several ways
to avoid that).

Other problem is that you also must know the exact algorithm being used -
DES of course is fixed (FIPS PUB 46(A|B|C)) but a DES machine would not work
- you would need a different one - for COAST. Not difficult, just different.
Of course if you knew the target were using ENTRUST...

Now I am just am amateur cryptographer just as I have not done any serious
digital design for years, still have a pretty good idea what sub-micron
lithography is capable of so know the numbers above are supportable. I do
disagree with the designated heros of the MIT 7 on one point, they got the
economics wrong - while their per-wafer price is possible, the total design
would cost a bit more and there would be overhead involved (maybe
governments do not worry about that but I do).

My current feeling is that 56-bit DES is OK today for a corporation so long
as every message is encrypted (including the trivial ones), each key is only
used once, and a good random key generator is used. Know that relying on
high volume to raise cost-to-break vs value-of-break above buying an
employee is "Security by Obscurity" but is real cost/benefit. More (64 bits)
is better and many of today's computers have granularities of 32 or 64 bits
(something I never see mentioned - design steps beyond 64 are 96 and then
128).

Padgett

P.S. I was confining my thoughts to symmetric message keys - asymmetric keys
   and algorithms that may be used for key exchange are an entirely different
   subject.  (Have seen the two confused. Often.).


New ATMs considered harmful

Carl Resnikoff <carl@weblogic.com>
Sun, 14 Jul 1996 21:11:24 -0700
My local grocery store recently installed a kiosk with the newest generation
of automatic teller machine from Wells Fargo Bank. This ATM has a high
resolution color graphic display, with a touchscreen and virtual keypad for
entering the PIN number.

I was standing in line about 6 feet behind the person currently using it
when I noticed that each time they entered a digit on the keypad, that digit
and only that digit lit up, so I could clearly make out each digit of their
PIN as they pressed it, as could anybody walking by.  Somehow I think this
negates the value of using a PIN, when anybody within a 10 foot radius can
read it without even meaning to.

Carl Resnikoff   WebLogic, Inc


Safety-Critical Computer Systems, by Neil Storey

N Storey <neil@eng.warwick.ac.uk>
Mon, 15 Jul 1996 14:48:46 +0100 (BST)
Addison-Wesley, ISBN 0-201-42787-7
http://www.eng.warwick.ac.uk/~neil/safebook.htm

This is an introductory text covering all aspects of the development of
Safety-Critical Computer Systems. It is intended for undergraduate and
postgraduate students, and for engineers who use microcomputers within
real-time embedded systems. It assumes no prior knowledge of safety, or of
any specific computer hardware or programming language.

The book covers all phases of the life of a safety-critical system from its
conception and specification, through to its certification, installation,
service and decommissioning.  It provides information on how to assess the
safety implications of projects, and determine the measures necessary to
develop systems to meet safety needs.  It gives a thorough grounding in the
techniques available to investigate the safety aspects of computer-based
systems and the methods that may be used to enhance their dependability.

The book uses cases studies and worked examples from a wide range of
industrial sectors including the nuclear, aircraft, automotive and consumer
products industries.  The approach taken in equally suited to engineers who
consider computers from a hardware, software or systems viewpoint.

Please report problems with the web pages to the maintainer

Top