The RISKS Digest
Volume 14 Issue 61

Saturday, 15th May 1993

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Opel Corsa Stops for Mobile Phones
O\ystein Gulbrandsen
Re: Analog vs Digital Speedometers
Roger Crew
Re: New NIST/NSA Revelations
Dorothy Denning
Clipper - A Trojan Horse
Bill Nicholls
Testimony to Boucher's House Science Subcommittee
Whitfield Diffie
Info on RISKS (comp.risks)

Opel Corsa Stops for Mobile Phones

Gulbrandsen <oysteing@taskon.no>
Fri, 14 May 93 09:09:52 +0200
  >From Motor, the Norwegian AAA monthly:

  While testing the new Opel Corsa, we brought along a mobile phone.  One
  conversation was enough to kill the motor completely.  The phone was of the
  new nordic 450-system, and the antenna was in the front seat.  At first we
  took it for a coincidence, but it proved to be repeatable: when calling out
  or receiving a call, the car abruptly halted.  The signals from the phone
  apparently confused the electronics in the car.

  When we later moved the antennae to the back seat, we only felt a slight
  "tug" from the engine, and with the antennae on the roof, there was no ill
  effect.  The importer has passed the message to Opel, Germany, and we expect
  it to be fixed ASAP, he says.

The translation is by me, but I hope the meaning gets through.  [TNX. PGN]

\ystein Gulbrandsen   Taskon A/S


Re: Analog vs Digital Speedometers

Roger Crew <crew@CS.Stanford.EDU>
Wed, 12 May 93 00:14:22 PDT
Digital speedometers have other interesting properties.

I am reminded of one night when I was out driving on one of the more
desolate portions of [...pick your favorite interstate...], noticed
another car going absurdly fast and decided to follow it for a bit to
see just how fast (...well okay, I had just recently bought my car and
was about to yield to the temptation to "push the envelope" a bit...).

The speed turned out to be 85 mph.  Oddly enough, the speed remained quite
constant.  For the first few minutes, I attributed this to the other car
having a very good cruise control.  I thought it was rather stupid to be
setting a cruise control that high, but then, who was I to talk?

A bit more thought and observation revealed that my speed was quite
independent of what I was doing with my accelerator pedal or the general
terrain I was driving on.  I let up on the gas quite abruptly and this time
could actually feel the car slowing down --- and still I was going 85 mph.

At this point I noticed the tachometer descending through 4000 rpm and decided
I had definitely had enough fun for one night.  And then I remembered...

My car is 1982 model.  In 1982 there was a federal (US) law in force
forbidding speedometers from showing speeds higher than 85 mph.  This is all
well and good if you have an analog speedometer and can SEE the needle PEGGED
against the upper end.

It had never occurred to me until that moment that someone would actually
build a digital speedometer that pegs.

(very) shortly thereafter I had gotten down to a relatively sedate 50mph with
2000 rpm on the tachometer.  The rest is left as an exercise for the reader...

Roger Crew crew@CS.Stanford.EDU


Re: New NIST/NSA Revelations

Dorothy Denning <denning@cs.cosc.georgetown.edu>
Wed, 12 May 93 10:20:42 EDT
David Sobel, CPSR Legal Council, wrote in RISKS DIGEST 14.59:

    The proposed DSS was widely criticized within the computer
    industry for its perceived weak security and inferiority to an
    existing authentication technology known as the RSA algorithm.
    Many observers have speculated that the RSA technique was
    disfavored by NSA because it was, in fact, more secure than the
    NSA-proposed algorithm and because the RSA technique could also
    be used to encrypt data very securely.

This is terribly misleading. NIST issued the DSS proposal along with a public
call for comments as part of their normal practice with proposed standards.
The community responded, and NIST promptly addressed the security concerns.
Among other things, the DSS now accommodates longer keys (up to 1024 bits).
As a result of the revisions, the DSS is now considered to be just as strong
as RSA.

Dorothy Denning


Clipper - A Trojan Horse

Bill Nicholls <billn@bix.com>
12 May 1993 13:11:05 -0400 (EDT)
I've been following the discussion in RISKS about the Clipper chip with great
interest, and have learned a lot about what is planned as well as what
weaknesses may exist, and why.  But I have gradually become aware of an uneasy
feeling, as though I was missing something important about this chip.

After some thought about the situation, I have come to the
conclusion that Clipper is in fact a Trojan Horse, in the classic
sense.  This morning I was thinking about why anyone would use
the Clipper chip when existing software security could be much
better than what Clipper offers.  I can see why NSA and FBI are
unhappy about the reality of not being able to crack a really
secure system such as the RSA or DES with a 128 bit key, or a one
time pad on a floppy disk or CD-ROM.

In fact it is quite clear that current techniques are more than
adequate to secure messages beyond even NSA's ability to break
economically.  The simple fact is that fast microprocessors can
generate complexity faster than even huge parallel processors can
break it.  In my opinion, the balance of privacy has gone over to
the individual who uses one of these techniques, and this is a
sea change - one that cannot be reversed.

I think NSA, being expert in this domain, recognized this before
anyone else.  Clipper is clearly an attempt to turn back the
clock to an era when they could crack most any system if there
was sufficient incentive.  Like all such attempts, this too will
fail.  Why then would it be proposed?

It was proposed because hidden inside the Clipper chip proposal
are a number of Trojan Horses, some we are meant to find, and
some we are not.

Many people jumped on the key security issue, and I believe that
was intended to attract our attention and tie up our resources.
Several good points have been made about that item, which I will
not repeat here.  Another Trojan that we were meant to find is
the 'weak' crypto issue, the Clipper algorithm being unknown and
not available for public review.  It may well have a hidden
trapdoor that would not be found easily, yet if it were public,
it would be found eventually.  If it were found, the public
outcry could well cause a big shift in political power.

Beyond those Trojans there are others which I believe we were not
meant to find.  Consider how the proposal surfaced.  After
lengthy *secret* meetings between NIST and NSA, a Presidential
order appears putting Clipper into play.  What is notable about
this?

Clipper is the first proposal for encryption that *requires*
registration of the keys.  Why?  Because, as I said earlier, our
ability to encrypt has vastly outstripped the government's
ability to decrypt.  Registration puts this even, from the
government's point of view.  So where's the Trojan?  The Trojan
is the required registration of keys, mandated by presidential
order.  Once this precedent is established, the government could
say "Sure, use any method you like, but register the algorithm
and escrow the keys".  This new requirement to register methods
and keys now has the force of law without ever having been
through the proper process - that is the passing of such a law by
our elected representatives in the House and Senate, plus a
public signing by the President.

I think it is clear that no such law would pass, since it is a
infringement on our liberty (IMHO), for which a vague hand waving
and repeat of the 'National Security' mantra does NOT suffice to
justify.  The first hidden Trojan is the attempted end run around
the legislative process.

There is a second Trojan in the Clipper proposal that I have not
seen discussed.  It is the establishment of a fixed (but unknown)
method, with a fixed (but knowable) key as the basis for a
'secure' communication.  On the surface, this looks acceptable if
the key security issues can be resolved.  But it really isn't
acceptable if you think about how the real world operates.
Entirely aside from the security of the keys is the nature of how
people handle security.

Given even the suspicion of a compromised security, most people
will change their passwords instantly, just to be safe.  But what
do you do when the change requires justification to management
that funds for a new instrument, or more than one, be expended
for replacement on the basis of a suspicion?  Even if you *know*
that an instrument has been decoded, in many cases management
will simply accept the government's word that the keys were
destroyed rather than replace the instrument(s).

What I suspect would really happen is that compromised
instruments would generally remain in place because of human
nature to underestimate the potential threat.  It may well be
that compromising one instrument can make other instruments
compromised through some trapdoor we don't know about.  So a
single compromised instrument could compromise *unknowingly and
beyond the key safeguards* any other instrument that had a
conversation with it.  It seems possible that one legal wiretap
could cascade into a lot of illegal wiretaps.

Given the ease with which secure communications can be done
today, both voice and data, for no more than the cost of a
mid-range PC and some software, I can understand why the FBI and
NSA are concerned.  I can easily sympathize with their problem
because they are carrying heavy responsibilities for the nation.
But I believe the attempt to turn back the clock is the wrong
answer.

When the Enigma machine was compromised and better methods
invented, no one suggested that everyone be forced to use Enigma
machines so the government could still break them if needed.  But
this is exactly what the Clipper chip proposal is attempting to
do, and it has no more chance of succeeding than forcing Enigma
on everyone would.  Worse, it obscures the fact that a sea change
has occurred and ignores the need to deal with the new reality.
The new reality, for some years now, is that methods exist that
are secure from everyone from a *technical breaking* point of
view.

The clear conclusion I draw from this is that the FBI and NSA
must give new attention to other methods of access to the data
they require for security and law enforcement.  This may well
mean a return to the days of (heavens) spies, listening devices,
photographic/video recon and other methods.  While the cold war
was still on, this would probably be unacceptable, but now I
think we can accept the possible handicap to law enforcement and
national security rather than yield our own privacy.

In addition, whatever we do as a nation does not prevent other
nations from using unbreakable methods, so in any event, both NSA
and the FBI need to address the 'other methods' issue to remain
effective.

Clipper Chip:
    It isn't needed, we already have better secure methods.
    It isn't wanted, except by the government.
    It isn't effective, since anyone can use better methods.
    It isn't useful because it doesn't address the real problem.

Let's toss this Trojan horse, this bad idea, on the scrap heap of
obscurity.


Testimony to Boucher's House Science Subcommittee, 11 May 1993

<whitfield.diffie@eng.sun.com>
Thu, 13 May 1993 at 14h15
    The Impact of a Secret Cryptographic Standard
      on Encryption, Privacy, Law Enforcement
            and Technology

            Whitfield Diffie
            Sun Microsystems
              11 May 1993

    I'd like to begin by expressing my thanks to Congressman Boucher, the
other members of the committee, and the committee staff for giving us the
opportunity to appear before the committee and express our views.

    On Friday, the 16th of April, a sweeping new proposal for both the
promotion and control of cryptography was made public on the front page of the
New York Times and in press releases from the White House and other
organizations.

    This proposal was to adopt a new cryptographic system as a federal
standard, but at the same time to keep the system's functioning secret.  The
standard would call for the use of a tamper resistant chip, called Clipper,
and embody a `back door' that will allow the government to decrypt the traffic
for law enforcement and national security purposes.

    So far, available information about the chip is minimal and to some extent
contradictory, but the essence appears to be this: When a Clipper chip
prepares to encrypt a message, it generates a short preliminary signal rather
candidly entitled the Law Enforcement Exploitation Field.  Before another
Clipper chip will decrypt the message, this signal must be fed into it.  The
Law Enforcement Exploitation Field or LEEF is tied to the key in use and the
two must match for decryption to be successful.  The LEEF in turn, when
decrypted by a government held key that is unique to the chip, will reveal the
key used to encrypt the message.

    The effect is very much like that of the little keyhole in the back of the
combination locks used on the lockers of school children.  The children open
the locks with the combinations, which is supposed to keep the other children
out, but the teachers can always look in the lockers by using the key.

    In the month that has elapsed since the announcement, we have studied the
Clipper chip proposal as carefully as the available information permits.  We
conclude that such a proposal is at best premature and at worst will have a
damaging effect on both business security and civil rights without making any
improvement in law enforcement.

    To give you some idea of the importance of the issues this raises, I'd
like to suggest that you think about what are the most essential security
mechanisms in your daily life and work.  I believe you will realize that the
most important things any of you ever do by way of security have nothing to do
with guards, fences, badges, or safes.  Far and away the most important
element of your security is that you recognize your family, your friends, and
your colleagues.  Probably second to that is that you sign your signature,
which provides the people to whom you give letters, checks, or documents, with
a way of proving to third parties that you have said or promised something.
Finally you engage in private conversations, saying things to your loved ones,
your friends, or your staff that you do not wish to be overheard by anyone
else.

    These three mechanisms lean heavily on the physical: face to face contact
between people or the exchange of written messages.  At this moment in
history, however, we are transferring our medium of social interaction from
the physical to the electronic at a pace limited only by the development of
our technology.  Many of us spend half the day on the telephone talking to
people we may visit in person at most a few times a year and the other half
exchanging electronic mail with people we never meet in person.

    Communication security has traditionally been seen as an arcane security
technology of real concern only to the military and perhaps the banks and oil
companies.  Viewed in light of the observations above, however, it is revealed
as nothing less than the transplantation of fundamental social mechanisms from
the world of face to face meetings and pen and ink communication into a world
of electronic mail, video conferences, electronic funds transfers, electronic
data interchange, and, in the not too distant future, digital money and
electronic voting.

    No right of private conversation was enumerated in the constitution.  I
don't suppose it occurred to anyone at the time that it could be prevented.
Now, however, we are on the verge of a world in which electronic communication
is both so good and so inexpensive that intimate business and personal
relationships will flourish between parties who can at most occasionally
afford the luxury of traveling to visit each other.  If we do not accept the
right of these people to protect the privacy of their communication, we take a
long step in the direction of a world in which privacy will belong only to the
rich.

    The import of this is clear: The decisions we make about communication
security today will determine the kind of society we live in tomorrow.


    The objective of the administration's proposal can be simply
stated:

    They want to provide a high level of security to their
    friends, while being sure that the equipment cannot be
    used to prevent them from spying on their enemies.

Within a command society like the military, a mechanism of this sort that
allows soldiers' communications to be protected from the enemy, but not
necessarily from the Inspector General, is an entirely natural objective.  Its
imposition on a free society, however, is quite another matter.

    Let us begin by examining the monitoring requirement and ask both whether
it is essential to future law enforcement and what measures would be required
to make it work as planned.

    Eavesdropping, as its name reminds us, is not a new phenomenon.  But in
spite of the fact that police and spies have been doing it for a long time, it
has acquired a whole new dimension since the invention of the telegraph.
Prior to electronic communication, it was a hit or miss affair.  Postal
services as we know them today are a fairly new phenomenon and messages were
carried by a variety of couriers, travelers, and merchants.  Sensitive
messages in particular, did not necessarily go by standardized channels.  Paul
Revere, who is generally remembered for only one short ride, was the American
Revolution's courier, traveling routinely from Boston to Philadelphia with his
saddle bags full of political broadsides.


    Even when a letter was intercepted, opened, and read, there was no
guarantee, despite some people's great skill with flaps and seals, that the
victim would not notice the intrusion.

    The development of the telephone, telegraph, and radio have given the
spies a systematic way of intercepting messages.  The telephone provides a
means of communication so effective and convenient that even people who are
aware of the danger routinely put aside their caution and use it to convey
sensitive information.  Digital switching has helped eavesdroppers immensely
in automating their activities and made it possible for them to do their
listening a long way from the target with negligible chance of detection.

    Police work was not born with the invention of wiretapping and at present
the significance of wiretaps as an investigative tool is quite limited.  Even
if their phone calls were perfectly secure, criminals would still be
vulnerable to bugs in their offices, body wires on agents, betrayal by
co-conspirators who saw a brighter future in cooperating with the police, and
ordinary forensic inquiry.

    Moreover, cryptography, even without intentional back doors, will no more
guarantee that a criminal's communications are secure than the Enigma
guaranteed that German communications were secure in World War II.
Traditionally, the richest source of success in communications intelligence is
the ubiquity of busts: failures to use the equipment correctly.

    Even if the best cryptographic equipment we know how to build is available
to them, criminal communications will only be secure to the degree that the
criminals energetically pursue that goal.  The question thus becomes, ``If
criminals energetically pursue secure communications, will a government
standard with a built in inspection port, stop them.

    It goes without saying that unless unapproved cryptography is outlawed,
and probably even if it is, users bent on not having their communications read
by the state will implement their own encryption.  If this requires them to
forgo a broad variety of approved products, it will be an expensive route
taken only by the dedicated, but this sacrifice does not appear to be
necessary.

    The law enforcement function of the Clipper system, as it has been
described, is not difficult to bypass.  Users who have faith in the secret
Skipjack algorithm and merely want to protect themselves from compromise via
the Law Enforcement Exploitation Field, need only encrypt that one item at the
start of transmission.  In many systems, this would require very small changes
to supporting programs already present.  This makes it likely that if Clipper
chips become as freely available as has been suggested, many products will
employ them in ways that defeat a major objective of the plan.

    What then is the alternative?  In order to guarantee that the government
can always read Clipper traffic when it feels the need, the construction of
equipment will have to be carefully controlled to prevent non-conforming
implementations.  A major incentive that has been cited for industry to
implement products using the new standard is that these will be required for
communication with the government.  If this strategy is successful, it is a
club that few manufacturers will be able to resist.  The program therefore
threatens to bring communications manufacturers under an all encompassing
regulatory regime.

    It is noteworthy that such a regime already exists to govern the
manufacture of equipment designed to protect `unclassified but sensitive'
government information, the application for which Clipper is to be mandated.
The program, called the Type II Commercial COMSEC Endorsement Program,
requires facility clearances, memoranda of agreement with NSA, and access to
secret `Functional Security Requirements Specifications.'  Under this program
member companies submit designs to NSA and refine them in an iterative process
before they are approved for manufacture.

    The rationale for this onerous procedure has always been, and with much
justification, that even though these manufacturers build equipment around
approved tamper resistant modules analogous to the Clipper chip, the equipment
must be carefully vetted to assure that it provides adequate security.  One
requirement that would likely be imposed on conforming Clipper applications is
that they offer no alternative or additional encryption mechanisms.

    Beyond the damaging effects that such regulation would have on innovation
in the communications and computer industries, we must also consider the fact
that the public cryptographic community has been the principal source of
innovation in cryptography.  Despite NSA's undocumented claim to have
discovered public key cryptography, evidence suggests that, although they may
have been aware of the mathematics, they entirely failed to understand the
significance.  The fact that public key is now widely used in government as
well as commercial cryptographic equipment is a consequence of the public
community being there to show the way.

    Farsightedness continues to characterize public research in cryptography,
with steady progress toward acceptable schemes for digital money, electronic
voting, distributed contract negotiation, and other elements of the computer
mediated infrastructure of the future.

    Even in the absence of a draconian regulatory framework, the effect of a
secret standard, available only in a tamper resistant chip, will be a profound
increase in the prices of many computing devices.  Cryptography is often
embodied in microcode, mingled on chips with other functions, or implemented
in dedicated, but standard, microprocessors at a tiny fraction of the tens of
dollars per chip that Clipper is predicted to cost.

    What will be the effect of giving one or a small number of companies a
monopoly on tamper resistant parts?  Will there come a time, as occurred with
DES, when NSA wants the standard changed even though industry still finds it
adequate for many applications?  If that occurs will industry have any
recourse but to do what it is told?  And who will pay for the conversion?

    One of the little noticed aspects of this proposal is the arrival of
tamper resistant chips in the commercial arena.  Is this tamper resistant part
merely the precursor to many?  Will the open competition to improve
semiconductor computing that has characterized the past twenty-years give way
to an era of trade secrecy?  Is it perhaps tamper resistance technology rather
than cryptography that should be regulated?


    Recent years have seen a succession of technological developments that
diminish the privacy available to the individual.  Cameras watch us in the
stores, x-ray machines search us at the airport, magnetometers look to see
that we are not stealing from the merchants, and databases record our actions
and transactions.  Among the gems of this invasion is the British Rafter
technology that enables observers to determine what station a radio or TV is
receiving.  Except for the continuing but ineffectual controversy surrounding
databases, these technologies flourish without so much as talk of regulation.


    Cryptography is perhaps alone in its promise to give us more privacy
rather than less, but here we are told that we should forgo this technical
benefit and accept a solution in which the government will retain the power to
intercept our ever more valuable and intimate communications and will allow
that power to be limited only by policy.


    In discussion of the FBI's Digital Telephony Proposal --- which would have
required communication providers, at great expense to themselves, to build
eavesdropping into their switches --- it was continually emphasized that
wiretaps were an exceptional investigative measure only authorized when other
measures had failed.  Absent was any sense that were the country to make the
proposed quarter billion dollar inventment in intercept equipment, courts
could hardly fail to accept the police argument that a wiretap would save the
people thousands of dollars over other options.  As Don Cotter, at one time
director of Sandia National Laboratories, said in respect to military
strategy: ``Hardware makes policy.''

    Law, technology, and economics are three central elements of society that
must all be kept in harmony if freedom is to be secure.  An essential element
of that freedom is the right to privacy, a right that cannot be expected to
stand against unremitting technological attack.  Where technology has the
capacity to support individual rights, we must enlist that support rather than
rejecting it on the grounds that rights can be abused by criminals.  If we put
the desires of the police ahead of the rights of the citizens often enough, we
will shortly find that we are living in police state.  We must instead assure
that the rights recognized by law are supported rather than undermined by
technology.


    At NSA they believe in something they call `security in depth.'  Their
most valuable secret may lie encrypted on a tamper resistant chip, inside a
safe, within a locked office, in a guarded building, surrounded by barbed
wire, on a military base.  I submit to you that the most valuable secret in
the world is the secret of democracy; that technology and policy should go
hand in hand in guarding that secret; that it must be protected by security in
depth.


            Recommendations

    There is a crying need for improved security in American communication and
computing equipment and the Administration is largely correct when it blames
the problem on a lack of standards.  One essential standard that is missing is
a more secure conventional algorithm to replace DES, an area of cryptography
in which NSA's expertise is probably second to none.

    I urge the committee to take what is good in the
Administration's proposal and reject what is bad. \begdis

      o The Skipjack algorithm and every other aspect of this proposal
    should be made public, not only to expose them to public
    scrutiny but to guarantee that once made available as
    standards they will not be prematurely withdrawn.
    Configuration control techniques pioneered by the public
    community can be used to verify that some pieces of equipment
    conform to government standards stricter than the commercial
    where that is appropriate.

      o I likewise urge the committee to recognize that the right
    to private conversation must not be sacrificed as we move
    into a telecommunicated world and reject the Law Enforcement
    Exploitation Function and the draconian regulation that would
    necessarily come with it.

      o I further urge the committee to press the Administration
    to accept the need for a sound international security
    technology appropriate to the increasingly international
    character of the world's economy.

Please report problems with the web pages to the maintainer

x
Top