The RISKS Digest
Volume 11 Issue 51

Monday, 22nd April 1991

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Re: Dutch crackers and irresponsible officials
Tom Blinn
Re: Dutch Intruders
Louis Todd Heberlein
Government control of information
Jerry Leichter
Letter to Senators on SB 266
Edward Engler
Withholding cryptographic keys
H. Keith Henson
Encryption backdoor rule: hiding data
Ross Williams
Encryption
Tony Buckland
Comment on "US Gov't is not careful with its supplies"
Haynes
Re: Educating the Camiroi
George L Sicherman
Info on RISKS (comp.risks)

RE: Dutch crackers and irresponsible officials (RISKS-11.50)

"Dr. Tom @MKO, CMG S/W Mktg" <blinn@dr.enet.dec.com>
Mon, 22 Apr 91 15:23:35 PDT
I certainly agree with Fernando Pereira that there appear to be irresponsible
officials involved in this incident, but I beg to differ with him about just
who and where they are.

If the published accounts are accurate, then the systems being "cracked" are
many of the same ones that have been "cracked" in the past, and the security
loopholes that are being exercised are the self-same ones that have been used
in past episodes.

The irresponsible officials are not at the University of Utrecht, as Fernando
Pereira would have us believe; rather, they are the bureaucrats managing the
systems who haven't closed the well-understood loopholes.

The risk?  The continued belief that "security through obscurity" can work,
that by prohibiting access (and making the open flow of unclassified data a
crime) we can somehow eliminate the need to secure our systems.

This is *not* a matter of blaming the victim, when the victim is suffering
from his (or her) own negligence.  Mr. Pereira is correct:  "Should a site
whose officials show this kind of disregard for the common good of the
network-using community be allowed to stay on the Internet?"  Of course not,
but it is NOT a problem in Utrecht, it is a problem in the systems that were
compromised through their own lack of diligence in implementing known fixes
for known security problems.
                                               Tom

Dr. Thomas P. Blinn, Digital Equipment Corporation, Digital Drive, Merrimack,
New Hampshire 03054  ...!decwrl!dr.enet.dec.com!blinn (603) 884-4865


Re: Dutch Intruders (RISKS-11.50)

Louis Todd Heberlein <heberlei@fuji.eecs.ucdavis.edu>
Mon, 22 Apr 91 16:13:53 PDT
I think before we can seriously worry about intruders using countries outside
the United States (and other countries with computer crime laws) as places to
hide, we need to take a close look at ourselves first.

I, along with a number of other organizations, have been working in the field
of intrusion detection.  I remember how excited I was when I detected my first
honest to goodness intrusion.  Several months later, and several HUNDRED
intrusions later, I now feel like a guest on the hackers' networks.  Why should
I worry about hackers from other countries when we have thousands of them here
at home?
                                        Todd


Government control of information

Jerry Leichter <leichter@lrw.com>
Sat, 20 Apr 91 12:28:48 EDT
The recent discussions of proposed regulations on cryptographic equipment have
approached a deep issue that I think has been inadequately discussed and
thought through.  People continue to discuss the specific issue of cryptography
entirely through the traditional approaches of the First and Fifth amendments
and ideas about personal privacy.  There's a risk in doing this: The risk of
missing new risks caused by application of ideas in new and different
circumstances.

Societies have always claimed the right to control various objects and
substances that thet consider dangerous.  Drugs, plants that produce drugs,
explosives, guns, various kinds of precursors for dangerous things that are
not in themselves dangerous, have all been regulated for many years.  We are
in the midst of an ongoing debate in the United States about the degree to
which we should control access to guns - but in fact we've had some controls
on them (machine guns and sawed-off shotguns are illegal) for a long time.

Many people have argued that we are in transition to an "information-based"
society, that in the future the source of wealth and power will be, not
material goods, but information itself.  If this is the case, I posit that we
will inevitably find that, just as we have found it necessary in the industrial
age to control access to certain substances and devices, in an information age
we will find it necessary to control access to and dissemination of certain
classes of information.

In fact, we already do this.  No one I know argues that you should have the
right to distribute other people's charge account numbers; no one seems upset
when credit card thieves are prosecuted for selling such numbers.  (We may call
the charges brought against them different things, but in practice what we are
trying to prevent is trafficing in a certain class of information).  We're
concerned about companies that sell data - information - about us without our
consent.  The companies might argue that they are just exercising their rights
of free speech, but somehow that argument doesn't seem enough.

The government has a massive establishment for classifying information and
keeping classified information secret.  Generally, if you come upon classified
information from a non-classified source, you are free to do with it what you
wish.  It's little-known that there are two exceptions to this in the law:
Certain information regarding nuclear weapons and cryptography is "born
classified" - you are bound by restrictions even if you discover the
information yourself.  (There have been few attempts to enforce these laws; I
don't think any attempt has been fought out to a Supreme Court decision on the
obvious constitutional questions that arise.)  You cannot export programs that
implement cryptographic algorithms from the United States - you may think they
are "information", but the government essentially classifies them as munitions.
(And, no, "public domain" or "free" distributions are treated no differently
from for-profit distributions.)

When NASA was created, it was required to make all its plans and designs
available to anyone who asked.  These days, there's increasing concern that a
Libya or an Iraq can easily obtain full engineering drawings, specifications,
manufacturing techniques, even lists of suppliers, for tested missiles with
intercontinental range.  Duplicating that work from scratch would be a
formidable, if not impossible, undertaking for most countries in the world.
(Actually building the missiles is still difficult, but it's much, much easier
to build from complete plans.)

As information, in and of itself, becomes more and more central to our
economies, our military, and every aspect of our lives, the clash between free
speech and safety and privacy will inevitably increase.  I believe it's naive
to think that we can ignore this clash, or continue to claim that "openness" is
ALWAYS the best policy.  (In fact, I know of few people who really believe
that, even if they say it: If you claim you believe openness is always the best
approach, ask yourself whether you believe the store you rent VCR tapes from
should have the right to make public information about what you view.  It all
comes down to whose ox is gored, doesn't it?)  (BTW, your VCR data IS private,
by law.  This little special-case law was, as I recall, passed in response to
outrage about newpaper reports on Robert Bork's viewing habits while his
Supreme Court nomination was being considered.)

The general issue of control of information has actually been discussed in
science fiction for years.  Larry Niven's novels have various references to
agencies with the job of controlling the dissemination of information - that's
one of ARM's jobs, for example.

The earliest, and still one of the best, discussions is in a story written by
Isaac Asimov in the late fifties or early sixties.  The title is something like
"The Dead Past"; it appears in a collection titled "Earth Is Room Enough".  I
highly recommend it to anyone who thinks that these problems have trivial
solutions.
                            — Jerry


Letter to Senators (Re:Senate 266)

Edward_Engler@transarc.com <Ed Engler, ere@transarc.com>
Mon, 22 Apr 1991 10:55:09 -0400 (EDT)
After reading many posts about Senate bill 266, I have decided that this issue
is worth writing to my senators about.  I have drafted the enclosed letter, and
I urge everyone to send a copy of it to their senators as well.  Regardless of
the immediate impact of this bill, the long term effects of the statement in
question can only be to compromise both personal freedom and national security
in the United States.

Dear Senator [Your senators name],

Recently, Senator Biden introduced a counter terrorism bill containing a very
distressing provision.  In Senate 266, section 2201 titled "Cooperation of
telecommunications providers with law enforcement", that proposition is put
forth that "It is the sense of Congress that providers of electronic
communications service equipment shall ensure that communications systems
permit the government to obtain the plain text contents of voice, data, and
other communications when authorized by law."

I do not believe that requiring communications 'gear' (whether it be
transmission equipment or communications software) to allow anyone 'authorized
by law' to read data transmitted by such means will serve your constituents or
our national interest.

I understand that the proposition is not specifically requiring manufacturers
to do anything at this time.  What it does is give the secret bureaucracies of
the executive branch of our government free reign to create any such
regulations that they see fit to write.

The only outcome of such power can be the enactment of one or more regulations
diminishing the ability to transmit secure data via electronic means of
communication.  No other form of communication has attached to it the universal
ability to read any communication that it is used to transmit.  I can send an
encrypted letter through the mail, and the only way anyone other than the
intended recipient can read that letter is by asking one of the two
participants what it says.  Why should electronic means of communication be any
different?

Consider the following scenario: The Department of Defense makes heavy use of
electronic communication both on and off the battlefield.  If all communication
systems were to have a method whereby an individual other than the intended
recipient could read the message, an unfriendly organization will gain access
to the contents of some (or all) of our military communication.  This is a very
serious breach of national security as well an affront to the personal privacy
of millions of United States citizens.

By adopting that statement, the Congress will be adopting a stance that will
inevitably result in the widespread use of communications systems highly
susceptible to compromise not only by authorized government agencies, but by
outlaws and antagonistic foriegn nationals as well.

You should ensure that the section in question is rewritten to read "It is the
sense of Congress that communications via electronic means shall be protected
from being read by any but the intended recipient by all reasonable means
available to the transmitting device or authority."  If you feel that it is
important that the government have access to the data passed between
individuals, then you may want to introduce a bill stating "The government may,
with the permission of a judge, subpoena the contents of electronically
transmitted messages.  Anyone willfully destroying subpoenaed electronic
information is subject to (some term of imprisonment and/or some fine)."

By all means let's protect ourselves from terrorism, but let's not give up any
of our fundamental liberties in doing so.

Sincerely,

[your name]


Withholding cryptographic keys (Re: SB 266, Boudrie, RISKS-11.49)

<hkhenson@cup.portal.com>
Sat, 20 Apr 91 18:05:38 PDT
I am not a lawyer, and off-hand would have agreed with Bob's analysis.
However, at the recent Computers, Privacy, and Freedom Conference, I had lunch
with several lawyers (defense and prosecution).  I brought up this very
problem, and they delved into the reasoning a judge would use when confronted
with law enforcement agents trying to pry a key out of someone.  It was their
solid conclusion--on Constitutional grounds--that a person could not be force
to disclose a crypto key, because the *resultant* decrypted data would be the
"product" (in some legal sense) of the key and the encrypted data, and would
thus invoke the self-incrimination protection of the Constitution.  I guess you
can be forced to turn over physical items, and even safe combinations, but not
crypto keys which act directly upon information to make it useable, and
possibly incriminating.

For all the reasurance, I am not eager to be a test case!

H. Keith Henson (hkhenson@cup.portal.com)


Encryption backdoor rule: hiding data

Ross Williams <ross@spam.ua.oz.au>
21 Apr 91 17:54:08 GMT
> Although US citizens have a right against self incrimination, it is unlikely
> that this right extends to withholding cryptographic keys encoding potentially
>...
> Establish a well documented, and publicized, policy of encoding numerous
> "dummmy files" (perhaps using man pages as source text) using your standard
> encryption algorithm, with random keys which are not recorded.  This will
> produce a *DOCUMENTED SITUATION* in which you are not able to product
> decryption keys for the majority of your files, even if you wanted to.

Or how about this. Design a cipher system that allows n files along with some
white noise to be jointly encrypted into a single file using several different
keys. Use data compression and white noise to hide how many files and how much
data is actually there. Then, when asked to decrypt, one can give any key or
keys one pleases so as to decrypt to one of many documents, innocent or
otherwise.

Or hide your data:
   1) In large executable files.
   2) In the grammar of large computer programs.
   3) Using data compression techniques in other (non-compressed) data.

The possibilities for hiding data are endless. One way to police this sort of
thing might be to demand that a ciphertext decrypt to a document, which when
compressed, is about the same length as the ciphertext.

Ross Williams.                                            ross@spam.ua.oz.au


Encryption

<Tony_Buckland@mtsg.ubc.ca>
Fri, 19 Apr 91 15:19:12 PDT
 For those wanting secure encryption, a true one-time pad
 (with the key as long as the text), really used only once,
 remains as unbreakable as it ever was.  It's tedious as
 hell to use, because you have to courier enough keys to
 your recipients in advance to cover all the messages you
 ever expect to send, but it is, if applied with absolute
 diligence, really safe.  That is, it reduces spying back
 to the good old low-tech methods of sex, blackmail,
 burglary, corruption and personal violence.


Comment on "US Gov't is not careful with its supplies"

<haynes@cats.UCSC.EDU>
Sat, 20 Apr 91 00:11:55 PDT
That story reminded me of something that happened nearly 30 years ago,
but the the government is often accused of using data processing
technology that is 30 years behind the times.

The truck division of one of the automakers had a regional parts warehouse.
Dealers ordered parts from the warehouse by filling out a form on a machine
that made punched paper tape.  The tape was then transmitted to the warehouse
with a telephone and modem setup.  There was no checking, parity or anything
else.

One night the transmission made a single-bit error, changing ASCII zero to one.
This turned an order for 17 dipsticks into an order for 1017 dipsticks.  The
warehouse computer processed the order, even generating a letter to the dealer
to the effect that there were only 234 dipsticks in stock, and they were
sending those and would order the rest from the factory and send them when they
came in.

The mistake was discovered when the guy packing the order for shipment wondered
why that small-town dealer needed so many dipsticks and brought it to the
attention of his boss.  Perhaps by now packaging and shipping technology has
improved to the point that the 234 dipsticks could be shipped without anyone
having a chance to notice.


Re: Educating the Camiroi

George L Sicherman <gls@corona.att.com>
Sat, 20 Apr 91 20:11:10 EDT
In Risks 11:49, David Lamb alludes to a science-fiction story which he
misattributes to Cyril Kornbluth. "Primary Education of the Camiroi"
was written by R. A. Lafferty and published 25 years ago in _Galaxy._
The story does indeed mention a child who reads too fast:

        "Only the other day there was a child in the third grade who
    persisted in rapid reading," Philoxenus said. "He was given an
    object lesson.  He was given a book of medium difficulty, and he
    read it rapidly.  Then he had to put the book away and repeat
    what he had read.  Do you know that in the first thirty pages he
    missed four words? ..."

As a mere Earthling, Mr. Lamb may be forgiven his error; but Philoxenus
later mentions that at Camiroi schools slow learners are executed.  Usenet
is more humane — slow learners are merely ex-communicated!

Col. G. L. Sicherman                     gls@corona.att.COM

Please report problems with the web pages to the maintainer

x
Top