The RISKS Digest
Volume 10 Issue 25

Monday, 27th August 1990

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Justice Department computers vulnerable
Rodney Hoffman
A Step Backward (Interactive Phone Service)
Theodore Lee
How to Lie with Statistics [once again]
Jerry Hollombe
Re: Something good about Automatic Bank Tellers
Jerry Hollombe
Mark Lomas
Re: Electronic house arrest units
Philip L Harshman
Jim Campbell
Amos Shapir
Brinton Cooper
Brian Tompsett
Mike Bell
Willis H. Ware
Re: Object Code Copyright Implications
Willis H. Ware
A. Harry Williams
Lars Poulsen
Gene Spafford
Re: Proposed ban on critical computerized systems
Pete Mellor
Comment on Markoff article on NCSC
Robert H Courtney via Bill Murray
Info on RISKS (comp.risks)

Justice Dept. computers vulnerable

Rodney Hoffman &offman.ElSegundo@Xerox.com>
Fri, 24 Aug 1990 07:58:40 PDT
Condensed from a story in the 23 Aug 90 'Los Angeles Times' by Ronald J.
Ostrow:

The General Accounting Office says that a lack of adequate computer
security in the Justice Department's new "state-of-the-art" data center in
Rockville, MD will permit unauthorized remote users to enter and exit the
system undetected, endangering highly sensitive information such as
identities of undercover operators and confidential informants.

The GAO report says that the data center is accessible over phone lines and
commercial computer networks, making it vulnerable to remote users who
could "introduce viruses and other disruptive software ... into vulnerable
computer systems."  Investigators found that contingency plans to be
implemented when computer services are disrupted had either not been
prepared or not tested and that no mandatory computer security training was
being given all employees.  They said, "Systems programmers with extensive
knowledge of hardware and operating procedures had unescorted access to the
data center and were capable of issuing critical computer commands that
should have been limited to computer operators."  Guards were not stationed
and surveillance devices such as cameras or motion sensors are also lacking
in some places.

The GAO blames the security weaknesses on "a lack of effective leadership
and oversight by the justice management division."

A Justice Dept. public relations official said, "... we have requested
additional resources from Congress for [computer security audit] and have
not received them."  A dept. official also said "a lot of corrective action
[has already been] taken and more ... is under way."


A Step Backward (Interactive Phone Service)

Theodore Lee <lee@TIS.COM>
Fri, 24 Aug 90 16:49:03 EDT
(I am sending this to both RISKS and TELECOM.  I think it is germane to one
or the other, perhaps even both, but I'll leave that decision up to the
respective editors.)

     I have been using a "pay-by-phone" service to pay bills out of my
savings account for longer than I can remember; I think at least ten years.
The service has been simple and straightforward: after entering your account
number and PIN in response to some minimal prompting (I don't remember now
whether it echoed the account number; it did not echo the PIN) you repeated
a simple cycle — (a) "enter payment code" (b) enter code, terminated by #
(c) "payment code 

How to Lie with Statistics [once again] (RISKS-10.22)

The Polymath <hollombe@ttidca.tti.com>
24 Aug 90 00:38:40 GMT
I agree with the recommendation of Huff's book, but for different reasons.

As a former statistician, I get very tired of the constant misperception that
there is something inherently unreliable or evil about statistics and
statistical techniques.  Statistical analysis is a perfectly legitimate
mathematical discipline.  The techniques are well understood and they work.  By
definition, you can't lie with statistics without misusing those techniques.

By all means, read Huff's book to learn how and recognize when the
techniques are being abused.  But don't let that lead to rejecting all
statistics out of hand.

As one of my former stats profs used to say: "You can lie with statistics,
but not to a statistician."


Something good about Automatic Bank Tellers (Mellor, RISKS-10.22)

The Polymath <hollombe@ttidca.tti.com>
24 Aug 90 00:38:40 GMT
This has been standard procedure for our (Citicorp) ATMs from the first --
about 15 years.  The only ATMs I can think of that don't do this are some of
the older Diebold machines that drop dispensed cash in a bin (a mind-
bogglingly bad design, IMHO).  The time-delay, BTW, is usually set by the bank.
Ten seconds is rather short, by our standards.

Many ATMs do similar things with cards left in card readers.

Jerry Hollombe, Citicorp(+)TTI, 3100 Ocean Park Blvd., Santa Monica, CA 90405
{csun | philabs | psivax}!ttidca!hollombe (213) 450-9111, x2483


Re: Something good about Automatic Bank Tellers (RISKS-10.22)

<tmal@computer-lab.cambridge.ac.uk>
Fri, 24 Aug 90 11:38:27 +0100
In RISKS DIGEST 10.22 Pete Mellor mentioned a useful timeout in a National
Westminster Bank cashpoint machine; these machines recover money that has
not been withdrawn within a short period.  I have seen a similar timeout
that involves the card rather than money.

I was standing in a queue at a Barclays Bank cashpoint machine behind two
women who were deep in conversation.  After a while the machine beeped
loudly and asked whether the customer wished more time to consider the
transaction.  A little later it beeped again and displayed a more strongly
worded message to the effect of `If you do not press a button within the
next thirty seconds your card will be confiscated'.  Thirty seconds later
a third message appeared `Your card has been retained, please inquire inside
the bank for further information'.  The perspex shutter then closed over the
display and keyboard; only this noise seemed able to distract the women
from their conversation who then went into the bank cursing machines.

I was pleased to see that if I forget to retrieve my card it is likely to
be retained by the bank rather than by a passerby.  I was also pleased to note
that if you want more time to consider your transaction then you can prolong
the timeout.  It was worth the inconvenience of waiting in the queue to see
the look on their faces after losing the card.

A final thought: nobody in the queue asked the women to hurry up.  Would
people in other countries wait patiently or is this a British characteristic?
I probably ought to have drawn their attention to the second message but I
was too interested to see the behaviour of the machine.

    Mark Lomas (tmal@cl.cam.ac.uk)


Re: Electronic house arrest units (RISKS-10.24)

Philip L Harshman <philip@hubcap.clemson.edu>
24 Aug 90 12:46:05 GMT
The detainee doesn't have to break open the device to read out the RAM.  All
he has to do is poll the thing himself and record its responses.  After feeding
this info into a handy tabletop "device", he can go about his business.  Is
a big enough hole?

Philip Harshman, Clemson University             uucp: ... !gatech!hubcap!philip
                                bitnet: philip@clemson    phone: (803) 656-3697


Re: Electronic house arrest units

"Jim Campbell" <jimc@ralvm11.iinus1.ibm.com>
Fri, 24 Aug 90 07:25:18 EDT
In Risks Digest 10.24, King says that in the event of a unit failure, the
parolee calls his/her probation officer and says "come and inspect me and
bring me a new bracelet".  Herein lies a weakness of the proposed system.

Background:  About a year ago, a neighbor installed a burglar alarm system
in his house.  For a variety of reasons, it went off (alarmed) two or three
times a week.  The alarm included a rather loud bell outside his house.
The first few times this happened, some of us went to see if he needed
help, or see if there really was a burglar.  Soon we realized that these
were false alarms, and began to first ignore, then resent this intrusion.
Had there been a real burglary then, no one would have paid any attention.

Back to the subject at hand.  The parolee regularly calls his/her probation
officer and says "come and inspect me ...".  Also, the parolee should on
occasion be out of range when polled, but not so far out of range that
he/she can't get back in time for the parole officer to catch.  Soon the
parole officer begins to mistrust the system, then even ignore it.  Then
the parolee can begin to test the limits.

Jim Campbell


Re: Electronic house arrest units (Gong, RISKS-10.22)

Amos Shapir <amos@taux01.nsc.com>
24 Aug 90 12:37:54 GMT
Just one hole, but a big one: this scheme assumes perfect reception - no
echoes, interferences, etc.  The urban landscape is just about the worst
environment for that.

Amos Shapir, National Semiconductor (Israel) P.O.B. 3007, Herzlia 46104, Israel
amos@taux01.nsc.com, amos@nsc.nsc.com  Tel. +972 52 522331 fax: +972-52-558322


Re: Electronic house arrest units (Gong, RISKS-10.22)

Brinton Cooper <abc@BRL.MIL>
Fri, 24 Aug 90 10:22:23 EDT
<king@kestrel.edu> proposes a house arrest unit, the design of which is
based upon the principles of radio location.  Some observations follow:

Were the radio transmissions only one-way, base-to-user, it could
probably be quite inexpensive — certainly no more difficult than
personal paging services to implement.

The difficult part is that the bracelet replies.  Now, instead of a few
high-power transmitters to cover an urban area (say, within the
"beltway" of the typical city), a large network of receivers is needed.
These receivers must be connected centrally for coordination, "voting,"
and the ellipsoidal computations.  This quickly becomes expensive, and,
although it may be of lower cost than incarceration, political leaders
seem to feel that "We already have a jail; we'd have to buy this
system."

In addition, the urban environment provides many unintentional
reflectors for radio signals.  In voice communications, the receiver
simply selects the strongest component and ignores the others.  In a
position location application, the direct component must be selected.
Because of reflections and masking, the direct component may not even
reach the receiver.

Also, the risk for computational and logical error seems large.  I don't
believe that this application is "off the shelf," so the software is an R&D
project subject to the error discoveries about which we often read in this
Digest.
                                                  _Brint Cooper, BRL


Re: Electronic house arrest units (King, RISKS-10.24)

Brian Tompsett <bct@tardis.computer-science.edinburgh.ac.uk>
Mon Aug 27 14:18:41 GMT 1990
Risks 10.24 contains a proposal for an electronic house arrest unit which
is based on distance triangulations using timing over the phone lines.
This method has a hole in it the size of a barn door. The problem is that
one cannot rely on calls to go via a repeatable route between any two
locations or at any time. The phone company may route your call in any way it
chooses and at any time. It is almost impossible to distinguish phone
company re-routing from hacking re-routing.
   Brian

Brian Tompsett. Secretary BCS Edinburgh Branch. E-mail bct@uk.ac.ed.cs.tardis.
Tel: 031 554 9424 (Office) 031 441 2210 (Home)         briant@uk.co.spider


Re: Electronic house arrest units (Gong, RISKS-10.22)

Mike Bell <mb@sparrms.ists.ca>
Mon, 27 Aug 90 10:08:38 EDT
king@kestrel.edu proposes an "uncrackable" design for a house arrest unit based
on physical propogation delays for position location and a stored key string
maintained in a bracelet.

Suggested improvement:

1> Store a DES cipher key in the bracelet, and send an encrypted version
   of a random "probe" signal. This (a) avoids having to store a very
   long string of "random" bits inside the unit, and (b) reduces the
   risk that a "forged" bracelet could produce the correct response.
   (If the response is only 1 bit long, I have a 50% chance of guessing
   it right, don't i?)

Cracking/hacking the scheme...

2> The "people system" is probably the weakest point. Convince your
   parole officer that the propagation of the signal is poor, and that
   any alarm is probably a false alarm. Do this by: heating up /cooling
   down the unit, static discharge, RF flooding, putting electro-magnetic
   shielding in parts of the house, etc. Keep calling him/her out at
   3:00am because the local CB radio is inerfering with your device.
   Chances are this will create some flexibility in dealing with
   system problems (how many maintenance engineers will be on call at
   3:00am?) Will they take you back into custody, replace the
   bracelet, or wait until 9:00am the following morning - enough time
   for you to commit whatever heinous crime you had in mind?

3> Use some active ECM against the *entire system*. (I'm assuming here
   that the half dozen transmitting/receiving stations are city
   wide - on grounds of cost). Try and jam or confuse the entire
   system for a period of time. What are "they" going to do? Check on
   *every* person under house arrest?

Whether either the cost/complexity of scheme or the cost/complexity of
overcoming it is justified is another question...

-- Mike Bell — <mb@sparrms.ists.ca>


Re: Electronic house arrest units (Gong, RISKS-10.22)

"Willis H. Ware" <willis@rand.org>
Fri, 24 Aug 90 10:38:59 PDT
This technique is well developed within the military and is usually called
TDOA - time difference of arrival.  Systems exist that can listen to 10's
of thousands of pulses, sort them out, calculate ground location, and
report same — and all within a very nominal size machine.


Re: Object Code Copyright Implications (Biddle, RISKS-10.24)

"Willis H. Ware" <willis@rand.org>
Fri, 24 Aug 90 10:38:59 PDT
I'd like to resurface an observation that I've made from time to time in
various places in regard to protection of intellectual property.

A central issue that must be sorted out somewhere along the way is the
essential difference between information per se and the representation of
such information.  Example:  One's name is information; it can be
represented in ASCII, as a written signature, as a spoken phrase, in Morse
code, as magnetic domains, as electrical signals, as modulation on a
carrier, etc.

The legal community will have to appreciate the distinction, and establish
the principle within both the judicial and the legal systems.  Until it
does, we'll be creating laws that deal with bits and pieces of the problem
and generate increasing confusion.
                    Willis H. Ware


Re: Object Code Copyright and reverse engineering.

"A. Harry Williams" <ARRY@MARIST.BITNET>
Thu, 23 Aug 90 23:45:01 EDT
There was an article by Pamela Samuelson in the Jan 1990 issue of IEEE Software
discussing Software copying, especially as it applies to reverse engineering.
While a letter from an IBM Corporate lawyer disputed the findings in the July
issue, MS Samuelson made a case for it not being a violation of the copyright.
It makes for interesting reading.
                                                  /ahw


Object Code Copyright Implications (Biddle, RISKS-10.24)

Lars Poulsen <lars@spectrum.cmc.com>
Fri, 24 Aug 90 04:44:35 GMT
I would agree that a decompilation of a copyrighted program is a derived
work, but I would belive that deriving this for personal use is "fair
use" - subject to the same limitations as the original object code.

Every time this topic comes up, I am impressed with how well the notions
of copyright law seem to agree with the field. And every time computer
people discuss this with lawyers, I am disgusted with how impossible it
seems to find common ground with the lawyers when it comes down to
interpreting the copyright law.

>Now there are some interesting points here.
>1) If object code is copyrightable, what *exactly* is it that is subject
>   to the copyright? Magnetic patterns? Ones and Zeros? Source code?
>2) Of course, program behaviour in the past is *not* sufficient to
>    determine how a program will behave in the future.

I would surmise that it is the behaviour of the program that should be
copyrightable. However, for practical purposes, that behaviour needs to
be determined from an inspection of some representation of the
algorithms. This may require decompilation.

Hence, fair use should allow you to decompile the code, but compiling it
for another machine should be considered a "copy". Likewise, a
transcription of the source code with every comment deleted/replaced, and
every variable name changed (and possibly transliterated into a
different programming language) should be considered a "copy" even if no
tokens are the same.

On the other hand, I think that "look-and-feel" goes too far. The Lotus
suit and the MacIntosh suit get very close the the kind of issues that
belong in patent law, rather than copyrights.

If the lawyers understood the field better, would they be more likely to be
"reasonable" ? Is this an issue of poor "cross-cultural" communication, or of
crossed goals ? Of different perceptions of fairness ?

Lars Poulsen, SMTS Software Engineer,    CMC Rockwell  lars@CMC.COM


Re: copyright on object code (RISKS-10.24)

Gene Spafford <spaf@cs.purdue.edu>
24 Aug 90 17:08:36 GMT
I'm not a lawyer, so don't take the following as legal advice, but....
Any lawyers out there will be certain to set me straight (or try!).

The purpose of copyright is to protect the commercial interest of the
copyright holder (author or publisher).  A copyright on an object code
version of a program is intended to prevent you from selling copies of
the program.  It certainly seems reasonable to say that reverse
engineering the object code preserves the copyright — trying to sell
a copy of the program in any form would infringe the economic
advantage of the program author/publisher, whether that copy is the
object code itself, or a new version of the program derived from
reverse-engineered code.

They key here is what you do with copies.  If I buy a book and make
ten copies of it, the problem comes about if I sell those copies or
give them away to others, thus depriving the copyright holder of those
sales.  If I buy a computer program and make ten copies of the program
that I lock up in ten different places because I am worried about
loss, I have not deprived the copyright holder of additional sales as
those copies are not being used, and are therefore not depriving the
publisher of potential sales. (I would probably be violating the
standard license that comes with most software that limits the number
of backup copies that can be made, but that isn't copyright.)

Reverse-engineering the code to see what it does is a problem if I use
that reverse-engineered version to sell or give away copies of the
program, or use the material to write a new program incorporating part
of the old one, or to make copies that I will use internally on which
I do not pay copyright royalties.  Those infringe the rights of the
copyright holder.  Translating the copy into something I find easier
to read, for my own private use, does not infringe those rights.

I don't know about New Zealand (Robert Biddle's subject), but here
copyright law is "enforced" in civil courts.  The copyright holder
needs to sue for damages and show how the copyright was abused.
(There is a criminal violation of copyright, but that requires certain
proof of criminal intent, etc.)  If the plaintiff cannot show that any
loss or damage occurred as a result of the copying, and that all use
was for private purposes, I don't see that there could be a negative
judgment.  In all probability, the case would never even be allowed to
come to trial.  Also, it is unthinkable that any copyright holder
would go through the aggravation and expense of pursuing such a case
over private decompilation of object code.

If the law in New Zealand currently under consideration attempts to
broaden copyright to prevent private copies that do not deprive the
copyright holder of royalties, and if that is what the lawyers are
recommending, then I would indeed be concerned.

Gene Spafford, NSF/Purdue/U of Florida  Software Engineering Research Center,
Dept. of Computer Sciences, Purdue University, W. Lafayette IN 47907-2004
Internet:  spaf@cs.purdue.edu   uucp:   ...!{decwrl,gatech,ucbvax}!purdue!spaf


Re: Proposed ban on critical computerized systems

Pete Mellor <pm@cs.city.ac.uk>
Fri, 24 Aug 90 14:54:21 PDT
In RISKS-10.24 <cameron@argosy.UUCP> quotes:

> Computers are inherently flawed and too unreliable for critical tasks, say Tom
> Forester of Griffith University and Perry Morrison of the University of New
> England, both in New South Wales.  In the British academic journal _Futures_,
> they write that computer systems cannot be designed without the threat of
> life-endangering malfunctions,...

and asks:

> I'd appreciate it if someone would dig up this issue of _Futures_ and post or
> summarize this paper.

I have a copy of the paper. If anyone wants a snail-mail copy, let me have your
postal address.

I may summarise it for RISKS in the next week or two if I have time.

Pete Mellor


RHC to New York Times, re: Markoff Article

<HMurray@DOCKMASTER.NCSC.MIL>
Mon, 27 Aug 90 11:25 EDT
Forwarded with permission.

Date:  Thursday, 23 August 1990 08:24 edt
From:  rhcx%beta at LANL.GOV (Robert H Courtney)
Subject:  NYT Article
To:  WHMURRAY at DOCKMASTER
                                         August 19, 1990

Mr. Max Frankel,
Executive Editor
The New York Times
229 West 43rd Street
New York, NY 10036

Dear Mr. Frankel:

Your article, "Washington is Relaxing Its Stand on Guarding Computer Security",
by John Markoff, August 19, reflects a serious misinterpretation of both the
intent and the probable effect of the new Presidential directive on computer
security.

The new directive replaces NSDD #145, which was issued by the Reagan
administration in 1984. With the authority of that older directive, and because
they were not willing to accept the utterly mundane, unexciting nature of the
data security problems in most agencies, the National Security Agency (NSA)
distorted the data security implementations of many federal civil agencies and
reduced the effectiveness of their computer security programs.

NSA's computer security efforts were oriented exclusively about the protection
of classified data from disclosure to those who did not have appropriate
security clearances. Their development program did not address the need for
data to be complete, accurate, timely and available. They were concerned only
with the confidentiality of data and wholly unconcerned about their usefulness
to their proper owners.

It has been an unfortunate NSA assumption that those with appropriate security
clearances can be trusted to the level of their clearances. This ignores the
damage which has been done in recent years by Messrs Walker, Pelton, Pollard,
Boyce, Smith, Miller, et al, all of whom were cleared for access to the data
which they delivered to those who appeared, until recently, to be the enemy.
There seems to be no basis for a belief that comparable damage has been done
through technically-oriented, foreign-directed penetrations of our systems
containing classified data.

Fortunately, the new directive relieves the civil agencies from a requirement
that they continue to accept misleading guidance in computer security from NSA.
Unfortunately, it was not issued not until significant damage had already been
done.

The Computer Security Act of 1987 gives the National Institute for Standards
and Technology (NIST) responsibility for providing technical guidance in
computer security to the civil agencies and DoD for the protection of their
unclassified data. It is regrettable that NIST is very poorly funded for work
in the computer security area and, at the current funding levels, cannot
provide any significant amount of the technical leadership in computer security
so badly needed by the civil agencies.

Only a small portion of funds previously available to NSA for computer security
would permit NIST to provide the needed guidance. Whether those funds are
provided or not, the new and wisely conceived directive will not result in
relaxation of the security afforded data by either DoD or the civil agencies.
The new directive rectifies a serious error of the previous administration and
makes it probable that data security in the civil agencies will improve - not
as much as it would if NIST had adequate funding and not as much as it should,
but it will be improved. The contrary impression conveyed by your reporter is
unfortunate.
                                     Sincerely, Robert H. Courtney, Jr.

Please report problems with the web pages to the maintainer

x
Top