The RISKS Digest
Volume 16 Issue 3

Thursday, 5th May 1994

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Spelling correction
Phil Agre
Sigh — security through obscurity is NOT security
Alan Wexelblat
Bellcore cracks 129-digit RSA encryption code
Steven Tepper
Risk of Non-Computerization?
Klaus Brunnstein
Computers Blamed For FAA Woes
Mark Thorson
Brief note re DIA fiasco
Paul Green
Followup on credit card policies (re: "Streetwise Guide ...")
Rob Slade
ABC Nightline re LaMacchia
Mich Kabay
Risks of electronic door locks for automobiles
Paul Wallich
Info on RISKS (comp.risks)

spelling correction

Phil Agre <pagre@weber.ucsd.edu>
Tue, 3 May 1994 15:10:22 -0700
We've had plenty of notes about spelling correctors, but I find this one
particularly interesting.  The otherwise excellent April 1994 issue of Z
Magazine contained a particularly horrible editing error in an article by Sara
Diamond about the "American Center for Law and Justice" (ACLJ).  The ACLJ was
created by conservative Christians in order to oppose the American Civil
Liberties Union (ACLU) in court cases over issues like school prayer.
Everyone has assumed that the similarity of acronyms is deliberate; ACLJ seems
part of a fairly systematic conservative strategy of positioning
public-interest groups as "liberal" by creating conservative mirror images of
them.  Well, as the Z editors explained in their May issue, their spelling
correction program included ACLU but not ACLJ, with the result that every
instance of "ACLJ" in Diamond's article got changed to "ACLU" except, somehow,
for the very first one, which occurred (much as it does above) in parentheses
after the first mention of "American Center for Law and Justice".  Apparently
there was an uproar, with some Z readers calling up the ACLU to ask why it had
suddenly reversed its positions.  The risk is subtle: in politics, things are
often designed to outwardly resemble their opposites, or to invite confusion
or sharply defined contrast (or both) with their opposites.  As a result, it
becomes impossible to define "close enough" in (for example) a spelling
corrector without a great deal of specific background knowledge.

Phil Agre, UCSD


Sigh — security through obscurity is NOT security

"Alan (Miburi-san) Wexelblat" <wex@media.mit.edu>
Tue, 3 May 94 17:26:46 -0400
Peter Ladkin introduces his post with the polite phrase:

>  a vandal exploiting a not-unknown security hole

or, in American English:

Some lazy person at the target site failed to plug a known hole, probably
reasoning along the lines of "Oh, no one will know about this hole so I
don't have to deal with it."

There is no excuse for vandalism, including failure by sysops to plug known
holes.  However, if I was a user at that site I'd be more pissed at the
person who failed to prevent the vandalism when such prevention was
possible.

--Alan Wexelblat, Reality Hacker, Author, and Cyberspace Bard, Media Lab -
Advanced Human Interface Group  wex@media.mit.edu 617-258-9168


Bellcore cracks 129-digit RSA encryption code

Steven Tepper <greep@datatools.com>
Tue, 3 May 94 18:46:36 PDT
      [Old news to some of us.  Oddly no one reported it to RISKS until now,
      and I did not get around to entering anything on it.  PGN]

Excerpts from an article on page 14 of the May 2 issue of Network World:

   A team led led by Bell Communications Research last week announced
   it has cracked a 129-digit encryption code, which scientists had
   predicted would take "40 quadrillion years" to break.  ...

   Breaking the RSA-129 public key required that the team find the two
   prime numbers that, when multiplied by each other, resulted in the
   129-digit key number.  ...

   This mathematically arduous task was accomplished in eight months by
   600 volunteers in 24 countries who used their organizations' spare
   computing capacity.  ...


Risk of Non-Computerization?

Klaus Brunnstein <brunnstein@rz.informatik.uni-hamburg.d400.de>
Wed, 4 May 1994 11:16:34 +0200
In a TV report (magazine FRONTAL, 2nd German TV channel ZDF, Tuesday May 3,
1994), recent motorsport accidents (3rd World Championship, formula 1 racing
cars, Imola/San Marino) were discussed and analysed by experts (e.g. former
World champion Niki Lauda/Austria). Besides general questions about
deficiencies in safety and protective measures, the question was discussed
whether the recent decision to forbid the computerized ground distance control
may have contributed to the strange behaviour of both racing cars which after
a curve went straight into a concrete wall, without visible signs of steering.
In both cases (Austrian driver Hackenberger, in training for his 2nd race, and
Brasilian driver Ayrton Senna, 3-times world champion), the cars crashed at
almost 300 kilometers per hour, with both drivers dying upon impact.

Until end-of-1993, formula 1 racing cars were equipped with a computerized
control system aimed at maintaining a fixed but very small distance between
the car's bottom and the track's surface, to allow for maximum contact and
therefore maximum transfer of power on the wheels. Following arguments that a
defect in the computer system may lead to serious crashes, the international
authority for this sport forbidded this computer system from 1994. This
decision may now have caused the problem on the very high speed track of
Imola, where the cars reach maximum velocities over 300 km/h. According to the
discussion quoted, the hydraulic steering support may be affected when the
car's bottom contacts, at high speed, an uneven part of the track; in such a
situation, the driver would no longer be able to steer the car and may even
loose the ability to brake. Evidently, the decision to forbid the computer
system was neither accompanied by an adequate risk analysis not were any
additional measures taken to diminish the risks e.g. by reducing the car's
speed or by enlarging the minimum distance between car's bottom and track
ground. Only now, discussions take place to reduce the speed in certain parts
of the track.

Klaus Brunnstein (May 4, 1994)


Computers Blamed For FAA Woes

<mmm@cup.portal.com>
Wed, 4 May 94 01:06:17 PDT
This morning (3 May 1994), Transportation Secretary Federico Pena appeared on
at least one morning TV show in addition the _MacNeil-Lehrer_ show to inform
the public about the shocking state of the nation's air-traffic control
technology.  On both appearances that I saw, Pena said that the FAA needed to
be a private corporation so it could acquire technology more quickly, outside
of federal regulations.

He used as an example the humble vacuum tube.  He said that the FAA is the
world's largest buyer of vacuum tubes.  That I can believe.  But then by way
of comparison, he held up a vacuum tube and a computer chip.  The tube was a
big one, about the size of a #30, which is much bigger than the tubes used on
any tube computer like ENIAC, whose tubes were about the size of a 5U4.  He
compared it against a computer chip (something packaged like an Intel 486),
and claimed the latter could replace 3.5 million vacuum tubes!

Apparently, Secretary Pena either: a) believes the FAA is running enormous,
obsolete vacuum tube computers containing millions of tubes, or b) doesn't
mind telling bald-faced lies to the American public to get support for his
objectives.  I'm not sure which is worse.

Mark Thorson (mmm@cup.portal.com)


Brief note re DIA fiasco

Paul Green <Paul_Green@vos.stratus.com>
Wed, 4 May 1994 21:14:59 GMT
As someone who has been involved in fixing a number of terrible situations
involving computers, and as someone who has not been involved in the DIA
fiasco, other than as a fascinated observer, I'm sure that when the book is
written on this, we'll find that there is ample blame for all of the parties
involved in the project.  Small groups of people make small messes.  To get a
really large mess, you need a large group of people working for many
organizations.  Anyone, or any reporter, claiming to know the source of the
problems as this stage, is either quite naive or has an axe to grind.

I think Bear Giles misses the point (in RISKS 16.01) about simple software
errors leading to massive, unintended consequences.  The issue is not (just)
syntax checkers. Sure, we've got 'em...So what?  The issue is that in a
mechanical or analog system a small error in the input or operation
generally leads to a small **and traceable** error in the output.  But in a
digital system, especially a software-based digital system that can experience
memory or data corruption, a small error in the input or operation can have
huge **and virtually untraceable** errors in the output.  I seem to recall
that it was indeed a small error that brought down the AT&T long-distance
switching system. I have to agree with the writer on this one.

Finally, judging by the number of bar-coded labels that I have to rip off my
luggage, many airports must use bar-code readers. I have to believe that this
piece of airport technology is reasonably mature.  Finally, common sense says
it has got to be more reliable to bar-code the luggage than the carts.

Paul Green, Sr. Technical Consultant, Stratus Computer, Inc., Marlboro, MA
01752    Paul_Green@vos.stratus.com, PaulGreen@aol.com    (508) 460-2557


<"Rob Slade, Ed. DECrypt & ComNet, VARUG rep, 604-984-4067">
Thu, 05 May 1994 15:21:10 -0600 (MDT)
Subject: Followup on credit card policies (re: "Streetwise Guide ...")

PGN asked me to summarise the responses to my comments in the review of "The
Streetwise Guide to PCs" by Jerome/Taylor.  I had suspected that it might be a
bit controversial, but I was surprised that the only substantive comments I
received were in regard to the difference between Canadian and American law
regarding credit card purchases.  The most apposite information was from Bear
Giles <bear@tigger.cs.Colorado.EDU> who wrote:

========

I don't remember the exact language, but in the U.S. consumers have the
right to refuse charges made more than 50(?) miles from their home address.
The refusal must be in writing, within 30 days, possibly with an explanation,
etc.  The bank must complete its actions within 60 days of receipt of this
letter, cannot charge interest or late fees on the disputed amount, etc.
(I would imagine the exact details are in the FAQ of various consumer
advocacy groups.)

The presumption is that the purchase was made over the phone or while on
a trip, either situation making it difficult to impossible for the consumer
to evaluate the quality of the merchandise prior to purchase.

There is no corresponding law applying to purchases "near home"; this
distinction is often lost on people.  And of course the existence of
an American law has no significance to a Canadian making a domestic
purchase.

BTW, another restriction (either U.S. law or merchant agreement) which
is often overlooked is a requirement that charges not be placed against
your account until the merchanise is actually shipped.  I believe
companies can still "reserve" credit, but the only effect the customer
sees from this is a reduced available credit — she does not have to
make payments against it.

==========

Much the same was said by Andrew Klossner <andrew@frip.wv.tek.com> who added:

==========

Holders of U.S. credit cards can find these procedures detailed on the
back of each monthly account statement.

==========

So check the back of your statements for further details.

(As I said, I suspected there would be some controversy: I expected it,
however, to be in regard to my statements about the Better Business Bureau.  I
guess that most people have such low expectations on that score that they
simply couldn't be bothered to comment  :-)

Vancouver Institute for Research into Security Vancouver  Canada V7K 2G6
ROBERTS@decus.ca  Robert_Slade@sfu.ca  rslade@cue.bc.ca   p1@CyberStore.ca


ABC Nightline re LaMacchia

"Mich Kabay [NCSA]" <75300.3232@CompuServe.COM>
04 May 94 06:20:39 EDT
On 02 May 1994, ABC News _Nightline_ aired a program entitled, "Law and Order
on the Information Superhighway_, which focused on the case of David
LaMacchia, an MIT student accused of wire fraud for allegedly having run a BBS
which permitted traffic in stolen software.

The host was Chris Wallace, substituting for Ted Koppel.

I ordered a transcript of the program from Journal Graphics, Inc. / 1535 Grant
Street / Denver, CO 80203 / phone 303-831-9000 / fax 303-831-8901.

The transcript is copyright (c) 1994 American Broadcasting Companies, Inc. and
therefore I can provide only an abstract.

   [I received the entire abstract from Mich, who also provided me with an
   annotated version, which is included here.  My apologies if I omitted
   anything necessary for understanding.  However, because Mich included
   chunks of the abstract before each of his annotations, I could not run
   both the full abstract and the annotated version.  PGN]

U.S. Attorney Donald Stern accuses David LaMacchia of having tolerated the
exchange of stolen software worth more than $1 million.  Prof. Laurence Tribe
of Harvard University Law School questions the implications of such an
accusation; he argues that LaMacchia's BBS should be accorded the legal status
of a common carrier, thus exculpating the owner of crimes committed through
his communications channel.

Analysis of the Nightline program about David LaMacchia and the software
exchange BBS:

[set flame = on]

<<LaMacchia's attorney, Harvey Silverglate, argues that providing a venue for
the commission of a crime is not a crime: "The question is whether someone who
activates the system should be liable for what anyone else does, or do you
look to punish the person who actually commits the wrong, who actually copies,
steals the software?"<>

Silverglate's legalistic attitude is understandable (he _is_ a lawyer, after
all) but fails to address the question of values.  Many BBS operators exercise
control over content of communications or files on their systems--not out of
fear of legal prosecution, but because they don't want to be tacit or active
supporters of bad things.  As Sysop in the NCSA's security forum on
CompuServe, I have removed abusive messages and written to their authors with
reminders of the standards of decorum my fellow Sysops and I expect on the
forum.  Free speech doesn't enter into it: CompuServe is a privately owned
network and Sysops have a responsibility to maintain an environment in which
participants address the issues without descending to abusive, ad hominem
attacks.  We do these things because they fit our _values_.  We do not assume
that if something is not illegal it must be right.

....

<<Silverglate claimed that Congress does not want to penalize BBS operators for
criminal acts carried out with their systems, "... because nobody would then
operate these systems, and the communications highway, the information highway
would grind to a halt."<>

These predictions are wrong.  Many people, including the Sysops on CompuServe
and many moderators in Internet news groups, regularly impose restrictions on
content on moral grounds and out of respect for possible legal entanglements.
For example, although I regularly post extracts of the RISKS FORUM DIGEST on
the NCSA Forum, I don't post what appear to be entire copyrighted articles.

....

<<Corley replied, "...On the Internet there are no tangibles.  You can ...
steal something and they no longer have it, or you can go on the computer
system or computer network and copy something, but it's a copy.  It's not the
same as stealing.  I think that's the first thing we have to get over.  We
can't use the same analogies that exist in the real world, because this is no
longer the real world."<>

Corley exemplifies an attitude prevalent among criminal hackers:
dehumanization of cyberspace.  It seems to them that there are no people on
the other end of the communications channel, only things.  Stealing software
in this context seems victimless; in reality, companies are losing money, and
people are suffering monetary losses.  Some small companies can go out of
business entirely if they cannot get sufficient revenue to cover the months of
grinding, often poorly-paid work by their employees.  Human beings have
smaller salaries or even lose their jobs because of software theft.
Cyberspace is as much a part of the real world as speech, books, and
telephones.  No one would argue that stealing a book is a victimless crime--or
would they?  Some unscrupulous bookstores rip the covers off books, return the
covers to the publisher for a credit, and then sell the mutilated books--with
not a penny going to the _people_ who wrote, edited, printed, bound, and
marketed the book.

....

<<Wallace protested, "... what's the difference between copying a program which
I'm then going to appropriate and use, and stealing?"

Corley continued, "It's not stealing, because you're not depriving somebody of
it....  we're presuming that people are not selling this material.  This is
something that someone is simply copying to either examine, to analyze, what
have you.  There's different reasons why somebody would legitimately copy
something."<>

Criminal hackers and software thieves miss the key point about intellectual
property or confidential information.  It isn't the copying that causes harm,
it's the owner's loss of control over a valuable resource.  Software companies
sell a license to use a product, not the product itself.  Software thieves steal
the ability to enter into a contract with a user; it is the lost revenue that is
in question, not the extra copy itself.

....

<<Jim Settle, retired member of the FBI Computer Crime Squad, argued that it is
the software creator which determines whether software shall be freeware,
shareware, or commercial software.

Corley said, "The problem with that is, though, that somebody can put a
particular value on a piece of software and say, `This costs $10,000,' and then
somebody copies it with a single keystroke and they're charged with stealing
$10,000...."<>

Corley implies that it is illegitimate to apply whatever price the creator
wishes to a product or service.  But a producer normally charges what the market
will bear:  a price which maximizes profit.  Too high for perceived value and
the market chooses alternatives--or they develop by less greedy competitors.
Too low, and the producer risks going out of business.

The other peculiar (wrong) suggestion in this passage is the idea that the
ease with which an act is performed somehow mitigates responsibility.  So a
single-keystroke copy is more of an excuse for stealing software than a
20-character command?  If a vandal demolishes a building by attacking it with
an ice pick over a 5 year period, are we expected to perceive his act as worse
than if he had blown it up by pressing a button to explode a nuclear bomb?
Once again, Corley confuses technology with morality.

....

<<In a discussion on the extent of copyright violation, Corley explained his
view of why programs get copied: "... computer programs are designed to be
copied, so naturally people are going to copy them.  If we don't want people
to copy them, then we make them so that they are uncopyable, or so that people
want the original."<>

Ah, the familiar fallacy of moral possibility.  This error consists of
believing that if something is easy it must be right.  The corollary is that
the victim is to blame for being victimized.  Thus if a bank robber uses a
howitzer to blow a hole in the wall of a bank, it serves the bank properly for
having failed to encase its building in three-foot thick titanium beryllium
alloy.  So software manufacturers should impose the endless nuisances of copy
protection on their products--even though these methods increase costs, enrage
consumers, and don't work anyway because little creeps find ways of countering
every single programmatic technique invented to date.

Just because it's _possible_ to steal doesn't make it right.

<<Corley explained that many people tape record albums illegally; "from the
band's point of view, though, it's a good thing that their music is being
distributed, you see.  And that's how I think we should look at copywritten
software.  The software is getting out.  I don't think Microsoft is going to
go out of business, and they're probably the company that is copied the
most."<>

Here we have Corley taking up the unpaid position of marketing director for
musicians as well as software companies.  Decisions on how to market a product
or service are rightfully the responsibility of the owners of that product or
service.  Criminal hackers have no moral justification for interfering with
those decisions.

<<Settle reminded Corley that Microsoft doesn't offer DOS for free
distribution.  Corley replied, "Well, they know that there are people that
will pay for it, and there lots of people that will pay for any particular
program, there are companies that will pay the corporate rate."<>

So Corley recognizes that honest people either agree to abide by the software
license agreement _or they don't use the product_.  So what are we supposed to
do about the creeps who jack _our_ price up by stealing the same products?
Corley's attitude reminds me of the idiots who race by traffic jams by using
the emergency lanes; these people are contemptuous of the rest of us, sitting
in orderly lines, waiting to take our turn past the obstruction causing the
holdup.  But no, the scoff-laws, like Corley, feel that they don't have to
abide by such rules.

<<Settle repeated his point: "But they're not saying, `Do you choose to pay or
not pay?'  They saying, `You owe me X number of dollars for this product."

Corley replied, "Right, but that's not very realistic.  There are college
students that want to use computers, and according to organizations like SPA,
they cannot have access if they cannot afford whatever price the software
company says they have to pay, and the same thing for people in other
countries."  He then made an analogy between programs and books, saying that we
encourage literacy with libraries but don't encourage computer literacy.<>

Ah, a new right.  Many criminal hackers espouse the view that they have a
_right_ to use any software they please without paying for it.  The reasoning,
such as it is, seems to be "I wannit--so it's mine."  No one on this planet
has a _right_ to use the database I have designed to track my projects.  I
wrote it, I debugged it, I refined it.  It's mine and mine alone.  If I
_choose_ to let Albert use it and not Bob, that's my choice.  If I _choose_ to
sell it only if the buyer agrees not to shoot sparrows, this eccentric term in
the license agreement can be accepted or rejected by a prospective buyer, but
they have _no right_ to use the product without that contractual agreement.
Can't afford MS-Word?  Use another word processor. Don't like the terms of a
license that doesn't let you make a copy for your portable computer?  Buy
another product or do without.  But wanting to use somebody else's tool is not
moral justification for stealing it.

[set flame = off]

Michel E. Kabay, Ph.D. / Dir Education / National Computer Security Assn


Risks of electronic door locks for automobiles

Paul Wallich <pw@panix.com>
Wed, 4 May 1994 21:56:20 -0400 (EDT)
The underlying risk of electronic car door locks is that the state of the
lock depends on what a microprocessor believes rather than whether
someone has turned a key or pushed a latch button. In addition to the
obvious failure modes (do you need a working battery to unlock the car?)
the manufacturer can also program in more complex lock behavior. Drivers
and passengers may find out the full range of lock states only when
bitten by a previously unknown "feature".

For example, last week I drove a two-door Chevy Cavalier that unlocks both
doors when the ignition is turned off. Makes it harder to lock keys in the
car, but could also pose a risk of theft if you don't notice.  Compared to the
Buick Century I was driving for a few days prior to that (and you'll
understand why shortly), the Cavalier's behavior is positively benign.

During a sudden spring blizzard at 2,500 meters in Northwest New Mexico, I
discovered the Buick's quirk. I went onto the shoulder to avoid a pickup and
trailer that had decided to stop in the middle of the road during a brief
zero-visibility whiteout, and found myself stuck in a newly-minted snowbank.
So I turned on the hazard flashers and went to see how hard it would be to dig
out. Since I had left the engine running, the doors locked automatically
behind me (I later verified that this is a "well-known" behavior to the rental
agents in Albuquerque). The risks of standing outside a locked car in driving
snow on a lightly-traveled mountain road (wearing clothing more suitable for
low-altitude desert) should be obvious. Without the timely passage of two
other tourists (bound from a monastery near Abiquiu to a commune at Taos) this
posting might not have been possible.

I was somewhat taken aback to note that it took the tow-truck operator less
than a minute to unlock the car, equipped only with a small pry bar and a bent
steel rod. So the electronic locking mechanism does not seem to add security.

During the mechanical era, auto manufacturers figured out various way to make
it difficult or impossible to lock your keys in a car; it's not a good sign
that they seem to be relearning those lessons from scratch.

Please report problems with the web pages to the maintainer

x
Top