The RISKS Digest
Volume 14 Issue 43

Wednesday, 24th March 1993

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Dutch computer hacker arrested under new Dutch Law
Ralph Moonen
Minnesota phone fraud
Vic Riley
Follow-up on the Lehrer Case from RISKS-14.14
David Lehrer
Re: Conspiracy trial ends in ... acquittal
Olivier MJ Crepin-Leblond
Peter Debenham
Re: Buy IBM and get fired
Ross Anderson
Michael J Zehr
[Re: Buy IBM] Smart card/electronic cash security
Niels Ferguson
Re: RISKS of brain interference: not as tabloid as you'd think
David Honig
Software Warranties
Paul Robinson
Info on RISKS (comp.risks)

Dutch computer hacker arrested under new Dutch Law

<rmoonen@ihlpl.att.com>
Wed, 24 Mar 93 10:00 GMT
The "Volkskrant", a leading Dutch newspaper carried an article last saturday
about the latest computer hacker arrest in the Netherlands.  It was the first
arrest made under the new Dutch Computer Crime Law.  The hacker was arrested
while sitting at a terminal in the computer room of the Amsterdam University.
When the police arrived late in the afternoon, he was supposedly caught
red-handed. A couple of hours later, the police searched his parents house,
and confiscated some equipment and papers. The hacker was arrested last year
also, but no charges were brought against him at that time. This time he
probably will have a harder time getting out of it, because of the new law. It
is unclear what charges he was arrested on exactly, or what he was doing at
the time of his arrest.  Watch this space for more news.....

--Ralph Moonen


Minnesota phone fraud

"Riley Vic" <riley_vic@msmail.src.honeywell.com>
24 Mar 1993 10:19:07 -0600
The Majority Leader of the Minnesota House of Representatives has just
resigned amid disclosures that his son and nephew obtained his access code for
the state long distance service.  The service involves calling an 800 number
and entering a personal code for making long distance calls related to state
business.  Although the son and nephew reportedly made less than $50 worth of
calls themselves, they shared the codes with their friends.  The total bill is
now estimated at around $85,000 worth of calls.  For now, taxpayers appear to
be liable for the bill.

Victor Riley, Honeywell Systems and Research Center, Minneapolis
riley@src.honeywell.com


Beacon article-fu: The Mark Lehrer Case (RISKS-14.14 follow-up)

David Lehrer <71756.2116@compuserve.com>
23 Mar 93 14:55:45 EST
Akron Anomaly BBS trial issue:

Distributed with permission of The Akron Beacon Journal, David Lehrer

Police Say They Were Taking a Byte out of Crime.  Munroe Falls Man Was
Arrested for Having X-Rated Pictures on His Computer Bulletin Board; His
Parents Believe the Sting Operation Was Politically Motivated.
Akron Beacon Journal (AK) - MONDAY March 22, 1993
By: CHARLENE NEVADA, Beacon Journal staff writer

  [For those of you interested in following up on the strange developments
  in the Mark Lehrer case, previously discussed in RISKS-14.14 by his father,
  David Lehrer, the entire Akron Beacon article is obtainable from the CRVAX
  RISKS: directory in a file named RISKS-14.43LEHRER, or from David.  PGN]


Re: Conspiracy trial ends in ... acquittal (Bowen, RISKS-14.42)

Olivier MJ Crepin-Leblond <o.crepin-leblond@ic.ac.uk>
Wed, 24 Mar 1993 10:32:29 +0000
I think what needs to be mentioned is the reason why he was acquitted (or at
least, the reason that journalists propagated): "He was addicted to hacking".
That, I find, is the weakest explanation I have ever heard about a crime. Just
imagine a society where serial killers are acquitted "because they are
addicted to killing" !

While, IMHO, a prison sentence would have been tough on Mr. Bedworth; a fine,
or some sort of community service would have been welcome.  Why have laws, if
they are not properly applied ?

Olivier M.J. Crepin-Leblond, Digital Comms. Section, Elec. Eng. Department
 Imperial College of Science, Technology and Medicine, London SW7 2BT, UK
       Internet/Bitnet: <foobar@ic.ac.uk> - Janet: <foobar@uk.ac.ic>


Re: Conspiracy trial ends in ... acquittal (Bowen, RISKS-14.42)

Peter Debenham <PPXPMD@ppn1.nott.ac.uk>
24 Mar 93 15:29:34 GMT
This posting creates a slightly false impression by not giving the grounds
that the acquittal was acquired on.  The court deemed that Mr Bedworth's
behaviour was uncontrollable by him due to a total addiction to hacking.  In
other words he was unable to form the "guilty mind" that is necessary to get a
conviction in a British court on such charges.  Very much like an acquittal on
any other charge due to being unable to know what you are doing (be the cause
mental illness, temporary effect from prescribed drugs etc).

Despite the Independent's leader, no new precedent has been set has been set
in this case.  It is just the application of a well established piece of
British law to a new statute.

Peter Debenham, Physics Dept., University of Nottingham, UK. NG7 2RD
P_Debenham@ppn1.nott.ac.uk   + 602 515151 x8323 (wk)   +602 730487 (hm)


Re: Buy IBM and get fired, response (Arnold, RISKS-14.42)

<Ross.Anderson@cl.cam.ac.uk>
Wed, 24 Mar 93 12:55:03 GMT
In reply to this:

(1) My primary source was `Waiting for Taurus' by J Green-Armitage in Computer
Weekly March 4 1993 pp 28 - 29. This article states that the considerable
delays and cost overruns were due to a number of problems, including the
security subsystem, management hassles and regulatory delays. To quote the
article `IBM must accept a modicum of blame because it needed an extra three
months in 1992 to finish its solution'.

This article appeared a few days before the project was cancelled and the
chief executive of the stock exchange resigned.

(2) There will be a lot of lawyers picking over this disaster. Two hundred
banks and brokers have lost over half a billion dollars between them, and IBM
seems to be one of three possible defendants (the others are Coopers and the
Stock Exchange itself).

If, as IBM now say, their system was finally signed off a few days before the
project meltdown, then they may get lucky. But they're obviously still
worried.  Why else did they not just keep quiet and let the matter die? If
they hadn't tried to argue the matter, my initial posting to sci.crypt would
have been forgotten by now.

Ross Anderson


Risks of automatic signature inclusions

<tada@Athena.MIT.EDU>
Tue, 23 Mar 93 22:43:39 -0500
In his article on IBM's security for the Taurus project, Todd W. Arnold,
tarnold@vnet.ibm.com, IBM Cryptographic Facility Development, Charlotte, NC
wrote:

>I've been asked to post the following "official" description of the situation
   [...]
>Disclaimer: This posting represents the poster's views, not those of IBM

To which PGN added:
>   [I normally suppress all disclaimers and cover them blanket-wise in the
>   masthead.  This one is intriguing, because the posting explicitly
>   contains an "official" description, which would seem to disclaim the
>   disclaimer!  PGN]

There are all sorts of risks present.  Perhaps IBM should be expanding its
signature verification software to verify that you really want to add a
particular signature to your posting?

There are semantic and syntactic risks too.  The word "official" is
surrounded by quotes, indicating that perhaps it doesn't retain the
normal meaning.

Or perhaps Todd's final line was in OO syntax.  The message "disclaimer"
meaning denial, repudiation, is being sent to the text string denying
the authority of the posting.  Computers routinely know what to do with
double negatives which humans tend to contract into a single negative.

One speculates whether PGN will append a message, "I normally suppress all
humor except my own.  This one is intriguing,  ..." :-)

-michael j zehr
                  [Zehr Gut.  Even though we are coming up on 1 April,
                  I will not append anything to Michael's message.  PGN]


Smart card/electronic cash security (Re: Yee, RISKS-14.42)

Niels Ferguson <Niels.Ferguson@cwi.nl>
24 Mar 93 10:57:50 GMT
[Was: Buy IBM and get fired, Bennet Yee]

I'm not sure what 'digicash' you are talking about. The smartcard based
electronic cash system that Chaum's _company_ DigiCash is marketing does not
require any on-line processing during payment.  The payment protocol uses a
commit-challenge-response structure which can eliminate the need for on-line
communication with a central server. In contrast to other smartcard based
systems there is no system-wide 'master' key stored in a tamper-resistant box
at each shop. The recipient can validate the money by a simple signature
verification using a public key. The system does rely on the tamper-resistance
of the smartcards for security; if anybody 'breaks' a smartcard and extracts
all the keys, then this will allow a limited amount of fraud.

The use of master keys in many smartcard based applications is of course a
serious risk. If the master key is ever compromised for any reason, then the
whole system collapses. It is questionable if storing such a master key in
thousands of tamper-resistant boxes is secure enough. Especially for
electronic cash applications, the fraud potential once the master key is known
is unlimited.

There are several other purely cryptographic systems for electronic cash. The
early systems were indeed on-line to prevent a user from spending the same
piece of money twice. Later systems achieve an off-line payment protocol
without any compromise regarding the privacy. (For more details, see the
Chaum-Fiat-Noar paper of Crypto '88.) All of these systems do NOT require any
tamper-resistant device anywhere, except in the central bank. The security
depends only on the cryptography used. Up to now these kind of systems have
not been used due to their complexity and inefficiency. Recent improvements
have lead to systems which are much more efficient and simpler.

There is a general construction to make any of these systems transferable.
With electronic cash the transferability is less important than with physical
payment systems; it only takes a phone call to hand in the received money and
withdraw some new cash. The transferability has some disadvantages, one of
which is that the size of the money (in bits) MUST grow as it is passed from
one user to another. Another disadvantage is that any user can always
recognise any piece of money that passed through her hands, which reduces the
level of privacy.

The current state of affairs is that electronic cash without tamper-resistant
user devices is technically feasible.  We are considering implementing one of
the newer protocols to provide electronic payments via e-mail.

Niels T. Ferguson, CWI, Amsterdam, Netherlands      e-mail: niels@cwi.nl


Re: RISKS of brain interference: not as tabloid as you'd think

David Honig <honig@ruffles.ICS.UCI.EDU>
Tue, 23 Mar 1993 19:58:41 -0800
"Mich Kabay / JINBU Corp." <75300.3232@compuserve.com> wrote in RISKS-14.42,

> This morning the Globe and Mail (Canada) reported that Fujitsu of Japan is
> working on a brain-wave interface for computers. ...

Actually its been known for some time that activity in a certain piece of your
brain reliably predicts eye and motor movements, by as much as 1/3 second.
This is of interest both to human factors people (the military has been
interested in this for faster response for firing missiles by a jet pilot) and
philosophers; Daniel Dennett writes of this in his recent tome, _Consciousness
Explained_, which came out last year.  Dennett is a cognitive-and-neuroscience-
aware philosopher (a rare and precious thing) at Tufts I believe.


Software Warranties [longish]

Paul Robinson <tdarcos@access.digex.com>
Mon, 8 Mar 1993 00:19:52 -0500 (EST)
A split version of this message has been sent to the Ethics-L
list.  It will be sent as one long message to the Risks List.

A user who did not put his E-Mail address in his message,
wrote the following on the Ethics-L list (ETHICS-L@vm.gmd.de) :

> Should the author of shareware be held liable for possible
> damages caused by the use of their product, and if so to
> what extent?  Should a shareware author be held liable for
> the performance of their product in every hardware and software
> configuration?

First off, I'm not a lawyer but I do know a little about the laws relating to
the operation of a business in general.

What you are (in general) referring to are the laws relating to warranty and
"fitness of purpose".  These laws place the general burden of operation for a
product upon the manufacturer or seller, subject to whatever provisions they
warrant; and in addition, each state has some laws of its own relating to
warranty protection.

In general, a manufacturer is supposed to provide a product which meets the
"fitness of purpose" test, i.e. an iron should heat and it be warm enough to
press clothes.  It should not explode when plugged in, nor should the person
be electrocuted when touching any metal parts.  If the person is burned by
touching the face plate, that's a consequence of misuse of the product for
which the manufacturer is not responsible.

If one makes a program to do accounts payable, it should accept transactions,
print checks and keep a list of checks it printed, among other things.  It
should not crash when started, it should not damage other programs or files on
the system, it should be able to print checks without printing 50 pages of
garbage (and wasting 50 checks).  If the person's checks are ruined because
they put them in upside down, or the printer jammed, that's a consequence of
misuse of the product for which the manufacturer is not responsible.

What the law should do is hold the manufacturer of a piece of software
responsible for (1) any bugs he knew about (2) any failure to test under
reasonable conditions of the market place, i.e. running without a printer,
with printer unconnected, without a floppy disk inserted, and make sure it can
recover without damage.  Also a program would be expected to refuse to run
under conditions it can't handle, for example, on any version of DOS less than
2.0, or under MS-Windows except in DOS mode, or whatever.  If it requires VGA
it should quit with a message if one isn't present.  If a program requires a
286 or better, running it on an 8086 should issue a message and it should quit
to the operating system, not crash or lockup the machine.

These, and perhaps other requirements, are reasonable minimum standards by
which a program should be required to act.  Under no circumstances should a
program damage other program's files, nor should it be damaging to the
hardware (unless the hardware is extremely delicate and can be damaged simply
by misprogramming, such as with the older Hercules Graphics boards).  Is it
reasonable to assume that someone should have known that something should be
done a certain way?  Crashing and damaging other people's files are evidence
of negligence.  If this is the result of other programs contributing to it,
that's another issue.

However, holding a software manufacturer liable for damages is going to be
difficult.  I ain't seen a program publisher who didn't put down a
laundry-list of disclaimers, exclusions and warranty negations, even up to the
point of declaring the software to be "as is".  That's a legal term meaning
the seller makes no representations about the particular item at all; in the
context of computers it could be anything from a fantastic product that does
everything you can imagine and even fulfils needs that you haven't even
realized you had, to a worm that slags the motherboard and roaches the hard
drives, then procedes to jump to other computers via the modem and network
cards and do the same.

There is legal precedent in where a man was placed on trial for doing exactly
that; he put in a program patch that destroyed everything in sight when he was
removed from the payroll.  I believe that man is now having an extended
vacation with free room and board as a guest of the People of the State of
Texas.

> Should the author of shareware be held liable for possible
> damages caused by the use of their product, and if so to
> what extent?  Should a shareware author be held liable for
> the performance of their product in every hardware and software
> configuration?

> If they can then it is unlikely that they ever could write
> program and not get sued.  Should a shareware author be treated
> differently if they make their living off shareware?

Technically a person can be sued if they provide a software package for free;
technically anyone can be sued for anything at all; whether someone else can
win and collect from them is another thing altogether.  If the seller is some
guy who makes, say $5,000 a year or less selling programs, nobody is going to
sue him because he probably doesn't carry Errors & Omissions insurance and
he's judgement proof, meaning there's no money for the shyster to go after
even if he does win.

Now let's say some large company, with assets is found.  Using the term from
the vernacular, let's call them "Deep Pockets Software, Inc."  Since many
companies are only incorporated in their home state, you have to sue them
there; unless they are doing business in other states (and merely providing
programs to a multistate distributor such as Egghead, Babbages, Waldenbooks,
GTSI, Comp USA, or Micro Center in the area of retail stores, and Cost Plus,
Programmers Shop, Dustin Discount Software or 47th Street Computer in the Mail
Order arena, does not necessarily constitute "doing business") you might not
be able to get service upon them.  You might be able to sue the seller, but if
the company simply put the software out on racks and people buy it, or simply
put a description in a catalog, there may not be grounds for a claim against
the seller since they've made no promises.  If the in-store seller makes it
possible to try the program on their machine first, then the claims are even
weaker.

Also, contributory negligence comes into play: You look stupid when the
defense attorney asks, "Did you have recent backups of your computer system's
files?  Aren't you aware you're supposed to have backups?  Did you not read
the message on the package that said 'Do not install this program without
backing up everything'?"  In some states, if the plaintiff (the one suing) is
even 1% at fault, (s)he gets nothing.

> For example:  A shareware author writes a disk defragmentation
> software which is tested and works perfectly on several computer
> setups.  A person gets the software and runs it on their computer.
> The software destroys several important files which cost the person
> a large sum of money.  Should the person be able to sue the author
> of the software?

First of all, even if the buyer of the software patched the program so that it
was defective, an ran it despite no backups, and then saw his hard disk
mangled, he can still sue the (1) maker of the hard drive (2) seller of his
computer (3) seller of the software (4) manufacturer of the software (5)
company that made the car that drove him there, etc.

The question at hand is whether he can collect damages from any of these
parties.  He can always sue even if he has no case at all; whether he can
collect damages is another thing altogether.

You have to look at (1) the purpose of the product at hand (2) intended
audience of the product (3) potential damage which can be caused by the
product and (4) is this something the manufacturer could have foreseen?

There are disk defragmenters sold for many different computers, including VAX
systems.  I'll review this for MS DOS systems as I have more familiarity with
those.

The purpose of a disk defragmenter is to reduce the number of separate
segments of a file which are stored on a disk, presumably down to one
contiguous set of segments for each file.  As such, a defragmenter program
works at the lowest level of the file system, even manipulating the File
Access Table on an MSDOS computer.  This places it in the class of Maintenance
programs, of a level of dangerousness as a computer virus.  (In fact, you have
to disable any virus checking software to run this sort of thing because it
has to have WRITE ACCESS to the FAT table of the hard disk.)  This is probably
the most dangerous class of user accessible programs around.

The best method for constructing a disk defragmenter is to (1) move one or
more file block(s) to an empty spot to either make the space contiguous for a
file or to open up empty space in front of a file in order to make it
contiguous (2) update the fat table with the new location (3) do the next
block.

What this does is ensure that the FAT table, the most critical part of the
system, is updated when the information changes.  You don't update the FAT
table until AFTER you move the file.  This way, the worst that can happen - a
reboot, a power failure - the file system is still intact and no damage is
done even if the process dies half-way through.

Now, the question at hand is what type of damage occurred, and why it
happened.  First, if the manufacturer failed to tell people to take
precautions (1) disable virus checkers (2) disable disk caching (3) disable
multitasking (4) disable TSRs or anything that keeps a file open (5) run the
program from the DOS prompt, then there would be cause to raise negligence.

In fact, if someone wrote a disk defragmenter to run under MS-Windows, or
failed to tell people to shut it down, first, I'd assume that to be almost
automatic negligence.  (In fact, I think people should be told to remove MS
Windows before attempting to get any serious work done.  Not just remove it
from memory, remove and erase it from their system, but that's another story.
:))

Now, the question comes up about why the user didn't have a backup of that
critical file or files.  Floppy disks sell for 53c each at a computer store or
less for anything up to 1.44 meg.  (I just bought 100 of them from the local
store, which sells them for 17c each.  If I wanted to go in and buy just one,
it would have been 17c plus tax.)  There is no excuse for failing to have
critical files backed up.

The user can raise the issue that he did not know this.  Well, if the program
showed a screen stating these facts and asked if the user wanted to continue,
and he did, there would be a hard time proving that the user didn't know.

Let's go further; let's say the user ran the defragmenter, and he does have
his files backed up, but it trashes everything because of a bug in the system.
Then, he could have a case for liability to the extent necessary to pay the
cost of reparation of damages, i.e. the amount of work needed for restoring
his system.  Considering that a diskette takes about one minute to read, plus
maybe 30 seconds to change disks, the time value of someone to restore a
system, plus perhaps lost time discovering files had been damaged, would be
what they could expect, if it was reasonable for this to be expected.  Perhaps
the maximum liability might be $1 per megabyte damaged.  But this would be for
every system damaged.

If a manufacturer does not state what his program should not be run on (for
example, IDE or MFM or SCSI or ESDI disks, or whatever), or does not warn of
the consequences of use (you must make sure you have current backups, you have
to disable TSRs, you have to turn off MS-Windows) then they have a problem.

The question is reasonableness: what could be reasonably expected to happen in
ordinary use?  If the manufacturer discovers, after 100,000 copies have been
shipped, that there is a bug that if the defragmenter is run between 11:59 pm
and midnight, on the last day of the year on an IDE drive it will damage the
FAT table and exit, it would then be liable for anyone injured by this until
the public had knowledge of this.  Sending a letter to every owner of the
package along with a correction is one way, but the real question is whether
they knew about the problem or could reasonably foresee it.

The question of a company's liability for erroneous software is more-or-less
all moot because nobody provides warranties for software.  But, this may
change as software usage becomes so prevalent that these days of "caveat
emptor" (let the buyer beware) stops being tolerated for software packages.
Then the question of what warranty protection the buyer is entitled to comes
into question.  Also, the "standing" of a software company with respect to
what type of work it does is also a question to be answered.

The "standing" of a company depends on whether it claims to be professional or
technical.  A professional does not guarantee the work he does, but he does
guarantee that he meets the minimum standards set by the industry he works in,
or by law or both.  For example, a doctor, a lawyer, an engineer all adhere to
the "professional" doctrine: you meet the minimum educational qualifications
needed to be licensed, i.e. you at least know how to do the work in question.
When a professional operates, in the absence of negligence, he is not liable
if his actions fail to fix the problem, as long as his technical
qualifications are intact.  That's why you can't sue a lawyer if your side
loses.  You still have to pay an architect for his drawings even if you don't
like the way the building looks.

On the other side is the technical person.  He does not guarantee his
experience, but he does guarantee the technical performance of his work.  For
example, a construction company will use the required materials and
workmanship to construct a bridge according to the architect's specifications.
If the bridge fails, as long as the materials and workmanship were
satisfactory to complete the task as described, the builder isn't liable.  He
may have just graduated from hod carrier to owner of a construction company,
but he does guarantee he will follow the requirements and not use materials of
less quality than is required for the task.

In the Software industry, we are attempting to exclude both classifications of
qualification: we do not claim to have the professional background (in some
cases, the people who are in the industry cannot do so; many of them may not
even have the qualifications), so we cannot stand on professionalism, i.e.  we
can't guarantee our ability to do the work.  And because we can't know that
the programs will work, nor can we guarantee we know what the best methods
there are out there, we can't guarantee the product.  Since we cannot
guarantee our qualifications, and we can't guarantee our product, places that
make computer programs are refusing to guarantee {anything}.

I think the ability to do this will be limited in the future as common
practice and legislation force the makers of software to take some
responsibility for their creations.  Incidents like the attempt by the State
of New Jersey a couple of years ago to license programmers is one such
foretaste of things to come.

Paul Robinson — TDARCOS@MCIMAIL.COM

Please report problems with the web pages to the maintainer

x
Top