The RISKS Digest
Volume 20 Issue 67

Tuesday, 7th December 1999

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Crack in GSM cell-phone encryption scheme
NewsScan
Medical errors kill tens of thousands annually, panel says
Keith A Rhodes
Modern fire-alarm systems
Steven M. Bellovin
Why Computers are Insecure
Bruce Schneier
Jail for possessing a debugger? More on DVD encryption cracked
Daniel A. Graifer
Quicken cannot roll back transactions, and even lacks an Undo feature
Tom Welsh
Microsoft Works not saving spreadsheets
Shez
Inadvertent attachments with MS Outlook 98
Jon Freivald
Counterfeit Japanese coins and resulting risk...
John F. Opie
Coppermine bug stops PC shipments
Sam Kasseman
Jane's article on cyberterrorism hype
Martin Minow
Stock performance charts
Jeremy Epstein
Railtrack timetable server has Y2K problems?
Christopher St.John
Worm.Mypic: Will Y2K provide cover for worm/viruses?
Mich Kabay
Y2K compliance
Identity withheld
Re: Irish telephone network outage brings Y2K fears
Henry Spencer
Risks of US-Euro date conversion
Ben Hines
Re: Mars climate orbiter
Michael Detambel
Re: Sarah Flannery
Timothy A. McDaniel
Info on RISKS (comp.risks)

Crack in GSM cell-phone encryption scheme

"NewsScan" <newsscan@newsscan.com>
Tue, 07 Dec 1999 07:11:00 -0700
Researchers Alex Birykov and Adi Shamir of Weizmann Institute in Israel
have cracked the A5/1 encryption scheme that protects communications made
over wireless phones using the GSM standard. The GSM protocol is employed
in more than 215 million digital phones worldwide, including ones made by
Pacific Bell and Omnipoint. Calling the researchers' claim "ridiculous," an
Omnipoint executive says, "What they're describing is an academic exercise
that would never work in the real world. What's more, it doesn't take into
account the fact that GSM. calls shift frequency continually, so even if
they broke into a call, a second later it would shift to another frequency,
and they'd lose it."  But UC-Berkeley computer security expert David Wagner
believes that the discovery is "a big deal" that puts calls "within the
reach of corporate espionage." (Source: *The New York Times*, 7 Dec 1999,
http://www.nytimes.com/library/tech/99/12/biztech/articles/07code.html;
from NewsScan Daily, 7 Dec 1999, reprinted with permission)


Medical errors kill tens of thousands annually, panel says

"Keith A Rhodes" <rhodesk.aimd@gao.gov>
Wed, 01 Dec 1999 07:57:08 -0500
[NOTE: The privacy issues surrounding this new center will be massive. I
believe it will be a real test of the government's will to apply the
necessary resources to protect patient privacy. The IOM news release is at
http://www4.nationalacademies.org/news.nsf/isbn/0309068371?OpenDocument and
the purchase information is at http://books.nap.edu/catalog/9728.html]

More people die each year in the United States from medical errors than from
highway accidents, breast cancer or AIDS, according to a report of a
National Academy of Sciences' Institute of Medicine panel.  Between 44,000
and 98,000 people die per year because of mistakes by medical professionals
(and that is "probably an underestimate" because of unreported errors
and uncovered areas of care).  [Source: CNN, 30 Nov 1999]


Modern fire-alarm systems

"Steven M. Bellovin" <smb@research.att.com>
Wed, 01 Dec 1999 21:43:49 -0500
The following note was sent to all occupants of my building:

> Today at 9:35 AM we experienced a Fire Alarm system failure.  You may have
> heard the bells or saw lights going off in certain parts of the building.
> The program that controls the different parts of the alarm system appeared
> to fail.  We have completed repairs to the system and will have a system
> programmer here tonight to perform testing.

A systems programmer to fix a fire alarm?  Sigh.


Why Computers are Insecure

Bruce Schneier <schneier@counterpane.com>
Mon, 29 Nov 1999 10:30:20 -0600
Almost every week the computer press covers another security flaw:  a virus
that exploits Microsoft Office, a vulnerability in Windows or UNIX, a Java
problem, a security hole in a major Web site, an attack against a popular
firewall.  Why can't vendors get this right, we wonder?  When will it get
better?

I don't believe it ever will.  Here's why:

Security engineering is different from any other type of engineering.  Most
products, such as word processors or cellular phones, are useful for what
they do.  Security products, or security features within products, are
useful precisely because of what they don't allow to be done.  Most
engineering involves making things work.  Think of the original definition
of a hacker:  someone who figured things out and made something cool
happen.  Security engineering involves making things not happen.  It
involves figuring out how things fail, and then preventing those failures.

In many ways this is similar to safety engineering.  Safety is another
engineering requirement that isn't simply a "feature."  But safety
engineering involves making sure things do not fail in the presence of
random faults:  it's about programming Murphy's computer, if you will.
Security engineering involves making sure things do not fail in the
presence of an intelligent and malicious adversary who forces faults at
precisely the worst time and in precisely the worst way.  Security
engineering involves programming Satan's computer.

And Satan's computer is hard to test.

Virtually all software is developed using a "try-and-fix" methodology.
Small pieces are implemented, tested, fixed, and tested again.  Several of
these small pieces are combined into a module, and this module is then
tested, fixed, and tested again.  Small modules are then combined into
larger modules, and so on.  The end result is software that more or less
functions as expected, although in complex systems bugs always slip through.

This try-and-fix methodology just doesn't work for testing security.  No
amount of functional testing can ever uncover a security flaw, so the
testing process won't catch anything.  Remember that security has nothing
to do with functionality.  If you have an encrypted phone, you can test
it.  You can make and receive calls.  You can try, and fail, to
eavesdrop.  But you have no idea if the phone is secure or not.

The only reasonable way to "test" security is to perform security reviews.
This is an expensive, time-consuming, manual process.  It's not enough to
look at the security protocols and the encryption algorithms.  A review
must cover specification, design, implementation, source code, operations,
and so forth.  And just as functional testing cannot prove the absence of
bugs, a security review cannot show that the product is in fact secure.

It gets worse.  A security review of version 1.0 says little about the
security of version 1.1.  A security review of a software product in
isolation does not necessarily apply to the same product in an operational
environment.  And the more complex the system is, the harder a security
evaluation becomes and the more security bugs there will be.

Suppose a software product is developed without any functional testing at
all.  No alpha or beta testing.  Write the code, compile it, and ship.  The
odds of this program working at all — let alone being bug-free — are
zero.  As the complexity of the product increases, so will the number of
bugs.  Everyone knows testing is essential.

Unfortunately, this is the current state of practice in security.  Products
are being shipped without any, or with minimal, security testing.  I am not
surprised that security bugs show up again and again.  I can't believe
anyone expects otherwise.

Even worse, products are getting more complex every year:  larger operating
systems, more features, more interactions between different programs on the
Internet.  Windows NT has been around for a few years, and security bugs
are still being discovered.  Expect many times more bugs in Windows 2000;
the code is significantly larger.  Expect the same thing to hold true for
every other piece of software.

This won't change.  Computer usage, the Internet, convergence, are all
happening at an ever-increasing pace.  Systems are getting more complex,
and necessarily more insecure, faster than we can fix them — and faster
than we can learn how to fix them.

  Acknowledgements:  The phrase "programming Satan's computer" was
originally Ross Anderson's.  It's just too good not to use, though.  A
shortened version of this essay originally appeared in the November 15
issue of _Computerworld_, and also in the November Crypto-Gram.

Bruce Schneier, CTO, Counterpane Internet Security, Inc.  Ph: 612-823-1098
3031 Tisch Way, 100 Plaza East, San Jose, CA 95128       Fax: 612-823-1590
     Free Internet security newsletter. See: http://www.counterpane.com


Jail for possessing a debugger? More on DVD encryption cracked (20.66)

"Daniel A. Graifer" <dgraifer@cais.com>
Fri, 03 Dec 1999 13:54:39 -0500
By reading the law like programmers, we may be in danger of going off
half-cocked.  Let me illustrate by analogy:

I recently served on the jury for the trial of a man caught outside a
health gym in possession of credit cards stolen from that gym both that
day and earlier, and a switchblade knife and a set of filed keys.  Among
the charges he faced was "possession of burglarious tools".  Burglarious
Tools are implements used or intended to be used to commit a burglary.
The intended use in this case was pretty clear: how else did he come
into possession of the credit cards if he hadn't broken into the lockers
they came from?  Some implements help demonstrate intent by themselves:
What else would you use filed keys for?  His actions made the knife one
too;  Even a screwdriver is a burglarious tool if that's what you are
going to use it for.  So yes, you CAN go to jail for possessing a
screwdriver!  Only an attorney could tell you if the new copyright
protection laws are truly analogous.  Anybody know an intellectual
property lawyer willing to submit his comments?

We are also assuming that the intent of these encryption devices is to
absolutely stop all piracy of the protected material.  It's not;  Not
anymore than you expect the catch lock on the front of your house to
impede a professional burglar.  They are only intended to stop casual
opportunism, and establish legally that the perp had to know he was
making an unauthorized entry.  We all make decisions about what level of
security our wealth and neighborhood prudently require (catch lock,
deadbolt or sophisticated alarm system).  In some cases, legislation or
case law has established a standard.  In other cases, the standard is
set contractually, ie. by your insurance company.  Some lawyer has told
the entertainment industry that prudence and due diligence requires the
use of encryption serious enough to foil a casual pirate in order to
maintain their copyright.  They have failed at this by opting for
security by obscurity and not employing the publicly reviewed
techniques.  Just like nobody in a U.S. urban area depends on a catch
lock that can be defeated with a credit card.  Prudence requires
something stronger.

Daniel A. Graifer, Parker & Company  1-888-426-6548 <dgraifer@cais.com>
Andrew Davidson & Company, 520 Broadway, 8th FL, NY 10012, (212)274-9075


Quicken cannot roll back transactions, and even lacks an Undo feature

Tom Welsh <tom@draco.demon.co.uk>
Tue, 23 Nov 1999 12:18:18 +0000
Intuit's Quicken financial package, widely used by individuals and small
businesses to keep track of their bank accounts, reflects a surprising set
of design values. While adding countless user interface features and even
Internet access over a series of releases, this program still denies users
the fundamental ability to roll back transactions. It does not even have the
"Undo" feature which, starting in word processors and text editors, has
come to be expected  in all well-written Windows applications.

When someone has been using Quicken for a number of years, the register for
each bank account contains thousands of transactions. Each has a date
associated with it. Unfortunately, Quicken's editing facilities are quirky
to say the least, and it is quite easy when trying to change part of a date
field to wind up with a date from an entirely different year. If someone is
entering a string of transactions - perhaps copied from a pile of receipts -
and typing ahead rapidly, an incorrect date may be overlooked. Once the
return key has been struck to enter the transaction, however, Quicken
provides no facilities whatsoever to cancel it, or even to find it.

Consider a register that contains over 3,000 transactions covering a period
of five years up to the present. A mistyped transaction may end up with a
date sometime in 1995 or 1996. Finding it again - even if the error is
recognised - is like looking for a needle in a haystack. But the register
will be impossible to reconcile properly until that rogue entry is found and
set right.

The solution is quite simple. First of all, Quicken should provide an "Undo"
feature. Second, the user should be able to commit or roll back any number
of transactions once they have been entered. This should extend to being
able to Quit and lose all changes made during the session.  (Staggeringly,
this option does not currently exist). Third, Quicken should keep a log or
journal of all transactions entered. This could be provided as an option,
and the length of the log could be made user-adjustable.

Integrity features like these would outweigh any number of cosmetic user
interface "enhancements", wizards, and multimedia gimmickry.


Microsoft Works not saving spreadsheets

Shez <newsreply@nospam.demon.co.uk>
Fri, 3 Dec 1999 06:15:39 +0000
Microsoft Works v. 4.5a for Windows does not properly save spreadsheets when
you give the Save command during work. Instead it leaves the disk file open
until you either Close the file window or Exit the application.

Whether data is written to disk at all prior to the file being closed is
unclear - Explorer shows it as being 0 KB and won't allow access as the
file is still open. This problem does not affect Works database and word
processor files, which are saved properly at once.

The Risk is that if the system hangs before you exit Works, you may lose
all your spreadsheet data even though you have pressed "save" from time
to time during the session.

Workarounds:
Either (a) close and re-open the spreadsheet each time you save it;
or (b) tick the "make backup" box; after this the spreadsheet will be saved
properly if you issue the save command twice.

This bug was discovered by <bfriesen@my-deja.com>, and I have
subsequently verified it for myself. It is not known at this point whether
other versions of Microsoft Works are affected.


Inadvertent attachments with MS Outlook 98

"Jon Freivald" <jon@freivald.org>
Thu, 25 Nov 1999 11:33:48 -0500
On two occasions now I have been the recipient of e-mail with attachments
that were not meant to be sent to me.

The e-mail was intended for me, but the attachments were not.

On both occasions, the sender was using MS Outlook 98 and had sent the
attachments to their intended recipients earlier the same day.

On both occasions the attachments were of highly sensitive nature. (One
a strategic planning document and the other a detailed expense report.)

On going to their sent items folder, neither sender showed the attachments
with the message to me, and could not find any other indication (other than
my contacting them about the attachments) that the attachments had been sent
to anyone but their intended recipients.

One of the senders is configured to use a POP server and sent a new
message. The other is configured to use the same Exchange 5.5 server that I
am, and did a reply to a message I had sent him. The only commonality I am
able to establish between the two incidents is MS Outlook 98 being used as
the mail client. When I get a few moments to breath, I'm going to try to
make the event reproduceable.

Shall I even bother to say — the risks are painfully obvious!

Jon Freivald — jon@freivald.org — http://www.freivald.org/~jon


Counterfeit Japanese coins and resulting risk...

"John F. Opie" <jfo@feri.de>
Mon, 29 Nov 1999 17:36:49 +0100
In today's issue (29 November 1999) of the *Nikkei Weekly* there was an item
on a problem developing in Japan with the relatively new 500 yen coin. This
coin was introduced to replace a 500 yen note and is relatively simplistic,
ie there is no bicolored metals, no special edging etc.  500 yen was around
$4.80 on 29. November.

It turns out that this coin can be easily counterfeited using a 500 won
(Korean currency) coin worth only 50 yen.  This coin is virtually identical
to the 500 yen coin, and only needs a couple of seconds under a grinder to
remove a small amount of metal in order to make weights identical.

So what, you may ask?  Well, coin machines in Japan are a tad more popular
than they are elsewhere, and if you toss in one of these obviously
adulterated and counterfeit coins, they accept them as if they were a 500
yen coin.  But rather than choosing a product, the crooks in this case press
for coin return and the machine returns a 500 yen coin.

That's right, it's first in, first out.  The sorting mechanism does not hold
the coins until the item is chosen, but rather assumes that no one is going
to pass counterfeit coins and passes them into the general bin.  The system
then, when returning money, doesn't return the coins actually entered, but
rather a generic coin from a general supply.

It's turning into a major nightmare for the coin-operated machinery
industry, since you can run large amounts of these coins through machines in
a rather short time (they are, of course, pre-loaded with a set amount of
cash in order to make change for large bills, and of course the 500 yen coin
is popular for this reason...).

The lesson to comp.risks?  Of course, that assumption that a coin is a coin.
A counterfeit coin returned (last in first out, or holding the coins in a
holder until a purchase was actually made, ie caching the coins until use)
to a crook is useless and there would not be a multi-million yen loss; by
passing them through into a generic coin bin, opportunities for abuse were
created that will require reequipping all coin-operated vending machinery
with new sensors or other ways of validating coins, and of course the
general aggregation when you make a legitimate purchase and the machine
gives you a handful of worthless slugs as change...

John F. Opie, Senior Economist, Feri GmbH  jfo@feri.de  Harald Quandt Haus,
Am Pilgerrain 17, Postfach 1454 D-61284 Bad Homburg vor der Hoehe


Coppermine bug stops PC shipments

Sam Kasseman EECE <usa1@Glue.umd.edu>
Thu, 2 Dec 1999 09:21:53 -0500 (EST)
Dell has stopped shipping its Optiplex GX110 corporate desktop because
a bug in the high-end Pentium III Coppermine processors can prevent booting.
[ZDNet]


Jane's article on cyberterrorism hype

Martin Minow <minow@pobox.com>
Wed, 01 Dec 1999 15:44:55 -0800
SlashDot <http://www.slashdot.org/> references an article to be published
in Jane's Intelligence Review on "Cyberterrorism  Hype", posted at
<http://jir.janes.com/sample/jir0525.html>. The article's author,
Johan J Ingles-le Nobel, discussed the subject with SlashDot contributors.


Stock performance charts

"Epstein, Jeremy" <Jeremy_Epstein@NAI.com>
Fri, 3 Dec 1999 05:56:22 -0800
*The Washington Post* business section 22 Nov 1999 included a corrected
chart showing performance of recent Washington-area IPOs, with the following
note:

"The stocks of companies that have gone back to Wall Street to sell shares
in the last two years have not performed as badly as appeared [in the
November 15 issue] ... which were based on inaccurate information in the
Bloomberg News database.  Bloomberg failed to adjust for stock splits in the
post-offering performance calculations .... Bloomberg is aware of the flaw
and plans to fix it. .... The corrected chart ... shows that 18 of 39
secondary issues were trading above their offering prices as of November 12.
That's a 46 percent success rate, substantially better than the 33 percent
cited in [the November 15 issue]."

For a company that makes its living selling financial data, this seems like
a rather substantial oversight for Bloomberg!  GIGO rules again?

--Jeremy Epstein, NAI Labs


Railtrack timetable server has Y2K problems?

"Christopher St.John" <chris.st.john@NO.san.SPAM.com.PLEASE>
Wed, 24 Nov 1999 18:16:17 -0000
The UK Rail Operating company, Railtrack, have an online timetable
information system which collates timetables from the various train
operators (see www.railtrack.co.uk )

Recently a suspicious warning has appeared on their timetable form:
   =======================================================
Please read this BEFORE you travel!
We have identified a problem with our online timetable service.

Currently the online timetable may return inaccurate results. Though most of
the information returned by the online timetable is accurate, weekend
journey information and journey information for travel between Christmas Eve
and January 2nd may be inaccurate.

We have identified the cause of this problem and are working on a solution
as a matter of the highest priority. A full and accurate service will be
restored as soon as possible.


Worm.Mypic: Will Y2K provide cover for worm/viruses?

Mich Kabay <mkabay@compuserve.com>
Sun, 5 Dec 1999 22:10:29 -0500
The upsurge in e-mail-enabled worms and viruses appears to be supporting the
predictions of anti-virus experts who said that the Y2K transition would see
a flurry of new viruses and variants that would contribute to confusion
about the source of software problems following New Year's Day 2000.

Nancy Weil, writing in ComputerWorld
<http://www.computerworld.com/home/news.nsf/all/9912035y2kworm>, suggested
that the Worm.Mypic (aka W32/Mypics.worm) demonstrates the kind of problem
we are going to face in coming weeks.  Worm.Mypic arrives as an executable
attachment (Pics4You.exe with a length of 34,304 b).  If executed, the
program e-mails itself to the usual first 50 names in the MS-Outlook address
list (and continues to try to do so at regular intervals).  As soon as the
date changes to 1 Jan 2000, the resident virus overwrites checksum data for
the computer's BIOS, interfering with the boot sequence.  The virus also
attempts to format C: and D: drives.

As usual, everyone agrees that it is critically important to update
virus-signature files even more frequently than usual as we approach the new
year.

M. E. Kabay, PhD, CISSP / Director of Education
R&D Group, ICSA Labs <http://www.icsa.net>


Y2K compliance

Identity withheld by request <>
Fri, 3 Dec 1999
The IEEE created a document (either a standard, a standard practice, or a
guide), I forget which status it achieved in which Y2K compliance was
originally defined, essentially, as "the software will work after the start
of the millennium". It was pointed out that this was ridiculous since the
software might not work beforehand and it shouldn't suddenly work
afterwards. So the definition was changed to, essentially "the software will
work as well after the millennium as it did before".

Increasingly, I am seeing organizations break all kinds of software now, in
their mad scramble to apply Y2K fixes (well, they are called fixes).  If I
apply the definition, it is clear that they are ensuring Y2K compliance by
the following strategy.

Start with a currently operational system that may (or may not) be Y2K
compliant.  Apply Y2K fixes so that the system becomes non-operational and
now, by the IEEE definition they are guaranteed to be Y2K compliant.

The real risk, behind this the above is, of course, that many organizations
(including those that should know better) have left it *far* too late to
apply and test patches and are now scrambling to become Y2K compliant. Who
knows what will be broken next or the vulnerabilities that are being opened
up?

  [Yes, we know, 2000 is not the beginning of the next "millennium".  PGN]


Re: Irish telephone network outage brings Y2K fears (Casey, R-20.66)

Henry Spencer <henry@spsystems.net>
Thu, 2 Dec 1999 11:13:50 -0500 (EST)
It sounds like the problem may have manifested itself only when the system
was under load (if it took from night/morning until midafternoon for it to
really make a mess), in which case having the upgrade done on the weekend
might not have been an improvement.

There is also a practical issue: it is often safer to make such changes at
times when your full staff is at work, so that minor problems can be spotted
and handled quickly.  This inherently conflicts with wanting to keep major
failures out of the way of the customers, so it unfortunately requires
guessing how serious the problems are likely to be.  Sometimes you guess
wrong.


Risks of US-Euro date conversion (Re: Ezquibela, RISKS-20.65)

Ben Hines <bhines@san.rr.com>
Mon, 22 Nov 1999 00:38:29 -0700
>All this trust is lost because, in the jump to secure HTTP, they used an SSL
>2.0 certificate from RSA Data Security valid until _12/10/99_ 23:59:59.

Of course, here in the US, where RSA is based, "12/10/99" is December
10th, 1999, so the key is still valid. You, being in Spain and
interpreted this as October 12, 1999.

-Ben  bhines@san.rr.com  <http://members.tripod.com/~tunnels/>


Re: Mars climate orbiter

Michael Detambel <detambel@mail.bfw-oberhausen.de>
Mon, 22 Nov 1999 13:03:57 +0100
Approximately 15 years ago I worked in a project on a Krupp process
computer. I developed the application software in the real time programming
language PEARL, a colleague the system software (e.g. a mask generator and
handler for the 25x80 character display terminals) in the system language of
the computer, META-S.  For the interprogram communication he has provided a
call interface, over which the field entrys and the field attributes could
be exchanged.  For the set and reset of the field attributes there was a
byte, whose bits were assigned to individual values. E. g. if bit was 1 on,
the field on the display should flash, with bit 2 the Field should appear in
reverse mode representation etc. However, that never works and we needed
some time, until we had found the bug: In the programming language PEARL the
bits are counted from left to right, with beginning by the value "1" (the
most significant bit first), in the programming language META-S from right
to left (with the first value zero)...

Michael Detambel (Translated by Babelfish)


Re: Sarah Flannery (Quisquater, RISKS-20.65)

Timothy A. McDaniel <tmcd@jump.net>
Mon, 22 Nov 1999 11:32:54 -0600 (CST)
Jean-Jacques Quisquater <jjq@dice.ucl.ac.be> gave a number of links giving
further information about Sarah Flannery and her proposed cryptosystem.  It
should be mentioned (as M. Quisquater did not) that the last link, the one
containing her paper,
    http://cryptome.org/flannery-cp.htm
has an postscript showing that her public-key system can be broken and
"there's no hint yet of a repair".  So far as I can tell, none of the other
links he provided, to press releases and further information and such,
mention that either.

Tim McDaniel, tmcd@jump.net

Please report problems with the web pages to the maintainer

x
Top