The RISKS Digest
Volume 20 Issue 13

Thursday, 24th December 1998

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Near-miss at LaGuardia Airport, NYC
Dave Weingart
Runaway train on Capitol Hill
Thomas A. Russ
Another fibre-optic cable cut
Bob Blanchard
British Government admits Y2K missile problem
Phil Pennock
2,000 Texans get false overdraft notes in Y2K test
Bill Bauriedel
Wassenaar Agreement exempts 'public domain' software
Martin Hamilton
Other infrared security crocks
Paul Wexelblat
Re: PalmPilots voiding car locks in Europe
Philip Koopman
E-LIFE'S RISKS? I.R.S. E-FILE!
Andrew Greene
Should pilots trust TCAS?
Andres Zellweger
Airlines databases lock in increases better then refunds
Peter
Re: Frequent Flyer miles accessible
Peter
Y2K expansion
Jerry Leichter
Intelligent virus invades NT servers
Edupage Editors
Unexpected date behavior in Windows 95
Daniel Weber
Microsoft Trojan Horse
Frank Markus
Quark XPress, hates Unix scripts!
Ben Sherman
Password hint risks
Alexander V. Konstantinou
Risks in incorrect warnings and alerts
Flint Pellett
CFP: 1999 National Information Systems Security Conference
Ed Borodkin
Info on RISKS (comp.risks)

Near-miss at LaGuardia Airport, NYC

Dave Weingart <dweingart@chi.com>
Thu, 10 Dec 1998 09:24:40 -0500
According to reports I heard on the radio this morning, a US Airways Boeing
737 came less than 100 feet (some reports say withing 50 feet) landing on
smack on top of a small corporate plane at LaGuardia Airport in New York
City on Wednesday, 2 Dec 1998.  The passenger flight was given permission to
land, despite the fact that the smaller plane was waiting to take off on the
same runway.  The near-disaster has been blamed on too few controllers being
too distracted.

Dave Weingart, Strategix Solutions  dweingart@chi.com     1-516-682-1470


Runaway train on Capitol Hill

Thomas A. Russ <tar@ISI.EDU>
17 Dec 1998 13:52:45 -0800
I found the following item in the *Los Angeles Times* 16 Dec 1998.
Especially intriguing are the spokesman's comments.  There is also the
nagging question of why there is an operator on a fully automated system in
the first place.

  ...there IS a runaway train on Capitol Hill.  The automatic brakes on
  the Senate subway between the Russell Office Building and the Capitol
  failed last week, sending the train crashing into a wall and slightly
  injuring the operator and the two other people on board.  In the best
  congressional spirit, a spokesman for the architect of the Capitol
  stressed that "there was no operator fault involved.  It's all
  automatic," said Herb Franklin, "and it's supposed to stop by itself."

Thomas A. Russ,  USC/Information Sciences Institute          tar@isi.edu


Another fibre-optic cable cut

Bob Blanchard <b.blanchard@computer.org>
Wed, 09 Dec 1998 22:07:20 -0500
My office's frame relay link to our corporate WAN was out for a day on 26
Nov 1998.  As it turns out, we were not alone.  A railway backhoe operator
accidentally cut an AT&T Canada fibre cable along the rail line between
Toronto and Windsor, crashing computers, knocking out phone lines, and
generally disrupting communications in southern Ontario.  The main branch of
the Bank of Nova Scotia was computerless.  Rerouting of Internet lines via
the U.S. was slow because of the Thanksgiving holiday.  [Source: Veronique
Mandal, ``Cut Cable Paralyses Network - Severed line brings hi-tech to its
knees'', *Windsor Star, London Free Press, Chatham Daily News*, 27 Nov 1998;
PGN Abstracting]


British Government admits Y2K missile problem

Phil Pennock <phil@athenaeum.demon.co.uk>
Wed, 9 Dec 1998 23:03:55 +0000
From The Times, 9 Dec 1998, p2:

Bug threat to missile

The Ministry of Defense admitted for the first time that the millennium bug
could have left Britain vulnerable to air attack.  It discovered that the
Rapier anti-aircraft missile would have failed to retaliate.  The problem
was identified inside the field equipment which activates the missiles and
it would have made the system inoperable.  The threat to Britain's defenses
posed by the computer bug was outlined by George Robertson, the Defense
Secretary.  [Well, at least the failure mode was to not fire.]


2,000 Texans get false overdraft notes in Y2K test

"Bill Bauriedel" <Bill.Bauriedel@Forsythe.Stanford.EDU>
Fri, 18 Dec 98 10:47:21 PST
2,000 Texans get false overdraft notes in Y2K test
Reuters, Detroit News, 12/17/1998

Bank One Texas was testing their Y2K systems to see if they could send out
overdraft notices after "1/1/00".  They were able to print over 2,000
fabricated notices.  But someone forgot to throw away the printouts, which
were mailed out! <http://www.detnews.com/1998/technology/9812/17/12170189.htm>


Wassenaar Agreement exempts 'public domain' software

Martin Hamilton <martin@net.lut.ac.uk>
Fri, 11 Dec 1998 17:00:48 +0000
Just to note that the Wassenaar Agreement explicitly exempts 'public
domain' software, in its 'General Technology Note':

  Controls do not apply to "technology" "in the public domain", to
  "basic scientific research" or to the minimum necessary information
  for patent applications.

The 'General Software Note' has this to add:

  The Lists do not control "software" which is either:

    1. Generally available to the public by being:

      a. Sold from stock at retail selling points without restriction,
           by means of:

        1. Over-the-counter transactions;
        2. Mail order transactions; or
        3. Telephone call transactions; and

      b. Designed for installation by the user without further
           substantial support by the supplier; or

    N.B. Entry 1 of the General Software Note does not release
       "software" controlled by Category 5 Part 2.

    2. "In the public domain".

See <URL:http://www.wassenaar.org/List/GTNGSN.doc>.  Category 5 is the
section on 'Information Security', of course.  Last updated 10th
December 1998.

"In the public domain" is defined as:

  This means "technology" or "software" which has been made available
  without restrictions upon its further dissemination.

  N.B. Copyright restrictions do not remove "technology" or "software"
  from being "in the public domain".

See <URL:http://www.wassenaar.org/List/Def.doc>.

Martin


Other infrared security crocks

Paul Wexelblat <wex@cs.uml.edu>
Thu, 10 Dec 1998 11:52:02 -0500
The discussion in RISKS-20.10 about theft of infrared codes for door locks
brings to mind two similar issues.

1. Conventional RF garage door openers left in cars at parking/repair
garages are easily opened so "the baddies" can see the code and program a
"universal" opener to get in — many folks don't lock the door between their
garage and house.

2. My household internal alarm system uses coded infrared beams. These are
trivially defeated with one of those X-10 type "control extenders".  A
device (you may have seen them, they're 3" pyramid shaped things) that you
zap your IR remote control at, this device converts the signal to RF that
goes through walls and the receiver reconverts and emits IR to the
sensor. (I use one to control my cable box from the bedroom.)

Using this thing I can just put the pair between the alarm transmitter and
receiver and walk on through.

...wex


Re: PalmPilots voiding car locks in Europe (McCoy, RISKS-20.10)

Philip Koopman <koopman@cmu.edu>
Thu, 10 Dec 1998 03:43:17 GMT
I have heard that one can get stand-alone "diagnostic" boxes to record and
play back transmissions for the RF car interfaces.  In fact, in response to
this some vehicles with RF interfaces already have cryptographic encoding of
transmissions (I designed such a system with Alan Finn back in my industrial
research lab days, and it is in moderately wide use in some US vehicles
today).

One of the RISKS of this area is that in general the market won't currently
bear the cost of high-strength cryptography — they only get a few cents
extra to spend on it, and even something like DES tends to cost too much.
So it is difficult for consumers to evaluate whether they're getting crypto
that is strong enough to be reasonable for the application, or one of the
systems that says it is but really isn't (or one of the systems that really
is but competitors claim it isn't, or ...).

Phil Koopman — koopman@cmu.edu — http://www.ece.cmu.edu/~koopman


E-LIFE'S RISKS? I.R.S. E-FILE!

Andrew Greene <agreene@bitstream.com>
Thu, 10 Dec 1998 11:16:38 -0500
I received a postcard this weekend from the Internal Revenue Service,
notifying me that my wife and I have been selected to (optionally)
participate in a pilot implementation of the new, improved "e-File".  This
is a way of electronically filing our income tax returns next April, where
not only is the information transmitted electronically and the payment or
refund handled via EFT, but we will not even be required to mail in a
signature card.

Instead of the physical signature card, we can simply use the electronic
"signature" number enclosed in a sealed section of the postcard that we
received this weekend. (The number is only five digits, but for a one-time
use that's probably not too bad.... right?)

It's an old story, I know, but one which apparently bears repeating, because
organizations are still doing it. This piece of mail was unsolicited, so had
someone pilfered it from my mailbox and used it to file a false return, *I*
would have been the one in trouble with the IRS.

- Andrew Greene

  [Nice palindromic subject line (ignoring punctuation)!  PGN]


Should pilots trust TCAS?

Andres Zellweger <ZellwegA@cts.db.erau.edu>
Thu, 24 Dec 1998 10:14:08 -0500
In RISKS-20.12 David Wittenberg reported on two situations, one where TCAS
may have created an incident that was "saved" by air traffic controllers and
another where air traffic controllers may have created an incident that was
"saved" by TCAS.  He asks "How are pilots to know which one to trust when
they must make decisions quickly?"

I recently completed a case study on TCAS for a graduate seminar on building
safe systems.  My students and I were convince, after looking at the
extensive TCAS safety activities over the last 15 years, that TCAS is indeed
a very effective "safety net" for pilots.

After TCAS introduction began, it became clear that a clear set of TCAS
operational procedures for both pilots and controllers was necessary to get
the full safety benefits of TCAS.  For instance:

- "pilots should follow resolution advisories unless doing so jeopardizes
the safe operation of the flight or the flight crew has definitive visual
acquisition of the intruder."

- "when responding to a resolution advisory hat directs a deviation from an
ATC clearance, pilots should communicate with ATC as soon as possible."

- "after a controller has been informed that an aircraft is responding to a
resolution advisory, the controller should not issue control instructions to
that aircraft that are contrary to the resolution advisory."

and so on ...

I would personally be very upset if pilots, at this stage in the evolution
of TCAS did not follow TCAS resolution advisories.  The probability of TCAS
errors is very small compared to the likelihood that an air traffic control
countermand to a TCAS advisory will lead to potential problems.

Unfortunately, I have not seen any reports of the FAA's (or anyone else's)
investigation into the former (more interesting, from a safety perspective)
incident. The good news is that analysis of reported incidents like this has
historically continued to lead to TCAS improvements.  The bad news is that
the FAA's investment banker has not seen it fit to continue funding of the
~TCAS Program at levels sufficient to continue TCAS improvements. Indeed,
there are some who have questioned whether the safety analysis of the new
TCAS version 7.0 (developed primarily for use in European airspace) has been
adequate.


Airlines databases lock in increases better then refunds

<peter@hightown.demon.co.uk>
Sat, 12 Dec 1998 22:43:47 +0000
Earlier this year I had occasion to return from Johannesburg to London on
business. I was booked business class for this journey.  On the flight day
the airline was a plane down so I and a colleague were transferred to
economy on the next plane. All the paperwork ws completed and entered into
computer on check-in.

When I returned to the office I asked the staff who deal with travel to
verify with our travel agent that the downgrade was credited with a refund
to the business. This did not happen. Full fare was charged.  It took many
months to unsnarl the paper trail and it was not clear who should have
informed who about what. The way the legalese on tickets is written maybe it
was all our fault for traveling!

It took us many man-hours to resolve the situation and probably many
business travelers do not bother assuming that "the system" sorts out
little details. But when "the system" comprises the IS systems of several
travel firms and subcontractors programmed to bill any upgrades or additions
exactly it only takes one to fail to pass on refunds to give a "heads you
lose" situation. Just like the vending machines — they never dispense free
drinks but there are often stickers on them saying, "it owes me 17p" (UK
pence).


Re: Frequent Flyer miles accessible (RISKS-20.12)

<peter@hightown.demon.co.uk>
Sat, 12 Dec 1998 22:43:47 +0000
I find the quote from NW surprising in view of the fact that the BA
"Airmiles" accounts can only be used from the WWW with an initial PIN sent
by snail mail to the account holder's address. Changing the PIN can only be
done online using https: access.

Unlike in the US, in Britain there is popular reluctance to adopt universal
ID such as by Social Security number or phone number.


Y2K expansion

Jerry Leichter <leichter@lrw.com>
Sun, 13 Dec 98 08:21:25 EST
As not-particularly-competent (but professionally paranoid) lawyers have
gotten more and more heavily involved, the range of "Y2K critical" dates has
grown.  A story has made the rounds that some software uses "9999" in the
date field to indicate special processing; hence, software is now supposed
to be tested for proper operation on 9 Sep 99.  (Why operation *on that
date* should be affected is beyond me.  Then again, since a date written
with a two-digit year and no separators necessarily has to be stored in at
least a 7-digit field — there being at least one month with a month number
larger than 9 containing at least one day with a number larger than 9 — it's
beyond me why one would expect software to use 9999 as a flag; 9999999 is
much more likely, and completely safe.)

I recently saw proposed language that would require a vendor to certify
proper operation on about 15 dates, starting with 9 Apr 1999 and ending some
time in 2002.  One could make a rough kind of sense of many of the dates --
e.g., they tested proper leap year handling in 2000 and thereafter — but
some of them are real head-scratchers.  The 9 Apr 1999 date is one of those.
Is it that 4 and 9 look alike, so that 9499 (or is it 4999?) might have been
used as a magic flag?  :-)

I worked with a lawyer doing Y2K requirements for a major corporation, and
convinced them not to put *any* explicit date requirements in.  Rather, the
language is all written in terms of proper operation on any dates, and with
any input date data, that will foreseeably arise in normal system operation
during the expected lifetime of the system.  Not only does this avoid
getting into silly "my list of dates is longer than yours, hence you weren't
exercising due diligence" games, but it also avoids a genuine *bug* in the
way most of these provisions have been written: They focus so closely on
dates around the year 2000 that they ignore the possibility of problems with
dates further away.  For example, a vendor could introduce windowing -
continue to store two-digit years but have 50-99 always represent 1950-1999
while 0-49 represent 2000-2049 - to easily comply with most Y2K requirements
I've seen.  That's fine if the field in question represents order dates, but
not so good if it represents dates of birth.
                            — Jerry


Intelligent virus invades NT servers

Edupage Editors <edupage@franklin.oit.unc.edu>
Tue, 22 Dec 1998 14:22:43 -0500
A new computer virus that attacked 10 MCI Worldcom networks last week is
capable of spreading throughout computer networks and scrambling the
documents on those networks as it goes.  "We've never seen anything this
sophisticated in 10 years of doing this," says Network Associates' general
manager of network security.  "This is a completely new strain of virus and
the first we've seen that propagates itself with no user interaction."  The
"Remote Explorer" virus runs on Microsoft Windows NT servers and affects
common programs like Microsoft Word.  Users clicking on their Word icon
might experience a slight delay, but otherwise would be unable to detect the
presence of the virus; meanwhile, the virus is busy corrupting files and
spreading to other programs.  Microsoft officials say they're "aware of
other viruses that have the same characteristics," and Network Associates
says it has developed a Remote Explorer detector and is working on a
solution to decode the affected files.  (*Wall Street Journal*, 22 Dec 1998,
Edupage, 22 December 1998)

  [Sounds like a worm to me.  PGN]


Unexpected date behavior in Windows 95

Daniel Weber <djweber@sandstorm.net>
Wed, 16 Dec 1998 18:21:46 -0500
I came across the following interesting behavior in Windows 95: if you use
the "Date/Time Properties" dialog box to change the month or day of month,
the system clock is actually set to that date, without the user hitting "OK"
or "Apply."

The risk is that the user is changing system properties without really being
aware of it--if the user hasn't pressed "Apply" he or she will figure that
the change hasn't occurred yet.  Although pressing "Cancel" will restore the
date, any applications that check the system clock in the meantime will read
the altered system time.

As a simple example, consider a mail reader that checks for new mail every
five minutes by comparing system times.  If it checks mail during an
interval when the clock is set forward a day and then reset, the mail reader
will not check mail again for 24 hours.

I don't know how many applications expect and rely on the system time
to be monotonically increasing.

I've been told that other Windows dialogs, besides the date/time, also
prematurely change system settings in similar ways, but I'm not sure what
implications other settings may have.

I tested this on an NT machine and on a Windows 98 machine that was said to
have the recent Y2K patch installed, and noticed the same behavior.

Dan Weber, Sandstorm Enterprises, Inc.  http://www.sandstorm.net/
1-617-547-0011  djweber@sandstorm.net


Microsoft Trojan Horse

"Frank Markus" <fmarkus@pipeline.com>
Thu, 10 Dec 1998 21:33:24 -0500
I am running Win98 with a Microsoft module that automatically notifies me of
"critical updates" and takes me to a Microsoft Updates web site.  The first
item on the list of updates was one that was intended to make Win98 Y2K
compliant.  This module was roughly 1.5MB.  Further down the list was a beta
IE virtual machine (presumably designed to comply with the Sun Java
decision.)  This file was roughly 4MB; I decided to pass on it.  Having
specified what I wanted to download — and install (it is both or nothing)
-- I noticed that the download was going slowly.  And then I discovered the
reason: the file that I was downloading was over 5MB ... and included the
Virtual Machine Beta that I had not checked!  I cancelled the download and
went back to check whether I had made and error.  I had not.  I repeated the
exercise with the same result.

My conclusion is that in an effort to show 'good faith' compliance with the
Sun Java court order, Microsoft is installing the revised Java engine on the
computers of users who have decided against using it.  The cheese that they
are using to bait their trap is the promise of Y2K compliance.

Does anyone who is using Win 98 and the IE 5.0 beta know how to get only the
Y2K update?


Quark XPress, hates Unix scripts!

Ben Sherman <ben@2600.com>
Wed, 9 Dec 1998 21:28:10 -0500 (EST)
Recently, we at 2600 Magazine published an article with a few Unix scripts
in it.

A neat feature of XPress is that it allows you to put XPress specific
formatting using plain ascii text, i.e. <i> for italics, <b> for bold and
so on.  So what do you do if you need to you the > or < symbols (in a
script for instance)?  Easy, XPress just looks for 2 > or < in a row.

So when you print something that says:

echo "root2:x:0:1:Root:/:/sbin/sh" <> /etc/passwd

it gets truncated to

echo "root2:x:0:1:Root:/:/sbin/sh" > /etc/passwd

In unix, <> appends something to a file, and > replaces the file, so any
command in a script to append something to the end of a file, would actually
ERASE the file and REPLACE it with the one thing that was supposed to be
added.

Eeek!


Password hint risks

"Alexander V. Konstantinou" <akonstan@cs.columbia.edu>
Fri, 11 Dec 1998 12:29:51 -0500
In the process of using the Netscape 4.5 automatic update feature I was
asked to join the Netscape Netcenter.  The form requests that you supply an
account name, password, name, electronic as well as physical address (to be
fair, you are given several options on how this information can be used).
The unusual aspect of this form, was an option to include a password hint
with the following explanation :

  "If you forget your password, Netcenter will present you this
   password hint to help jog your memory. Example hint:
   'same password as my bank acct.' "

The risks are clear : potential access to your name, address and bank PIN !

Alexander V. Konstantinou

  [Various comments received on this one.  TNX.  PGN]


Risks in incorrect warnings and alerts

Flint Pellett <flint@kai.com>
Wed, 16 Dec 1998 18:02:34 -0600
It seems to me that more often than not, the reason a Risk rears its ugly
head is because either somebody wrote a piece of software without thinking
about what possible states exist for various variables, or (a pet peeve of
mine), somebody issued an incomplete error message, or both.  [My definition
of a "complete" error message is one that tells me not only what is wrong,
in a way I can be expected to understand, but also gives me some clue what
to do to fix it or where to go to get help.]

While the example below is a trivial Risk (I hope), it illustrates
both of the above quite well.

I just installed my copy of the 1999 version of a widely used personal
finances software package, and migrated data from a much older version
without incident.  The new version includes a new feature (new to me anyway)
to show alerts if my charge card balances are "near my credit limit."  I was
very surprised, on my first entry, to see an alert telling me that one of my
cards was near my credit limit.  I was even more confused when I examined
the account: my balance owed was $0!

I eventually figured out their default algorithm for those alerts, by
examining several other accounts.  They would take the account's credit
limit, subtract $3,000, and set the resulting value as the default upper
limit on the balance owed before an alert would be issued.  That made the
default on my gasoline company card with its $700 credit limit be negative
$2,300.  [Feel free to draw your own conclusions about the person who came
up with _that_ algorithm.]

When I found the place where you can change those defaults, the offending
account wasn't listed, so I can't fix the -2,300 value.  I suspect that
perhaps someone used a negative number in that field to decide that this
account isn't a credit card account and shouldn't be in the list.  I appear
to be stuck with this incorrect warning message until such time as the gas
company owes me more than $2,300 on the account.

As for that (scary) error message, "Account XXX is near your credit limit",
it could have almost trivially been presented to me as the following: "You
owe $0 on Account XXX, which has a $700 limit.  You will see this alert
whenever your balance exceeds $-2,300."  The first message wasted a
considerable amount of my time in having to figure out what was going on.
The second would have made it a lot easier, if not obvious, and would
certainly have generated less stress.  Millions of people use this program-
millions of minutes can get wasted when people can't figure out what is
triggering a warning.  Is that a trivial Risk?  Maybe not.

Can I conclude anything else about this incident?  Yes.  The QA people who
tested this, who apparently don't have any charge accounts with credit
limits under $3,000, are quite likely overpaid. :-)

--Flint Pellett


CFP: 1999 National Information Systems Security Conference

"Ed Borodkin" <borodkin@constitution.ncsc.mil>
Fri, 18 Dec 1998 14:43:26 -0500
The National Information Systems Security Conference (NISSC) welcomes
papers, panels, and tutorials on all topics related to information
systems security.  Our audience represents a broad range of information
security interests spanning government, industry, commercial, and
academic communities.  Papers and panel discussions typically cover:

 * research and development for secure products and systems, presenting
   the latest thinking and directions;
 * electronic commerce;
 * legal issues such as privacy, ethics, investigations, and enforcement;
 * practical solutions for government, business and industry
   information security concerns;
 * network security issues and solutions;
 * management activities to promote security in IT systems including
   security planning, risk management, and awareness and training;
 * implementation, accreditation, and operation of secure systems in a
   government, business, or industry environment;
 * international harmonization of security criteria and evaluation;
 * evaluation of products, systems and solutions against trust criteria;
 * tutorials on security basics and advanced issues;
 * security issues dealing with rapidly changing information technologies;
 * highlights from other security forums; and
 * implementing policy direction.

For more details see http://csrc.nist.gov/nissc/call.htm.

  [The most important detail: the deadline
  for submissions is 15 Feb 1999.  PGN]

Please report problems with the web pages to the maintainer

x
Top