The RISKS Digest
Volume 18 Issue 74

Tuesday, 7th January 1997

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

U.S. Air Force webpage hacked
PGN
Grammy web page leaks nominees
B.J. Herbison
The Sky Is Falling
Jim Horning
Computer safety 25 years ago
Wayne Hayes
Leap-Year software bug gives "Million-dollar glitch"
Jim Towler
VISA fines banks with Y2K problems
Lloyd Wood
Y2K: Blessing in Disguise
Mark Brader
Another privacy bug in Netscape
Kevin McCurley
When connectors shouldn't meet
Lauren Weinstein
Dan Farmer releases real-time security survey
Betty G. O'Hearn
Let UPS publish your signature on the Net
Hall
Easy answers...
Steve Hand
April 1 considered harmful
William J. Evans
Re: Do Not Attempt to use Airplane as Submarine?
Sam Lepore
'Ghost Trains' evidence of safe design
Andrew Waugh
2nd FMICS Int. Workshop, Call for Papers
Diego Latella
Info on RISKS (comp.risks)

U.S. Air Force webpage hacked

"Peter G. Neumann" <neumann@csl.sri.com>
Mon, 30 Dec 1996 18:10 PDT
U.S. Air Force has now joined the club along with the Department of Justice,
CIA, and NASA, whose webpages had previously been altered by intruders
(RISKS-18.35 and 49).  On Sunday morning, 29 Dec 1996, the main webpage of
the Air Force's website http://www.af.mil at Fort Belvoir, Virginia, was
included ``Welcome to the Truth,'' above dripping blood and a pair of red
eyeballs, and ``You can learn all about gov't corruption here.  Learn the
secrets that they don't want you to know.''  Also included were links to
nonGovernment webpages and an X-rated picture with the caption, ``This is
what your gov't is doing to you every day.''  A person identified as a
23-year-old small-business man claiming to have been involved with the two
intruders said that they actually had access to the entire AF e-mail system
(including classified documents), and were trying to show how ``pathetic''
the security was.  An AF spokesman suggested that they had access only to
one PC.  (Fort Belvoir also houses the websites for the Army and Marines.)
[Source: Hackers Disrupt Air Force Web Page, article by Seth Schiesel, *The
New York Times*, 30 Dec 1996.]

  [Yes, the primary purpose of your webpage is usually to make information
  openly available, not to protect secrecy.  But also don't forget that
  integrity and preventing denials of service are fundamental parts of your
  web and net security problems.

      This case was also noted by Christopher Klaus <cklaus@iss.net>, who
      added ``With so many government web sites getting hacked, you think
      they would spend a little more time securing them.  The tools exist to
      secure the web sites.''  Well, more or less.  PGN]


Grammy web page leaks nominees

"B.J. Herbison" <bherbison@HighGround.com>
Tue, 7 Jan 1997 10:42:06 -0500
On the CNN Interactive web site (cnn.com) today there is an article about
the Grammy awards (www.grammy.com).  The Grammy award nominees were supposed
to be officially announced at 8:40 this morning, but the announcement was on
the web site overnight, removed early this morning.

When the Grammy site reposted the list, the names matched the information
from CNN.  The CNN article is at
  http://cnn.com/SHOWBIZ/9701/07/grammy/index.html

This is nothing new, just the standard risk of not integrating the standards
for new technologies with the existing technologies.

B.J. Herbison, HighGround Systems, Inc., 1300 Massachusetts Avenue
Boxborough, MA 01719-2203  bherbison@HighGround.com  +1 508 263-5588 x126


The Sky Is Falling

Jim Horning <horning@intertrust.com>
Fri, 03 Jan 1997 10:46:00 P
The "Week in Review" in *The New York Times*, 29 Dec 1996, led off with a
column by Michael Wines entitled "The Sky Is Falling: Three Cheers for
Chicken Little."  It will probably be of interest to most readers of RISKS.
I don't agree with all of it, but it is definitely thought-provoking.
I just excerpt the main points:

``'Time is running out,' ... Mr. Peterson is writing about the imminent
bankruptcy of Medicare.  No matter. It easily could have been global
warming, overpopulation, designer genes, nuclear terrorism, AIDS, the
national debt, Apple Computer, Democrats, Republicans, the two-party system,
wetlands, desertification, moral decline, moral zealots, killer asteroids
or, for that matter, software that does not recognize the year 2000...

``Alarmism is a national obsession... Are Americans getting too much of a bad
thing?  Of course they are.  But that begs the real question: why doesn't
the sky fall?  And that question is a tougher one.  Sure, plenty of alarms
are raised over threats that are, to put it kindly, overstated...
Maybe the sky doesn't fall because alarms work as the alarmists intended.
Maybe the apocalypse never fully arrives because the alarmists' wretched
excess frightens people into taking action...

``There's no way to know for sure.  The only evidence that an alarm works is
in its being proved false — which may also be evidence that the alarm was
unnecessary to begin with..."

``'It would be intellectually satisfying to say the real impact is through
reasoned discourse,' [Paul Ehrlich] said.  'But in my view the real impact
isn't in reasoned discourse.  Media attention, press coverage and, if
necessary, alarmism at least set an agenda.  And that way you can have a
debate.'" ...

``Alarmists can be self-promoting pains, and they are often wrong.  So ignore
them.  How bad could life be on an Earth slow-cooking in a sauce of melted
icebergs, populated by genetic experiments gone wild and fated to an
eternity of Windows 95?"

Jim Horning

  [Perhaps Cassandra was a little chicken rather than a Chicken Little?  PGN]


Computer safety 25 years ago

Wayne Hayes <wayne@cs.toronto.edu>
Tue, 7 Jan 1997 14:50:36 -0500
I was rummaging around my old books recently and came upon the following
gem.  It is interesting to note that everything he says is still true today,
down to the things he suggests we need to do to make software safer.  And
this was 26 years ago.  When will we ever learn?

>From _The Computer Revolution_, by Nigel Hawkes.  Copyright 1971 Thames and
Hudson, London.  Copied without permission.

    HOW SAFE ARE COMPUTERS?  (pages 193-196)

    Less often discussed but potentially more serious is the danger of
    what I call computer-aided disasters.  Many computer experts,
    particularly those in the software field, believe that these
    represent a very serious danger.  Alex d'Agapeyeff, President of the
    British Computer Society, has argued that the existing computer
    systems have a reliability rather worse than that of the British
    telephone service.  Other computer professionals have claimed that
    up to 70% of the computer systems in operation in Britain are
    unsuccessful.  [He doesn't define what he means here.-WH]  Although
    this may be an exaggeration, we would be wise to recognize the
    widening gap between what the computer salesmen claim and what the
    systems engineers actually provide.  Large computer systems remain
    something of an unknown quantity, and are quite capable of going
    very seriously wrong.
    Most of the time this does not matter, except to those directly
    concerned.  It is naturally irritating when a computerized system
    sends one an absurd bill, but it can usually be corrected without
    too much trouble.  [And now, in 1996, even that is sometimes not
    possible.-WH]  When computers are used to control jumbo jets,
    chemical plants or nuclear power stations, the effects of a similar
    failure might be catastrophic.  More dangerous still, some people
    believe, are elaborate military systems, where relatively untried
    weapons may be sent on their way to the target by relatively
    untried software.  The kind of dangers that can arise even in civil
    air-traffic control are epitomized by a story about Britain's new
    computerized system, called Mediator.  As a first stage of
    implementation, a computer was introduced which could not deal
    easily with letters of the alphabet; instead it used flight numbers
    to identify aircraft.  Soon after its introduction, three aircraft
    were circling over Watford waiting for clearance to land at
    Heathrow, and all three had the same flight number --- BEA flight
    701, Air India flight 701 and Iberia flight 701.  Fortunately, a
    human air-traffic controller noticed in time, before the computer
    had time to give all three aircraft clearance to land at once.
    What security can one have against computerized disasters?
    Errors very rarely arise in the hardware of a modern computer.
    Almost all are introduced by human error --- by carelessly written
    programs, inaccurate input data or badly conceived systems design.
    Greater emphasis on software development and improved input methods
    would help, but alone are never likely to be enough.  The dangers we
    really need to worry about are exactly the ones we cannot predict,
    and therefore cannot easily guard against.  As a minimum, it seems
    to me, we should insist on all major computer installations being
    designed to `fail softly' by falling back to a degraded state of
    operation rather than collapsing catastrophically.  In the case of
    chemical plants, nuclear power stations, or medical intensive care
    units, we should insist that the control function is so designed
    that it can if necessary be taken over by a human operator in the
    event of a computer breakdown.  Failing that, a completely
    independent `stand-by' system, with its own power supply, should be
    installed.  These may seem expensive safeguards, but they are cheap
    compared with the possible costs of ignoring them.
    Unfortunately, there are at the moment no agreed standard to
    which computer systems are expected to conform.  The industry is
    new, and growing fast, and nobody has yet taken the time to set up
    proper disciplinary agencies.  Even when these are established, they
    are unlikely to be wholly effective, if the experience of other
    industries is any guide.  Most probably, a better safeguard would be
    to `professionalize' the computer programmers by establishing
    learned institutions like those set up in the nineteenth century by
    the engineers.  The British Computer Society is, indeed, trying to
    turn itself into such a body, although it still has a long way to go.

       [And these topics have been discussed in the ACM Software Engineering
       Notes now in volume 22, and RISKS for 11.5 years, ever since.
       However, I do not recall this book being mentioned before.  PGN]


Leap-Year software bug gives "Million-dollar glitch"

Jim Towler <jtowler@csi.co.nz>
Wed, 8 Jan 1997 09:47:40 +1300
Too many are suggesting that new programs are all OK and its only the "old
mainframe stuff" that will have problems with "Year 2000".  Well, people are
still writing code with bugged date logic.  Jim Towler, Wellington, New
Zealand

  Million-dollar glitch ("The Dominion" — Wellington, New Zealand,
  8 Jan 1997) via NZPA [New Zealand Press Assoc.]

A computer glitch at the Tiwai Pt [place in South Island of New Zealand]
aluminium smelter at midnight on New Year's Eve has left a repair bill of
more than $1 million [New Zealand Dollars].  Production in all the smelting
potlines ground to a halt at the stroke of midnight when the computers shut
down simultaneously and without warning.  New Zealand Aluminium Smelters'
general manager David Brewer said the failure was traced to a faulty
computer software programme, which failed to account for 1996 being a leap
year.  The computer was not programmed to handle the 366th day of the year,
he said.  "Each of the 660 process control computers hung up simultaneously
at midnight," Mr Brewer said.

The same problem occurred two hours later at Comalco's Bell Bay smelter, in
Tasmania [Australia].  New Zealand is two hours ahead of Tasmania.  Both
smelters use the same programme, which was written by Comalco computer
staff.

Mr Brewer said the cause was difficult to trace and it was not till a
telephone call in the morning from Bell Bay that the leap year link was
made.  "It was a complicated problem and it took quite some time to find out
just what caused it."

Tiwai staff rallied through the night to operate the potlines manually and
try to find the cause.  The glitch was fixed and normal production restored
by midafternoon.  However, by then, the damage has been done. Without the
computers to regulate temperatures inside the pot cells, five cells
over-heated and were damaged beyond repair.  Mr Brewer said they would have
to be replaced at a cost of more than $1 million.

  [Reminder: I generally do *not* use a ``spellchecker'' for Britishisms
  (including Kiwi-isms) such as aluminium vs aluminum, honour, etc.
  But here we clearly needed a ``smeltchecker'' for our Smeltschmertz.  PGN]


VISA fines banks with Y2K problems

Lloyd Wood <L.Wood@surrey.ac.uk>
Tue, 7 Jan 1997 11:46:34 +0000 (GMT)
The article below appeared in The Times (London, UK), Mon Jan 6 1997,
Business News p41 first column, Article is probably available on The Times'
website at <URL:http://www.the-times.co.uk/> - registration is required, and
a quick hunt didn't turn it up.

BANKS FACE FINES OVER VISA CARD PROBLEMS, Fraser Nelson

Visa, the world's largest credit card company, is preparing to impose a fine
of up to UKP100,000 per month on some of its member banks in a last-ditch
attempt to ensure that they will accept credit cards with expiry dates
extending into the new millennium.

The company, itself a consortium of 20,000 banks, is launching the penal
system a year after its first deadline for Year 2000 compliance.  It
estimated that 1.3 million outlets worldwide are still unable to deal with
cards with expiry dates reading '00'. Britain is believed to account for
only 40,000 of the faulty terminals.

After April, banks that have problems processing Visa's cards will be
charged between UKP600 and UKP100,000 per month, depending on volume, until
they correct the bug.

Visa says that 90 per cent of terminals accept the new cards but an
unacceptably high number still throw up an error when told a card was issued
in '97 and expires in '00. Jim Dickie, vice president of Visa's operations
and services in Europe, said the move was the next logical step to safeguard
the credit card's brand name.

Year 2000 compliance is the first of three upheavals Visa faces over
the next three years. The cards are also to have built-in microchips,
and European monetary union will require further upgrades.

  [My Visa card expires in March. I wonder if I'll encounter any of those
  'only 40,000' terminals this year? Convincing shop assistants that my card
  isn't stolen could be an interesting experience.  L.
  <URL:http://www.sat-net.com/L.Wood/><L.Wood@ieee.org> +44-1483-300800x3435]

    [And don't forget the scam this engendered, noted in RISKS-18.68.  PGN]


Y2K: Blessing in Disguise

Mark Brader <msb@sq.com>
Wed, 1 Jan 1997 18:55:50 GMT
I was in the Los Angeles area on October 20 and clipped this from the
*Times* [presumably LA, rather than NY, but I don't know Mark's habits.
PGN] that day, but forgot to send it along until now.  Perhaps it's more
appropriate on New Year's Day anyway:

|  Your Oct. 16 editorial ("The IRS Wages War on the Millennium Bug")
|  misses the point.  The "Year 2000" problem is a blessing in disguise.
|  Thanks to our computers we can relive the 20th century.  Think of
|  the mistakes we can correct this time around.  We can prevent two
|  world wars, find Amelia Earhart, vote for Hubert Humphrey in 1968,
|  catch the Unabomber before he strikes and tell young Jeffrey Maier
|  to stay in his seat!
|    Bill Smart, Santa Barbara, letter to the editor

  [Disguise (DisGuy's?) the Limit.
  And remember, tomorrow today will be yesterday --
  at least until 31 Dec 99.  PGN]


Another privacy bug in Netscape

Kevin McCurley <mccurley@swcp.com>
Sun, 5 Jan 1997 13:45:19 -0700
Version 2.0 of Netscape navigator had a bug in it that allowed web sites to
"steal" your e-mail address when you visited the page (see
http://itu.rdg.ac.uk/misc/Mailing_Lists/cpd/00000002.htm).  That bug was
fixed in Version 2.02 by trying to require that the user approve any mail
that is sent out from their machine.  Unfortunately, a new bug has been
discovered in Netscape 3.0, 3.01, and 4.0b1 that once again allows a web
site to steal addresses from browsers without the consent of the user.  A
satirical demonstration of this is located at
http://www.digicrime.com/noprivacy.html.

Such bugs continue to undermine the public's trust of the Internet.  The
existence of the bug is simply a programmer error rather than a malicious
act.  A far more insidious act is the exploitation of such mechanisms to
steal e-mail addresses for unknown purposes.  In fact the old 2.0 bug is
still in use at a US government site:

  http://www.hr.doe.gov/ucsp/doeucsp.htm

Kevin McCurley


When connectors shouldn't meet

Lauren Weinstein <lauren@vortex.com>
Fri, 3 Jan 1997 15:53 PST
The latest (Jan 1997) issue of "Mix" magazine (a pro-audio trade
publication), points out a potential risk in some new connectors recently
introduced for the audio industry.  Essentially, these are a set of adapters
that convert common audio connectors to the standard three prong A.C. power
configuration.  The idea is to allow ordinary power extension cords to be
used to carry audio in "emergency" situations at wiring performance venues
if the "correct" cable isn't available.

Mix properly notes serious concern over the obvious risks of such use,
especially in the hectic wiring situations where mislabeled cables are
common, and a technician thinking he or she was handling an audio cable
might find themselves plugged into A.C. mains current instead.

Lauren, Moderator, PRIVACY Forum  www.vortex.com


Dan Farmer releases real-time security survey

"Betty G. O'Hearn" <betty@infowar.com>
Mon, 06 Jan 1997 00:27:15 -0500
Dan Farmer, security researcher and creator of SATAN, just released the
results of a real-time security survey on the Internet.  It included various
classes of attacks against government sites, 660 world wide banks, credit
unions, on-line sex businesses and media orgs.

WWW.InfoWar.Com has posted the complete survey. There is no charge for
complete access to this invaluable, unauthorized survey.

Click on the home page of www.infowar.com at the icon "New on Infowar.Com"-
Scroll down. The survey is listed at 2 Jan 1997.

  [SATANic curses?  PGN]


Let UPS publish your signature on the Net

<hall@alvoid.research.att.com>
Mon, 6 Jan 1997 14:31:48 -0800 (PST)
I noticed a UPS (United Parcel Service) TV commercial this weekend that
advertises the capability for one to download an image of the signature of
the person who signed for a package.  It seems like a matter of a few
minutes of hacking for a reasonably clever forger to gain the ability to
sign checks and credit card slips (particularly those you sign and mail in)
on behalf of anyone who signs for a UPS package.  Even if there is tight
network security so that only the package sender can see the signature,
nothing stops a would-be forger from UPS-ing a bogus package to the person
whose credit card or checkbook s/he stole.

To the extent that handwritten signatures are used as a security measure any
more (which is debatable, of course), publishing them in a computer-readable
medium seems to me like an unnecessary risk with little or no compensating
advantage.

-- Bob

  [Ah, yes, we have been around on this one before.
  See RISKS-11.42, 11.71, and 15,29.  PGN]


Easy answers... (PGN, RISKS-18.72)

Steve Hand <sassth@unx.sas.com>
Mon, 6 Jan 1997 14:16:03 -0500 (EST)
> The moral lesson is a familiar one in RISKS: Easy answers are risky.
> Complex solutions are also risky.  On the other hand, even moderate
> solutions are risky, as your immoderate moderator repeatedly points out.

And, as newspaper editor H.L. Mencken said, "For every complex question
there is a simple answer, and it is wrong."

Steven Hand   sassth@unx.sas.com
(919) 677-8000 ext. 6936 (work)   847-9354 (home)


April 1 considered harmful

William J. Evans <wje@netcom.com>
Mon, 30 Dec 1996 13:29:58 -0800
Our esteemed PGN commented with respected to the alleged PENPAL virus:
:  Too bad we were not approaching 1 April instead of 1 Jan.

We need to address the risks involved in even _having_ a 1 April in the
calendar.  What if a powerful newbie takes a 1 April prank seriously, and
dives in to "fix" something?  What are the risks there?

Alternatively, what are the risks of expanding March to 32 days and starting
April with 2 April?

Would that be any worse than the upcoming Y2K?

Sorry, but I couldn't wait three months to submit this.  Too bad we're
not approaching 1 April instead of 1 Jan.

Bill Evans/Box 4829/Irvine, California 92716/(714)551-2766  wje@acm.org


Re: Do Not Attempt to use Airplane as Submarine? (Brader, RISKS-18.72)

<Sam.Lepore@ncal.kaiperm.org>
Tue, 31 Dec 1996 12:08 PST
> The airport is on an island just barely above sea level; presumably, then,
> when a high-pressure cell is in the area, its "pressure altitude" is below
> sea level.

As amusing as it may seem, this shows the excellent design of being prepared
for an unlikely event - like flying below sea level. Unlikely, but not
impossible.

This reminds me of an unconfirmed report from many years ago about the first
military aircraft from China Lake/Edwards AFB that flew with an
altimeter-linked autopilot and tried to buzz the deck in Death Valley (280
feet below sea level). I've heard various stories about it 1) tried to land
in mid-air; 2) believed it was about to crash and attempted to eject the
pilot; 3) nearly flew into the Panamint mountains trying to avoid crashing
at 'sea level'.

  [Mark Brader received an out-of-band message from
     Richard.Black@cl.cam.ac.uk
  reminding him that one of the busiest airports in the world (Amsterdam
  Schipol) is (like much of the Netherlands) below sea level.  PGN]


'Ghost Trains' evidence of safe design (PGN, RISKS-18.70)

<A.Waugh@mel.dit.csiro.au>
Thu, 2 Jan 1997 15:11:28 EST
Without further details, it is almost certain that the 'ghost train' is a
track circuit failure.

Track circuits are the fundamental building block of railway signaling
systems. Their purpose is to detect the presence of a train in a particular
section of track. Failure to detect a train is certain to eventually cause a
serious accident; the Clapham Junction accident in the UK, for example, was
caused by a wiring error which omitted a track circuit from the controls of
a signal.

All technology fails, of course, so track circuits are designed to indicate
the presence of a train when they fail. This is 'safe' as it holds signals
at danger and prevents any points in the track section from moving.

Totally locking up the system, however, would close the line until the track
circuit was repaired. So there is always a method of granting authority for
trains to pass a signal at danger and *slowly* pass over a track when a
track circuit fails — the 'manual operation' mentioned above. The railways
have traded off a small amount of safety (a small number of collisions have
occurred when Drivers travel too fast under these circumstances) to prevent
total closure of the line.

The 'ghost train' in the tale is the result of a deliberate decision as to a
technology failure mode, careful design to bias potential failures into this
safe failure mode, and a careful trade off between safety and closure of the
line. Given the very low accident rate on railways equipped with track
circuits, the 'ghost train' is, in fact evidence of a Risk success, not a
failure!

andrew waugh

  [We received a ton of e-mail on ghosts, 911, etc., lots of minor
  variants on earlier themes, far too many to include here.  TNX.  PGN]


2nd FMICS Int. Workshop, Call for Papers

Diego Latella <latella@sting.cnuce.cnr.it>
Tue, 7 Jan 1997 11:38:10 +0100
Second International Workshop on Formal Methods for Industrial Critical Systems
  ERCIM - FMICS    CESENA (Italy)    4-5 July 1997     [truncated for RISKS]

The Second International Workshop on Formal Methods for Industrial Critical
Systems will take place in Cesena, close to Bologna (Italy) as a Satellite
Workshop to the 24th International Colloquium on Automata, Languages, and
Programming, ICALP '97.  Workshop page URL:
  http://fdt.cnuce.cnr.it/~latella/FMICS/WS/Cesena97/workshop.html

The aim of these workshops is to provide a forum mainly for, but not limited
to, researchers of ERCIM Sites, interested in the development and use of
Formal Methods in the Industry.  In particular, these workshops should bring
together scientists active in the area of formal methods and willing to
exchange their experience in the industrial usage of these methods. They
also aim at promoting research and development for the improvement of formal
methods and tools with respect to their usage in/interest of industry.
Please notice that the workshop will be held in conjunction with the Second
International Workshop on Advanced Intelligent Networks (AIN'97)

SUBMISSIONS: Authors are invited to send five copies of a full paper (in
English, up to 25 pages) to the Programme Chair: S. Gnesi, CNR -
Ist. Elaborazione dell'Informazione Via S. Maria 46, I56126 Pisa - ITALY by
31 JAN 1997. An electronic version of the paper in .ps format plus an
abstract should also be sent to: fmics@iei.pi.cnr.it

Please report problems with the web pages to the maintainer

x
Top