The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 25 Issue 47

Monday 8 December 2008


Chatsworth Wreck May be a Safety Failure
Chuck Weinstock
Caltrain computer outage causes extensive schedule disruption
Water pumps failed in Yorba Linda fire
Jim Geissman
Dangerous Precedence Set - Federal Criminal Charges for Violation of Commercial Online ToS?
Stephen via Dave Farber
A cyber-attack alarms the Pentagon
Jerry and Virgil Gligor via David Farber
A secure version of reality
Andy Piper
The recovery features of botnets
Peter Houppermans
Fingerprints in South Africa
Heinz M. Kabutz
Facebook and tracking people
David Magda
How *not* to improve data quality
Richard O'Keefe
Israeli Labor primaries postponed: electronic systems fail
Amos Shapir
Re: Risks of assuming constant hours in a day
Curt Sampson
Workshop on GENI and Security: Call for Participation
Matt Bishop
MiniMetricon call for participation
Fred Cohen
REVIEW: "The History of Information Security", de Leeuw/Bergstra
Rob Slade
Info on RISKS (comp.risks)

Chatsworth Wreck May be a Safety Failure

<Chuck Weinstock <>>
Mon, 8 Dec 2008 17:59:40 -0500

Off the Newswire at
[Three different spellings of the conductor's name unified.  PGN]

According to the *Los Angeles Times*, conductor Robert Heldenbrand, the lone
surviving crew member of the Sept. 12 Metrolink wreck that killed 25 people,
has told investigators the signal he passed prior to the crash was green.

Trackside signals are programmed to detect the presence of a train or other
equipment within a block and turn red, telling approaching movements to stop
prior to entering. If the signal truly displayed a green aspect when in fact
a Union Pacific freight occupied the block, it would mean a signal
failure. Such failures do occasionally occur, and are known as false clear

Heldenbrand's account backs up that of three witnesses on the platform of the
Chatsworth depot, who also say the light was green at the time of the
crash. Still, the National Transportation Safety Board is standing by its
account of the crash, saying the train's engineer, Robert Sanchez, passed a
red signal before slamming head-on into the UP freight. It has, however,
questioned how brightly lit the signal was.

Investigators had earlier asked why Sanchez hadn't called out the signal's
indication on the radio, which Heldenbrand would have been required to repeat
back if the signal wasn't green. But a green signal would have meant the two
would not be required to call the signal.

Caltrain computer outage causes extensive schedule disruption

<"Peter G. Neumann" <>>
Fri, 5 Dec 2008 14:15:50 PST

On 4 Dec 2008, Caltrain (the commuter line between San Francisco and San
Jose/Gilroy) experienced significant delays due to the computer system that
controls the signal system.  The problem began at the end of the morning
rush hour, and was still ongoing during the evening rush hour.  Trains had
to be controlled manually, including bullet trains overtaking local trains.
Delays were further complicated by construction at Palo Alto's California
Avenue station -- which requires only one train at a time in the station
(even though the long-standing problem of having northbound passengers
cross-over the southbound tracks has finally been resolved in the past week,
with the completion of a northbound platform).

Water pumps failed in Yorba Linda fire

<Jim Geissman <>>
Thu, 27 Nov 2008 13:31:15 -0800

The equipment failures caused fire hydrants to run dry in the highest
neighborhood in the city and may explain why firefighters were unable to
protect homes.

Water officials said [on 25 Nov 2008] that pumps designed to push water to
the upper reaches of a hillside Yorba Linda neighborhood failed during a
15 Nov firestorm [so that] firefighters were forced to abandon the area and
let homes burn....

Water officials believe the electric pumps shut off because the fire had
burned through their electrical communication systems. The backup, they
said, failed because of the heat.

Officials said that the reservoir has been on the to-do list since at least
2001, when developer Shapell Industries first submitted plans to build homes
around Hidden Hills. The agency has recognized the need for more reservoirs
in the eastern side of Yorba Linda for 30 years but depends on developers'
building plans to determine where to locate the infrastructure.

[Interesting that residential development occurs before development of the
infrastructure required to support it.],0,3919612.story

Dangerous Precedence Set - Federal Criminal Charges for

<Stephen <>>
November 27, 2008 12:24:49 PM EST
   Violation of Commercial Online ToS?

  [From Dave Farber's IP list.  PGN]

A very dangerous legal precedence was set today.

In the case of the 13-year-old who committed suicide supposedly over a
MySpace hoax, the mother involved was found guilty on three federal counts.
What of? Not of a serious criminal act.

She was found guilty on three criminal counts (misdemeanors), in a federal
court, of violating the Terms of Service agreement. So now you can be
accused, tried, and found guilty of a federal criminal offense not for
breaking a Federal or even a state law, but rather for violating the Terms
of Service of a click- through agreement of a commercial site you go to on
the Web.

"Prosecutors alleged that Drew and her employee violated MySpace's "terms of
service," which prohibit using fraudulent registration information,
obtaining personal information about juvenile members, and using the service
to harass, abuse or harm others...

The verdict underscores the complexities of the case. Some legal experts and
civil liberties groups said a felony conviction would mean that millions of
people who violate the terms of service of the Web sites they visit could
become criminally liable. Experts also said that if violating such terms is
a crime, then the sites that write the agreements essentially could function
as lawmakers or prosecutors. " - from *The Washington Post*

"The case was prosecuted under the federal Computer Fraud and Abuse Act,
originally intended to prosecute hackers. Did Lori Drew effectively hack
MySpace for nefarious purposes? Some people think it's quite a stretch.

``This was a very aggressive, if not misguided, theory,'' said Matt Levine,
a New York-based defense attorney and former federal prosecutor.
``Unfortunately, there's not a law that covers every bad thing in the
world. It's a bad idea to use laws that have very different purpose.''  from
ZDNet @


[IP] Re: A cyber-attack alarms the Pentagon

<David Farber <>>
Sun, 7 Dec 2008 09:34:29 -0500

>From: No-Name <>

I've heard a rumor about the way the WORM made it's way into the Pentagon
computer network. If true, it was a simple but brilliantly effective method.
Someone infected thumb drives with the WORM then dropped them around the
Pentagon parking lot. The employees, picked them up, took them into their
offices and plugged them into their office computers to determine the owner
of the drive.  Jerry

Date: December 4, 2008 8:36:03 PM EST
>From: Virgil Gligor <>
Subject: For IP: A cyber-attack alarms the Pentagon

Cyberwar: The worm turns, *The Economist*, 4 Dec 2008

A cyber-attack alarms the Pentagon

Battlefield bandwidth is low at best, making networks sticky and e- mails
tricky. American soldiers often rely on memory sticks to cart vital data
between computers. Off-duty, they use the same devices to move around music
and photos. The dangers of that have just become apparent with the news that
the Pentagon has banned the use of all portable memory devices because of
the spread of a bit of malicious software called agent.btz.

This is a "worm", meaning that it replicates itself. If you have it on, say,
the memory card of a digital camera it will infect any computer to which you
upload photos. It will then infect any other portable memory plugged into
that computer (the cyber-equivalent, one might say, of a sexually
transmitted disease). On any computer hooked up to the Internet, this
variant tries to download more nasty stuff: in this case two programs that
access the hard-drive. Was it a humdrum crime of trying to steal banking
details? Or something more serious?  The trail has gone cold.

In any case, the malicious software (malware in the jargon) penetrated at
least one classified computer network. The problem was severe enough for
Admiral Michael Mullen, the chairman of the joint chiefs of staff, to brief
George Bush on it. Officials are saying little more than that.

Kimberly Zenz, an expert on cyberwarfare at VeriSign iDefense, a computer
security company that is investigating the attack, notes that it is not
clear that agent.btz was designed specifically to target military networks,
or indeed that it comes from either Russia or China (two countries known to
have state-sponsored cyberwarfare programmes that regularly target American
government computer networks).

Indeed, she says, by the standards of cyberwarfare, agent.btz is pretty
basic; it is a variant of a well-known bit of malware called the SillyFDC
worm, which has been around for at least three years. By contrast, a
government commission warned Congress last month that "since China's current
cyber operations capability is so advanced, it can engage in forms of
cyberwarfare so sophisticated that the United States may be unable to
counteract or even detect the efforts."

The most remarkable feature of the episode may not be the breach of
security, but the cost of dealing with it. In the civilian world, at least
one bank has dealt with agent.btz by blocking all its computers' USB ports
with glue. Every bit of portable memory in the sprawling American military
establishment now needs to be scrubbed clean before it can be used again. In
the meantime, soldiers will find it hard or outright impossible to share,
say, vital digital maps, let alone synch their iPods or exchange pictures
with their families.

A secure version of reality

<Andy Piper <>>
Wed, 03 Dec 2008 12:46:39 +0000

A variation on a common RISKS theme, but one I have come across increasingly
often. More and more websites are enforcing "secure" passwords and, it
seems, security questions. For instance two websites I used recently, which
front real services I use (credit card and pet insurance) enforce passwords
that *must* contain at least one upper-case character, one digit and not
bear any resemblance to your user id. So far, so secure. Unfortunately they
also want an answer to a security question as a reminder. Mother's maiden
name is usually the one I pick, but my mother has a double barreled maiden
name and this time I get:

"Answer to security question is invalid, please correct"

Sorry? So the self-evidently correct answer is not, in fact, "correct"
according to the website mafia. Ok, move along to the next one, place of
birth. My place of birth is tripled barreled (one of those quaint English
towns where the name is related to the river that runs through it).

"Answer to place of birth question is invalid, please correct"

Gah! I'm assuming that what the site doesn't like is the hyphen's in both
names (or perhaps that they have two names, although for place names that
seems entirely dubious). The problem is that I am beginning to have to enter
information that is not only wrong, but which I have no hope of remembering
correctly 6 months later. Increasingly the same is true for passwords. I use
systems where the above rules are enforced *and* password aging takes place
*and* the use of an old password is prohibited. So now I have to remember
not only information that is incorrect but also a system for encoding
passwords in my brain for these sites. Increasingly that system fails and I
need to recover the password, but the recovery system is also failing me

Where will this madness end?

The recovery features of botnets

<Peter Houppermans <>>
Fri, 28 Nov 2008 11:36:17 +0100

Interesting read, was a submission by Thomas Lakofski to a mailing list.

Short summary: botnet code contains a semi-random domain name generator
which it will contact for new instructions.  This makes it impossible to
shut down the controlling domains as the code will simply switch to a new
name for instructions - which differs every single day.  Let me put it
another way: a shutdown is virtually impossible without catching the actual

Fingerprints in South Africa

<"Dr Heinz M. Kabutz" <>>
Fri, 05 Dec 2008 09:57:33 +0200

In South Africa, whenever you apply for an ID book, passport or driver's
license, your fingerprints are taken.  In the past, these were used to
prevent identify theft.  Of course, even that is flawed.  I know someone
whose skin flakes off his fingertips when he is under stress.  He had a lot
of trouble renewing his passport, because his fingerprints did not match
what was on file.

It would be fair to say that in South Africa, the Department of Home Affairs
has a database of almost all the fingerprints in the country.  You cannot do
anything without an ID book.

Now someone has had the "brilliant" idea to open up this database to the
police service, who are notorious for their low conviction rate.  It
horrifies me to think how many false positives there will be!  Imagine being
dragged out of bed at 3am by police special forces for murder and rape,
because your fingerprint biometrically matched a serial killer's.  The level
of corruption at the South African Police Service is well known, so this is
probably one of the worst news I have heard in years.

I'm sure fellow RISKS readers will agree ...

Dr Heinz M. Kabutz (PhD CompSci), Sun Java Champion,
Author of "The Java(tm) Specialists' Newsletter"

Facebook and tracking people

<"David Magda" <>>
Thu, 27 Nov 2008 16:00:45 -0500 (EST)

Nowadays, once you have someone's name, it can be quite easy to track them

> Leary was left with an unpaid bill for about $520, and little hope of
> recovering his money.
> "It was then I remembered that when the group arrived, one of them had
> asked about one of our waitresses who was not working that night," Leary
> said yesterday.
> The waitress gave him a name and then he thought of Facebook.
> "I searched the name and there he was, large as life," he said. "And he
> was pictured with his girlfriend--the only girl who had been in the
> group.
> "The site also gave me his place of employment, which was handy.",25642,24714093-5014239,00.html

It turns out the individual in question worked at another restaurant, from
which he's now been fired.

How *not* to improve data quality

<"Richard O'Keefe" <>>
Thu, 4 Dec 2008 16:52:21 +1300

A few days ago my wife and I bought a 2nd-hand station wagon to replace a
vehicle that needed repairs for which parts are now hard to obtain.  Part of
the process is to change the registration for the new vehicle.  As we were
dutifully filling in all the details on the form, we came to a box for our
phone number, and being unable to think of any reason why we wouldn't want
the licensing authority to be able to reach us, were about to fill it in.
At that point the dealer stopped us and advised us not to.  The reason?  If
your phone number changes (e.g., you drop your mobile in a toilet and have
to get a new one -- happened to me once), you are legally obliged to tell
the LTSA, but only if you told them your number in the first place.  And
most people never remember to do this.

So a rule "You must tell us if your phone number changes" that
was presumably intended to ensure that the data would be more
accurate has had the effect of making the data mostly unavailable.

Israeli Labor primaries postponed: electronic systems fail

<Amos Shapir <>>
Mon, 8 Dec 2008 17:12:14 +0200

This is primary elections season in Israel, when political parties vote for
their nominees for the upcoming general elections in February.  The Labor
party tried a fully electronic voting scheme which failed miserably, forcing
them to delay the election by several days and go back to the old fashioned
paper ballot method.

Full story at:,7340,L-3631836,00.html

Despite that, today the Likud right-wing party also held its primary
elections electronically, and did face some trouble, though not as fatal.
Full story at:,7340,L-3635022,00.html

Unfortunately, there aren't many technical details about the systems

Re: Risks of assuming constant hours in a day (Toby, RISKS-25.45)

<Curt Sampson <>>
Sat, 6 Dec 2008 16:49:22 +0900

> I made the (altogether reasonable, I thought) assumption that if you take a
> timestamp and add 24 hours, it becomes the same time on the following day.
> Well, not always.  Such as when clocks change for Daylight Savings Time.

And that's only the start. As we're all well aware, you can't even add sixty
seconds to a time and assume it's then in the next minute.

We normally don't think much about this, but lately I've been doing a lot of
programming in Haskell, and the folks who built the libraries being much too
smart for their own (or, anyway, my) good, make this clear, not to mention
push it a bit in your face:

Actually, though, as well as being more than a little pedantic, this is I
think a fairly brilliant example of good risk management: by forcing
programmers to stop and make a choice between a DiffTime (which includes
leap seconds) and a NominalDiffTime (which more or less pretends they don't
exist), it also forces them to think for a moment about what exactly they're

So kudos to Ashley Yakeley, the rather smart author of this library.

Curt Sampson <> +81 90 7737 2974  Functional
programming in all senses of the word:

Workshop on GENI and Security: Call for Participation

<Matt Bishop <>>
Thu, 4 Dec 2008 13:36:48 -0800

                       Workshop on GENI and Security
                             January 22-23, 2009
                            Davis, California, USA

The Global Environment for Network Innovations (GENI) is a suite of network
research infrastructures now in its design and prototyping phase. It is
sponsored by the National Science Foundation to support experimental
research in network science and engineering. The goal of this workshop is to
engage the security community in GENI's design and prototyping, to ensure
that security issues are properly considered during its development.

First, what classes of security experiments should GENI support? What
capabilities will GENI require to allow the conduct of these experiments?
Second, how can GENI itself be adequately secured and protected from attack?
What forms of authentication, authorization, and accountability would be
most appropriate?

As the GENI Project Office expects to issue its 2nd solicitation for GENI
analysis and prototyping subcontracts in the middle of December, with
proposals due in mid-February, it is anticipated that topics discussed at
the workshop will lead to proposals from the security community.

Participation. We invite short (1 paragraph preferably; at most 1 page)
statements of ideas addressing these two issues. Submit your statement to by December 18. Please use either PDF or text.

For more information:

Matt Bishop, UC Davis

MiniMetricon call for participation

<Fred Cohen <>>
Wed, 3 Dec 2008 16:21:03 -0800

For those interested in security metrics, here is the link to the Call for
Participation for MiniMetricon 3.5.  [20 April 2009, Moscone Center SF]

Fred Cohen & Associates 572 Leona Drive Livermore, CA 94550 1-925-454-0171

  [Frequent RISKS contributors Fred and Jeremy Epstein are on the PC.  PGN]

REVIEW: "The History of Information Security", de Leeuw/Bergstra

<Rob Slade <>>
Thu, 4 Dec 2008 11:03:07 -0800

BKHISCCH.RVW   20081020

"The History of Information Security", Karl de Leeuw/Jan Bergstra,
2007, 978-0-444-51608-4
%E   Karl de Leeuw
%E   Jan Bergstra
%C   256 Banbury Road, Oxford, OX2 7DH
%D   2007
%G   978-0-444-51608-4
%I   Elsevier Advanced Technology
%O   +44 865 512242 Fax: +44 865 310981
%O   Audience i Tech 1 Writing 2 (see revfaq.htm for explanation)
%P   887 p.
%T   "The History of Information Security: A Comprehensive Handbook"

Chapter one, which stands in for an introduction to the papers in this
volume, already notes that the title is inaccurate.  The editor admits
that this work is not a history, as such, but an overview from the
perspective of different disciplines related to information security,
taking a historical approach in examining the socio-political shaping
of infosec.  The authors ask whether technology influenced public
policy and politics, and look for information security strategies (or
the lack thereof) in politics.  I found the selection of references
disquieting, noting that the editor responsible for the choice of
papers complained that there was no historical material addressing
industrial espionage, administrative practices, disruption of
communications with criminal intent, or other areas.  No mention is
made, in the references, to the works of Stamp (cf. BKINSCPP.RVW),
Winkler (cf. BKCRPESP.RVW, BKSPAMUS.RVW), or Denning (cf.
BKDENING.RVW) to name just a few.

I can agree with the emphasis on social aspects of security: security
is, and always has been, a people problem.  Information security,
however, necessarily involves technology, and the authors of most of
the papers included in this collection have concentrated so much on
history (mostly in the form of dates and political rivalries) that the
questions of influence of technology on politics, or politics on
technology, can't really be analyzed.  Additionally, enormous topical
areas relevant to information security (such as risk management,
intrusion detection, cryptographic infrastructure (PKI), physical
security, computer architecture, application development, and malware)
are notable by their absence.

Part one addresses intellectual property.  Essay subjects include
various forms of censorship and self-censorship (with no mention of
the "full disclosure" debate), the German patent system, copyright,
and the application of copyright and patent to software.

Part two looks at items related to identity management, with a highly
abstract and impractical philosophy of identity, notes on document
security, a review of identity cards, and a recent history of

Although entitled "Communications Security," part three is about
cryptography.  The papers on Renaissance (1400-1650) and Dutch (up to
1800) cryptography, British postal interception up until the 1700s,
the KGB crypto office, and the NSA (US National Security Agency) are
of primarily political interest.  The articles on rotor cryptography,
Colossus, and the Hagelin machines have points of curiosity, but are
still very thin on technical details.  A final essay attempts a very
terse overview of modern cryptographic concepts.

Computer security is in part four.  Early US military evaluation
standards, some of the basic formal information security models, an
academic look at application security and auditing, a rough division
of recent information technology into decade "periods," an equally
unpolished history of Internet security, and a scattered review of
computer crime make up this section.

For some reason questions of privacy and regulations governing the
export of cryptography are seen to fit together in part five.  Three
papers present US cryptographic export restrictions, a random and not
completely successful attempt to define privacy, and various US
undertakings at regulating the use of encryption.

Part five can't have been lumped together simply due to a lack of
articles, since part six is a single piece providing a limited and
incomplete overview of information warfare.

As a book this volume is disappointing.  It is not "a history," merely
a collection of papers, with little structure or linkage.  The topics
relate to security, but a work on infosec should have more technical
content and understanding.  It is certainly not comprehensive.  And,
at several kilograms in weight, it bears little resemblance to a

That said, a number of the essays do provide interesting historical
points, anecdotes, and references.  Therefore, those with the stamina
to work through the material may be rewarded with historical nuggets,
and pointers to further sources of information.

copyright Robert M. Slade, 2008   BKHISCCH.RVW   20081020

Please report problems with the web pages to the maintainer