The RISKS Digest
Volume 24 Issue 61

Saturday, 31st March 2007

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…


Risks of Virtual Professionalism, Jim Horning
Quantum Security
Rob Slade
Time-handling bug leads to lost time machine
Alaska Government worker formats wrong disks, backups unreadable
Latent software risk in aircraft control systems
Mike Martin
Brazil software ATC failure
More railroad-related unintended risks
Satellite Navigation may be Hazardous to your Life Of Crime
Mark Brader
NEDAP, the Dutch chess-playing voting machine
Mark E. Smith
Typing saves your skin
Peter B. Ladkin
Proving NON copyright infringement
Joseph A. Dellinger
A parable about the state of the Web
Andrew Koenig
Hotel door locks that are too secure
Kevin Fu
Intuit's Amazing Web Pricing Roulette
Lauren Weinstein
Re: When security software goes bad...
Rick Damiani
Two-step authentication
Marc Auslander
Info on RISKS (comp.risks)

Risks of Virtual Professionalism, Jim Horning

<"Peter G. Neumann" <>>
Sun, 1 Apr 2007 00:00:03 GMT

Long-time RISKS contributor Jim Horning has written an outstanding Inside
Risks column for the {\it Communications of the ACM, 50,} 4, April 2007 on
{\it Risks of Virtual Professionalism}.  It raises issues of licensing
software engineers and of legal jurisdiction on the Internet.  Jim is a
member of the ACM Committee on Computers and Public Policy (CCPP, the
sponsor of the ACM Risks Digest) and has written several previous columns
for Inside Risks.

Although I cannot run his article here without violating CACM Copyright,
you can find it online for your own personal interest:
or with accelerated access:

Jim's article is the 202nd in the remarkably continuous ongoing monthly
series.  Previous columns issues are also online:
including Jim's April 2004 column:

Quantum Security

<Rob Slade <>>
Sun, 1 Apr 2007 00:14:10 -0000

Quantum computing is a field of research based upon the notion of quantum
entities known as qubits.  Unlike the classical computer bit, which can
exist in either a one or zero state, qubits can exists in a superposition of
both states simultaneously, and possibly more.  This may (or may not) enable
us to create new computer architectures which can (or can't) provide new
computing capabilities.

The ability for a qubit to hold both one and zero states simultaneously
implies that quantum computer architectures will be able to compute all
possible|each possible|every possible|all feasible|each feasible|every
feasible|all viable|each viable|every viable|all conceivable|each
conceivable|every conceivable|all imaginable|each imaginable|every
imaginable value for a given problem at once (or not).

Given this new and powerful computer architecture, we may (or may not) be
able to perform computations of NP-complete, non-convergent, or least path
problems in less than exponential times.  This has significant implications
for risk analysis and management.  Possibly the greatest risk is in pursuing
a technology which may never produce a real effect.  However, on February
13th of this year, a Canadian company demonstrated a device which is the
largest quantum computer built to date (or not).

The superposition factor of computing all possible values holds promise in
terms of encryption, but the relation to encryption does not end there.
Using the quantum phenomenon of entanglement, the sender can determine
whether or not a third party is reading transmissions.  (I wonder if anyone
is reading this?)  Unfortunately, the concepts of quantum encryption, and
quantum computing, although they use different technologies (or not), are
entangled in the public mind.

I have, as it happens, been working on a paper (for the next ISMH) on the
security implications of quantum computing.  At the moment, the paper is in
a superposition state of being written and not written.  (Until an observer
looks at it, have I really written the paper?)

Returning to the topic of risk management, quantum devices may be able to
compute, via an assessment of the lowest energy state, the optimum configu

Oh, I'm too tired to finish this off ...

Time-handling bug leads to lost time machine

<David <>>
Sun, 1 Apr 2007 00:34:47 -0000

(Inspired by the recent F22 Raptor computer crash crossing the
180-degree longitude...)

I built a working time machine and sent it back to the year 1500.  It was
supposed to take some pictures.  I was planning on retrieving it from 2007
immediately afterwards.  Unfortunately, I messed up the code for the
Gregorian calendar year adjustment and now I've lost it!  Help!

Alaska Government worker formats wrong disks, backups unreadable

<"Peter G. Neumann" <>>
Tue, 20 Mar 2007 14:49:52 PDT

A computer technician accidentally wiped out Alaska's huge data file (and
the backup disk) containing nine months worth of information on the annual
payout from the state fund (reportedly worth \$38 billion) that pays
dividends to Alaskans out of the oil revenues.  Seventy people had to work
overtime for six weeks to re-enter the lost data from 300 boxes of paper.
The error cost the state \$220,000 in overtime and consultants.  [Source:
CNN, 20 Mar 2007; PGN-ed, with thanks to Lauren Weinstein.]

[F. John Reinke also spotted this one (
and commented:
  Gooferment IT at its best. Great design and architecture. How come the
  only two copies of the data were in the same time zone? Where was security
  that one "custodian" could access both copies? Where was IT Leadership
  that had processes and procedures that could fail so miserably?  An
  interesting object lesson. In business, there would be terminations for
  all involved.  FJR

Latent software risk in aircraft control systems

<"mike martin" <>>
Fri, 16 Mar 2007 11:57:14 +1100

On 1 August 2005, shortly after departing from Perth, Australia, bound for
Kuala Lumpur, Malaysia, a Boeing B777-200 passenger aircraft suffered a
flight upset while climbing through 38,000 feet. It began when the aircraft
spontaneously pitched sharply upward, reaching 41,000 feet and activating
stall warnings. After pilots regained control they returned to Perth.

The incident was triggered by a second accelerometer failure in the
aircraft's air data inertial reference unit (ADIRU). This unit is designed
to be highly redundant and fault-tolerant but the first failed
accelerometer's failure mode was not one that had been anticipated during
unit design and development. (It had been assumed that a failure would
always result in zero voltage output, but this failed device was producing a
high output value.) The twin failures exposed a latent software fault, which
resulted in the unit feeding incorrect aircraft acceleration data to other
flight control systems.

Boeing B777-200 aircraft first entered service in 1995 and this is the first
reported instance of the particular software fault, which was apparently
present in the unit's original design, affecting operation of an aircraft.
The incident highlights the fact that software testing can never eliminate
all risk.

The Australian Transport Safety Bureau's investigation report is at

Brazil software ATC failure

<"Peter G. Neumann" <>>
Tue, 20 Mar 2007 14:49:52 PDT

The Brasilia air-traffic control center suffered a communications failure
(apparently due to software), and a subsequent power failure at the airport,
combined with unusually heavy rains.  Flights were disrupted over the
weekend and on into Tuesday.  Earlier outages occurred during the Christmas
holidays.  (The worst Brazilian air disaster occurred on 29 Sep 2006 when
a midair collision killed 154.)  [Source: AP item, 19 Mar 2007; PGN-ed],0,2571380.story?coll=sns-ap-nationworld-headlines

More railroad-related unintended risks

<"Peter G. Neumann" <>>
Thu, 22 Mar 2007 14:11:17 PDT

Here is something sort of similar to last fall's item on flat wheels and
slippery rails (RISKS-24.47,51-53).  In this case, locomotives have
difficulties reinitializing themselves after the computer system controlling
brakes, signals and throttle occasionally lost power.  Apparently control
signals were arriving out of order.  NJ Transit officials attributed the
failures to breaking in new engines (PL42s).  Reported NJ Transit late train
data looks like this:

  Year Delays
  ---- -----
  2002  400
  2003  680
  2004  653
  2005  732
  2006  830

[Source: David A. Michaels, Computer glitches causing delays for
NJ Transit, *The Record*, 27 Feb 2007; PGN-ed]

An added irony to this is that these trains run to Hoboken, NJ,
where Col. Stevens was a pioneer in the development of the steamboat, and by
1825 he had designed the first American-built steam locomotive (on the site
of Stevens Institute).

Satellite Navigation may be Hazardous to your Life Of Crime

< (Mark Brader)>
Tue, 20 Mar 2007 16:09:30 -0400 (EDT)

According to police, a man and a woman stole a Toyota Highlander SUV in
the Toronto suburb of Newmarket, planning to drive it to Alberta, but
relied on the GPS dashboard device for directions for the trip.  It duly
gave them the shortest route to Alberta — one passing south of Lake
Huron.  So when the license number was routinely checked at the US border,
the couple were arrested.  Allegedly they were on the approach to the
international bridge at Sarnia before they realized it was too late to
turn back.  A total of four people are now charged in a series of 70
vehicle break-ins in Newmarket in February and March.

(From today's Toronto Star:
Mark Brader, Toronto,

NEDAP, the Dutch chess-playing voting machine (Re: RISKS-24.60)

<"Mark E. Smith" <>>
Fri, 16 Mar 2007 17:59:23 -0700

Erling Kristiansen's submission brought to mind my post entitled, "Wish I'd
been wrong department," on Thursday, March 15th, to peoplecount, a
hand-counted paper ballots advocates' mailing list:

On 2/24/07, I wrote:

> When you feed ballots into a machine, neither you nor I nor anyone knows
> whether the machine is counting the votes or whether it is playing chess with
> the guy who will feed it the results he wants as soon as all the ballots
> have passed through it.

On page 24 of the April 2007 issue of Harper's magazine, which arrived in my
mailbox today, is a little article entitled, "Rooked." It says that a Dutch
organization called We Do Not Trust Voting Computers, bought two voting
machines to test and found that they were very insecure. They put out a
statement saying that one machine was so insecure that it "could just as
easily be programmed to play chess as to lie about election results." The
machine manufacturer, Nedap, challenged their claim, so the group actually
programmed the voting machine to play chess.

Typing saves your skin

<"Peter B. Ladkin" <>>
Thu, 29 Mar 2007 10:54:39 +0200

According to a news item from the U.K. Institution of Engineering and
Technology, a team organised by the SANS Institute analysed 7000 detected
security vulnerabilities from 1996 (the item says "the 7000" but doesn't say
further how they were identified), and found that 85% of them were caused by
three phenomena:

  * Failure to check user input
  * Allowing buffer overflows (that is, failing to hinder them)
  * Handling integer type checks or overflows incorrectly

SANS spotted an opportunity and put together a course and practical exam
about secure programming, leading to a certificate.

A few observations.

1. Security is not taken as seriously as safety, despite that computer
security problems probably cause more total resource damage than
accidents. I have long believed, with others, that the phenomena in both
areas are similar and thus that similar techniques may be used to assure
systems vulnerable to these sorts of phenomena. Devising a threat model is
very similar to hazard identification, but whereas hazard identification is
partly internationally normed, I suspect that people programming software on
networks, especially WWW-based SW, rarely have anything like a professional
engineering qualification or status and maybe do not feel as bound to
discover and adhere to norms that cover their tasks.

It might help to revise international standards on safety to use the word
"dependability" instead of safety, and to use the "specified loss"
formulation of the notion of accident rather than the "physical injury or
death" formulation, and then security vulnerabilities would be covered. Then
again, rather than leading to a higher standard of programming, this might
instead just serve to lower the standard of argument for dependability to be
found in the required documentation.

2. Working in a strongly-typed programming language would have avoided 85%
of the security vulnerabilities discovered (according to some unspecified
criteria) in 1996.

It is astonishing to me that 47 years after strong typing was invented and
recognised, and after the Turing Award has been presented to such proponents
as Dijkstra, Hoare, Wirth, Dahl, Nygaard and Naur, professionals not using
this technology caused 85% of significant errors in a specific area of
computing. I think it is disgraceful.

One could always hope that things have changed in the last 10 years. But
obviously the SANS Institute doesn't think so.

3. The social phenomena in program construction are overwhelmingly more
influential than technical progress. Nothing else could account for
phenomenon 2.

Peter B. Ladkin, Professor of Computer Netw & Distributed Systems, Univ. of
Bielefeld, 33594 Bielefeld, Germany +49(0)521 880 7319

  [The SANS of time are measured with geological-scale egg-timers.  PGN]

Proving NON copyright infringement

<"Joseph A. Dellinger" <>>
Sun, 18 Mar 2007 00:35:39 -0600

Some years ago I created some web pages on geological hazards, being careful
to populate them only with photos I actually took myself.  No risk of a
copyright infringement lawsuit there, right? Wrong!

Over the years, I have been requested several times for permission to use
some of my images. I built the web pages for fun, so I always said "sure,
just credit me as the photographer, please". Unfortunately, more often than
not, when I've later happened to run across one of my photographs (for
example, on a geological kiosk on the San Francisco waterfront), I have
found that they either failed to credit me, or worse, credited someone else.

Which leads to the risk. A few weeks ago I was told my web pages had been
removed at the request of an author of a textbook on geological hazards. An
image on my website had clearly been illegally scanned from his
textbook. The images were clearly identical. Open and shut case.

Fortunately, the server showed that the offending image
pre-dated the first publication of the textbook. Also fortunately, on the
basis of that piece of evidence the author was willing to hear me out, even
though he swore he remembered personally taking that photo. He tracked down
his original. Turns out it was a "stock" photo provided by his textbook
publisher, and they had gotten their copy from... me. Sloppiness in keeping
track of who owned what combined with the normal failings of human memory
and the passage of time did the rest.

I was lucky in that I had physical evidence to prove I was the original
photographer, and my web host and the source of the complaint were both
willing to listen to my "implausible" story. And now I will make sure that I
keep original unedited source material for anything I make available online,
just in case I need to later prove that my site is the source, not the
infringer. Non-infringing users of sites like "YouTube" are well advised to
do the same!

A parable about the state of the Web

<"Andrew Koenig" <>>
Mon, 26 Mar 2007 08:10:26 -0400

I was browsing through Yahoo! Finance today and encountered an article with
a significant factual error.  How significant?  Judge for yourself.  The
article recommended two mutual funds, claiming that one of them "gives you
exposure to both large cap and small cap companies."  That claim is not
true: The fund in question, VEXMX, covers the entire domestic stock market
EXCEPT FOR the S&P 500, so it has no large-cap coverage at all.  So
investors in this fund would get a significantly different risk profile than
the article would lead one to believe.

I wanted to send them a correction.  So I looked for an e-mail address to use
for that purpose.  Nothing.  But they do let you post comments about the
article--at least I can do that.

Not so fast: To post a comment, you must sign in.  To sign in, you must have
a Yahoo! account.  And to get one of those, you have to agree to this:

	You are responsible for maintaining the confidentiality
	of the password and account and are fully responsible
	for all activities that occur under your password or account.

And this:

	You agree to indemnify and hold Yahoo! and its subsidiaries,
	affiliates, officers, agents, employees, partners and licensors
	harmless from any claim or demand, including reasonable
	attorneys' fees, made by any third party due to or arising
	out of Content you submit, post, transmit or otherwise
	make available through the Service, your use of the Service,
	your connection to the Service, your violation of the TOS,
	or your violation of any rights of another.

And, finally, this:

	You and Yahoo! agree to submit to the personal and exclusive
	jurisdiction of the courts located within the county of
	Santa Clara, California.

In other words: In order to comment on factually incorrect financial advice
given on this website, I have to agree that if anyone steals my password
from their service and uses it to do something they shouldn't, and someone
sues Yahoo! as a result, then I have to pay both my legal expenses and
theirs, and pay any judgment against them if they lose, AND go to California
to defend the suit.

Some situations speak for themselves.

Hotel door locks that are too secure

<Kevin Fu <>>
Sun, 18 Mar 2007 07:14:43 -0400

During a recent stay at the Best Western in Rockville, MD, a long line
formed at the check-in counter.  The desk attendant told me that the new
OPERA property management system was just installed earlier in the week, and
several problems prevented guests from entering their rooms.  Namely:

1. New hotel swipe cards could not be created.
2. The master key was missing.

Problem (1) was apparently on-going throughout the week.  To work around the
problem, the desk attendant used his master key to let each guest into their
room.  That is, each guest was escorted to a room and warned that re-entry
would require the desk attendant's help.  Unfortunately, problem (2) caused
complete chaos because now the desk attendant could not open rooms either.

No room keys could be made because after a shift change an employee
accidentally took the master key home.

The desk attendant tried multiple master keys to no avail.  The desk
attendant tried frantically to call the Emergency Services number for the
hotel chain, but he only reached voicemail boxes.  There were
representatives from the company that sold the property management software
on site because of the new rollout and employee training, but they were
equally helpless without the master key.

A group of roomless customers (including me) gathered in the bar for free
drinks.  We learned from the bartender that this problem had been going on
all week.  Whenever the Internet goes down, no one can get room keys or
check out (according to the desk clerk).  The system is so secure, that the
rooms are reserved and yet empty.

One customer in the bar had already checked into his room, but was no longer
able to enter because he stepped out for an errand.  His medication was
locked in his room for several hours, but fortunately lack of medication
allowed him to drink beer without complications.

Eventually, the embarrassed desk attendant returned with the master key.
The hotel escorted each customer to their room with the master key, but was
still unable to create room keys for guests.  We were advised to keep one
person in the room at all times to ensure re-entry.

As I sit here nursing a tasteless beer, I wonder about the principles for
designing a safe and secure property management system.  Fail- safe defaults
come to mind, as does fault tolerance.  A simple DoS attack that disturbs
the network would prevent swipe cards from being created.  Being secure is
nice, but backdoors have their applications.

[This is a write-up of an a-dorable hotel property management system that I
encountered last October.  -KF]

Kevin Fu, Assistant Professor, Computer Science Department, University of
Massachusetts Amherst Ph: 413-545-4006

Intuit's Amazing Web Pricing Roulette

<Lauren Weinstein <>>
Fri, 23 Mar 2007 08:32:32 -0700

                    "Intuit's Amazing Web Pricing Roulette"
                ( )

Greetings.  Earlier this year, on Dave Farber's IP list, I noted my disgust
with Intuit's upgrade pricing policy and related customer service
discussions — what I called "Intuit's 'Bait & Switch'"
( ) — which amounted to no discount
at all if you only wanted the basic Quicken upgrade.

Now it's time for a much more bizarre installment — "Intuit's Amazing Web
Pricing Roulette" ... and if this ends up looking confusing, that's because
it is.

At the present time, depending on exactly how you hit the Intuit Quicken Web
site ( ), you may be presented with different
prices for the same product (in my test cases, Quicken Basic).

In tests so far, I've been offered three different prices:

 — $29.99 (regular retail — typical store price and what I was
            originally told was the only available online price
            whether upgrading or not).

 — $24.89 (with free shipping — worthless if you download the package
            — this one may be difficult to find, so here's proof:

 — $19.99 (the lowest price)

Which of these prices you will see on their Web site appears to depend on a
mix of factors.  Whether or not you say you are upgrading does not seem to
have an effect.

A key issue appears to be your cookie settings.

If your cookies are off, you are likely to see $29.99.  If your cookies are
on, you will most likely be offered $19.99.

In at least some cases, if you try to order at $29.99 with cookies off,
you'll be told to turn cookies on, then you'll see $19.99 after you've done
so.  In other cases, you may find $29.99 (or $24.89) carried down all the
way through the purchase process (here's an example of the high price being
used: .

I am seeing different results depending on the exact sequencing of pages,
cookies, and Web browser in use (e.g. Firefox vs. IE).

I have not attempted to delineate all possible permutations or the
underlying "rationale" for this behavior, but I would obviously urge extreme
caution in dealing with this site.  +1 (818) 225-2800
Lauren's Blog: DayThink:

Re: When security software goes bad...

<"Rick Damiani" <>>
Sat, 17 Mar 2007 16:04:44 -0700

This is actually the re-surfacing of a well known problem with e-mail
databases (see Microsoft KB253111, KB262374, KB822158, KB893083, etc.).  I
had a similar problem with an older version of Symantec AV running on an
exchange 5.5 server. Most of the time it would catch viruses when they
showed up in the 'inbound' folder preventing exchange from doing any
processing on the infected e-mail. One update added a definition for a virus
that had made it through that process and was in the .edb file (exchange's
database file), so Symantec AV quarantined it. That crashed the exchange
server, with predictable results.

The fix (from Microsoft and Symantec) was to replace the edb file and
exclude that folder from processing. Later versions of Symantec added the
exclusion on their own. I would say that the root problem is the old Not
Invented Here syndrome leading to a failure to learn from history, but MS
purchased a (very small) AV company rather than develop an AV tool of their
own from scratch. I guess that would be NIH at one remove.

Rick Damiani, Applications Engineer, The Paton Group
California: (310)429-7095  Hawaii: (808)284-3033

Two-step authentication

<"Marc Auslander" <>>
Fri, 16 Mar 2007 20:42:23 -0400

A silly law has forced many financial institutions to implement two-step
authentication.  You know how it works.  You choose a picture and/or phrase.
When you log in, you present your user id, they present the picture/phase
warning you to check it, and you then provide a password. Of course, you
have to remember a different challenge for each site you use, and remember
which ones use this scheme and which don't.

This is not only useless, it's downright dangerous.

It's useless because the average user who's susceptible to phishing is
unlikely to notice a missing challenge.  Even a sophisticated user is
unlikely to notice, IMHO.  The naive phishing site isn't going to put up a
random picture and tell you to check it, after all.  They'll just skip the
whole thing and hope you don't notice!

But its worse than that.  A sophisticated phishing site could implement a
simple man-in-the-middle system.  You provide your id, they send it off to
your bank, get back the challenge, and show it to you.  Now you are really
ready to believe you are safe!

Whatever the solution to phishing is, it isn't expecting end users to
remember a complicated protocol and notice then its not quite right.

Please report problems with the web pages to the maintainer