The RISKS Digest
Volume 24 Issue 14

Wednesday, 4th January 2006

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

United airlines computer out/r/age
Mark Seiden
Cat dials 911, saves owner
Amos Shapir
System fakes prisoner releases
Peter Scott
Marriott customer data for 200,000 missing
Monty Solomon
Another calendar error
Bruce Stein
Greenpeace donation transfers accidentally multiplied by 100
Nick Rothwell
PDF documents can leak image data
Geoff Kuenning
Re: The drunks may save our election system
Tanner Andrews
Re: Kansas Lottery Picks Same Number Three Nights in a Row
Aaron Emigh
Re: Double compiling for debugging
Ken Knowlton
Never write checks on your birthday
Bob Mehlman
Re: Sat nav systems
Graham Reed
Expedia doesn't understand phishing
Art
False positive on check
F John Reinke
REVIEW: "CyberTerror", R.J. Pineiro
Rob Slade
Info on RISKS (comp.risks)

United airlines computer out/r/age (From Dave Farber's IP)

<mis@seiden.com>
January 4, 2006 3:55:16 PM EST

What, in this day and age, would cause a complete more-than-5-hour outage of
an system mission critical for an airline?

According to AP and Reuters:

  "Computer Glitch Delays United Air Flights In US, 3 Jan 2006
  United Airlines' domestic flights were delayed up to 90 minutes Tuesday
  night because of an outage in the computer system controlling United's
  check-ins and reservations, which went down for about five hours around 5
  p.m. CST Wednesday.  Passengers were checked manually, and flights were
  delayed up to 90 minutes.  [PGN-ed]

But according to me, who was at LAX yesterday trying to get to Oakland at
5pm on their one-and-only flight, the outage was complete and system-wide.

* No self-check-in kiosks working, reservationists answering the phone with
"our computers are still down", which meant every queue had more than 500
people in it, spilling out on the sidewalk outside the terminal, and they
were using "the manual procedure".  the people close to the head of the
queue had been waiting for more than two hours, they said, and they
dispensed with the special queues for premier or 1k, just to spread the pain
equally.

* They weren't calling out specific flights to try to fill them.

* They had most of the check-in desks empty.  Obviously they don't have
enough people trained in the manual procedure to alleviate the bottleneck.

* The woman working the lines (with a megaphone) was apologetic, but
wouldn't answer questions, not even frequently asked questions which did not
have to with individual problems, such as "if I miss my last flight will you
provide a hotel?  or is my ticket now refundable if I fly another carrier?

* some reports are they were flying planes half-empty because people
couldn't get to the gates.  of course, they weren't announcing how long they
were holding flights to try to board them.

* TSA, not known for their flexibility, was not allowing people to go to the
gates directly with a boarding pass.  Even an e-ticket receipt with a seat
assignment wouldn't get you there.

United stock is down 2% today, trading at around a buck a share.  their
earnings are -$43 per share at the moment.  I'll bet this was an expensive
failure.

(As for me, I scooted right over to Southwest, and got out only 1.5 hours
later, but buying a one-way last minute ticket guarantees you'll get the
dreaded four ssss "special screening" on your boarding pass.)

[IP Archives: http://www.interesting-people.org/archives/interesting-people/]


Cat dials 911, saves owner

<"Amos Shapir" <amos083@hotmail.com>>
Wed, 04 Jan 2006 16:23:00 +0200

See details in http://www.msnbc.msn.com/id/10663270/?GT1=7538 .  (I think
there was a similar report on RISKS a few years back, that time about a
dog).

  [Yes.  For example, The risks of Canadian Poodles using 911, RISKS-15.70.
  PGN]<corrected in archive>


System fakes prisoner releases

<Peter Scott <risks@psdt.com>>
Sat, 31 Dec 2005 17:07:05 -0800

The RISKS archives include several cases of prisoners being erroneously
released by errant computer systems.  This might be the first case of a
system that only pretended to release them.  CNN reports at
  http://us.cnn.com/2005/LAW/12/31/inmate.scare.ap/index.html
that an automated notification system at the Ohio Department of
Rehabilitation and Correction telephoned about 3,000 people the day before
New Year's Eve to inform them of the recent release of a prisoner that had
victimized them or a family member.  Unfortunately - or fortunately,
depending on how charitable you are - that wasn't the truth.  The prisoners
had not been released but were listed in a file accidentally sent to the
contractor that handled notifications.  No word on whether the size of that
file was unusually large.


Marriott customer data for 200,000 missing

<Monty Solomon <monty@roscom.com>>
Wed, 28 Dec 2005 23:10:33 -0500

The timeshare unit of Marriott International Inc. is notifying more than
200,000 people that their personal data are missing after backup computer
tapes went missing from a Florida office.  The data relates to 206,000
employees, timeshare owners and timeshare customers of Marriott Vacation
Club International, the company said in a statement Tuesday. The computer
tapes were stored in Orlando, where the unit is based.

The company did not say when the tapes disappeared. They contained Social
Security numbers, bank and credit card numbers, according to letters the
company began sending customers on Saturday. ...  [*The Boston Globe*, 28
Dec 2005]

http://www.boston.com/business/articles/2005/12/28/marriott_customer_data_for_200000_missing/


Another calendar error

<Bruce Stein <bruce42@pacbell.net>>
Thu, 29 Dec 2005 18:59:22 -0800 (PST)

Go to http://www.protopage.com .  This is a free site where you can design aa
home page for yourself.  There is a calendar in the upper right hand corner.
Hover your cursor on it and it will change to a full calendar for the
current month.  Use the left arrow on this calendar to go back one month.
Continue doing this until you get to January, 2001.  Then go back one more
time.  You are now in December 3900. (!)


Greenpeace donation transfers accidentally multiplied by 100

<Nick Rothwell <nick@cassiel.com>>
Thu, 29 Dec 2005 23:11:50 +0000

Approximately 10000 UK supporters of Greenpeace who make regular donations
by direct debit have have accidentally had their bank accounts debited by a
hundred times their usual amount, with its software adding two noughts to
the latest batch of direct debit demands.
  http://news.bbc.co.uk/1/hi/uk/4567944.stm

I would hazard a guess that some manual intervention was made, perhaps to
update the records for a new calendar year, leading to a mistake by a real
human being rather than "the computer."

nick rothwell  http://www.cassiel.com

  [A different kind of environmental hazard,
  the Greenpeace dreadnought strikes again.  PGN]


PDF documents can leak image data

<Geoff Kuenning <geoff@cs.hmc.edu>>
04 Jan 2006 02:09:06 -0800

A colleague recently provided me with a PDF of a presentation he created
using Keynote on a Macintosh.  I needed to use some photographs from that
document in a presentation of my own, so I used pdfimages, a public-domain
tool, to extract them.  Imagine my surprise when I discovered several images
that were not apparent in the original, including logos for Yahoo and MSN, a
snapshot of a commercial Web page, and a photograph of some former students.

I have not experimented with random files from the Web, so I don't know what
tool is responsible for inserting the inadvertent images in the file,
although it seems to be a classic case of using an existing document as a
template for a new one.  Clearly, however, PDF documents are capable of
carrying images that are not visible to the casual user, and thus risk
leaking information in the same way as Microsoft Word and Powerpoint.

Geoff Kuenning   geoff@cs.hmc.edu   http://www.cs.hmc.edu/~geoff/

  [For example, see RISKS-23.86-88 for the discussion on using PDF to
  redact classified documents.  PGN]


Re: The drunks may save our election system (RISKS-24.14)

<tanner andrews <tanner@payer.org>>
Thu, 29 Dec 2005 09:43:06 -0500 (EST)

db-) [if drunk drivers an see code, why can't voters?]

  ** First, let me be clear that I am not a lawyer.  This
  ** is a political opinion piece, not legal advice.

Distinguish the drunks, who are entitled by law to ``full information'',
State v. Muldowny and Pitts, 871 So.2d 911 (Fla. 5DCA 2004) (discussing
Fla. Stat. 316.1932(1)(f)(4)), from the voters who have no obvious similar
entitlement.

1. Muldowny and Pitts prevailed under a theory that they had
a right to discovery in their respective criminal cases.
The court agreed, criticizing the box as ``a mystical
machine'' in the absence of source: it simply inhaled
breath samples and spat out a report of guilt.

The burden in a criminal case is on the state to show that the machine was
certified.  Because the firmware is an essential component of the machine
(perhaps the single most important, and easiest to change), they were
entitled to see the code and verify that it was as certified.  Failing that,
of course, you can have a ``Wizard of Oz'' effect, where the man behind the
curtain presses a secret button and the machine says ``drunk''.

2. Voter cases are different.  They obviously cannot rely on a discovery
theory as in _Muldowny_ because the ptfs would not be charged with any
crime.  Standing can probably be had by having an affected voter file a
protest; a losing candidate would be the obvious ptf.  However, the barrier
is that the ptf must have knowledge of actual fraud, and must swear to it.

This gives rise to a chicken-and-egg problem.  How is the voter to know of
the fraud without inspecting the machine?  And how is the voter to gain
access to inspect the machine, absent knowledge of fraud?

The _Muldowny_ defs attacked the certification of the machine, in part.  The
statute required that the machine be certified, _Muldowny_ at 913
(discussing Fla. Stat.  316.1932(1)(a)), and material changes would require
new certification.  The defs wanted to show that the machine as used was not
the same as was certified.

The voter ptf will have to show that the use of uncertified equipment
affected the outcome.  Courts are reluctant to overturn elections.
Beckstrom v. Canvassing Board, 707 So.2d 720 (Fla. 1998) (gross negligence,
but no fraud, so affirming result preserving election); Boardman v. Esteva,
323 So.2d 259 (Fla. 1975).

Following _Beckstrom_, the ptf will have to show actual fraud in the
handling of the votes in order to prevail.  This will be a higher hurdle
than it might appear.  In _Beckstrom_, the supervisor of elections allowed
Vogel supporters to ``correct'' ballots that were incorrectly marked for
Beckstrom.  This was held to be gross negligence but not fraud.

I would expect that a pre-load, as was demonstrated in Leon, might qualify
as actual fraud.  A pre-load is where one sets the number of votes for one
candidate to +N and for the other to -N, such that the total is still zero.
The negative count rolls over, of course, during the course of the day.

3. An alternative theory is to attack under Fla. Stat. 119.07 (Public
Records law).  Ballots are inspectable as public records, though the
conditions of inspection are onerous.  It could be argued, though likely
without success, that the machines' guts are public records as well.

A public record is (1) a record (2) made or received (3) during the course
of official business.  Adv. Op, David Wagner re: Legal Bills,
Fla. AGO-2000-7; Shevin v. Bryan, 379 So.2d 633, 640 (Fla.  1980).
Certainly the ballots qualify on all elements.

It seems likely that the machines are made or received during the course of
official business.  But do they qualify as records?

The supervisor of elections never receives the source code, and I do not
believe that the Department of Elections does either.  It is hard to see it
as a public record on that basis.

Could we at least see the machine code?  I don't think this theory works,
either: if it did, we could all have a copy of Windows for the cost of
reproduction, assuming they use the same at City Hall.

If that theory works, how about embedded devices?  Could we require the road
department to open up and let us dump the code out of computer-based
surveying equipment?

The essential quality of being ``a record'' is missing in these cases.  The
machine code in the voting machine, or in the desktop computer, or in the
surveying equipment, is not a record: it is not the preservation and
transfer of knowledge.  It is more analogous to the power steering arm of a
car: it is there to perform a function, not to convey knowledge; the
engineering knowledge embedded in it is there only for the purpose of
accomplishing the function.

Accordingly, I would not expect a Public Records attack to open up the
source for the machines.

4. The analysis changes if the device uses any GPL code.  In such a case,
delivery of the device necessarily implies delivery of the object code, and
the licensing terms require that copies of the source be made available to
anyone to whom the object is given.

The Supervisor of elections would be entitled, under the GPL, to the source
code of a machine using GPL code in its deliverables.

An entity cannot defeat public records inquiry by reposing custody in a
third party.  Times v. St Pete, 558 So.2d 487 (Fla.  1990).  The interested
person may go to the Supervisor's office and require that a record of that
office be produced.  Such an attack seems likely to prevail, though the
litigation may be expensive and time-consuming.

5. It seems unlikely that a voter could use _Muldowny_ to open up the code
to black box voting machines.  Nor is a general public record challenge
likely to work, unless the machine uses GPL code.


Re: Kansas Lottery Picks Same Number Three Nights in a Row (R-24.13)

<"Aaron Emigh" <aaron-risks@radixlabs.com>>
Wed, 28 Dec 2005 19:44:08 -0800

The article in RISKS-24.13 states that "The odds of winning the lottery are
one in 1,000. The probability that the numbers will be the same three nights
in a row are a staggering one in a billion."  This is off by three orders of
magnitude.

Of course, the odds of drawing the digits 5-0-9, or any other specific
combination, three nights in a row are one in a billion with an honest
random number generator.  But we don't care what number is drawn the first
night.  For a three-peat, we require only that that first night's number,
whatever it is, be drawn again twice.  The odds are one in a million, not
one in a billion.  The observed sequence is a curious fluke, but not
entirely implausible for a properly functioning random number generator.
Many improbable properties can be found in nearly any large dataset...

  [Also noted by George Kaplan.  PGN]


Re: Double compiling for debugging (Wheeler, RISKS 24.13)

<Ken Knowlton <KCKnowlton@aol.com>>
Thu, 29 Dec 2005 10:34:29 EST

David Wheeler's comments on double compiling (RISKS-24.13) bring to mind a
paper of mine, "A Combination Hardware-Software Debugging System," *IEEE
Trans. Computers*, C-17, 1, Jan 1968, pp 84-86. Briefly:

  Two versions of a program, logically identical, have sections of program
  and data mapped differently into memory; storage is initialized with the
  same sequences of "random" numbers.  The programs are run synchronously.
  The hardware knows which parts of instructions and data — including data
  to be overwritten — should match, and complains when they don't.  Several
  kinds of error are thus detected close on the heels of misbehavior.

    [There is nothing knew under the son of the farther...  PGN]


Never write checks on your birthday

<rmehlman@jumpy.igpp.ucla.edu>
Fri, 30 Dec 2005 19:56:43 -0800 (PST)

You'll get the year wrong...

...and may not even notice, since you've written your date of birth
so many times.  (Well, it's a risk of the human computer.)


Re: Sat nav systems (Dunn, RISKS-24.13)

<Graham Reed <greed@pobox.com>>
Tue, 03 Jan 2006 15:18:10 -0500

"Sean Dunn" <sad14159@hotmail.com> writes:
> GPS systems can hardly be programmed to avoid seedy neighborhoods without
> political uproar. On the other hand, there are roads that shouldn't be
> traveled at some times of the day...

However, the newer-generation of aftermarket units, at least those from
Garmin, can be provided with both rectangular and road-based "avoidances"
loaded at the user's request.  In Garmin's case, the avoidances can be used
for on-computer route planning with older units, but not for route planning
or re-calculating on the unit itself.

So, although it would be politically wrong for the GPS makers to pre-load
such data, user groups could collude to fill in the gap, and provided
down-loadable files that can be used to set up the programmed avoidances on
the GPS units.  At least, the ones that can be programmed by your PC in the
first place.

Mind you, this raises a new RISK of people seeding the database with bad
data for other reasons: keeping folks away from competing businesses, for
example.  But that's not really new, downloading untrusted data from the
Internet is a RISK as old as the 'net itself.

GPS is a case of a technology that works more than well enough in general,
that it is very easy to forget its limitations.  Right up until the time
you're looking at a muddy gravel road on your heavy sport-touring motorcycle
because the road was supposed to have been paved, but the budget was cut so
the work was never done....

  [Of course, map makers always seed their maps with a few intentional
  errors to be able to spot ripoffs.  PGN]


Expedia doesn't understand phishing

<art-risks@dontsharemyemail.com>
Thu, 29 Dec 2005 09:36:57 -0500

I use a unique email address for things I sign up for online so that I can
track email leakages.

The other day I received an email to my expedia email from
usmail@expediamail.com - a domain that pops up a blank page in my
browser. It was offering some wonderful offer if I just clicked on an
encoded link that went to expediamail.com.

Q: In this day and age of phishing, how retarded does a company have to be
to use a domain that is similar, but different, from its own domain to send
out "wonderful offers" from?

A: As retarded as only Microsoft can be apparently. I wrote to Expedia and
they confirmed that they use that address to send out promotional
offers. They told me how to stop receiving them, but when I went to set my
preferences to not get them, they were already set to not get them. So
apparently Expedia doesn't even adhere to their own members' preferences.

When I asked about that, they said "yes, you aren't signed up to receive the
offers, maybe someone else did it (after having confirmed that they did it),
here's how you turn off receiving offers..."

The risks are losing potential customers by sending out emails that look
like phishing expeditions


False positive on check

<"F John Reinke fjr@anywhere" <reinkefj@yahoo.com>>
Sun, 1 Jan 2006 12:35:42 -0500

  The person operating the cash register told [Dan] Ring his account had
  been flagged for some reason, and he might want to contact his bank.
  [Excerpt from Bruce Mohl, *The Boston Globe*, 1 Jan 2006]

Here's an example of Type 1 error - rejecting a good check thereby losing
the retailer a sale. Of equal interest should be the approval of a bum
check. It appears that the reporter really didn't dig. I wonder where the
bodies are buried. It's usually found by following the money trail. Since
the retailer doesn't know the customer, they probably don't value the sale
properly. I know from the "publisher's free offers", that the repeat
business from a satisfied customer is worth a premium. In this case, if the
retailer loses the sale and the chance for repeat business, then that indeed
is an expensive rejection. Hmmm?

http://www.boston.com/business/globe/articles/2006/01/01/check_verification_system_is_vulnerable_to_mistakes/?rss_id=Boston.com+%2F+Business+%2F+Personal+Finance+-+Money+Management+-+Financial+Management+-+Boston.com

  [The article points out that less than a half percent of $790 billion
  point-of-sale checks are erroneously rejected by a system that decides in
  about a third of a second whether a check might be bogus.  PGN]


REVIEW: "CyberTerror", R.J. Pineiro

<Rob Slade <rMslade@shaw.ca>>
Tue, 27 Dec 2005 20:44:23 -0800

BKCBRTER.RVW   20050929

"CyberTerror", R. J. Pineiro, 2003, 0-765-34304-5
%A   R. J. Pineiro author@rjpineiro.com
%C   175 Fifth Avenue, New York, NY  10010
%D   2003
%G   0-765-34304-5
%I   Tor Books/Tom Doherty Assoc.
%O   pnh@tor.com www.tor.com
%O  http://www.amazon.com/exec/obidos/ASIN/0765343045/robsladesinterne
    http://www.amazon.co.uk/exec/obidos/ASIN/0765343045/robsladesinte-21
%O   http://www.amazon.ca/exec/obidos/ASIN/0765343045/robsladesin03-20
%O   Audience n- Tech 0 Writing 1 (see revfaq.htm for explanation)
%P   493 p.
%T   "CyberTerror"

Now, those who follow this series will know that, in my opinion, most of the
hype over cyberterrorism is a) overblown, and b) looking at the wrong things
anyway.  However, this book goes beyond the norm.  It reminds me of that old
joke about the difference between a used car salesman and a computer
salesman being that the used car salesman knows when he is lying to you.

All right, let's look at what he got right.  Yes, computers do control a lot
of "infrastructure."  Yes, the worst disasters are when there are multiple
(and usually cascading) failures in both control and safety systems.  Yes,
developers, maintainers, and even service people do leave trapdoors in
systems.  And, yes again, if you were going to perform terrorist acts, it
would be best to target a number of interrelated systems.

Now, before we look at the technical problems, a few practical ones.  The
advantage of cyberterrorism is said to be that you can, from the comfort of
your own (remote and safe) hacienda, blow up your enemy's city with a few
keystrokes.  The terrorists in this book must be pretty unskilled, because
they seem to need money, traitors, advance information, bomb materials--in
short, everything that any other terrorists need when they are doing
noncyberterrorism.  (The characters aren't terribly consistent: for example,
we have one Middle Eastern terrorist who reverts to Hispanic at moments of
stress.)

As for the technology, it isn't good.  We have the usual movie-script-
oriented virtual reality interface, completely ignoring the realities of
internal computer operations, and the fact that providing complicated
forensic information via a simple graphical interface would be a very
difficult task indeed.  (Oh, and we also have the famous, mythical
"digital-pulse-bomb-that-gets-from-the-computer-into-
your-head-and-gives-you-a-stroke" program.)  Pineiro contradicts himself,
telling us that there is a virus, then that there is no evidence of a virus
(the mythical "undetectable" virus: a virus *always* changes *something*),
and then that there is a virus.  (The author never defines what a virus is,
which, given how much else he gets wrong, is probably a good thing.
Supposedly a virus can be used as traceroute, a RAT, a trojan, or anything
you want.)  While it was a big deal fifteen years ago, a T1 carrier is
hardly high-speed anymore, particularly between related companies.  As a
devotee of software forensics, I approve of the fact that characteristics of
a computer system can be used to gain information about the user, but I
hardly think it boils down to a choice of pink defensive software for girls
and blue for boys.

Pineiro does not seem to know the difference between computer hardware
and computer software.  (We have, of course, already seen that
computer software can generate sufficient power to fry circuitry, and
even people.)  Programs (some of which can be as small as two bytes
long) communicate via certain frequencies, like radio signals.  When
you stop the system clock, somehow memory locations begin to lose
charge.  (No, I don't think he is referring to the fact that DRAM
needs to refresh every millisecond or so.)  The author also doesn't
seem to realize that, regardless of what language was used to write
the original program, most software in production systems tends to be
object code.  (He also seems to think that you can stop the system
clock and thus halt programs originally written in Ada, but leave
programs originally written in C still running.)

With their magical virtual reality interface, the blackhats never seem to
need to know what system they are attacking.  It's got some UNIX- like
characteristics, but that blue screen just has to be Windows.  Which is too
bad, given that most embedded systems tend to be specialized hardware, and
not subject to any off-the-shelf malware.  (As of the mid-90s, most nuclear
power plants still used PDPs, keeping at least one plant running turning out
replacement parts for them.)

Pineiro also displays his ignorance of artificial intelligence.  Despite his
"neural-like" type of expert system program that amalgamates all known AI
techniques, a neural net is one approach to AI, while an expert system is
quite a different one.  Not all AI systems are capable of learning: in fact,
it's quite a feat to put learning capability into a package.  (And I love
the "Turing Society": I'm sure that those in Turing's home country of
Britain would be thrilled to have the US defence department deciding who
can, and can't, mess around with their AI programs.  The implication of the
Society is rather Frankensteinish, although Hans Moravec, in "Robot: Mere
Machine to Transcendent Mind" [cf.BKRBTMMT.RVW], would probably agree with
the possibility of AI taking over, if not the necessity of inhibiting it.)

Cyberterrorism is certainly possible, and a lot of systems should be
protected more rigorously than they are at present.  However, this book
provides no feeling for the realities of cyberterrorism--or anything else,
for that matter.

copyright Robert M. Slade, 2005   BKCBRTER.RVW   20050929
rslade@vcn.bc.ca      slade@victoria.tc.ca      rslade@sun.soci.niu.edu
http://victoria.tc.ca/techrev    or    http://sun.soci.niu.edu/~rslade

Please report problems with the web pages to the maintainer

x
Top