The RISKS Digest
Volume 24 Issue 36

Tuesday, 8th August 2006

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…


Electrical Fires in Queens
R. Mercuri
AOL releases 500K users' search queries — The Last Straw
Lauren Weinstein with included analysis by Seth Finkelstein
Digital retouching of photos to make a propaganda point
Jeremy Epstein
Voting machines in Ireland and The Netherlands
Erling Kristiansen
Dutch energy company Eneco sends huge bill
Leon Kuunders
Robot car park holds cars hostage
Steve Klein
German road pricing system should help fighting crime, politicians say
Harald Vogt
Unexpected consequences of airport random-screening glitch
Steve Summit
RFID Clonable
Brad Malin via Dave Farber's IP
Re: The Risks of retro computing?
Tom Watson
IEEE e-mail alias service with Comcast
Christopher Stacy
REVIEW: "Symbian OS Platform Security", Craig Heath
Rob Slade
Info on RISKS (comp.risks)

Electrical Fires in Queens

<"R. Mercuri" <>>
Mon, 24 Jul 2006 11:05:43 -0400

I don't know why the National news, or RISKS, isn't covering the electrical
fires in Queens. There's been a series of power outages, exacerbated by
fires that start off more fires. The news footage on NY TV stations is
astonishing — burning wires in the air, and explosions in manholes. The NY
Times had an article at:
and there's a community paper with a scary photo at:
but otherwise there hasn't been much or consistent coverage by the major US
media (see google news: electrical fires astoria), even though thousands
have been without power, now for a week. Residents, business owners, and
local leaders are becoming peeved at being ignored (including by Governor
Pataki who's been refusing to declare the area a disaster).

  [The outage lasted for about a week, but the aging underground
  infrastructure is most likely still fragile and vulnerable to more
  such outages.  Some of the wiring apparently dated to the beginning
  of the previous century.  PGN]

AOL releases 500K users' search queries — The Last Straw

<Lauren Weinstein <>>
Mon, 7 Aug 2006 09:55:15 -0700 (PDT)

Greetings.  I've written and spoken many times about the sensitivity of
search engine query data.  We all know about Google's stance in DOJ vs.
Google early this year, where Google wisely attempted (for several reasons)
to prevent release of such data to a government fishing expedition related
to "child protection" legislation.  We also know that Gonzales, et al. are
merrily pushing mandated data retention laws — again mainly in the name of
child protection — that would leave Internet users vulnerable to all manner
of unreasonable surveillance of their Internet activities.  All of this is
already enough to be sounding alarm bells regarding the lack of reasonable
legislated protections for such data.

The AOL action in releasing the search records of a reported 500K AOL users
-- assuming it took place as outlined below — is probably the most
egregious violation of users' search privacy in the history of the Internet,
despite the half-hearted attempt at crude anonymization.  The unbelievable
lack of responsibility or good judgment shown by AOL in this case should be
enough to cause any remaining AOL subscribers (or users of their free
services) to strongly consider ceasing any further contact with AOL.

Furthermore, we need to accept the fact that search query data is incredibly
sensitive and often contains extremely personal data that does not lose its
potential for abuse via simplistic forms of anonymization.  Nor can we
necessarily depend indefinitely on some individual search engines' honest
and praiseworthy desires to protect such data (e.g. Google) in the face of
intense competition and intrusive government actions.

Search query data can contain the sum total of our work, interests,
associations, desires, dreams, fantasies, and even darkest fears.

We must demand that this data be protected.



I have removed URL reference (3) from the forwarded message below.  Anyone
who tried to forward that original message to an AOL user may have been in
for a surprise.

At least in my experiments just now, AOL rejects that message since URL
reference (3) contained a numeric IP address rather than a domain address.

Ironic, isn't it?  AOL "protects" users by blocking messages with IP
addresses in URLs (can such addresses be suspect?  Yeah, but they can easily
be legit, too) — yet they happily release the most private aspects of
users' search activities.

It's like a Fellini movie over there, but much less amusing.

Lauren Weinstein +1 (818) 225-2800 or Lauren's Blog:

From: Seth Finkelstein <>
Date: August 7, 2006 1:05:50 AM EDT
To: Dave Farber <>
Subject: AOL Releases Search Logs from 500,000 Users [From IP]

AOL Releases Search Logs from 500,000 Users
[1] Adam D'Angelo - 8/5/2006

AOL just released the logs of all searches done by 500,000 of their users
over the course of three months earlier this year. That means that if you
happened to be randomly chosen as one of these users, everything you
searched for from March to May (2006) is now public information on the

This was not a leak - it was intentional. In their desperation to gain
recognition from the research community, AOL decided they would compromise
their integrity to provide a data set that might become often-cited in
research papers: "Please reference the following publication when using this
collection..." is the message before the download.

This is a blatant violation of users' privacy. The data is "anonymized",
which to AOL means that each screenname was replaced with a unique
number. "It is still a research question how much information needs to be
anonymized to protect users," [9] says Abdur from AOL. Here are some examples
of what you can find in the data:

User 491577 searches for "florida cna pca lakeland tampa", "emt school
training florida", "low calorie meals", "infant seat", and "fisher price
roller blades". Among user 39509's hundreds of searches are: "ford 352",
"oklahoma disciplined pastors", "oklahoma disciplined doctors", "home
loans", and some other personally identifying and illegal stuff I'm going to
leave out of here. Among user 545605's searches are "shore hills park mays
landing nj", "frank william sindoni md", "ceramic ashtrays", "transfer money
to china", and "capital gains on sale of house". Compared to some of the
data, these examples are on the safe side. I'm leaving out the worst of it -
searches for names of specific people, addresses, telephone numbers, illegal
drugs, and more. There is no question that law enforcement, employers, or
friends could figure out who some of these people are.

I hope others can find more examples in the data, which is up for [10]
download over here. The data set is very large when uncompressed which makes
it pretty hard to work with, but someone should set up a web interface so
people can browse it (or even 10% of it) without having to download the
400mb file. If you make a mirror or better interface to the data, or find
other examples, let me know and I'll put a link up here.

This is the same data that the DOJ wanted from Google back in March.
[11] This ruling allowed Google to keep all query logs secret. Now any
government can just go download the data from AOL.

It's unclear if this is the type of data AOL released to the government
[12] back when Google refused to comply. If nothing else, this should be a
good example of why search history needs strong privacy protection.

Thanks to Greg Linden for pointing this out [13] here.

Update 2: The md5 of the file AOL posted (and now removed) is
31cd27ce12c3a3f2df62a38050ce4c0a. I'm posting it so you can make sure you
have a valid copy, but so far none of the copies I've seen are fake.

Update: Seems like AOL took it down. There are some mirrors of the data in
the comments of the digg story, linked below. I estimate about 1000 people
have the file, so it's definitely going to be circulated around. The [2]
main AOL research page is still up, with some other data collections. The
[3] google cache of the download page is still up, but you can't get the
data. Here's discussion at other sites:

   * [4] siliconbeat
   * [5] techcrunch
   * [6] digg
   * [7] reddit
   * [8] zoli's blog


3. [ removed to avoid AOL block ]

Seth Finkelstein  Consulting Programmer
Infothought blog -

Digital retouching of photos to make a propaganda point

<Jeremy Epstein <>>
Mon, 7 Aug 2006 11:50:15 -0700

Digital retouching of photos isn't unusual, whether to make a teenagers acne
go away or to make a joke.  It gets serious though when it's used for

Reuters yesterday withdrew photos that purported to show smoke rising from
buildings in Beirut after an Israeli bombing.  After bloggers showed obvious
evidence of tampering (such as buildings repeated through a picture, smoke
that's duplicated), Reuters investigated and admitted that "photo editing
software was improperly used on this image".  They have now suspended the
photographer.  There are allegations by bloggers of other image tampering by
the same photographer, Adnan Hajj.,7340,L-3286966,00.html describes the
Reuters action. is one of the blogs
that first reported the Photoshopping.

Certainly not the first time there's been distortion in time of war; the
ease of manipulations combined with the power of bloggers to reveal what's
going on may be part of a balance of power.

Voting machines in Ireland and The Netherlands

<Erling Kristiansen <>>
Fri, 21 Jul 2006 22:20:30 +0200

According to EDRIGRAM, the on-line newsletter of "European Digital Rights",
number 4.14:

On 4 July 2006, the Irish Commission on Electronic Voting released its
second report on the secrecy and accuracy of the e-voting system purchased
by the Irish Government.

The summary remarks at the beginning of the 200 page report say: "The
Commission concludes that it can recommend the voting and counting equipment
of the chosen system for use at elections in Ireland, subject to further
work it has also recommended, but that it is unable to recommend the
election management software for such use."

The "further work" includes, among others:
1) add a voter verified audit trail;
2) replace the election management software (which prepares election
   data, reads votes from "ballot modules", and calculates results) with a
   version that is developed to mission critical standards;
3) modify the embedded software within the voting machines to bring it
   up to mission critical standard;
4) make certain modifications to the machines themselves;
5) test all components to mission critical standard;
6) modify the specification for the PC that is to be used for vote management;
7) test the system as a whole (including end-to-end testing) to mission
   critical standard;
8) rectify the security vulnerabilities identified in the way data is
   transferred within the system.

This is quite a mouthful. In particular, the "mission critical standards"
may be quite difficult to achieve as a retrofit.  The article speculates
that the responsible minister, who declares his intention to continue the
project, "may not realize the extent of the changes required".  [Or is it a
polite way of saying "No thank you"? -EK]

Full article at
The article includes several links, including a link to the full report.

As far as I can make out from various sources, the voting machines in
question are essentially the same as the Nedap machines used in The
Netherlands for years.  Little public criticism of these machines appears in
the general press.

But they do, indeed, have problems: According to the "Bits of Freedom"

In a local election, one candidate got 1, 3, 7, and 181 votes, respectively,
in the 4 polling stations where he was a candidate.  The candidate not only
was en election official in the high-vote station, he operated the machine!

Peter Knoppers, according to the article an expert on voting machines, is
quoted saying that manipulation of the machine by a voting official is "a
piece of cake". For example, if a key is turned at the exact moment of the
vote being acknowledged by the voter, the vote will not be counted. The
missed votes can then be added manually at a later time, for any candidate
of your choice.

Full story (in Dutch) at
This article also has several links, all in Dutch.

Dutch energy company Eneco sends huge bill

<Leon Kuunders <>>
Tue, 25 Jul 2006 13:16:47 +0200

Apparently the Dutch energy supplier Eneco send an invoice for **euro
2.144.607 and 90 cents [which in the US would be written as 2,144,607.90] to
a man for his two month energy usage.  The man would have used **20.000.000
kW electricity and 102.284 m3 gas.

When he called the energy supplier the call-center operator replied with:
"It can be that this invoice isn't correct sir. We can sort it out, but you
will have to pay the bill first. We can setup a payment arrangement if you

The cause of the error remains unknown.

Robot car park holds cars hostage

<Steve Klein <>>
Tue, 8 Aug 2006 12:01:16 -0400

The city of Hoboken, New Jersey owns a parking garage with an automated
car-parking system.  The software that runs the hardware is licensed from
Robotic Parking of Clearwater, Florida.

Following a recent contract dispute, the software license was allowed to
expire and hundreds of cars were trapped in the garage for several days.

More details here:,71554-0.html?tw=wn_index_1

Steve Klein, Your Mac Expert Phone: (248) YOUR-MAC or (248) 968-7622

German road pricing system should help fighting crime, politicians say

4 Aug 2006 06:55:03 -0700

When the German Toll Collect system was put into operation in January 2005,
it was accompanied by a law regulating the use of the data that is collected
for billing. In particular, the system takes photographs of vehicles passing
the scaffolds on which cameras and reading devices are mounted. According to
this law, passing that information on to government agencies other than
customs and the "Federal Agency for Commercial Transport" is illegal. This
applies — for now — especially to the police, who cannot use the data for
criminal prosecution. This is in accordance with German privacy regulations
demanding that data collection has to serve a well-defined and well-stated
purpose and cannot be done for future, yet unknown needs, such as not yet
committed crimes.

It seems that a current case might turn that law upside down rather sooner
than later. After an 18-year old female student was found dead in a Autobahn
parking lot, it became clear through DNA testing that the murderer is
actually a serial killer who is wanted for a murder in 2003 and an attempted
murder in 2004. As he is suspected to be a truck driver, the Toll Collect
recordings may give a clue on his identity.  However, the police is not
allowed to get their hands on that data.

This case has sparked off a lively discussion in the political scene.  Some
(mostly right-wing politicans and police officials) say, the Toll Collect
data should be used freely by police to investigate crimes.  Others, such as
data protection officers, reject that on grounds of data protection and
privacy. As an obvious risk, they see the freedom of communication (which
includes anonymous mobility) endangered by tendencies to promote
surveillance. There are also some moderate voices who do not rule out the
passing of data fundamentally but require high legal standards to do
so. Members of the left-right coalition currently running the German federal
government have announced to pursue a change in the aforementioned law that
would allow to use the data for prosecution.

Apparently, police are currently trying to exploit a loophole in the
system. While they are not allowed to use the Toll Collect data directly,
police have asked shipping companies to look through their own files and
report truck drivers that may have passed the parking lot during the night
of the murder. If they comply, this would yield basically the same
information to the police.

Interestingly, some politicians seem not so much concerned about the privacy
of citizens, but rather whether there will be enough money left for building
new roads after the Toll Collect infrastructure is extended into a
surveillance tool. Today, the system is already very costly — a quarter of
the collected money is spent on its operation — and would be even more if
continuous surveillance was in operation.  This money would then be lacking
in the maintenance and building of roads. It seems the risk here is that a
road surveillance tool could be created for which there exist no roads to

Unexpected consequences of airport random-screening glitch

<Steve Summit <>>
Fri, 21 Jul 2006 17:52:45 -0400

This doesn't seem to have been covered in the media yet and I don't have
full details, but according to an acquaintance who just traveled through
there, a computer or computers unknown at Newark airport this morning
(2006-07-21) mysteriously started selecting 20% of passengers for the random
intensive security screening, instead of the normal 2%.  No one felt
authorized to countermand the computer's selections, so screeners were
compelled to carry out all the excessive screenings, resulting in huge
delays and many missed flights.  A small glitch in a random selection
process can have large and unexpected consequences.

RFID Clonable [From Dave Farber's IP]

<Brad Malin <>>
July 25, 2006 10:10:47 AM EDT

Hi Dave, remember a couple of years ago when you said you wanted to clone
and repeat RFIDs - apparently someone has built the system to it.


VeriChip's human-implatable RFID chips clonable, sez hackers
Posted Jul 24th 2006 4:14PM by Donald Melanson
Filed under: Misc. Gadgets, Wireless

  [Note: "implatable" is "mispelt" in both the URL and the title of the
  article, but the URL works as of when I am putting out this issue, and
  "implantable" is used in the text of the article!  PGN]

In case anyone needed more proof that we're all living in a Philip K. Dick
novel, a pair of hackers have recently demonstrated how human-implantable
RFID chips from VeriChip can be easily cloned, effectively stealing the
person's identity.  Annalee Newitz and Jonathan Westhues showed off their
handiwork at the HOPE Number Six conference in New York City this weekend,
with Newitz herself playing the role of guinea pig, implanting a VeriChip
RFID chip in her right arm.  To clone the chip, Westhues first red Newitz's
arm with a standard RFID reader, then scanned it again with a homebrew
antenna connected to his laptop, which recorded the signal off the chip.  He
then used the same RFID reader to read the signal from his laptop, which
promptly spit out Newtiz's supposedly unique ID.  For its part, VeriChip has
only said they haven't yet had a chance to review the evidence but still
insist that "it's very difficult to steal a VeriChip."

IP Archives at:

Re: The Risks of retro computing? (RISKS-24.35)

<"Watson, Tom" <>>
Mon, 24 Jul 2006 11:34:55 -0700

The story mentions that the IBM 1401 computer "Can't Add Doesn't Even Try".
This is the wrong computer.  The IBM 1620 is the computer that "Can't
Add...".  The computer museum has a working example.  Later the IBM 1620,
Model 2 could add, but still needed a table to multiply (but knew that
multiplying by zero was a simpler operation).  I can't vouch for the
operations in the IBM 1401, as I haven't used the machine, but I am very
familiar with the IBM 1620.  Both machines hit the streets in the 1959-1960
time frame.

Not very Risks oriented, but a bit of history.

IEEE e-mail alias service with Comcast

<Christopher Stacy <>>
Thu, 20 Jul 2006 19:36:02 -0400

Pete Klammer writes about losing email due to spam blacklisting due to the
IEEE forwarding service's use of BrightMail.  His analysis is that
BrightMail is an "unaccountable third party" because IEEE could supposedly
not obtain confirmation, logs, or rule sets describing the lossage of his

But the other involved mail carriers, such as his ISP, IEEE, or even his own
desktop software are all potentially filtering by using blacklist databases
and rules which may be inscrutable.  The problem is the unavailability of an
audit trail.

The victimhood tone of the story and the suggestion that society is placing
undue trust in third parties fails to identify a more straightforward
accountability problem.  His forwarding service (IEEE) has contracted with
BrightMail for filtering services, but is then dismissing him when there's a
problem with the forwarding service.  This is simply poor customer service,
and tantamount to "blaming the computer" as was very common in the 1970s.
In the unlikely scenario that BrightMail customers cannot get the necessary
information, then what's happened is that IEEE made poor contract with that
vendor.  But I am pretty sure that BrightMail is not such a black box, and
does indeed have the necessary audit trails.  So the most likely explanation
is that the IEEE folks were just too inept, lazy, or otherwise disinterested
to bother accessing those resources when asked to investigate the problem.

This does not call for the elimination of a system in which third parties
can be contracted for valuable email processing services.  BrightMail is no
more a "third party" in this scenario than IEEE.  The risk is that if a
computer is involved, people will accept lame blame-passing excuses from
their various service providers.

REVIEW: "Symbian OS Platform Security", Craig Heath

<Rob Slade <>>
Thu, 03 Aug 2006 10:44:52 -0800

BKSYOSPS.RVW   20060615

"Symbian OS Platform Security", Craig Heath, 2006, 0-470-01882-8,
%A   Craig Heath
%C   5353 Dundas Street West, 4th Floor, Etobicoke, ON   M9B 6H8
%D   2006
%G   0-470-01882-8
%I   John Wiley & Sons, Inc.
%O   U$70.00/C$90.99 416-236-4433 fax: 416-236-4448
%O   Audience a Tech 2 Writing 2 (see revfaq.htm for explanation)
%P   249 p.
%T   "Symbian OS Platform Security"

Part one is an introduction to the Symbian mobile (cellular) phone operating
system, and particularly its security provisions.  Chapter one examines the
reasons for the emphasis on security in a mobile phone: the users'
perception of it as a more personal (and therefore more trusted) device and
the acceptability of remote network installations and administration.
Therefore, the developers of Symbian were faced with the challenge of
creating an "open" development platform, while implementing security
constraints.  "Platform Security Concepts," in chapter two, presents an
interesting basic catalogue, but concentrates on capability lists.  (In
this, the term may not be used in a standard manner: the capabilities appear
to be preset, rather than being taken from the calling capability.)

Part two looks at application development for platform security.  Chapter
three describes the basic functions of the Symbian security environment.  A
decent, basic list of suggestions for writing secure applications is in
chapter four, but there are few details.  How to write secure servers
(common processes), in chapter five, provides only generic advice, and has
oddly little information that is distinctive to Symbian.  Chapter six, on
the development of plug-ins, is more code and architecture specific.  The
safe sharing of data, in chapter seven, is addressed with a useful list of
threats and countermeasures, and an outline of various security related
components and provisions.

Part three deals with the management of platform security attributes.
Chapter eight examines the native software installer, concentrating on
encryption key certificates.  How developers obtain and use these
certificates is reviewed in chapter nine.  Some of the public key
infrastructure behind the system can be inferred from the description (by
those familiar with the concepts) but little detail is provided.

Part four, on the future of mobile device security, consists of chapter
fourteen, which mentions a variety of potential functions for mobile phones.

For those wanting an introduction to the security provisions of the Symbian
operating system, this work provides a useful starting guide.  Developers,
however, may need a bit more.  For example, the statement is made that the
platform is "less prone" to buffer overflows, but there is no discussion of
why this is so, how it is achieved, or to what extent a developer can rely
upon the operating system to protect against the problem of buffer overflows
(or other types of malformed data).  Given that most Symbian security is
based on capability tables and certificates (and particularly with a
somewhat non-standard definition of capabilities) these concepts, and their
limits, should probably be explained more fully.

copyright Robert M. Slade, 2006   BKSYOSPS.RVW   20060615

Please report problems with the web pages to the maintainer