The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 23 Issue 78

Thursday 10 March 2005


Security? Nuclear plants don't need no stinkin' security!
Jim Horning
Drug-error risk at hospitals tied to computers
Scott Allen via Monty Solomon
Hospital computers make things worse
Richard Akerman
Richard Clarke: Real ID's, Real Dangers
John F. McMullen
MIT says it won't admit hackers
Robert Weisman via Monty Solomon
Website hijackings, 302 redirects, and security issues
Tim Chmielewski
Credit Information Stolen From DSW Stores
AP via Monty Solomon
Garbage Out, Garbage In?
Adam Shostack
More BofA problems
Tom Watson
Re: More uses of satnav/GPS
Michael Bacon
REVIEW: "Windows Forensics and Incident Recovery", Harlan Carvey
Rob Slade
Info on RISKS (comp.risks)

Security? Nuclear plants don't need no stinkin' security!

<"Jim Horning" <>>
Fri, 4 Mar 2005 17:36:41 -0800

A SecurityFocus post
discusses the nuclear industry's reaction to a proposed voluntary standard
for security of digital systems controlling nuclear power plants.

"Two companies that make digital systems for nuclear power plants have come
out against a government proposal that would attach cyber security standards
to plant safety systems. The 15-page proposal, introduced last December by
the U.S. Nuclear Regulatory Commission (NRC), would rewrite the commission's
'Criteria for Use of Computers in Safety Systems of Nuclear Power Plants.'
The current version, written in 1996, is three pages long and makes no
mention of security. The plan expands existing reliability requirements for
digital safety systems, and infuses security standards into every stage of a
system's lifecycle, from drawing board to retirement.  Last month the NRC
extended a public comment period on the proposal until March 14th to give
plant operators and vendors more time to respond. So far, industry reaction
has been less than glowing."

"The NRC tries to promote the use of digital technology in the nuclear power
industry on the one hand, but then over-prescribes what is needed when a
digital safety system is proposed," wrote one company president.

"The entire cyber security section should be deleted and only a passing
reference to the subject retained," another company wrote.

More information at

Drug-error risk at hospitals tied to computers (Scott Allen)

<Monty Solomon <>>
Thu, 10 Mar 2005 07:28:47 -0500

Hospital computer systems widely touted as the best way to eliminate
dangerous medication mix-ups can actually introduce many errors, according
to the most comprehensive study of hazards of the new technology. The
researchers, who shadowed doctors and nurses in the University of
Pennsylvania hospital for four months, found that some patients were put at
risk of getting double doses of their medicine while others get none at all.
22 types of mistakes were identified, such as failing to stop old
medications when adding new ones or forgetting that the computer
automatically suspended medications after surgery.  The findings underscore
the complexity of improving safety in US hospitals, where the Institute of
Medicine estimates that errors of all kinds kill 44,000 to 98,000 patients a
year.  [PGN-ed]

Scott Allen, *The Boston Globe*,  9 Mar 2005

Hospital computers make things worse

<Richard Akerman <>>
Thu, 10 Mar 2005 08:36:05 -0400 (AST)

Reports over the past few years of increasing numbers of patient injuries
and deaths due to medical errors sent hospital administrators scrambling for
computerized solutions.  But two new studies suggest that, in many cases,
these high-tech systems have left doctors and nurses increasingly frustrated
while providing little evidence of real benefit to patients. In fact, one
widely used system actually helped foster medication errors, researchers
found.  See the 9 Mar 2005 issue of the Journal of the American Medical

Sympatico News, Hospital Computers Fail to Deliver: study finds they
facilitated errors

Richard Akerman

Richard Clarke: Real ID's, Real Dangers

<"John F. McMullen" <>>
Mon, 7 Mar 2005 17:24:40 -0500 (EST)

Richard A. Clarke, The Security Adviser, Real ID's, Real Dangers,
*The New York Times*, 6 Mar 2005

Have you ever wondered what good it does when they look at your driver's
license at the airport? Let me assure you, as a former bureaucrat partly
responsible for the 1996 decision to create a photo-ID requirement, it no
longer does any good whatsoever. The ID check is not done by federal
officers but by the same kind of minimum-wage rent-a-cops who were doing the
inspection of carry-on luggage before 9/11. They do nothing to verify that
your license is real. For $48 you can buy a phony license on the Internet
(ask any 18-year-old) and fool most airport ID checkers. Airport personnel
could be equipped with scanners to look for the hidden security features
incorporated into most states' driver's licenses, but although some bars use
this technology to spot under-age drinkers, airports do not. The photo-ID
requirement provides only a false sense of security.  [Excellent article
abstracted for RISKS.  PGN]

MIT says it won't admit hackers (Robert Weisman)

<Monty Solomon <>>
Wed, 9 Mar 2005 16:27:41 -0500

Sloan School of Management has joined Carnegie-Mellon and Harvard in
rejecting applications from prospective students who hacked into a website
to learn whether they had been admitted before they were formally notified.
32 MIT applicants reportedly took a peek, along with 1 at CMU, 119 at
Harvard, and 41 at Stanford.  The Web site is run by ApplyYourself, and also
used by other business schools.  Its access was compromised by a posting on
a BusinessWeek Online forum.  [PGN-ed from Robert Weisman, *The Boston
Globe*, 8 and 9 Mar 2005]

  [Dave Farber's IP list had several responses.  Rejected applicants
  considered their treatment excessive.  One candidate saw only a
  blank page at ApplyYourself, but was rejected for having accessed
  the site.  Dave Lesher wrote
    What's the B-schools' culpability in contracting out a process to a
    company with inadequate security?  [Presumably] the schools demanded
    SSN's and other financial data from the applicants. Was there informed
    consent by the applicants to have their data shared with, in effect, a
    data broker?  Could they apply WITHOUT so agreeing?
  Joe Hall wrote
    What strikes me is how constructing a URL that is available to students
    without any further authentication or protection is considered
    "hacking".  That's inevitably diluting any geek cred. held by any of us
    who are even crappy hackers!
  Joe also noted Ed Felten's post on this subject at
  PGN wonders what if a competing candidate had masqueraded as other
    candidates to see if others had been accepted, and thereby wound up
    getting them all rejected!  Could that be a suitable defense for the
    rejected students?  PGN]

Website hijackings, 302 redirects, and security issues

<"Tim Chmielewski" <>>
Thu, 10 Mar 2005 15:36:34 +1100

I have been reading about the problems with the site on the forum
Webmaster World and decided to try it myself.

Basically what it is that if you type in any site with the format: you will get redirected to another site
(actually a proxy server in China) that looks exactly like your site, but
none of your pages that use scripting will work.

Using the same technique other sites could hijack banking or online shopping
sites and redirect input so they collect your credit card and other

While this has been a popular topic of discussion in the webmaster forums,
Google itself is silent on the issue.

Tim Chmielewski, Webmaster, Human Edge Software

Credit Information Stolen From DSW Stores

<Monty Solomon <>>
Tue, 8 Mar 2005 22:13:23 -0500

AP, 8 Mar 2005

Credit card information from customers of more than 100 DSW Shoe Warehouse
stores was stolen from a company computer's database over the last three
months, a lawyer for the national chain said Tuesday.  The company
discovered the theft of credit card and personal shopping information on
Friday and reported it to federal authorities, said Julie Davis, general
counsel for the chain's parent, Retail Ventures Inc. The Secret Service is
investigating, she said.  DSW was alerted by a credit card company that
noticed suspicious activity, she said.

Garbage Out, Garbage In?

<Adam Shostack <>>
Thu, 10 Mar 2005 13:45:50 -0500

An article in the Guardian,,3605,1410921,00.html
discusses a plan to implant chips in garbage bins covers some risks:

  "If, for example, computer hackers broke in to the system, they could see
  sudden reductions in waste in specific households, suggesting the owners
  were on holiday and the house vacant."

But the tendency to believe anything written on a computer screen continues
unabated: "He said the microchips would help the council fend off
unwarranted criticism.  "We will have a confident response to customers who
claim their bin may not have been emptied," he added. "

More BofA problems

<Tom Watson <>>
Mon, 28 Feb 2005 20:03:51 -0800

My recent encounters with BofA include attempting to setup an "out of
branch" transfer.  The thought seems wonderful, and then I try to do it.
Navagating the web site (https, thankfully) gets me to a page that asks me
to enter a "confirmaiton number" that was sent to my e-mail address.
Unfortunately, something doesn't comeplete the sending of this message, and
I never get it.  Of course no error message appears and I'm left without the
ability to transfer.  I call the bank (or send internal [secure] messages
and the response is "get another e-mail address" or some such.  The risks:
I'm told some tale that it is really "my problem" (it isn't!), and the
bank's web service is sending messages into wierd places.  I guess I really
can't trust them until they "get it right".  On the other hand, my other
bank is perfectly able to send e-mail to the address is question.  You think
they would read their logs and wonder (do they?).

Re: More uses of satnav/GPS

<"Michael \(Streaky\) Bacon" <>>
Wed, 9 Mar 2005 15:25:30 -0000

In RISKS-23.71, David Magda pointed to the suggested use of GPS on trains in
the UK.  In RISKS 23.52 (Shutting the train door before the commuter has
bolted?) I drew attention to existing problems with one such system already
operational on some UK trains.

DM highlighted the potential issue of the GPS system being '"shut off" by
the US government during emergencies and suggests Galileo (or inertial
guidance) as a back-up.

The threat of the existing GPS system being 'switched off' has been touted
on many occasions, but, given its incorporation into so many systems (public
use as well as military) in the USA and world-wide, it is doubtful that
anything more extreme than a 'detuning' of the system would ever be
contemplated.  This could well be effected via an existing option which
would allow the military to continue using GPS at a necessary high degree of
accuracy while still providing private users with a service - albeit with
lesser accuracy.  Whilst the effects of such a lesser accuracy are
'undefined' and could lead to severe consequences - including death -
GPS-based systems would continue to operate.

Further, using Galileo as a back-up may be contentious for the reason that
the US government has previously expressed grave concerns about its
up-coming loss of exclusivity over the provision of such positioning
technology and about the nationality of some of the participants in the
Galileo scheme.  This gives rise to a dilemma.  Either the existence of
Galileo provides a reason for not 'turning off' (or 'detuning') the existing
GPS system; or Galileo itself must be similarly 'turned off' (or 'detuned')
in parallel.  If we assume the latter, this would have to be effected by one
of three means: US governmental pressure on the Galileo operators, external
(and unauthorised?) override of the satellites' software, or physical
interference with the satellites (up to and including destruction).  Any
ground-based 'satellites' would (probably) have to be similarly affected.

Thus, if we assume that GPS might be 'turned off' (or 'detuned'), we must
assume the same of Galileo.  This would leave inertial guidance as the only
viable back up system - were such accuracy actually required.

But, let us examine what benefits are sought from using a precise
positioning technology.  The articles quoted skillfully (and typically)
interweave the 'safety' issue - trading on terms such as "A number of
devastating crashes over the last 10 years have pushed rail safety to the
top of the national agenda." - with that of operational convenience.  Trains
are currently located somewhere in (say) one kilometre 'blocks' protected to
the rear by a red signal.  On the UK's railways there have been a number of
recent incidents of SDAP - Signal Passed At Danger - some of which have led
to fatalities.  However, other recent fatalities have had causes that would
not be well-militated against through exact positioning technology
(e.g. derailment caused by a broken rail and collision with a car at a
crossing).  The lumping of accidents together - regardless of cause - to
justify spending on a system which might address only one of them is all too

Whilst it is possible that a GPS-type system could enable signallers to more
accurately locate trains and know their speed too and it is possible that
this could enhance safety, I fear that the prime driver for the installation
of positioning technology is actually intended to enable an increase in
train density on the existing rail infrastructure.  Such a move would likely
increase risk rather than reduce it.  The rationale for my fear is that the
pressures to increase density exist, are increasing and will continue to
increase.  Building additional track capacity is not a viable option and
lengthening trains themselves is highly problematic - because of
restrictions on platform length at termini and intermediate stations.  The
only option is to increase track usage (more trains per mile of track) - and
this cannot be done with the existing signalling systems and block working.

But, is the technology up to it?  Railways in the UK provide special
challenges for the designers of equipment.  The recent history of equipment
on new trains such as remotely released doors, automatic announcements,
new-style pantograph and third-rail pickups, tilting carriages, etc. has
been fraught.  Seemingly, designers had no real concept of the challenges
provided by the environment.  Reliability in an environment that is subject
to continual and continually variable vibration and oscillation;
intermittent but frequent harsh shocks of variable intensity; rough
handling; infrequent, sometimes poor and occasionally non-existent
maintenance, and the necessary 'building down to a price' demanded by a
cash-strapped industry is demonstrably hard to achieve.

Therefore a primary question must be: "What happens when an on-board
positioning system fails (either the computer or the transmitter)?"  The
train might simply vanish off the display and possibly out of the
system. Signallers and drivers of following trains would not know whether
there was a train ahead (or behind), nor exactly where that train was.  So,
if train separations were reduced through the introduction of such a system,
catastrophe could easily and quickly result.  Even if the system
'fails-safe', problems would still result from any reduction in train

Then there is the human factor.  Common issues here include the operator's
implicit trust in the automated system, their inability to recognise,
assimilate and react to emergency situations, poor MMI and clashes between
the system's and the users actions (vis. the Australian Naval ship referred
to in RISKS 23.71).

We have existing examples in air traffic control and aircraft flight
systems.  These are formed of far-longer established protocols; but we still
have (brand new) systems that fail, either inherently or at the man-machine
interface, and clashes between pilots and 'George'.  When it comes to
implementing remote or automated control in mass-transport systems, the
record is not good.

When we couple commercial greed - especially when argued in the grounds of
'safety' - with the continuing failure of designers to construct reliable
and reliably fail-safe systems the RISKS are manifest and manifold.

REVIEW: "Windows Forensics and Incident Recovery", Harlan Carvey

<Rob Slade <>>
Mon, 7 Mar 2005 08:28:59 -0800

BKWNFOIR.RVW   20041224

"Windows Forensics and Incident Recovery", Harlan Carvey, 2005,
0-321-20098-5, U$49.99/C$71.99
%A   Harlan Carvey
%C   P.O. Box 520, 26 Prince Andrew Place, Don Mills, Ontario  M3C 2T8
%D   2005
%G   0-321-20098-5
%I   Addison-Wesley Publishing Co.
%O   U$49.99/C$71.99 416-447-5101 fax: 416-443-0948
%O   tl a rl 1 tc 2 ta 2 tv 1 wq 2
%P   460 p. + CD-ROM
%T   "Windows Forensics and Incident Recovery"

Chapter one is an introduction, both to the book and to the ideas behind it.
For once, the author does, indeed, try to define what an incident is.  The
definition is broad, but so are the possibilities.  The intended audience is
stated to be anyone interested in the security of Microsoft Windows, but it
is instructive that, in listing specific groups, forensic specialists and
security professionals are *not* mentioned.  Carvey notes that a great many
people would like to know the information that Windows forensics can
provide, since the platform is nearly ubiquitous, but few have the knowledge
of system internals that is necessary to find the relevant bits.  Based on
the definition of an incident as an event that violates security policy,
chapter two demonstrates some of the ways that policy failures, and
therefore attacks, can occur.  (The rationale behind the inclusion of eleven
pages of Perl source for a program to detect null sessions escapes me.)

Chapter three reviews a number of places to hide data, but all of these are
at the user interface level, such as setting hidden file attributes, placing
data in unused keys in the Registry, NTFS (NT File System) alternate data
streams (ADS), and the extra information stored in data files by
applications like Microsoft Word.  There is no mention of the lower level
caches: slack space (whether in terms of zero padding, extra space in
sectors, or the timing margins on hard disks) or page files.  In addition,
for those locations that are mentioned, specific programs for extracting
particular data are listed, but no details of structural internals (for
example formats for NTFS, OLE/COM, or Word) are provided for analysis with
more general utilities.  This is not to say that Carvey does not do a good
job of explaining what he does cover: the tutorial on NTFS ADS is clear and
complete.  The material in chapter four addresses the issue of preparation
by suggesting various means of hardening systems and networks against
attack.  The content is unusual, and deals with functions and activities
that are frequently left out of security texts.  At the same time, it does
not touch on some common suggestions for system security: this should be
seen as a complement to, rather than a replacement for, other Windows
security works.  A wealth of utilities for deriving all manner of
information from Windows systems are listed and described in chapter five.

Chapter six presents suggestions for the methods and procedures to be used
in responding to a potential incident, but it does so in the form of a
number of fictional examples.  The stories can be instructive, but it does
take a long time to sort through the material to find the relevant points to
use.  Various indications that can be evidence of the existence of malware
(particularly network-based remote access trojans) are examined in chapter
seven.  The author's Forensic Server Project, a tool for managing forensic
data collection, is presented in chapter eight.  Chapter nine describes an
assortment of network scanning and data capture tools.

Although a number of areas are addressed, the text will be of greatest use
to those who are concerned about network malware, especially of the remote
access type.  The intended audience, of experienced but non-specialist
Windows administrators and law enforcement professionals with some technical
background, will find a number of valuable indicators that will point out
whether a system will reward further scrutiny.  The professional, and
particularly one with experience in forensic analysis, will find some very
useful information on newer operations of Windows, but may be frustrated at
the lack of detail.  (I'm still not sure who is going to get a lot out of
all the Perl source code ...)

copyright Robert M. Slade, 2004   BKWNFOIR.RVW   20041224    or

Please report problems with the web pages to the maintainer