The RISKS Digest
Volume 19 Issue 51

Saturday, 20th December 1997

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Brief KC power outage triggers national air-traffic snarl
PGN
Public-key crypto history vs cryptohistory
Steve Bellovin
Chinook helicopter engine software
Mike Ellims
AltaVista calls Esperanto communist
Philip Brewer
Privacy problems with patient data in hospitals, by Simson Garfinkel
via Fr. Stevan Bauman
Risk of seizure-inducing video
Bruce Martin
Re: Potential software nightmare for ISS
Bruce Stephens
name withheld
Satanic Risks?
Lindsay F. Marshall
"Concurrent Programming" by Fred B. Schneider
PGN
Info on RISKS (comp.risks)

Brief KC power outage triggers national air-traffic snarl

"Peter G. Neumann" <neumann@csl.sri.com>
Fri, 19 Dec 97 9:16:46 PST
Power went out at Kansas City's Olathe Air Route Traffic Control Center at
9:03a.m. CST on 18 Dec 1997, resulting in a ``brief and supposedly
impossible power failure'' [1].  A technician routed power through half of
the redundant ``uninterruptible'' power system, preparatory to performing
annual preventive maintenance on the other half.  Unfortunately, he
apparently pulled the wrong circuit board, and took down the remaining half
as well.  The maintenance procedure also bypassed the standby generators and
emergency batteries.  The resulting outage took out radio communications
with aircraft, radar information, and phone lines to other control centers.
Power was out for only 4 minutes, communications were restored shortly
thereafter, and backup radar was working by 9:20a.m.  However, at least 300
planes were in the Olathe-controlled airspace at the time, and the effects
piled up nationwide.  Hundreds of flights were cancelled, diverted, or
delayed.  There were delays of up to 2 hours, and delays continued into the
evening.  [Sources: 1. Matthew L. Wald, *The New York Times*, 19 Dec 1997;
2. *Kansas City Star*, 19 Dec 1997.]

The *Times* article noted that this is the latest in an improbable series of
problems.  The NY Terminal Radar Approach Control (TRACON) was shut down
almost completely on 15 Oct 1997, because of dust from ceiling tiles, and a
similar situation occurred at the Jacksonville center.  The TRACONs at
Dulles and O'Hare were closed when fumes invaded the ventilation systems.


Public-key crypto history vs cryptohistory

Steve Bellovin <smb@research.att.com>
Fri, 19 Dec 1997 13:44:12 -0500
A paper just released by CSEG (essentially the British equivalent of NSA)
claims that they invented public-key cryptography in 1970
(http://www.cesg.gov.uk/ellisint.htm).  It's fascinating reading, and there
is no reason to doubt the claim.  Indeed, rumors about this have apparently
circulated for some time.

Another interesting possibility is that NSA knew of the technique even
earlier.  Bobby Inman, when director of NSA, claimed that they had it circa
the mid-to-late 1960's.  There's some evidence for this, too.  At an
informal session of a conference a few years ago, I heard a retired NSA
cryptographer state that National Security Action Memorandum 160, signed by
President Kennedy, was the basis for the invention.  The context was the
command and control of nuclear weapons.  Gus Simmons — who worked in this
field, and who had stated a few minutes earlier that he didn't learn about
public-key cryptography until the famous Martin Gardner column in Scientific
American — agreed that this memo (written 14 years earlier) was indeed the
key.

Assuming that all of the above is accurate, there are some interesting
questions about closed research.  NSA may have had a technique that would
have helped other parts of the government — but they didn't share the
information.  And ironically enough, it was this exact same problem that led
NSA to the development of public-key crypto.

I've scanned in NSAM-160 and the declassified portion of the accompanying
memo from Jerome Wiesner, who was Kennedy's science advisor at the time
(http://www.research.att.com/~smb/nsam-160).  It does not appear to state
any requirements that could only be met by using public-key cryptography.
But two of the paragraphs that were deleted appear just before a statement
of the complexity of managing so many code numbers — and that, of course,
is the sort of thing that public-key crypto is best at.


Chinook helicopter engine software

Mike Ellims <mike.ellims@pigroup.co.uk>
Fri, 19 Dec 1997 17:54:13 -0000
In 1994 a Chinook helicopter crashed into hills on an island off the coast
of Scotland, killing 29 people.  At the time the engine control software was
absolved of blame, although problems with it were known to exist. The
Minister of Defense was quoted as saying of the software that 485
observations were made but none were considered safety-critical."

In recent weeks Channel 4 in Britain raised the question of whether or not
there were actually serious problems with the software, via a leaked report
from EDS-Scicon.  This report listed 56 category-1 errors (most serious),
which indicate either a coding error or non-compliance with documentation.
A further 193 errors were listed as category-2 errors, which relate to the
quality of the code.

It was further alleged on Channel 4 that the RAF test pilots who develop
operation procedures etc. for new aircraft refused to fly the
helicopter. The aircraft was introduced into operational service, but with
restrictions on load that do not apply to the Mark-1 version.

The official line is that there is no shred of evidence to suggest that
anything other than pilot negligence caused the crash.  However, there is
some possibility that another investigation into the crash may occur.

Further information can be found at http://www.computerweekly.co.uk/ .

Mike Ellims


AltaVista calls Esperanto communist

Philip Brewer <pbrewer@urbana.css.mot.com>
Mon, 15 Dec 1997 11:44:53 -0600
In 1996 a gathering of Esperantists from all over the world published the
Prague Manifesto calling for democracy and minority language rights.  There
happened to be a reference to the manifesto in a German-language document I
was using to test the AltaVista auto-translation software.

The overall quality of the translation was no worse than I had expected, but
it introduced a really egregious error: it called the Prague Manifesto the
"Prague communist manifesto".

On a statistical basis, I suppose the odds might be with you if you just
assume that any manifesto must be the communist one, but not all manifestos
are communist ones, and non-communists are liable to take offense when you
call them communists.

I reported the error to the "feedback" address provided on the translation
web page.  If I get an interesting response, I'll forward it on to RISKS.


Privacy problems with patient data in hospitals, by Simson Garfinkel

Fr. Stevan Bauman <csb@indy.net>
Wed, 17 Dec 1997 01:16:39 -0500 (EST)
  [This item originally appeared in *The Boston Globe*, 5 Jun 1997.  It was
  later reprinted in American Reporter, a daily electronic newspaper,
    http://www.american-reporter.com  for initial free access.
  Contact Joe Shea at joeshea@netcom.com for comments and subscriptions.
  This article is reproduced here with permission of the author.  PGN]

Simson Garfinkel
American Reporter Correspondent
Martha's Vineyard, Mass.
October 2, 1997

COMPUTERS COMPROMISE PRIVACY, CUT COST OF CARE
by Simson Garfinkel
American Reporter Correspondent

MARTHA'S VINEYARD, Mass. — A few months ago, a patient at the University of
Washington Medical Center made what sounded like a reasonable request.
Worried about his medical privacy, the patient asked that the hospital's
computers be set up so that his medical record could not be displayed on a
computer terminal.

Today the UW Medical Center is still considering the request, but the
doctors involved aren't quite sure how to proceed. University of Washington
has been a leader in bringing computers to medicine, and there are few parts
of the hospital that still rely on paper.

Various computer systems at the hospital keep track of each patient's
appointments, record the procedures done by a physician, record the
laboratory work requested and performed, send the results electronically
back to the physician, remind the patient when it is necessary to schedule a
follow-up, and most importantly, send out bills to insurance companies and
the patients themselves.

Precisely which computer does the patient not wish his information to be
displayed upon?

"We're trying to figure that out right now," says one of the physicians on
the hospital's medical informatics review panel. So far, there is no good
answer.

For thousands of years, it's been the obligation of physicians to protect
the privacy of their patients. But many physicians are increasingly worried
that this age-old commitment is being jeopardized as hospitals adopt
increasingly-advanced medical information systems.

Earlier this year, the National Research Council issued a report on the
issues surrounding electronic health information. Called "For the Record,"
the report identified five "threat levels" for information stored in health
care computers:

 * Threat 1: Insiders who make "innocent" mistakes and cause accidental
disclosures of confidential information. This could be as simply as a lab
sending a fax sent to a wrong phone number, or a nurse pulling up one
patient's medical records instead of another's.

 * Threat 2: Insiders who abuse their record-access privileges.
Browsing seems to be a problem with many electronic record systems. The
Internal Revenue Service, for example, has had persistent problems with
curious employees looking through the tax records to which they have access.
It's unreasonable to think that hospitals will somehow avoid this scourge.

 * Threat 3: Insiders who knowingly access information for spite or for
profit.  During the 1992 Democratic primaries, a pathologist at Beth Israel
Hospital here in Boston was contacted by a member of the press who wanted
access to former U.S. Sen. Paul Tsongas' medical records. The reporter
offered good money, and a less ethical pathologist could easily have
retrieved the file — probably without having that information traced back
to him.

 * Threat 4: An unauthorized physical intruder gains access to information.
Many hospitals rely on physical security to protect information stored
inside a computer: the terminals are put in a special room or behind a desk
to which only authorized personnel are supposed to have access.
Unfortunately, hospitals are not as secure as hospital administrators would
like to believe. If that journalist had simply put on a white lab coat and
gotten a fake badge, that person might have been able to retrieve Tsongas'
medical records unassisted.

 * Threat 5: Vengeful employees and outsiders, such as vindictive patients
or intruders, who mount attacks to access unauthorized information, damage
systems, and disrupt operations. A doctor who practices at an HMO recently
told me of a problem that her group has been having: an employee — they
think they know who — has been accessing the HMO's scheduling computer and
deleting patient appointments. The scheduling desk then thinks the
appointment slot is free, and two or three patients show up at the same
time.

The increased reliance on Social Security numbers is further compromising
patient confidentiality. These days it is relatively easy to find out
somebody's Social Security number, and if you have that magic number, you
can call up a hospital or doctor's office and impersonate that person,
hunting out embarrassing or valuable pieces of information.

What makes this scam possible is the fact that many hospitals use Social
Security numbers as a kind of secret patient password for patients to prove
their identity. Hospitals don't seem to realize that even if Social Security
numbers were once relatively secret, that day is long past.

Disturbingly, use of Social Security numbers by health care organizations is
about to expand dramatically. Section 1173 of the Kennedy Kassenbaum health
care portability legislation passed last year defines a set of
"administrative simplification procedures" which require the establishment
of universal health identification numbers.

The identifier will make it easier for different organizations to combine
data, both to improve patient care and to make it easier to perform
large-scale epidemiological studies. Right now, it looks as if Congress or
Health and Human Services will adopt the Social Security number as that
universal identifier.

Some computer professionals have suggested that the way to solve the health
care privacy issue is to encrypt all of a patient's files with a secret key,
so that a patient's files can't be decrypted without their permission. The
problem with this sort approach is that it makes it difficult for doctors to
access critical information in times of urgent need.  Hospitals, and other
institutions, are loathe to deploy systems that restrict anybody's access to
information.

Instead, many hospitals seem to prefer systems that allow relatively open
access, but record every file that's viewed or modified by every health care
worker. That record is called an audit trail. The information can be used to
find and punish employees who violate patient confidentiality, and having it
works as a deterrent for others.  But even audit trails breaks down in an
emergency room, where forcing people to type a username and password before
ordering a test could mean the difference between life and death. Are you
willing to die for your right to privacy?

With all of these problems, sometimes it is easy to forget that the reasons
that hospitals are turning to computers is to lower costs and improve
patient care. Unfortunately, ensuring patient privacy can be expensive and
can prevent doctors and public health officials from considering all of the
pertinent data for a given problem.  It's doubtful that we will be able to
resolve the fundamental tension between the need to know and the need not to
know.


Risk of seizure-inducing video

<Bruce_Martin@manulife.com>
Wed, 17 Dec 1997 12:04:17 -0500
On December 17th the Canadian News Online Enterprise, a.k.a. CNews, reported
that about 650 Japanese viewers ages 3 to 20 were hospitalized following
epilepsy-like seizures apparently induced by what they saw on TV.
(http://www.canoe.ca/TopStories/dec17_cartoon.html)

To paraphrase, roughly 600 young viewers were rushed to hospital after they
were "felled by fits of spasms and nausea" occurring about 20 minutes into
Tuesday night's episode of TV Tokyo's hit cartoon "Pokemon" (meaning "pocket
monsters"), based on characters in a game produced by Nintendo.  Another 50
or so who saw part of the show repeated on a news program also fell
victim.  Total viewership of either the cartoon or the news segment was not
reported.

Those afflicted reported experiencing headaches, "flashing lights" in their
field of vision, and nausea akin to car sickness A scene in the cartoon
featuring a "vividly colored explosion mixed with the strobe-light flashing
of a character's eyes" seemed to trigger the illness.  About 150 viewers
remained hospitalized Wednesday.

In Kyodo News, an epilepsy expert at Saitama University of Medicine outside
Tokyo, Toshio Yamauchi, said their symptoms suggest a one-time attack
triggered by optical stimulus, which is different from epilepsy.  "There
have been many similar cartoon programs in the past, and I don't understand
why the program this time caused so many attacks," Yamauchi was quoted as
saying.  "It's a sign that Japan will also have to set up guidelines for TV
program production."

TV Tokyo said Wednesday that it is canceling the segment on 30 other
stations scheduled to show it.  Spokesman Hiroshi Uramoto told reporters "We
are shocked to hear many children were taken to hospitals" and promised that
the station would investigate.

The Computer-Related Risk?  While isolated incidents like this have been
reported in the past (RISKS 14.63 etc.), this particular cartoon segment has
furnished us with a model for seizure-inducing video that apparently affects
a significant portion of the population.  If low-definition broadcast video
can mass-induce spasms and nausea, higher-definition computer video almost
certainly can.  And unlike TV, the choice of images displayed on your
computer screen is in the hands of several hundred thousand faceless
programmers.  If even one of those programmers has an axe to grind, well,
imagine a new class of computer virus that crashes your hard disk *and* your
cerebral cortex!

  [Also reported by Dan Vogel and Chiaki Ishikawa.  PGN]


Re: Potential software nightmare for ISS (RISKS-19.50)

Bruce Stephens <B.Stephens@isode.com>
Tue, 16 Dec 1997 14:16:45 +0000
Edward R. Tufte's lovely ``Visual Explanations: Images and Quantities,
Evidence and Narrative'' has a chapter on the O-ring problem, and comments
(perhaps unsurprisingly!) that all of the information was there before the
incident; it just needed much better graphical presentation.

Indeed, judging by the graphs that he gives that were used in pre-launch
discussions, it seems that perhaps technology might be partly to blame: when
you can so easily present the information as pictures of space shuttles with
cross-hatching indicating O-ring damage, perhaps other (far more effective)
techniques like simple line-graphs become unacceptably low-tech.


Re: Potential SW nightmare for ISS (Name withheld, RISKS-19.50)

<>
Thu, 18 Dec 1997
  [Note: This person is different from the RISKS-19.50 contributor.  PGN]

I agree with your anonymous correspondent about the potential for an ISS
software nightmare, but I believe that the potential is much very much
greater than your correspondent appreciates.  Over the past two years I have
seen a requirements scrub, a delivery split, more requirements scrubbed, new
requirements added (change traffic is still high), mission creep (PCS most
notably), continually slipping schedules, and fear of Jay Greene, ISS
Associate Director of Engineering and Chairman of the Design Control Board,
elevated to a valid engineering principle.  The schedule slips were
sometimes difficult to discern because, until several months ago, there was
no integrated schedule.  Management has pursued all the standard remedies:
cut back on testing; more overtime; cut back on time available for testing;
more overtime; combine test phases downstream (the reason that Multiple
Element Integrated Test — MEIT — is important); and more overtime.  It is
a litany you are familiar with, and the results are about what you would
expect.

The desire to maintain the launch date is leading to poor decisions such as
fixing the hardware with the software (Lab smoke detectors);not fixing the
SW (accepting loss of synch between C&C and SM because it doesn't happen
that often and we can't meet the SM's timing requirement); and changing the
requirements to accommodate the system (deleting the CRC for a 16-bit XOR
checksum because of CPU use).

Some other causes for concern:

1) The baseline requirements were derived from Freedom.  Freedom's
requirement set was ``fully validated,'' but there appears to be no evidence
that the subset selected for ISS was revalidated.  It is as if the program
assumes a subset of a fully validated requirements set is itself fully
validated;

2) The C&DH (command & data handling) hardware is underpowered.  It looks
as if it will go I/O bound and run out of telemetry bandwidth just after 6A
and then run out of CPU a little later;

3) Because of 2), I think PCS will take on command and control (C&C)
functions.  This is not good because PCS is Crit3 hardware (their
requirements allow them to crash anytime) and C&C (hazardous commands)
require Crit1.

All in all it is not a story with a happy ending.

Your correspondent in RISKS-19.50 contrasted the ISS situation with that of
the shuttle.  It is not quite appropriate to compare a twenty-year-old
program to one much younger.  There is no error count from a comparable age
in the shuttle program (prior to release 19.6) on which to base the
comparison.  But the shuttle software is not perfectly healthy either.

On STS-79, a software error manifested itself during entry and aborted the
DTO in progress.  The quick analysis showed the problem was caused by a
missing initialization (missed in the implementation of the CR (change
request)); the DR (discrepancy report) was assigned severity 3.  Post-flight
analysis showed there were some other anomalies and they turned out to be
caused by a missing parameter in an IF check in code from the same CR.  An
audit was then performed and 5 more discrepancies were found in the code
from the CR (the CR was not a particularly difficult one).  The entire
incident was characterized, by the contractor, as "a result of a process
breakdown which was addressed and corrected."

Not quite a year later, another serious but unrelated DR appeared: DR
108075, "Illegal Entry on all Item Inputs Attempted on the OPS 202 Display."
This problem was introduced on OI-25 with the correction of DR 108680 and
found in the SMS.  The problem was assigned a severity of "1N".  The "N" was
assigned because there are crew procedures in place to preclude the
condition from happening.  The "1" was assigned because there are
contingency scenarios that have been identified where the inability to make
entries on the OPS Page would result in loss of crew/vehicle.

It makes one wonder if the "process breakdown" was "addressed and corrected."


Satanic Risks?

"Lindsay F. Marshall" <Lindsay.Marshall@newcastle.ac.uk>
Mon, 15 Dec 1997 11:33:31 +0000 (GMT)
In the *Letters* page of this month's *Fortean Times* (FT106, January 1998)
there is a letter entitled Brotherly Communications, raising the privacy
risks of mandating GPS in every mobile phone — which it claims will be the
case in the USA in 1999.  However, the letter then goes on to say the
following:

> Much of the data concerning mobile phone paranoia (or the enhanced 911
> service) comes from the publications of Lucent — also known as Bell
> Laboratories — AT&T and Sandia National Laboratories.

> Lucent seems an odd sort of name — Luc(iferic) Ent(erprises) as people on
> a witch hunt might suggest — but when it comes to software they have a
> real-time operating system called Inferno, written in a language called
> Limbo, with a communications protocol called Styx.  Reading the product
> literature is less like engineering and more like indoctrination.  The head
> offices are at 666 5th Avenue in New York.  The company motif is a fiery
> red circle that might represent a bull's eye, the star Aldebaran in the
> constellation Taurus — also associated with the Egyptian god Set ...

> Lucent has been doing a lot of recruiting recently — their headline
> product is something called Airloop(tm) which looks like a cellular phone
> microcell incorporating voice and data.  It is controlled by a little box
> that I expect we'll be seeing everywhere, called the BSD2000 (Lucent seem
> to have a millennial flavour in their product numbers).

Lucent is, of course, at http://www.lucent.com, and the *Fortean Times* is at
http://www.forteantimes.com.

Lindsay  <http://catless.ncl.ac.uk/Lindsay>


"Concurrent Programming" by Fred B. Schneider

"Peter G. Neumann" <neumann@csl.sri.com>
Tue, 16 Dec 97 16:08:33 PST
  Fred B. Schneider
  On Concurrent Programming
  Springer-Verlag, New York, 1997

One of the most insidious sources of programming problems in the RISKS
archives involves concurrent programming.  Synchronization, locking, message
passing, and other tight-coupling mechanisms are extremely difficult to do
properly.  Programming languages and operating systems are not necessarily
much help by themselves.

Fred Schneider has put together a wonderful book on how to do concurrent
programming correctly.  Whereas the book is ideal for a one-semester course
(and more), it is also very valuable as a reference work.  It should be read
by everyone deeply involved in writing critical programs.  Although its
focus is strongly on formal methods, I have long claimed that formal methods
can be enormously helpful if you are really concerned about correctness in
concurrency, for which most unproved algorithms tend to have flaws (and a
few ``proved'' ones may also).  Furthermore, the implementations of such
algorithms are always in question, and formal methods can help significantly
there as well.

Please report problems with the web pages to the maintainer

x
Top