The RISKS Digest
Volume 10 Issue 54

Thursday, 18th October 1990

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Flawed computer chip sold for years
Al Stangenberger
The slippery slope of personal identification and tracking
Jerry Leichter
Technology Meets Dog; Dog Wins
Sanford Sherizen
Pilot error and human factors
ark
Re: Airliner story
Bob Sutterfield
Pilot Error, Human Factors, and Common Sense
Irving Chidsey
Re: Closed Captioning at Educom
Gary Coffman
Lauren Weinstein
Info on RISKS (comp.risks)

Flawed computer chip sold for years

<forags@violet.Berkeley.Edu>
Thu, 18 Oct 90 10:41:23 PDT
Al Stangenberger, Dept. of Forestry & Resource Mgt., Univ. of Calif.
Berkeley, CA  94720  uucp: ucbvax!ucbviolet!forags BITNET: FORAGS AT UCBVIOLE

From KRTN News Wire, reported in Marin Independent-Journal (San Rafael, CA)
3 Oct 90 page B4:

SUNNYVALE - A strangely flawed computer chip was sold by the millions by
National Semiconductor Corp. here between 1987 and last spring, with a
potential for causing bizarre failures in computer systems.  The chip's
potential for mischief is significant because it was used by major computer
makers for more than two years, and some may not be aware of the potential for
problems.

National first learned the chip had a design flaw in 1987, but it wasn't until
January that the company stopped shipping it, according to a lawsuit filed in
June by a former employee.  The firm had a "large inventory" of the chips it
didn't want to "dispose of as non-functional," claims the suit by Michael
Parsin of Sunnyvale, a former managing engineer in the department product group
responsible for the chip.  "We did identify some isolated problems with that
part among some customers," said Mary Coady, spokeswoman for National
Semiconductor.  "The company took steps to address the problem — a bunch of
steps.  Of hundreds of customers for millions of parts that were shipped, I am
told we have relatively few ... complaints", Coady said.

The chip tracks the time and date in computers and other electronic systems.
In certain applications, it has had a tendency to skip forward one day, with
unexpected results:

> The United Nations International Atomic Energy Agency used the chip in a
new television security system for guarding nuclear fuel in atomic power
plants worldwide, according to agency official Klaus Gaertner.  Problems
cropped up in one monitoring system, and design changes had to be made to
protect the chip from electronic noise, Gaertner said.  The chip is the
suspected cause of the problem, but more testing may be needed to know for
sure, said another engineer familiar with the system, who added that it
would be very costly to replace them at this point.

> A Canadian company had difficulties with the chip on a military system.
"It was a real problem," according to George Bleier, a project engineer
with Marconi Canada who said it had problems in a system for a foreign
military customer.  "We were just flabbergasted."  He said he complained by
letter in 1988 and National fixed the problem, but only this year did the
firm finally apologize.

> A financial program for a company was set up to print paychecks on
Fridays, but the chip caused the computer — made by a major manufacturer -
- to skip from Thursday to Saturday, leaving employees with no paychecks,
according to an engineer familiar with the computer.

> A computerized trip recorder for long-distance trucks printed reports
that made the truck look as though it were traveling at impossible speeds
and "doing impossible things."  The system frequently shut down, said an
engineer at Rockwell International Corp. who worked on the system.  The
chip had to be replaced with another version.  "It was a fiasco," said the
engineer, who asked not to be identified.  "If I go in there and say
'National time chip,' (my boss) goes through the roof."

National said "some isolated problems" have been reported in the chip --
problems that seem to occur more frequently in even-numbered years.
Exposure to electronic "noise" triggers a tendency to flip from 24- to 12-
hour time with unfortunate results, said some engineers who have used it.
A new version does not have the problem.


The slippery slope of personal identification and tracking

Jerry Leichter <leichter@lrw.com>
Thu, 18 Oct 90 08:51:51 EDT
It was reported in last Sunday's New York Times that Princeton University has
installed a new security system at one of its colleges (groups of related
dorms and such).  The doors to the college, heretofore always open, are now
locked.  Residents of the college have "proximity" access cards which unlock
the doors.  Such cards can be sensed from a reasonable distance (e.g., if you
carry your card in your wallet, the door will unlock as you approach it).
Others at the university can use their magnetically-encoded cards in a "swipe"
reader to unlock the doors.  Non-university people are supposed to be greeted
at the entrance.

Princeton intends to install the same system at all its colleges over a period
of time.  The system is described by the university as "monitored 24 hours a
day from a central location" (not an exact quote); precisely what this means
and why anyone should care isn't clear, but apparently the university con-
siders this a good thing which should instill confidence in the system.

The Times reports that some students at the college are complaining about the
inconvenience caused by the system.  The university justifies the system as
necessary for safety - there have been several "incidents" on campus of late -
and the inconvenience minor.

What no one mentions, at least in the article, is the potential such a system
has for invading privacy.  A card reader of this sort has the ability to track
who goes where and when on a campus.  Systems of this sort that I've seen log
every use of a card.  That log is subject to misuse.  Suppose some government
agency decides that student X is a dangerous radical, fomenting revolution.
What a simple matter to track him and look for people who go to the same
places he does - just check the logs.  There's a long history of exactly this
kind of investigatory technique - taking photographs of cars parked near
demonstrations and checking for license plates that show up more than once,
for example.  It's also clear that, historically, most institutions have not
resisted government attempts to gain access to such information; and that even
when they do, the government can usually get a subpeona.  Note that failing to
collect information that a system can easily collect doesn't help - the gov-
ernment could easily demand that a logging system be turned on, just as it can
require the telephone company, under appropriate court order, to track usage
of a phone.

The fact that proximity cards are used makes the system all the more danger-
ous.  First of all, you can be tracked without taking any specific action -
which means you'll have a hard time knowing when you might be tracked, and
won't be able to avoid it.  (Leaving the card home may not be a solution -
usually, it's an id card that you MUST have to accomplish almost anything).
Second, it makes the system virtually invisible, so people don't think about
the implications as much.

Now, I don't want to over-emphasize the dangers, such as they are, of the
particular system at Princeton.  The data will likely be fairly "course" -
the cards give you access to a college, which is home to hundreds of people,
not to individual rooms - and it PROBABLY won't be abused.  But there's an
underlying issue here which has received too little discussion:  One side-
effect of many recent technologies has been to make tracking of individuals
a quick and painless matter.  Every time you use your bank card, you are
providing a central system with a real-time trace of where you are.  These
days, every time you use a credit card, it's checked with a central system -
again providing a trace.  How many people know that their cellular telephones
can be made to report, with no indication that they've been polled?  This
ability is an inherent part of the implementation of cellular systems, and
even at its most limited allows the phone to be located to the nearest cell.
In practice, with some effort one can usually locate the phone much more
precisely, since some directional information is available and there is also
usually signal strength information for several cells.  The only way to keep
the phone from responding is to turn it off.

Some losses of privacy are obvious; others are insidious, occurring as unin-
tended side-effects of otherwise benign and even very useful technologies.
The cumulative results can be the same, however.

In one of his science fiction books, Fred Hoyle speculates on how a universal
"person tracking" system might come to be imposed.  Initially, the system is
created as a means to keep a small elite continuously accessible and safe.
High government officials today accept a constant surrounding of protective
forces and communication agents; making the tracking more automatic would only
improve their situation.  In Hoyle's argument, over time, more and more people
are considered to be important enough to warrant the privilege of being part
of the system; it's considered an honor.  Eventually, though, EVERYONE becomes
part of the system.

Constant accessibility, first with pagers, now with cellular phones, has in-
deed developed more or less along these lines.  Constant position location,
at least ACKNOWLEDGED constant position location, has not so far.  Instead,
it's creeping in even more insidiously, piggy-backing along with apparently
unrelated systems.
                            — Jerry


Technology Meets Dog; Dog Wins

Sanford Sherizen <0003965782@mcimail.com>
Thu, 18 Oct 90 08:35 EST
Amid all of the problems posted here, a dog-bites-phone risk is worth noting.

NETWORK WORLD, October 15, 1990 had an article on AT&T Tariff 12 deals.  In the
article, the following appeared.

"On a lighter note, it seems a new type of long-distance fraud is making the
rounds, as Tom and Bonnie Robb of Aliso Viejo, Calif., can attest.

When their telephone bill arrived recently, they had a difficult time figuring
out who had made $28 worth of toll calls to Sports Pick and the Adult Date line,
according to a recent story in the HARTFORD COURANT.

But it turned out to be their cocker spaniel, who was using a large-faced
push-button telephone.  The Robbs had attempted to teach the dog to dial 911 by
smearing peanut butter on the corresponding buttons of the keypad.

The dog had apparently taken to knocking the handset off the receiver and
dialing telephone numbers, inadvertently dialing the 900 numbers."

W-H-Y were the Robb's attempting to teach their dog that trick?  Are peanut
butter manufacturers accessories to a crime?  Did the dog enjoy the Adult Date
Line?  Are we sure that the dog *inadvertently* dialed those numbers?  What
animal species will next turn to crime?
                                                  Sandy

                  [Next the dog will learn how to imitate the touch tones,
                  and its bark would be much worse than a byte.  PGN]


Pilot error and human factors

<ark@research.att.com>
Thu, 18 Oct 90 08:56:04 EDT
A few days ago I saw a comment on rec.aviation about `pilot error' from a
flight instructor who had just come back from an AOPA recertification clinic.
Among the notes from that clinic were that 75% of the pilots involved in
accidents where the cause had been established as `pilot error' were at the
time going through a marriage, divorce, or career change.


Airliner story (RISKS-10.49)

Bob Sutterfield <bob@morningstar.com>
Thu, 18 Oct 90 14:24:14 GMT
Gene Spafford quotes RIch EPstein <@VM.CC.PURDUE.EDU:REPSTEIN@GWUVM>:
   Heavy rains leaked into the plane and knocked out the transponders
   and the auto-pilot computer.  About 15 minutes into the flight the
   pilot announced that we had to return to O'Hare because the air
   traffic controllers couldn't "pick us up".  In other words, we were
   invisible, in the clouds, at O'Hare... the pilot meant this
   literally.  Radar picks up aircraft by means of the signal sent out
   by the transponders.

Lack of a transponder return isn't really an immediate, major safety problem.
You weren't about to get bumped into.  Your flight was operating under
instrument flight rules (IFR), which means there was a very detailed flight
plan and clearance in effect.  Even if all two-way communications had been
rendered inoperative along with the transponder at the moment of takeoff, a
block of airspace would have been reserved for you as you moved along your
route.  Lost-comm procedures are a fundamental part of IFR flying, and provide
a nearly algorithmic "way out" of every situation.  Lacking a transponder but
maintaining communications, the crew would simply have been required to provide
regular verbal position reports, just like in the olden days (not so very long
ago) when ATC radar coverage wasn't so pervasive as it is now.  So being
invisible in the clouds isn't that big a deal, safety-wise.

I suspect that the loss of the autopilot was a more severe problem,
since it would drastically increase crew workload in every phase of
flight and would render some maneuvers (e.g. a Category III instrument
approach in the event of very bad (nearly zero/zero) weather at your
destination) impossible.  The airline's operations manual may list the
autopilot as a go/no-go or continue/abort item.  It may also list the
transponder as such, but it's not such a big operational safety issue.

   The pilot ... said that this was a good plane because it had
   "stainless steel aeronautical control cables", a reference to the
   fact that an Airbus would probably have been disabled completely in
   a similar circumstance.  I have no doubt that the pilot was
   referring to the Airbus when he made this remark.

Or maybe he was just reassuring you that the control systems weren't going to
rust and jam, and your personal worries about the Airbus (fueled by Gene's
stories from RISKS) filled in the "A320" between the lines.  Either way, this
is a more interesting issue, and possibly the main RISKS-related story to be
told about the incident.


Pilot Error, Human Factors, and Common Sense (Spelt, RISKS-10.5x)

Irving Chidsey (INF) <chidsey@BRL.MIL>
Thu, 18 Oct 90 11:27:52 EDT
    Some years back I read a story in which an engineer was reprimanded
because he had designed something without using "common sense".  His defense
went approximately:

        `` `Common Sense' is a very rare commodity.  I am only an engineer
with a technical education, and must design as I was taught.''

                            Irv

              [Or, put another way, common sense is not very common,
              in both senses of the word.  PGN]


Re: Closed Captioning at Educom (RISKS-10.51)

Gary Coffman <gary@ke4zv.UUCP>
18 Oct 90 13:16:54 GMT
As a Gannett employee working at WXIA-TV in Atlanta (11 Alive) I can tell you
that voice recognition equipment is not used in our captioning system.  The
system is Atari 800 (!!!) based with a court stenographer's keyboard grafted on
to the computer. Real live human operators man the steno keys.  The errors
reported can be attributed to the fact that even Southerners can't always
understand Jimmy Carter and to the fact that our stenos can't spell nor do they
know geography or geopolitics. You should see some of the things we routinely
put on the air.
                                        Gary


open captioning at conference (was: "Technophobia...")

Lauren Weinstein <lauren@vortex.com>
Tue, 16 Oct 90 23:28:18 PDT
Without a doubt, the open captioning of Pres. Carter's speech was *not* being
done by an automated speech-to-text voice recognition system.  Continuous
speech voice recognition systems are still at a comparatively primitive level,
even when specifically trained for a particular speaker.  Recognition systems
for - dealing - with - separated - speech - are much more advanced, but still
normally need per-user training except for limited vocabularies, and wouldn't
be applicable in such a situation anyway.

What was almost certainly happening was that the conference was using a closed
captioning realtime speech transcription system to provide open captions in
this case.  The fact that the captions were being provided by a local
television station lends even more weight to this.  All of the commercial
television networks, and an increasing number of major metro area local
stations, are providing closed captioning for many of their major news-oriented
programs.

Unlike most non-news, non-sports programming, where shows for closed captioning
are sent off to the National Captioning Institute (NCI) for "offline"
captioning, news and sports programs are captioned using a realtime system
developed by NCI.

The transcription operator uses a special phonetic keyboard, much like that (in
concept anyway) of the court reporter.  They enter the speech they hear in
realtime, and a computer does its best to translate the phonetic entries into
words and sentences based on various complex algorithms/dictionaries.

Such a system is of course dependent upon the accuracy of the
algorithms/dictionaries, the quality of the implementation, and the skill of
the operator.  The fact that the sorts of errors noted at the conference would
occur in such a system is not at all surprising.  These systems are still in
the relatively early phases of development, and considering the rate at which
the operators have to enter the phonetic information they really work amazingly
well and provide a very valuable service for the hearing impaired.

--Lauren--

Please report problems with the web pages to the maintainer

x
Top