The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 12 Issue 20

Friday 30 August 1991

Contents

o "Thieves Hit Social Security Numbers"
Yasmin Anwar via PGN
o Jetliners in near-miss over Cleveland
PGN
o More T/CAS
Martyn Thomas
Robert Dorsett
o Overseeing dementia patients by computer
Urban Fredriksson
o Heisenberg effect for credit data?
Peter G. Capek
o The story of O [and Ng]
Jerry Leichter
Stuart I Feldman
o A number is no name
Clifford Johnson
o The need for utilities to deal with non-standard situations
Tom Lincoln
o Uncle Sam Can't Keep Track of his Trillions
Bob Frankston
o Info on RISKS (comp.risks)

"Thieves Hit Social Security Numbers"

"Peter G. Neumann" <neumann@csl.sri.com>
Fri, 30 Aug 91 10:00:36 PDT
One of the better newsmedia items on the misuse of Social Security Numbers is
an article by Yasmin Anwar, Chronicle Staff Writer, in today's San Francisco
Chronicle.  Front Page.  She is a Chron Intern, and I think she is to be
commended for a superb job of incisive reportage.  The following item is
included in RISKS in its entirety because of its keen relevance to our ongoing
discussions on this subject in the RISKS FORUM, and the increasingly serious
problems that it poses.


Copyright San Francisco Chronicle, 30 August 1991.  Reproduced in RISKS with
explicit permission of the Chronicle for use by the RISKS Forum.  Any further
distribution or publication requires explicit permission of the Chronicle.

                   Thieves Hit Social Security Numbers
                     Fouled-Up Benefits and Credits
                 By Yasmin Anwar, Chronicle Staff Writer

  Debbie Biner knew something was wrong when the Internal Revenue Service
demanded back taxes for a job she never had.  Then her boss accused her of
falsely claiming unemployment.  Bewildered, the 36-year-old Moraga woman set
out on a two-year investigative trail that led her to a bizarre discovery --
12 people from as far away as Virginia had been using her Social Security
number.  "I've never been late on a payment in my life," she said.  "Who knows
what people are doing with my number?"
  Biner is among a growing number of victims stung by Social Security number
theft, a crime that can take years to detect. Most often, the felony reveals
itself in fouled-up tax records or muddled credit reports.  Occasionally, major
embezzlement is involved.  "Someone can take your number, get a credit card,
charge it to the limit and vanish," said Steven Gruel, an assistant U.S.
attorney and former immigration fraud prosecutor.

  Although many consumers go to great lengths to safeguard their credit-card
numbers -- cutting up expired plastic and tearing up carbon receipts -- few
realize the dangers of a Social Security number in the wrong hands.  In this
computer age, where extensive records on a person's background are just a
keystroke away, the importance of protecting Social Security numbers is
magnified.  "If a private eye wants to find somebody, a Social Security number
is all he needs," said attorney Fred Gross.

  So far this year, 550 people have been convicted of felonies for stealing,
selling or using bogus Social Security numbers -- compared with 468 convictions
for all of 1989, and 390 in 1988.  And federal authorities figure the
convictions reflect just a fraction of the problem.
  "It's rampant. But the (Social Security) system isn't set up to detect fraud,"
Gruel said. "You don't know people are using your number unless you try to take
out a home loan and your credit file is flagged."

  Examples of fraud:

* Joelle Robert, a waitress at San Francisco's Meridien Hotel, could not figure
out how someone opened 16 credit cards in her name -- then ran up $10,000 in
charges.  Eventually, Robert learned that someone she considered a friend had
been using her Social Security number.
  "I don't understand why credit companies don't ask for more IDs when they
give people cards," Robert said.

* A Martinez woman trying to claim unemployment last year was told by the state
Employment Development Department that five people using her number had already
beaten her to it. The woman, who asked not to be identified, said she gave up
trying to claim benefits.

* Lizabeth Stephens, a.k.a.  Elizabeth Ann Borruso, used eight Social Security
numbers and six names last year to open accounts throughout Northern California
at Citibank. Security Pacific and Great Western Savings. She obtained an Army
civilian identification card under a false number and name. Currently in jail
awaiting sentencing, she faces a maximum sentence of five years in prison and a
$250,000 fine.

  Experts attribute the increasing abuse of Social Security numbers to two main
factors: undocumented immigrants seeking work in the United States and the
business world's increasing use of the number as a universal ID.
  The 1986 Immigration Reform and Control Act -- designed to ontrol immigration
and tighten restrictions on illegal workers -- ended up fueling a black market
in phony IDs, which contain Social Security numbers.
  Illegal immigrants, who now need to present more IDs when they apply for a
job, can buy fake green cards and numbers from street corners and stores for as
little as $30.  "We created an industry," said Philip Waters, deputy district
director for the Immigration and Naturalization Service, who estimates there
are 50,000 identification counterfeiters operating in the United States.
  Immigration fraud investigators say they seldom pursue workers who have used
bogus identifications.  "We get the manufacturers and vendors. The bigger the
better," Waters said.

Uses of the Number

  Social Security numbers can be misused in many ways. The computer age has
allowed businesses and government agencies to compile extensive and centralized
records on Americans. And a Social Security number unlocks that information.
  By tapping into computer systems, enterprises as diverse as insurance
companies, police departments, hospitals, grocery stores and colleges can dig
up details on individuals ranging from unpaid medical bills to cocaine
convictions.
  "The number is information fly paper. It's basically one step short of putting
a bar code on everyone's forehead," said attorney Mark Rotenberg, a former
adviser to the Senate Judiciary Committee.

Shaky Legal Ground

  By law, the only agencies that can demand a Social Security number are the
Social Security Administration, the IRS, employers, banks and the military.
Other agencies such as credit bureaus, insurance companies, police departments
and hospitals have no legal authority to request it.  Yet businesses routinely
obtain customers' Social Security numbers because people give them out on
applications.
  "I personally protect my number like it's gold. I keep it locked up in a safe
deposit box," said IRS spokesman Larry Wright. "If they choose to deny me the
credit card, I don't care. I'll go somewhere else."
  The 1974 Privacy Act prohibits government agencies from giving out
information from individuals' files. Citing the act, Peter Zilahy Ingerman, a
New Jersey computer scientist, sued the IRS for displaying taxpayer Social
Security numbers on income tax form envelopes.  The case is pending in U.S.
District Court in New Jersey.

Seeking the Culprits

  As Debbie Biner's case illustrates, the search for a number thief can be
time-consuming and complex.  Biner said government agencies offered no help.
"It was so frustrating. Everyone kept telling me my case was out of their
jurisdiction." she said.
  So Biner asked Transunion, the nation's largest credit bureau, to run her
number through a tracing system to determine who was using her nine-digit
identifier.

Slowing the Number Flow

  In Washington, D.C., privacy rights advocates and watchdog groups such as
Computer Professionals for Social Responsibillty are lobbying Congress to write
stricter Social Security laws.
  They are pushing for a legal guarantee that would state, "No person shall be
denied credit, employment or the opportunity to engage in a commercial
transaction for failure to provide his or her Social Security number."

  Meanwhile, Biner sits in her Moraga apartment, as her 6- and 7-year-old
children play, and writes letters. Fraud investigators have advised her to
contact the IRS, the Employment Development Department, various collection
agencies, banks, department stores and furniture stores where her number mates
are doing business.  "My name is Debbie Biner," she writes. "I am the original
owner of the following Social Security number.  Please remove the following 12
names and their attached transaction records from my files. "


Jetliners in near-miss over Cleveland

"Peter G. Neumann" <neumann@csl.sri.com>
Fri, 30 Aug 91 9:27:31 PDT
The PGN digesting service notes a NYTimes-originated article in this morning's
San Francisco Chronicle (p.A3) regarding the near-miss last Saturday (24Aug91)
over a radio navigation marker at 35,000 feet, 20 miles southwest of downtown
Cleveland, which somehow did not get into the Cleveland Plain Dealer until
yesterday, Thursday (29Aug91).  A British Airways DC-10 from London to Atlanta
(routed over Toronto and Cleveland) came within 100 feet vertical and half a
mile horizontal (only a few seconds separation on closure!) of a Midway
Airlines DC-9 from LaGuardia to Chicago.  Apparently a controller had
accidentally assigned the DC-10 the wrong frequency, the crew had not realized
it, and the air traffic controllers observing the two planes on an apparent
collision course were unable to contact the DC-10 crew -- which never saw the
DC-9.  But it was certainly disturbing that the DC-10 crew had not contacted
the controllers since passing Toronto.  Indeed, NEITHER plane was in contact
with the proper controller.  Incidentally, neither plane had the collision
avoidance system that will be mandatory by the end of 1993.  At the last minute
(literally) one of the DC-9 pilots spotted the other plane and took evasive
action.


TCAS false alarms

Martyn Thomas <mct@praxis.co.uk>
Thu, 29 Aug 91 11:08:30 BST
Further to the recent IEEE Spectrum report on problems with the Collins TCAS
systems.

Flight International (28/8/91) reports that the US National Air Traffic
Controllers Association (NATCA) is claiming that at least half of altitude
deviations following TCAS "resolution advisories" are due to malfunctions.
NATCA says its estimate is conservative (the FAA disagrees).

The US airline pilots association also says that the problem is not as large
as NATCA says.

Problems have mostly been fixed by software changes. They include:

 * Intruders with a high vertical speed but about to level off
 * Intruders with adequate vertical separationtransponder/encoder errors
 * Intruders on parallel approaches causing unnecessary go-arounds
 * TCAS detecting its host aircraft transponder [!!!]
 * "Descend" advisories issued when the aircraft is only 500ft AGL
 * High-wing military aircraft with belly-mounted transponders not triggering
   TCAS.

NATCA say there were 325 TCAS-generated incident reports between 5 May and 12
August. 200 involved altitude changes. Of these 69% reported a deviation of
500ft or greater, 23% reported 1000ft or greater. Pilots are normally advised
that a TCAS resolution advisory should result in a deviation of 300ft to 500ft.

I believe that these incidents all involve TCAS II, which only gives vertical
advisories. TCAS III will also give lateral advisories - this is
computationally harder: whether it will result in more or fewer incidents
remains to be seen.

I fear that we are creating an arbitrarily-complex network of systems
interacting in real-time, with feedback. I doubt that our technology is
capable of assessing the failure probabilities of such a system. Does anyone
on the net have a copy of the safety-case justifying the mandatory
introduction of TCAS?

Martyn Thomas, Praxis plc, 20 Manvers Street, Bath BA1 1PX UK.
Tel:    +44-225-444700.   Email:   mct@praxis.co.uk


More T/CAS

Robert Dorsett <rdd@cactus.org>
Fri, 30 Aug 91 01:17:45 CDT
Lars-Henrik Eriksson  wrote:
<>Not quite.  TCAS is a backup system. It's a redundant backup.  Primary
<>responsibility for "see and avoid" is with the pilot (FAR part 91).
>
>The "see and avoid" responsibility is only applicable in visual flight
>conditions.

...and in IFR conditions, the primary separation responsibility is with the
air traffic control system.  T/CAS is a warning/alert system, and is not
designed for tactical situational awareness.


<>... TCAS and air traffic control are at crossed purposes.  TCAS gives
<>authority to the pilot, and ATC takes it away.
>
>ATC authorities (both FAA and those of other countries) have the legitimate
>concern that pilots will react unnecessarily to TCAS alerts and cause other
>incidents by doing unauthorised deviations.

Currently, it is FAA's policy that it will NOT pursue enforcement of any
clearance violations by pilots who deviate due to T/CAS alerts.  False alarms
happen often enough, and cause enough pilot concern, that there is now almost
a monthly reminder in the Air Line Pilots Association newsletter that it is
still being tolerated.

The controllers resent their organized chaos being disorganized; pilots want
to stay alive.  If I were a passenger on an airliner, I think I'd want my
captain to err on the side of caution.

Interestingly, many controllers are not completely aware of the FAA's
lenience on the issue, and continue to write up pilots.  ATC currently
recommends that pilots inform them whether the aircraft has TCAS, so they
can plan for more lenient separation.


>I understand that the TCAS technology and the procedures being applied
>when a TCAS alert occurs have developed to a point when this risk is
>at an acceptable level.

I've seen nothing to suggest that the number of false alarms is falling.  In
fact, as more T/CAS-equipped aircraft come online, the number seems to be
increasing.  In the terminal environment, I'm hearing more discussions between
pilots as controllers, as pilots attempt to reconcile their T/CAS warnings with
ATC radar.  I have no data to support this, but it's my perception that T/CAS
warnings are starting to be taken with a grain of salt.  This regarding a
system whose warnings are designed to be obeyed in a time-critical framework!

The design of the T/CAS interface is also fluid.  Some systems present a
"plan" overview on the weather radar screen; another has T/CAS warnings
built into a funky (and slightly odious) vertical-speed indicator.  I had
hoped that the FAA would standardize display formats, but perhaps not.

To bring this back to RISKS: T/CAS is turning into a classic case of what
happens when technology is developed and implemented under hysterical political
pressure, without a concrete grasp of the consequences.  T/CAS has been under
development for years; but it was pushed into service as a result of a mid-air
collision in the early 1980's.

It did not address the safety issue of the *much* higher rate of mid-air
collisions BETWEEN general-aviation aircraft, it does not address Mr.
Eriksson's observation of ATC/pilot disagreements, and it does not address
potential improvements to the ATC system.  Ultimately, I suspect there will be
more and more restrictions on T/CAS (all it takes is ONE change of the FAA
Administrator)--and, at some point, we will find ourselves wondering why we
bothered with this very expensive system.  The images of drug-crazed pilots and
mid-air airliner collisions are quite useful to politicians to rally support
around, despite the improbability of either occurring.  Few of these
politicians attempted to address the real cause of the degradation of air
safety in the 1980's: deregulation, and the shortage of experienced controllers
caused by Reagan's mass sacking in 1981.

Robert Dorsett Internet: rdd@cactus.org UUCP: ...cs.utexas.edu!cactus.org!rdd


Overseeing dementia patients by computer

<Fredriksson_Urban_NOK@kista.relay.nokia.fi>
Fri, 30 Aug 91 11:31:41 +0300
        "Gunnar, it is night. Don't go out, go to bed!"
        "Gunnar, shut of the water tap."

These voice messages are controlled by a computer watching over Gunnar, aged 77
and suffering from senile dementia. They were taken from a Swedish radio
program mainly dealing with how to take care of old people while preserving
their integrity.

Gunnar doesn't want to stay in a home, so he has food delivered to his
apartment and help from time to time.

The computer is part of a highly modified burglar alarm system, which is still
in the trial stage. The designer thinks it is better than video surveillance,
since now Gunnar isn't watched over, but can get help when he needs it, for
example if he is lying on the floor. Or when he doesn't need it, if the new
help makes his bed so the sensor gets unplugged, which has happened.

Gunnar doesn't understand there is a computer, any positive reaction is
probably because he thinks he's got visitors. His life isn't risk free: He
smokes, but isn't very good at putting the cigarettes out, so one message is:

        "Gunnar, there is a fire! You must go out
        in the street immediately!!"

But he also runs the risk of being a VERY involuntary beta tester. It took a
long time before it was discovered what would have happened if there was a fire
in the night.

        "Gunnar, it is night. Don't go out, go to bed!"

Urban Fredriksson, Stockholm, Sweden


Heisenberg effect for credit data?

"Peter G. Capek" <capek@watson.ibm.com>
Thu, 29 Aug 91 10:03:14 EDT
BankCard Holders of America reports that too many inquiries (e.g., to
a credit bureau) for an individual's credit report can harm that report by
making it appear that credit is being applied for from too many sources,
resulting in the individual being over-extended.  An example given is that of
a person shopping around for a new car:  Every dealer visited may use
information extracted from the driver's license (for a road test) to obtain a
credit report, to determine if the prospect is worth pursuing.  Similar effects
may occur when shopping around for a bank loan.  Bottom Line/Personal, where
this was reported, suggests not providing sellers the information they need to
make a credit check unless you're serious about buying.


The story of O

Jerry Leichter <leichter@lrw.com>
Thu, 29 Aug 91 09:00:58 EDT
A recent RISKS mentions the problems of one Stephan O in getting computers to
accept his single-letter last name.

This is an OLD problem.  "Ng" is a moderately common Chinese name (well, to be
more accurate, it's a moderately common rendering of an underlying Chinese
name probably more often written as Eng or Ing, and undoubtedly pronounced
using a phoneme not present in English).  I recall at least one report,
probably in Datamation, many years ago - probably early '70's - of the trials
and travails of a programmer whose last name was Ng.  It seems the payroll
computer just would not accept that as a valid name.  As I recall, his
paychecks were eventually made out to one Damn U Acceptit.

The underlying issue here - and one we haven't gotten any better in dealing
with in 20 or more years of trying - is that of "unreasonable" data.  A common
complaint is that computers accept everything literally; with no knowledge of
real-world reasonableness, they are perfectly happy to accept that a homeowner
use a million kilowatt-hours in a month (because of a small error in trans-
cription), or what have you.  The usual prescription is "Check for reasonable-
ness".

Unfortunately, the world is sometimes "unreasonable"!  The "robust" software
that avoids accepting random junk produced by line noise for names has prob-
lems with Ng and O.  The range-checking software that discards "impossible"
values suppresses all data about the ozone hole over the Antarctic.

As Mr. O's story illustrates, it's not just computers that run into this
problem.  A "dumb" program, with no recourse to "common sense", would accept
the name with no problems.  A "smarter" program, embodying the programmer's
model of what names look like, rejects it just as Mr. O's teachers did.  The
only difference is that, with the teachers, he could convince them that O it
was.  The program has no escape hatch.

However, people sometimes have no escape hatch either.  Everyone has had to
deal with bureaucrats who just would not bend "procedure", even when it was
clear that "procedure" just was not working.  Everyone has also run into at
least one pig-headed individual, operating entirely without the excuse of
organizational inertia, who would not bend from his belief in some particular
way of doing things, evidence to the contrary notwithstanding.

Probably the most significant effects of this phenomenon are in the many
examples of intelligence organizations which ignore what in retrospect are
"clear warnings" of problems because the evidence is "unreasonable" in terms
of their theory of the world.  Or consider the Challenger disaster, and the
effects of deliberate blindness to evidence.
                            -- Jerry


The Story of O

Stuart I Feldman <sif@lachesis.bellcore.com>
Wed, 28 Aug 91 15:04:47 -0400
If I remember an NPR item on the problems of Stephen O, he has particular
difficulties because programs that launder names to fix up entry errors assume
that a single O is part of an Irish name (as in PGN's O'O).  An example of the
risks either of ethnocentric (Eurocentric?) computer programming or of
excessive cleverness.
                                           stu feldman


A number is no name

"Clifford Johnson" <GA.CJJ@Forsythe.Stanford.EDU>
Wed, 28 Aug 91 16:44:19 PDT
In addition to the story about the computer-related inconvenience of a person
having the name "O", it is worth mentioning a California judge's ruling (Marin
county, 1984) refusing to permit the name "3", or even its romanized form
"III".  The person in question had been called "3" since his childhood, being
the third child, but the judge ruled that a number cannot be a legal name.
Only the spelling "Three" was permissible.  Social security fought the name
change, arguing that the case presented an exception that would cost them too
much to program for.

      [Having just seen on PBS a rerun of the old Victor Borge equivalent of
      the young people's guide to pronunciation, one would assume that if they
      permit "O" and "3" that someone might try for "!" (Jack Splat?) or "#"
      (they make calculators!) or "&" (Georges Amper Sand?) or even "~"
      (ma hatma tilde?).  An opportunity to circumflex your imagination!  PGN]


The need for utilities to deal with non-standard situations

Tom Lincoln <lincoln%iris@rand.org>
Thu, 29 Aug 91 17:41:45 PDT
Koenig in RISKS-12.18 states: It's practically impossible to keep two separate
  databases in step for any length of time.  That's true even when one of the
  `databases' is reality itself.

It is **particularly** true when reality is to match some formal data structure
because reality is full of all sorts of non-standard situations.

The story of (Stephen) O the following day illustrates how pervasive the
problem is.  See Spafford's contribution to RISKS-12.19, where numerous systems
could not accept a letter as a last name. What if he had to be admitted to a
hospital with an automated registration and admission system?

The real problem does not lie in the particular cases... those already
submitted to the RISKS FORUM are too numerous to count... but rather with the
general lack of utilities and procedures to manage non-standard situations
wherever they arise in on line computing. The data model will never be
completely correct, and the real world is a moving target.

Very commonly, the person at the terminal can see the absurdity, but has no
override to do something about it.

Take the case of a nearby hardware store: They have tried to order some power
tools from Black & Decker. However, the order has been rejected because there
is a non-zero balance of over 60 days. In this case, however, it is not a
debit, but a $8.49 credit! B&D does not send out checks to adjust a credit
balance, but rather applies the credit to the next order... But in this case...
And there is no override...

Of course this is a bug. The test should be for a balance less than zero. There
should be an exception sequence managed on paper by a supervisor.... but there
isn't. Clearly, exceptions have not been anticipated. But there are always
exceptions. These must be resolved by the direct user (often a clerk) where the
transactions are made. At the very least the user must be able to put
non-standard material in an exception que to be resolved by higher authority.

Take the case of a physician submitting a missing (?lost) prescription for
Medicare patient reimbursement. The instructions are to back date it to the
original date. However, the physician, wishing to be accurate, puts down both
the original date and the date that the prescription was rewritten, noting that
this is a resubmission for a lost document. It is rejected. There is no way to
submit a non-standard document.... The only way is to pretend that it is an
original. Clearly, the problem is with procedures first, and only subsequently
with the computer implementation.

Managing non-standard situations needs to be an integral part of of all
software that must deal with unstructured aspects of the real world. The idea
of managing non-standard situations should be incorporated in the operating
system and in the structure of commercial data bases. When this advanced day
arrives, life will be much easier, and their will be fewer funny examples in
the RISKS FORUM.
                                        TOM LINCOLN  lincoln@rand.org


Uncle Sam Can't Keep Track of his Trillions

<frankston!Bob_Frankston@world.std.com>
29 Aug 1991 20:25 -0400
So goes the title of a Business Week article (September 2, 1991).  I see it is
a counterpoint to the Stories of O and Ng.  The problem is not so much the
risks of technology as the risks of underutilizing technology.

The problem is that there are just too many SMOPs to deal with.  Back in the
1950's the banks were saved by computers which made it possible to deal with
the huge volumes of checks they had to process.  Throwing more programmers at
the problems is no better than attempting to hire the entire US population as
check sorters or phone operators. (Though throwing programmers at checks did
work).

Getting back to the article; the government has underinvested in accounting
infrastructure which is no great surprise.  What is more surprising was the
comment that until 1989 the Treasury couldn't report on which checks were
actually presented for payment.  (This is the same problem I had with Citibank
ebanking which would post a check when issued instead of when presented).

The term "reengineering" is currently in vogue.  Change of paradigm is another
take on this.

An overriding issue is the question of how to compose large systems out of
smaller ones without explicitly building large systems.  We can build
individual solutions to separate technical problems but how do these interact?
These could be the systems within an airplane (and, simultaneously, among
planes) or the data exchanges between government departments.  If the solution
involves project management of large scale software projects, we're doomed. [To
head off responses, perhaps there are very large systems, but even they need to
cooperate with other VLSs to compose Hyperlarge Systems].

Until we can do this fully, what are the modest standards to adopt so we can
exchange data in the interim (i.e., this reality)?  (An interesting aside are
the competition between SMGL & RTF, X.500 & Domains, TCP/IP & OSI -- is a
premature major standard better than a quick & dirty interim solution?) Some of
this data will be smart (such as objects with methods -- a Macintosh disk with
an INIT operation is a current example).  Simple examples involve delivering
financial and other data in machine readable form (via email).  How does ISDN
allow me to interact with the communications infrastructure?

Much of the change simply involves awareness.  In accounting we have
double-entry bookkeeping, in engineering closed-loop systems are similar
concepts.  Many problematic systems are open-loop and don't allow for reality
checking.

One term I like to use is "federation".  Back in the mid70's I reacted to
distributed database by proposing federated databases as a better model which
database would be autonomous but cooperative (though not entirely trustworthy).

I'll stop here without going into the many risks we'll encounter as we learn
about this systems and without going into how the individual deals with this
infrastructure.

Please report problems with the web pages to the maintainer

Top