The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 8 Issue 27

Thursday 16 February 1989

Contents

o FBI NCIC Data Bank
Bob Morris
o Internet mail forgery
Walter Roberson
o Re: Dead code maintenance
Clifford Johnson
o Probabilities and Engines
Steve Philipson
Robert Dorsett
Daniel A. Graifer
o Info on RISKS (comp.risks)

FBI NCIC Data Bank

<RMorris@DOCKMASTER.ARPA>
Thu, 16 Feb 89 09:41 EST
  Proposed FBI Crime Computer System Raises Questions on Accuracy, Privacy --
     Report Warns of Potential Risk Data Bank Poses to Civil Liberties
                 Washington Post, February 13, 1989
                         by Evelyn Richards

   On a Saturday afternoon just before Christmas last year, U.S. Customs
officials at Los Angeles International Airport scored a "hit."
   Running the typical computer checks of passengers debarking a Trans World
Airlines flight from London, they discovered Richard Lawrence Sklar, a fugitive
wanted for his part in an Arizona real estate scam.
   As their guidelines require, Customs confirmed all the particulars about
Sklar with officials in Arizona - his birth date, height, weight, eye and hair
color matched those of the wanted man.
   Sklar's capture exemplified perfectly the power of computerized crime
fighting.  Authorities thousands of miles away from a crime scene can almost
instantly identify and nab a wanted person.
   There was only one problem with the Sklar case:  He was the wrong man.  The
58-year old passenger - who spent the next two days being strip-searched,
herded from one holding pen to another and handcuffed to gang members and other
violent offenders - was a political science professor at the University of
California at Los Angeles.
   After being fingered three times in the past dozen years for the financial
trickeries of an impostor, Sklar is demanding that the FBI, whose computer
scored the latest hit, set its electronic records straight.  "Until this person
is caught, I am likely to be victimized by another warrant," Sklar said.
   Nowhere are the benefits and drawbacks of computerization more apparent than
at the FBI, which is concluding a six-year study on how to improve its National
Crime Information Center, a vast computer network that already links 64,000 law
enforcement agencies with data banks of 19 million crime-related records.
   Although top FBI officials have not signed off on the proposal, the current
version would let authorities transmit more detailed information and draw on a
vastly expanded array of criminal records.  It would enable, for example,
storage and electronic transmission of fingerprints, photos, tattoos and other
physical attributes that might prevent a mistaken arrest.  Though
controversial, FBI officials have recommended that it include a data bank
containing names of suspects who have not been charged with a crime.
   The proposed system, however, already has enraged computer scientists and
privacy experts who warn in a report to be released today that the system would
pose a "potentially serious risk to privacy and civil liberties." The report,
prepared for the House subcommittee on civil and constitutional rights, also
contends that the proposed $40 million overhaul would not correct accuracy
problems or assure that records are secure.
   Mostly because of such criticism, the FBI's revamped proposal for a new
system, known as the NCIC 2000 plan, is a skeleton of the capabilities first
suggested by law enforcement officials.  Many of their ideas have been pared
back, either for reasons of practicality or privacy.
   "Technical possibility should not be the same thing as permissible policy,"
said Marc Rotenberg, an editor of the report and Washington liaison for
Computer Professionals for Social Responsibility, a California organization.
The need to make that tradeoff - to weigh the benefits of technological
advances against the less obvious drawbacks - is becoming more apparent as
nationwide computer links become the blood vessels of a high-tech society.
   Keeping technology under control requires users to double-check the accuracy
of the stored data and sometimes resort told-fashioned paper records or
face-to-face contact for confirmation.  Errors have plagued the NCIC for many
years, but an extensive effort to improve record-keeping has significantly
reduced the problem, the FBI said.
   Tapped by federal, state and local agencies, the existing FBI system juggles
about 10 inquiries a second from people seeking records on wanted persons,
stolen vehicles and property, and criminal histories, among other things.
Using the current system, for example, a police officer making a traffic stop
can fine out within seconds whether the individual is wanted anywhere else in
the United States, or an investigator culling through a list of suspects can
peruse past records.
   At one point, the FBI computer of the future was envisioned as having links
to a raft of other data bases, including credit records and those kept by the
Immigration and Naturalization Service, the Internal Revenue Service, the
Social Security Administration and the Securities and Exchange Commission.
   One by one, review panels have scaled back that plan."
   "There's a lot of sensitive information in those data bases," said Lt.
Stanley Michaleski, head of records for the Montgomery County [Maryland]
police.  "I'm not going to tell you that cops aren't going to misuse the
information."
   The most controversial portion of the planned system would be a major
expansion to include information on criminal suspects - whose guilt has not yet
been established.
   The proposed system would include names of persons under investigation in
murder, kidnapping or narcotics cases.  It would include a so-called "silent
hit" feature: An officer in Texas, for instance, would not know that the
individual he stopped for speeding was a suspect for murder in Virginia.  But
when the Virginia investigators flipped on their computer the next morning, it
would notify them of the Texas stop.  To Michaleski, the proposal sounded like
"a great idea.  Information is the name of the game."
   But the "tracking" ability has angered critics.
   "That [data base] could be enlarged into all sorts of threats - suspected
communists, suspected associates of homosexuals.  There is no end once you
start," said Rep. Don Edwards (D-Calif.), whose subcommittee called for the
report on the FBI's system.
   The FBI's chief of technical services, William Bayse, defends the proposed
files, saying they would help catch criminals while containing only carefully
screened names.  "The rationale is these guys are subjects of investigations,
and they met a certain guideline," he said.
   So controversial is the suspect file that FBI Director William Sessions
reportedly may not include it when he publicly presents his plan for a new
system.

    [A case similar to Sklar's was reported previously in RISKS -- that of
    Terry Dean Rogan, who was arrested five times because of outstanding
    warrants caused by someone else masquerading as him.  He finally settled
    for $50,000 in damages.  PGN]


Internet mail forgery

<Walter_Roberson@CARLETON.CA>
Wed, 15 Feb 89 22:14:45 EST
A few days ago, someone forged a message to rec.music.misc. The "From:" address
corresponded to that of a gateway for the Apollo mailing list.  A couple of
people, not recognizing that the fake name corresponded to a mailing list, sent
their replies in `privately' instead of to rec.music.misc, with the result that
their replies were broadcast whereever the Apollo list and comp.sys.apollo
reaches. They were, it seems, subsequently `flamed' for their unintential
mis-mailing.

A subsequent note from someone, exposing the message as a forgery, states

> With SMTP and/or NNTP, the forgery could come from anywhere, not
> necessarily berkeley or ucsf.

Perhaps someone else can comment on this: can we trust -any- of our
(non-encrypted) network mail to be from the claimed author? How about the
other way around: how much danger is there that someone can spoof mail in
order to receive messages destined for someone else?

   Walter Roberson  <Walter_Roberson@Carleton.CA>  (if you can believe that...)

References: <11366@cgl.ucsf.EDU> (the forged message)
            <8039@netnews.upenn.edu> (the exposure, written by
 george%hyper.lap.upenn.edu%netnews.upenn.edu%eecae%mailrus.uucp@
 ames.arc.nasa.gov  (George "Sir Lleb" Zipperlen) )

    [The simple answer is "no".  Furthermore, encrypting the networks does not
    help very much if the operating systems are vulnerable to attack.  
    Previous spoofs include "Chernenko at Moskvax" (see ACM SIGSOFT Software
    Engineering Notes 9 4, July 1984, and last year's "Spafford" hoax.  PGN]


Re: Dead code maintenance

"Clifford Johnson" <GA.CJJ@Forsythe.Stanford.EDU>
Wed, 15 Feb 89 18:04:41 PST
My useless code maintenance story concerns a job I applied for once, as a
contracted programmer, specifically to maintain one Cobol program.  It was
billed as a 20-hour per week job, and it's maintenance had kept a Stanford
Ph.D. programmer/statistician busy for that amount of time, for some months.
(None of this relates to work done at or for Stanford.)

The job was to run the program against fresh data a couple of times a week, and
keep the record formats (which rarely changed) up to date.  As soon as I
reviewed the program, having taken the well-paid job, I discovered that all it
did was read-in a file a record at a time, and rewrite some fields from each
each record it read-in, without any data change or sorting whatsoever.  The
second set of records was then read by a statistical program -- which could
have read-in the original records directly, simply ignoring the un-needed
fields! I contemplated how easy the job was, but recommended scrapping the
Cobol program, which not even said Ph.D. had realized was utterly purposeless.
This was done - and so I put myself out of work.

I wonder how many programmers do similarly unproductive work because their
managers don't realize what is and isn't really being accomplished?  Sometimes
immediate management knows the score, but instructs one to take more time than
needed to make a change.  In one employment I was told to take at least two
weeks to change the title line in a report, to impress upon senior management
how tricky it was and how overloaded we all were.


Probabilities and Engines

Steve Philipson <steve@aurora.arc.nasa.gov>
Wed, 15 Feb 89 17:50:26 PST
In RISKS 8.26 blinn%dr.DEC@decwrl.dec.com, Dr. Thomas P. Blinn writes:

>In RISKS-FORUM Digest Volume 8 : Issue 24, it is asserted by Barry Redmond that

<>If someone makes a mistake on one engine at any of these times, there is a
<>high probability that they will make the same mistake on the other engine(s).

>That may be true, but it may not be true, because the same person may not
>be working on all the engines.  I would agree that an incompetent mechanic
>working on all the engines is likely to make the same mistakes on all of
>them, but the reality of aircraft engine repair is different.

   There are several well known cases where all engines on a multi-engine
aircraft failed (or were in the process of failing) due to the same 
maintenance error.  Sometimes it's the result of a single person's work,
and in other cases it's the result of systematic error by a group.  The 
reality is that such errors and failures do occur.

>The probabilities of failure are not independent because if one engine fails it
>immediately increases the probability of another failing.

<>This is a very interesting assertion.  It seems to be saying that there is a
<>causal relationship between a first engine failure and the likelihood of a
<>second.  ...  , but this doesn't mean that the probability of failure of the 
<>other engines has changed in any way. ...

   There is such a causal reationship.  When one engine fails, the remaining 
engine(s) may have to be operated at increased power levels and shoulder
additional tasks.  This raises the stress on them, and gives us an a priori 
knowledge of increased probabilty of failure.  

   It may be necessary to have a thorough grounding in probability theory
and statistics, but it is also necessary to have knowledge of the specifics
of real world operations.  Either without the other sets us up to allow 
problems to escape detection and analysis.


Probabilities and Engines

Robert Dorsett <mentat@louie.cc.utexas.edu>
Thu, 16 Feb 89 14:56:20 CST
NOTE: a longer version of this tirade will soon appear on USENET's
rec.aviation...  I'll mail a copy to anyone without usenet access.

Jordan Brown wrote:
>727 engines (3/airplane) are wimpy compared to DC-9 engines (2/airplane).
>BAe-146 engines (4/airplane) are *really* wimpy.  (This assumes that
>727s are approximately the same size as DC-9s.  Bae-146's are smaller.)

The 727 and DC-9 engines are the same, derivatives based on the Pratt & 
Whitney JT8D.  What is significant, in the context of this discussion, is 
how the engine thrust relates to the airplane weight.  Here are a few 
thrust-to-weight ratios, assuming various weights and engine-remaining 
situations:

    727-200,000 lbs (max)   727-200, 140,000 lbs    DC-9-30,140,000 lbs (max)
3           4:1                 3:1
2           6:1                 5:1                     4:1
1           13:1                9:1                     9:1

One can change the performance of an airplane by losing weight; to lose
weight, fuel is dumped.  The 727 can lose some 60,000 lbs of fuel.
The one-engine case with no fuel is a performance increase of some 30% 
over the same thrust at max.weight.  

To give an example of the significance of all this, recently a fully-loaded 
Continental 747, enroute to New York, attempted to take off from Gatwick, 
heading north.  It had an engine fire and shutdown at takeoff.  The pilot was 
just barely able to hold altitude after takeoff (with three engines), at 200-
300 feet, with the stick shaker and stall warnings active.  The airplane went 
behind some trees; the controller called a crash after losing radar and visual 
contact.  The plane dumped a massive amount of fuel, managed to gain altitude 
(after several minutes), and returned to the field.  

The moral here, again, is that from the mundane perspective of keeping
the airplane in the air, it's not how many engines you have, but how
much you weigh.  What is far more important for trans-oceanic operations
is how likely it is to lose some or all of your engines, and how likely it 
would be to get to an airfield once you do.  Considering the frequency
with which total (or near-total) freakish engine failures have occurred the 
last few years (even though the engines themselves are more reliable 
than in the past), this isn't really as trivial or as "safe" as the numbers
might have us believe.  


>Airplanes are required to be
>able to maintain such-and-such a level of performance with one engine out.

The most important situation normally considered being takeoff.  I doubt 
a 727 could take off with two engines out; it wouldn't have the time necessary 
to dump, or the thrust necessary to maintain airspeed.  As the above 747 
example shows, even the worst-case performance figures can be misleading.
At the end of the flight, that same 747 would be able to perform with only
one engine operating (as a recent United Airlines emergency landing 
on a flight to Tokyo showed). 


>I don't believe a 727 can fly on one engine.  It must have two.

It can fly on one engine.  And even if it couldn't fly on one engine, as 
another poster pointed out, having *any* thrust means the difference between 
a steep glide and a long glide.  According to the 727 patterns manual, a one-
engine ILS approach is made by assuming a decision height of some 600 feet, 
with an airspeed in the 160-170 kt range.  Best climb speed, for the go-
around, is 200 kts (190 knots with two degrees of flaps).  There's an 
implicit assumption in the training manual that between 600 feet and ground 
level, they will be able to accelerate--and hold--200 kts.


>A three-engine airplane has a higher probability of having a failure in
>the first place, and when it does have a failure it then has two points
>of failure, EITHER of which will cause an accident.

The 727 has three engines because, more than any other factor, Boeing 
perceived a need to trade off airline requirements at the time the plane was 
constructed.  United Airlines wanted a four-engine airplane, Eastern wanted 
two.  So they compromised, and agreed on three.  I suspect a similar history
with the Tristar and DC-10: four is too many, two is too few.  Three is nice
and "safe." 


>Going from one engine to two adds redundancy.  Going from two to three,
>with two required, REDUCES redundancy.

Perhaps we should look up the meaning of "redundancy."  Three engines provide
three thrust sources, three generators, three pneumatics sources, and (on the
727) two hydraulic sources.  I can't imagine how that is "bad," since (apart
from fuel starvation, mismanagement, and particle ingestion) they really don't
have a common failure mode.  There are more parts to fail, but the issue here
is whether more engines will make it more likely for everything to go wrong in
a catastrophic manner, which years of experience has shown to be fallacious.

Robert Dorsett


Re: Aircraft failures (RISKS DIGEST 8.26)

<dag@fciva.UUCP>
Thu, 16 Feb 89 16:52:04 -0500
First, there seems to be some disagreement on the subject.  Does anyone have
any information on the capability of currently popular 3-engine commercial
aircraft (DC10, L1011, B727) to maintain level flight with only one functioning
engine?

Second, to expand upon the comments of Dr. Blinn, my recollection of statistics
is as follows:

Each engine has its own (unknown) probability of failure during any time
interval.  This probability is a function of many known and unknown factors
(history, current aircraft state, fuel, maintainence, etc.).  Initially, we
have an ESTIMATE of this probability which is the same for all engines: some
sort of historical average or other statistic.  The failure of one engine on an
aircraft gives additional information regarding those factors which are common,
and thus allows us to revise our estimate of the probabilities of another
failure on the same aircraft in the near future.

Normally, a statistician would say that the probability of failure hasn't
changed, just our estimate.  There is an exception to this statement.  It is
possible that the failure of the first engine is itself a factor in the failure
of the second, for example, by increaseing the load that engine is run under,
or stress on the aircraft from unequal thrust.

I think the most misunderstood aspect of statistics is that probability
distributions for real world phenomena are rarely known, only estimated.  The
march of time gives us new information to refine our estimates.
                                              Dan Graifer
The usual disclaimers....
Daniel A. Graifer, Franklin Capital Investments,
7900 Westpark Drive, Suite A130,    McLean, VA  22102     (703)821-3244

Please report problems with the web pages to the maintainer

Top