The RISKS Digest
Volume 6 Issue 36

Thursday, 3rd March 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

$9.5 million computer-based check fraud
Donn Parker
Captain Zap Zaps Hackers
Donn Parker
Police computer problem
Michael J. Wallach
On the topic of correlating databases...
Matt Fichtenbaum
RISKs of computer swapping
Dave Horsfall
Bank ATMs and checking your statements
David Andrew Segal
Airbus Safety; Database Accuracy
Mike Olson
Slippery slopes & relative risk
Stephen Schaefer
Re: Disappearing Skills
Ronald J Bottomly
Invalid dates
Ross Patterson
Lee Ridgway
Neural networks and P1
Dave Pare
Ada-caused bugs?
Jerry Harper
Aerospace Computer Security Applications Conference
Marshall D. Abrams
Info on RISKS (comp.risks)

$9.5 million computer-based check fraud

Peter G. Neumann <NEUMANN@csl.sri.com>
Thu 3 Mar 88 11:22:37-PST
Four employees of the DCASR (Defense Contract Administration Services Region)
office in El Segundo CA are accused of having "prepared some false documents
and tricked some coworkers" to rig the DCASR computer to issue a check for
$9.5 million to one of them individually as payment for a legitimate invoice
from a legitimate contractor.  A bank officer apparently became suspicious
when the person trying to deposit the check wanted $600,000 in cash on the
spot, and called in the law.  One of the defense lawyers blamed the events on
OTHER DCASR employees.  "Because of incompetence, lack of control and
violation of regulations, it's impossible to know exactly what happened in
this case, who did what and when they did it."

[Source: Evening Outlook, Santa Monica CA, 4 February 1988, courtesy of 
Donn B. Parker]


Captain Zap Zaps Hackers

Peter G. Neumann <NEUMANN@csl.sri.com>
Thu 3 Mar 88 11:29:01-PST
"Ian A. Murphy, a.k.a. Captain Zap, is selling his underworld expertise to USA
corporations that want to keep hackers from busting into their computer
systems.  Night after night, [...] the cherubic Captain sits at his dusty
computer in a cluttered, run-down townhouse here, scanning electronic bulletin
boards — where tips and gossip are traded by computer.  Someone may drop a
hit about breaking into one of his clients' computers.

Murphy is one of a handful of convicted computer felons who make decent
livings ($200,000 last year, he says) using the skills that helped land him
in trouble in the first place.  His slogan: ``Everybody's into computers.
Who's into yours?''

[...]  Murphy claims to employ seven to 10 of the USA's top hackers to break
into computers — legally, that is."

[USA TODAY cover story by Mark Lewyn, no date available, courtesy of
Donn B. Parker]

     [$200,000 sounds INDECENT to me.  Nice time to plant Trojan horses?  PGN]


Police computer problem — lighting up license-plate matches

Peter G. Neumann <NEUMANN@csl.sri.com>
Thu 3 Mar 88 11:04:13-PST
John Stapelton, 35, a computer consultant from Yonkers NY was stopped while
driving in The Bronx and was frisked because a random check of automobile
licenses in the police computer system erroneously turned up his car as that
of someone who had killed a state trooper.  Strangely the database record did
not include the make of the car, which might have been a tip-off that the
actual license of the killer had been entered inaccurately.

Stapleton said the cops admitted the car computer system has its faults.
``They told me it tilts on them all the time.''  In this case they let him go
after deleting the incorrect entry.  Officers of the Bronx' 50th Precinct
claimed to have no record of the incident, but that is not surprising because
no arrest was made.

[Source: An article by Joy Cook and Linda Stevens in the New York Daily News,
no date available, contributed by Michael J. Wallach, Innovative Computer
Solutions, 31 Tulip Circle, Staten Island NY 10312.]

    [The subject of accepting partial matches is a very thorny one,
    especially in the presence of inaccurate data.  One approach is that
    much greater effort is needed in training personnel who interpret
    partial matches.  Another is that systems that try to do partial
    matching should REJECT unconfirmed input data and should continually
    warn the users...  I already suggested adding a pervasive measure of
    data trustworthiness — see my endnote on the message from James H.
    Coombs in RISKS-3.32.  PGN]


On the topic of correlating databases...

Matt Fichtenbaum <mlf@genrad.com>
Wed, 2 Mar 88 09:15:08 est
This RISKS digest mentioned the Post Office matching its list of employees
against a list of debtors (ah, the wonders of computer technology).  Some
20 or so years ago, the State of New York did a match of driver's license
holders against recipients of state aid to the blind.  This operation
found, I think I remember, a few hundred people who were on both lists.

But then, anyone who's driven in New York City could have guessed that.

    [I had a NY driver's license from 1948, and was able to renew it with
    no effort even though no longer residing in NY — until in the late 60s
    they decided to request an eye reexamination!  So those who became blind
    also had no trouble until then.  PGN]


RISKs of computer swapping

Dave Horsfall <munnari!stcns3.stc.oz.au!dave@uunet.UU.NET>
Thu, 3 Mar 88 15:28:30 est
Sometimes, the RISK in computers is in trying to dispose of them, as the
following story shows.

From "Computing Australia", Feb 29th:

``Cream of Canberra wades through rulebook for simple solution.

  When the Department of Science was dissolved into the Dept. of Industry,
  Technology and Commerce last year, officials discovered the two departments
  had non-compatible computing equipment.  Ditac [Dept of I T and C] used
  IBM pcs, while Science had always favoured Convergent Technology.  It was
  decided the CT system would be abandoned and put into storage.

  At the same time, Ditac began to suffer a shortage of computing equipment.
  Some bright spark suggested if another department could be found to use
  the CT gear, it might be swapped for IBM-compatible pcs.

  Then the real snag struck.  The Department of Finance stepped in to
  question the mechanics of the proposal.  Was the arrangement legal?
  It had not been done before.  The regulations made no mention of swaps.
  Maybe the rules would have to be re-drafted.  Interdepartmental meetings
  were held.  Possibilities canvassed.  Eventually a circuit-breaker [?]
  was called for: an outside legal opinion.

  Finally, after weeks of effort and argument, 67 networked microcomputers
  and a minicomputer have been taken from the stores and exchanged for 48
  pc clones.  Everyone's a winner and bureaucracy triumphs.

Dave Horsfall, dave@stcns3.stc.OZ.AU, ...munnari!stcns3.stc.OZ.AU!dave


Bank ATMs and checking your statements

David Andrew Segal <dasegal@brokaw.LCS.MIT.EDU>
Wed, 2 Mar 88 21:55:26 EST
RISKS readers are well aware of the need to check on technology, I learned
this the other week when after allowing four months of bank statements to pile
up, I decided to catch up and reconcile them all.

In early December I deposited a check in the bank's ATM and as I always do
saved my receipt and then later entered the transaction in my check book.
Upon reconciling my statement, I noticed that the deposit had never been
credited to my account.  I found the receipt and noticed that the transaction
was noted as "Deposit not completed."  I knew that since I saved the receipt I
must have deposited the check.  I contacted the individual who gave me the
check and noted that it had indeed been debited from their account 9 days
after I had deposited it.  I contacted my bank and was informed that since the
transaction code stated I never completed the deposit I must be mistaken.
After getting a copy of the check (which had my account number in the
endorsement in addition to all the usual bank's endorsements), the bank
finally credited my account for the missing amount.

I wonder what the bank did in their reconcilation?  When they checked the
machine the fact that they had an extra envelope and deposit didn't bother
them nor did they find it necessary to credit any account but their own.

This certainly shows the need for good record keeping as well as continuing to
check on technology.

David Andrew Segal          

      [When a supposedly indivisible transaction fails to complete properly,
      this is known as an atomic bomb.  If the kernel of the operating system
      is at fault, it is known as nuclear con-fusion.  Consistency may be 
      seen as the hobgoblin of little minds in life, but in computer
      programming we mind more than a little when the system fails with a
      gob of hobblin' code.   PGN]


Airbus Safety [RISKS-3.32]; Database Accuracy [old topic]

Mike Olson <blia.UUCP!blipyramid!mao@cgl.ucsf.edu>
Wed, 2 Mar 88 08:58:15 PST
1. Airbus Safety

In RISKS 3:32,  Nancy Leveson writes (from the London Sunday Times, 13 Dec.):
>  "Airbus yesterday rejected the charges, and said the 320 would be the safest
>  passenger aircraft ever.  `We believe that the safety requirement of a total
>  breakdown occurring only once every billion hours is achievable,' a 
>  spokesman said.  Airbus dismissed Hennell's fears as extravagant and 
>  `wildly off target,' but admitted the computer had failed during test 
>  flying.  The breakdowns were caused by teething problems and the aircraft 
>  had landed safely, it said."

Airbus' statement is less than comforting.  Will only a "total" breakdown
cause the pilot to lose control of the plane?  How badly does some component
of the system need to fail before the plane crashes?

The quote about "teething problems" is also alarming.  Since this is the first
civilian aircraft with fly-by-wire technology, I assume that that technology
is still relatively new.  Does the certification board, or Airbus, or anyone
else, have sufficient expertise to guarantee that the system's teeth are all
in yet?

Particularly in a system like this, where human lives are on the line,
we need to be very careful about deployment.  Testing components and letting
a couple of Navy pilots take the plane up isn't sufficient.  Large systems
tend to fail because of unexpected interaction among their components.  I'd
be *very* interested in examining Arbus' test suite.


2. Database Accuracy

In an earlier RISKS digest, Amos Shapir writes of problems in reliably
identifying people from a database with no reliable primary key.  James
Coombs comments:

>               A naive operator may well not be aware that more than one
> record has been retrieved (yes, there may still be some irresponsibility
> here).  Whether or not the incident followed this scenario, we should keep
> the possibility in mind and consider displaying the number of records
> retrieved before displaying any records.

This theme is an old one is RISKS, and other contributors have addressed the
issue at length.  From personal experience, though, I add the following:

The "clerks" responsible for entering and retrieving data are often both
undertrained and underpaid.  It's hard to convince someone who's making
minimum wage to care much about accuracy; they want to do their jobs with
no fuss or bother, and forget about them at the end of the day.  Given a
database for (for example) the registration of all citizens, their addresses
and credit histories, bank balances and criminal records, misuse (whether or
not it's inadvertent) is virtually guaranteed.

I used to work for a hospital billing agency; the data entry people there were
mostly high-school dropouts living at just about the poverty line, and we had
problems like this all the time.  Once, for example, two patients with the
same name were admitted to the hospital on the same day, went into surgery
on the same day, and were released on the same day.  One was an eighty-year-
old man in for a hip replacement; the other was a young woman in for a
Ceasarian section.  Our database wasn't well-constructed; the eighty-year-old
man was billed for both procedures.  (To be fair, if he *had* been pregnant,
he certainly would have required a C-sec...).  Medicare objected to the
bill, of course, which was how we found out about it.

The risk here is two-fold.  We were using an old system that had been poorly
designed from the start.  It's true that the software that handles the billing
should be smarter, but like a lot of businesses, we couldn't afford to re-write
it (ever try to scratch by on Medicare payments?).  And the people who used
the software were either unable or unwilling to understand its limitations.

Hackers love to talk about the twenty billion lines of Jurassic COBOL that
run the world.  As time goes by, and networks and databases put more
information on-line, the flaws of old code are going to become more apparent.

Mike Olson, Britton Lee, Inc.
(...!ucbvax!mtxinu!blia!mao)    (olson@ucbvax.berkeley.edu)


Slippery slopes & relative risk

Stephen Schaefer <sps@mcnc.org>
Thu, 3 Mar 88 17:46:26 EST
My view of this debate is that there are two different objects being pursued,
and perhaps mistaken for one another.  The slippery slope is one paradigm with
which to anticipate possible risks.  What David Thomasson would like to do is
go beyond the identification of possibilities to a ranking of risks, that is, a
MEASUREMENT of benefits and pitfalls, from which a rational judgement can be
attained.  The piteous condition of the real world is that the cost of
measuring risks often outweighs the possible benefit of a rational choice.  The
confusion can become even more vicious when the cost of measurement is itself
highly uncertain.  Darkness heaps upon darkness.

So how do we cope?  Badly, of course.  People die in accidents caused by
unexpected features, and people die in accidents easily preventable by the
appropriate widget.  Different cultures and different individuals adopt
different attitudes toward experimentation in different domains, choosing high
risk/high payoff or low risk/sure payoff.  One technique associated with
western culture is to let individuals choose their risks, and then, after some
data come in (some die, some get rich), observers adopt the beneficial and
reject the detrimental.  The whole afair is a chaotic mess, with no end of
decisions based on insufficient data and irrational likes and dislikes.  The
approach is obviously inappropriate to instances where replication is
impractical - nuclear war/nuclear defense immediately leaps to mind.  But we
profit so well from such ``scientific method'' that no other approach seems to
satisfy the void when it is unavailable.

Societies less opulent than ours have a propensity toward tradition and moral
dicta that may reflect their smaller margin for error — or the causality may
lie in the opposite direction, with our larger margin for error being the
result of more ristk taking.  For the moment, life in the fast change lane is
serving us well.  To continue to be successful, we must develop methods
applicable beyond the scope of practical experience; we must know when to apply
them; and we must have the will to apply them.  These topics are the concerns
of metaphysics, epistemology, and ethics, and we must have high hopes for Mr.
Thomasson's philosophy.  It is that which is not subject to engineering
solutions which most threatens our society.  Most readers on this list are
engineers, however, and we work from the opposite direction.  Our duty is to
measure wherever we can, and, failing that, to present as comprehensive a
description of the possibilities as we can.


Re: Disappearing Skills [RISKS 6.35]

Ronald J Bottomly <Bottomly@DOCKMASTER.ARPA>
Thu, 3 Mar 88 09:31 EST
  <>   What do you think?  Is technology weakening us by causing
  <>   important skills to atrophy?  Or is our educational system
  <>   "irrelevant"?  Where does one draw the line?

It is not so much the SKILL (ability to multiply) that will atrophy; it is
the ability to think that will atrophy.

You were not taught insciption of cunieform or how to trim a quill pen when
learning to write because of the advent of improved MEANS of writing (eg.
the pencil).  However, there was still the necessity of learning the skill
of writing.

I never learned how to multiply by using a slide rule (what with the advent
of calculators).  And I will use a calculator without hesitation if one is
immediately available.  But if one is not available, I can just as readily
multiply by hand.  The only cost to me is time.

I am not condoning technological stagnation, but I am condemning absolute
technological reliance.  The need for multiplication will probably exist as
long as mankind exists; but it seems dangerous (RISKy?) to come to rely upon
calclators (or whatever will succeed them) to perform this multiplication.

Technological advances should save us time; they should not "save" us the
"bother" of being able to think.
                                              Ron Bottomly


Invalid dates

Ross Patterson <A024012%RUTVM1.BITNET@CUNYVM.CUNY.EDU>
Thu, 03 Mar 88 09:31:30 EST
    February 31, 1988 is at least partially understandable, given the
atrocious algorithms sometimes used for date manipulation.  However, on
February 29, 1980, IBM's VS/APL system reported the date as March 0, 1980!
The user who called us to report it asked if we'd changed the default for
the )ORIGIN.  I guess it made sense, given an APL mindset.

Ross Patterson, Rutgers University


Invalid dates

Lee Ridgway <RIDGWAY@MITVMA.MIT.EDU>
Thu, 03 Mar 88 10:13:50 EST
I just noticed that the "due date" on the computer-generated slip for a bank
loan of mine says "2/30/88".

On another note, a lawyer-friend of mine says his office sends out a warning to
its staff every leap year, on 2/1, to check all legal documents that may be
completed on 2/29.  Seems they did get caught several years ago with a
mega-buck financial contract that expired on the 20th anniversary from the date
of signing, which was---- 2/29.  Let's see, 80 years of interest on $5 million,
at 12%...


Neural networks and P1

Dave Pare <mr-frog@amos.ling.ucsd.edu>
Wed, 2 Mar 88 16:08:14 PST
At the current state of technology, neural networks are nothing to be
feared!  The idea that "some neural network" could take over large
sections of the ARPAnet seems ludicrous; anyone who has ever implemented
a neural network can tell you that it is painful enough trying to teach
the network how to "learn" an XOR operation.

What people mean when they say neural networks "learn" is that the
network has the ability to configure itself so it recognizes patterns.
Typically, the experimenter takes many kinds of examples of input
(bit patterns, samples of human speech, etc) and runs them through the
network.  The network is told the right answer for each input, and the
idea is that from some subset of input, the network can generalize
and apply its pattern recognizing capability to provide the correct answer
for input that wasn't explicitly presented.

Depending on the complexity of the pattern, this process can take hundreds
or thousands of presentations, eating up huge amounts of CPU time.  The
person I work for managed to use UCSD's entire allocation of CRAY-XMP time
for a quarter by running his neural network simulator for 24 hours.
That's the closest to a takeover that I've heard of!

It is true that the learning approach does seem to better reflect the
way people actually learn, but the technology is still quite new and
mostly unexplored.

Dave Pare, Center for Research in Language, UCSD


Ada-caused bugs? [Another old topic; new question]

Jerry Harper <mcvax!euroies!jharper@uunet.UU.NET>
Thu, 3 Mar 88 10:52:41 GMT
1.  Am I correct in thinking that several (two?) missiles were recently
    destroyed on launch each of which had their guidance systems
    coded in Ada?  Were the problems which forced the destruction of
    the missiles the result of bad software design or some inherent
    ambiguity in Ada syntax?

2.  I spotted but unfortunately left unlogged a report somewhere which
    gave an account of a talk by a leading scientist (name?) in the
    military technology area who expressed grave reservations about the
    design of Ada.  I *think* the report mentioned that the person
    expressed little confidence in guidance systems coded in Ada.

3.  Is the Pentagon insisting on Ada being the standard for all military
    software projects?  

Jerry Harper, Merrion Gates Software (Logic Programming)
Merrion House, Merrion Road, Dublin 4, IRELAND.  netwise: jharper@euroies.uucp

   [Ada is by no means a panacaea.  It has some benefits — type-checking,
   import/export controls, etc. — that can contribute to safer programming.
   But its complexity makes it ripe for misuse.  It is nominally mandated for
   all military embedded systems, except that various limitations have 
   resulted in its being eschewed in some security-community applications.
   Can anyone provide a definitive answer to Question 1?  I don't recall
   anything that might have implicated Ada!  PGN]


Aerospace Computer Security Applications Conf. - Call for Papers

Marshall D. Abrams <abrams@mitre.arpa>
Tue, 01 Mar 88 10:52:53 EST
Call for Papers, Fourth Aerospace Computer Security Applications Conference
December 12-16, 1988, Sheraton World Hotel, Orlando, Florida

Operational requirements for civil and military systems under development
increasingly stress the necessity for information to be readily accessible
to users and operators.  This produces an apparent conflict with policies
and directives which require total protection of system data from
compromises of privacy, confidentiality, and integrity.  Accomplishing both
of these sets of requirements requires the application of the maturing
technology of computer security to new systems throughout their development
cycle.  In addition, operational approaches to satisfy system requirements
and accommodate the implementation of engineering technology require
intensified research and development.

This conference will explore technology applications in two complementary
aspects:  first, the policy issues and operational requirements for both
civil and military systems; and second, the hardware and software tools and
techniques being developed to satisfy system requirements.  Special emphasis
will be placed on specific examples of systems applications.

A three-day technical conference exploring the application of computer
security technology will be preceded by two days of tutorials dealing with
policy matters, technology applications, and other areas.  Introductory and
advanced surveys will be offered as well as advanced courses exploring
specialized technological areas.

Areas of Interest Include: Trusted DBMSs, Operating System, and Network
Security, Current and Future Trusted System Technology, Space Station
Requirements, Certification, Evaluation and Accredition, Policy and Management
Issues, Advanced Architectures, C3I Systems, Risk/Threat Assessments

Unclassified papers or unclassified abstracts of classified papers must be
mailed before 20 May, 1988, to Dr. William T. Bisignani, Technical Program
Chairman, Booz-Allen & Hamilton Inc., 4330 East-West Highway, Bethesda, MD
20814

Tutorial Proposals including a detailed outline and a resume of presentor(s)
must be mailed before 20 May, 1988 to Dr. Dixie B. Baker, Tutorial Program 
Chairwoman, The Aerospace Corporation, P.O. Box 92957, 2350 East El Segundo
Blvd, El Segundo, CA 90245-4691.

For more information or to receive future mailings, please contact the
conference chairman, Dr. Marshall D. Abrams, phone: (703) 883-6938, The MITRE
Corporation, 7525 Colshire Drive, Mail Stop Z670, Mc Lean, VA 22102, E-mail
address: abrams@mitre.arpa

Please report problems with the web pages to the maintainer

x
Top