The RISKS Digest
Volume 4 Issue 30

Tuesday, 16th December 1986

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Arpanet outage
Andrew Malis
Dynamic Signature Verification
Robert Stroud [and Brian Randell]
Wobbly skyscrapers and passive vs. active controls
Niall Mansfield
Re: The Audi 5000 problems
Matt Smiley
Modifying bank cards
Rodney Hoffman
Credit card mag strips
Ted Marshall
Fast-Food Computing
Edward Vielmetti
"bugs"
Doug McIlroy
Jonathan Clark
Bob Estell
Info on RISKS (comp.risks)

Arpanet outage

Andrew Malis <malis@ccs.bbn.com>
Mon, 15 Dec 86 10:46:48 EST
[An earlier message asked, "Why did the Northeast corridor disappear from
the Arpanet last weekend? The Network Operations Center said one trunk had
been broken, and they were cut off from most everyone, too. I thought there
was enough redundancy in the Arpanet to prevent a single trunk from causing
such extensive outage...":]

At 1:11 AM EST on Friday, AT&T suffered a fiber optics cable break between
Newark NJ and White Plains NY.  They happen to have routed seven [different]
ARPANET trunks [all] through that one fiber optics cable.  When the cable
was cut, all seven trunks were lost, and the PSNs in the northeast were cut
off from the rest of the network.  Service was restored by AT&T at 12:12.

The MILNET also suffered some trunk outages, but has more redundancy, so it
was not partitioned.
                                           Andy Malis

   [Robert W. Baldwin <BALDWIN@XX.LCS.MIT.EDU> noted:  This is a classic
   example of redundancy at one level of abstraction that turns out to be
   non-redundant at a lower level of abstraction.]
      [Redundancy works sometimes: I received several copies of Andy's
      note.  Yes, this is a lovely example.  By the way, AT&T is laying a
      fiber-optic cable under the Atlantic.  That will provide LOTS of
      opportunities for virtually distinct paths to co-occupy the same
      physical channels.  PGN]


Dynamic Signature Verification

Robert Stroud <robert%cheviot.newcastle.ac.uk@Cs.Ucl.AC.UK>
Tue, 16 Dec 86 12:15:20 GMT
There was an article in The Independent recently (2nd Dec 1986) about
dynamic signature verification and "the arrival of biometrics as a practical
security technology". A company called AI Transaction Security from
Cambridge have produced a gadget called Securisign, two of which are being
used to control access to "a very secure area" at the EEC's headquarters.
[EEC is European parliament]

The article concluded as follows:

  "Dynamic signature verification has turned up one disappointment.
  Researchers originally hoped that signature pads could test the sobriety of
  people such as nuclear plant operators when they signed on for a shift.
  However, research shows that most people can sign their names convincingly
  even when hopelessly drunk".  [Copyright (c) 1986 Newspaper Publishing PLC]

I found this last comment interesting because the last time this topic came
up on RISKS, I recall that the consensus was that the technology did not
work because you had to sign your name very carefully, i.e. not when you
were "tired and emotional". However, when I showed the article to Brian
Randell, he told me the following anecdote:

     Some years ago, I was involved (in an official capacity) in reviewing a
  research project, at a Laboratory which I would prefer not to identify, on
  dynamic signature verification. I was given a demonstration of the system,
  which involved my being asked to sign my name five times, and then being
  asked to sign again to confirm that the system had now "learnt" and could
  recognise my signature. Much to the consternation of the demonstrator, my
  entirely unpremeditated reaction was to turn to a colleague, and ask him to
  sign my name. Without any prior warning or practice, he roughly imitated
  what he could recall of my hand movements, without attempting to reproduce
  the written appearance of my signature. The machine accepted his efforts as
  my signature.  I was then informed, in tones of considerable embarassment,
  that in an effort to speed up the demonstration, the thresholds had been set
  low, and that all would be well if they were reset and I gave an adequate
  number of signatures.  So, they were reset, and I gave (more than) the
  requested numbers of signatures.  To my surprise, the demonstrator expressed
  surprise when I indicated that I felt it appropriate to repeat my
  experiment, and again challenged my colleague to repeat his "feat" -
  something he did immediately and effortlessly!

     The point of this story is that this struck me as an elementary check to
  make on dynamic signature verification systems - yet I do not recall ever
  seeing claims, in any of the (admittedly popular) articles I have read on the
  topic, regarding the ability of the system to defeat attacks based on seeing
  how a person signed his/her name.   [End of Brian's story]


Wobbly skyscrapers and passive vs. active controls

Niall Mansfield <MANSFIEL%EMBL.BITNET@WISCVM.WISC.EDU>
Mon 15 Dec 86 17:28:20 N
PGN in RISKS-4.26
> This raises interesting questions about the relative precision, accuracy, 
> and soundness of "metal algorithms" and comparable analog devices in general.

If you change the scene a bit and take a mildly absurd example, you could
have the same sort of considerations in a desk lamp - either use a normal
passive Anglepoise type, or a hi-tech computer-controlled active
servo-postioned type lamp.  I'd reckon that the old fashioned lamp would
behave itself it power cuts (although not very brightly), electrical storms,
glitchy mains periods, the last day of february of the year 2000, etc.,
whereas I wouldn't be at all surprised if the robot lamp went berserk
sometime and brained me or smashed my teeth in because the chap next door
started radio broadcasting.

For whatever reason - perhaps that we have had such or similar artifacts for
centuries - we are confident and "know" that passive devices made of metal
tubes and weights and springs are not sensitive to various outside effects
which DO affect computers and consequently computer controlled devices, and
if only because they behave resonably, (i.e. as we expect them to) such
passive devices have a great safety advantage.


Re: The Audi 5000 problems

Matt Smiley <crash!pnet01!msmiley@nosc.ARPA>
Tue, 16 Dec 86 00:52:37 PST
Audi did more damage with the '...there isn't anything wrong.' statement than
could be done by simply saying they don't know what it is. Statistically, the
rate of such accidents with the Audi should be proportional to the rate of
such accidents with other vehicles. It obviously is not, leading me to think
there's some defect in the engineering of the vehicle. I had a similar problem
with an old Ford truck of mine, and it took months for me to figure out that
it was due to a defective motor mount. The torque of the engine would lift it
off the mount and subsequently pull the accelerator linkage to the floor. A
similar oddity could be plaguing the Audis.

...nosc!crash!pnet01!msmiley@NOSC <Matt Smiley>

    [The summary list of RISKS-4.1 notes that an Audi investigation was 
     reported earlier in Software Engineering Notes, but I just noticed that
     the reference was wrong: it should have been SEN 11 2 (April 1986). PGN]


Modifying bank cards

Hoffman.es@Xerox.COM <Rodney Hoffman>
16 Dec 86 08:11:51 PST (Tuesday)
From the Los Angeles Times, Dec. 15, 1986 (Reuters):

    COMPUTER 'HACKERS' HELD IN W. GERMANY 

WIESBADEN, West Germany — Police have arrested four computer "hackers" said
to have robbed banks in the Frankfurt area of more than $50,000 by
manipulating cash dispenser cards with a home computer.  Hesse State police
said the four, one woman and three men, had been roaming Frankfurt and
surrounding towns since May with a computer plugged into the battery of
their Mercedes limousine.  They were arrested at the end of November.

The four hackers bought bank cash cards for $1,500 apiece from their family
and friends, who then notified their bank that the cards had been stolen.
The four then used their computer to change the codes on the cards' magnetic
strip so that they could withdraw more money than the limit set by the cards
from automatic tellers, or to tap other accounts.  Under a law on computer
crime passed last August, the four face jail terms of up to five years if
charged and found guilty.


Credit card mag strips

Ted Marshall <blia.UUCP!ted@cgl.ucsf.edu>
Mon, 15 Dec 86 11:45:59 PST
I have noticed a new trend in the way stores imprint credit card slips.
In the olden days, the embossed numbers and letters on the card were
mechanically transfered to the slip. The only use of the magnetic strip on
the back was for verification of the credit limit.

I have now seen two stores (including the local Radio Shack) where the
mag strip reader feeds data to an electronic cash register which not only
dials-up the bank to verify credit but also prints out the slip for the
customer to sign. Unless the clerk checks the printed information on the
slip against the embossed card, there is no verification of the information.

Credit card companies are making it harder to counterfit the embossed
information on the cards. But a hardware hack can still build a gizmo for
$20 that will copy the magnetic information from a "borrowed" card to his.
He then makes sure the other card gets returned so that the bank isn't
notified. The hack walks into the Radio Shack, buys $1000 worth of stuff
with "his" card, and it gets charged to his friend's account. The only
thing to trace him with is the signature on the slip, and it's easy to
sign your name so that it's close enough for the clerk but no one will
ever trace it to you.


Fast-Food Computing

<Edward_Vielmetti@um.cc.umich.edu>
Tue, 16 Dec 86 16:15:04 EST
I must have been in the cycle early for McDonald's fast-food intelligent
man-machine systems, according to Guthery's law:
   >     In an evolving man-machine system, the man will get
   >     dumber faster than the machine gets smarter.

McDonald's fast food computers (i.e., cash registers) collect all sorts of
data on the individual employee at the counter and on all counter sales as a
whole.  They also do not have a <no sale> key that opens up the cash
register, probably to prevent theft.  That made it real hard to fix a
mistake without calling a manager to get a key to open the drawer.

Solution?  Well, the people I worked with at McD's had been around the
system long enough to figure out how to get around it.  Without getting into
too many details of why things were as they were, the easiest way to open
the drawer without a manager was to ring up a sale that gave away a tub of
barbecue sauce for McNuggets and nothing else.  
   (Hit <promo> <barbecue> <promo> <total> .)
Of course, that messed up the daily statistics some.

Edward Vielmetti, Ex-McDonalds employee, Computing Center Microgroup, U. Mich.


"bugs"

<doug%btl.csnet@relay.cs.NET>
Sun 14 Dec EST 1986 21:39
plugs   Unwanted trash that contaminates output.  The classic example is a 
        cheery advertising blurb like "Welcome to MUCUP Version 2.7," 
        which cripples the next program down the pipe.

drugs   Unwanted features that contaminate specs; something the cat drug in.


"bugs"

Jonathan Clark <jhc%mtune.UUCP@harvard.HARVARD.EDU>
Mon, 15 Dec 86 11:45:51 EST
At a recent course I heard Jim Gray of Tandem (seriously) describe two
more bug types:

Heisenbugs: generally transient failure conditions that exist inside
systems.  ('I can't let you have this resource now because it has been
locked'.)  Typically, when the operation is retried on another processor, it
succeeds because the backup processor is in a different internal state.

Bohrbugs: repeatable failures even when retried on another processor.
Typically these are 'hard errors'.


"bugs"

"ESTELL ROBERT G" <estell@nwc-143b.ARPA>
16 Dec 86 10:05:00 PST
"augs" - induced while augmenting a system.
"dugs" - added while fixing other bugs, digging the hole deeper.
"jugs" - portable bugs, bottled and bonded.
"lugs" - which slow down the system [e.g., security features].
"nugs" - little "nuggets" of gold, which didn't pan out.
"qugs" - errors in queues that make batch jobs miss deadlines,
         and print files twice, or not at all.
"rugs" - evenly distributed throughout the code, and pervasive.
"tugs" - little interfaces which keep big systems in tow.
"xugs" - alien bugs [like E-Mail penetrations of UNIX systems]

Please report problems with the web pages to the maintainer

x
Top