The RISKS Digest
Volume 5 Issue 62

Friday, 20th November 1987

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

A Two-Digit Stock Ticker in a Three-Digit World
Chuck Weinstock
Stark - warning depends on operator action, intelligence data quality
Jonathan Jacky
Task Force Slams DoD for Bungling Military Software
Jonathan Jacky
Addressable CATV
Jerome H. Saltzer
Human automata and inhuman automata
Chris Rusbridge
Re: CB frequencies and power
Dan Franklin
John McLeod
Wm Brown III
"UNIX setuid stupidity"
David Phillip Oster
Stephen Russell
Software Safety Specification
Mike Brown
Call for Papers, COMPASS '88
Frank Houston
"Normal Accidents" revisited
David Chase
Space Shuttle Whistle-Blowers Sound Alarm Again
rdicamil
Info on RISKS (comp.risks)

A Two-Digit Stock Ticker in a Three-Digit World

Chuck Weinstock <weinstoc@SEI.CMU.EDU>
Fri, 20 Nov 87 17:06:42 EST
From Business Week, November 30, 1987, Page 108I
(Industrial/Technology Edition):

When the Dow Jones industrial average plunged 508 points on Oct. 19, the
message didn't get through to subscribers of Lotus Development Corp.'s
Signal service, which provides up-to-the-minute market data.  Because Signal
can record only a two-digit net change for each day, it reduced the market
crash to eight points.  "When the product was originally designed, they
didn't take into account these sorts of fluctuations," concedes Lotus
spokesman James P. O'Donnell.

Signal, introduced two years ago, captures stock prices and indexes from an
FM radio wave, then displays them in a Lotus 1-2-3 spreadsheet format.
Lotus says it won't fix the problem until the next release of Signal, and it
won't say when that will be.  Until then, Signal subscribers, who pay at
least $100 a month for transmissions, will have to check that third digit --
or fourth — in the next morning's paper.

      [Sounds like Signal has been Short-Sheeted.  As the Father of the
      spreadsheet, maybe with Signal Lotus Leaves something to be de-Sired?
      (I suspect they do not anticipate another 100-point swing soon.  Dow-dy?)
      Which digit is the Index finger?  1st in US, 2nd in Europe, 4th if you
      turn the right hand over.  But having to check the fourth digit in
      the Dow-swing sounds really ominous!  PGN]


Stark - warning depends on operator action, intelligence data quality

Jonathan Jacky <jon@june.cs.washington.edu>
Thu, 19 Nov 87 09:10:41 PST
The STARK's captain told the press some time ago that he thought the ship's
warning system had not worked.  It turns out the warning system is dependent 
on the quality of data loaded by operators.  This information comes from
John A. Adam, "USS STARK: what really happened?", IEEE SPECTRUM, Sept. 1987,
26 - 29.  Some excerpts:

  Brindel, who had been the Stark's captain, said during a telephone interview
  on Aug. 5: "If the sensors that we had would have divulged the things they
  should have, then I'm sure my TAO (referring to tactical action officer 
  Moncrief) would have taken additional measures." Brindel said neither the
  radars nor early warning receiver performed according to specifications.  But
  he declined to elaborate, saying, "I think all the problems are being 
  addressed by the Navy."

  Sources familiar with the Naval Board of Inquiry investigation have confirmed
  a statement by Brindel to SPECTRUM: the SLQ-32 radar receiver did detect the
  missiles coming but it failed to properly identify them. This is reminiscent 
  of the (attack on the Sheffield in the 1982 Falklands War...) ...

  The SLQ-32 computer compares the observed signal characteristics with the 
  parameters of possible radar emitters stored in the computer's library of
  friendly and hostile targets.  When identification is complete, the computer
  sends it to the display to alert the operator.

  ...Identification as friend or foe can be "heavily influenced by the SLQ-32
  operator" using software libraries.  Some libraries, such as those for Soviet
  equipment, are fixed and cannot be tampered with; others, such as those for
  weaponry of US forces or Third World nations, are variable, allowing inputs
  from the operator when the ship is deployed in a given territory.  
  One source familiar with the operations said the US military uses an
  "electronic order of battle" which lists all anticipated friendly,
  neutral, and hostile emitters.  This, he said, would be the basis for
  SLQ-32 entries for a specific operation or region.

  Thomas F. Curry, associate deputy assistant secretary for the Navy until
  1983, said "it's probably more difficult to get intelligence information
  on friendly neutrals (like France or Iraq) than on hostile countries." ...
  Friendly neutral countries do not freely give information on their weapons
  to allies, since it would hurt military export sales.  The US Navy had
  (conducted tests on some Exocets but) the Exocet's Paris-based manufacturer,
  Aerospatiale, refused to say whether it had recently changed the
  characteristics of the missile's seeker.


"Task Force Slams DoD for Bungling Military Software" (SDI, Ada, ...)

Jonathan Jacky <jon@june.cs.washington.edu>
Wed, 18 Nov 87 21:30:43 PST
Task Force Slams DoD for Bungling Military Software Efforts
ELECTRONICS, Nov. 12, 1987, p. 121.

The Defense Department's efforts in sofware development are disjointed,
uncoordinated, and lack support, charges the Defense Science Board's Task
Force on Military Software.  The task force reports "it is convinced that
today's major problems with military software are not technical, but
management problems."  It lambastes the DOD for having "not provided the
vital leadership needed" in Stars, the Software Technology for Adaptable,
Reliable Systems.  It complains that Ada, the high-level programming
language the DOD is pushing to make a standard for all military systems "has
been overpromised."  It warns that "the Strategic Defense Initiative has a
monumental software problem" and that "no program to address the software
problem is evident."  To solve the management problem, the task force urges
the DOD to bring together Stars, Ada, and the Software Engineering Institute
under the Air Force Electronic Systems Division.  It also wants
representatives from the three programs and from the Defense Advanced
Research Projects Agency's Strategic Computing Initiative to produce a
"one-time jont plan to demonstrate a coordinated DOD Software Technology
Program."  What does the DOD have to say?  Not much just yet.  Officials at
the Ada Program office did not respond to calls, and a DARPA spokesman would
say only that the agency "has no plans to implement any of the changes the
report recommends" at this point but will take them under consideration.


Addressable CATV (RISKS-5.61)

Jerome H. Saltzer <Saltzer@ATHENA.MIT.EDU>
Fri, 20 Nov 87 02:10:17 EST
Allan Pratt raises several good questions about privacy vis-a-vis
addressable CATV technology, with the following starting point:

> The cable company serving the community where I live is upgrading
> their service to new, addressable converter boxes.

It turns out that the term "addressable" in the CATV industry almost
always means "one-way addressable".

One-way addressable is moderately harmless to privacy--it really means only
that the converter box responds to signals specifically addressed to it
(such as "allow this customer to watch Showtime").  In a one-way addressable
system there isn't a return path from the converter back to the head-end, so
the company has no way of knowing whether or not the box responded, how many
movies you have watched on the Playboy channel, or whether or not you have
used the "mute" feature.  The only things the company knows are what
channels it tried to authorize, how promptly you pay your bills, and how
often you complain about the service.  There is still a good privacy issue,
because their computer can easily generate for sale a mailing list, e.g., of
everyone who subscribes to one of the premium sports channels.  In our
community, the CATV operator sends a yearly notice to each subscriber
promising not to use CATV billing information for anything other than billing.

Although the FCC has for several years been requiring that new CATV systems
be two-way capable, almost none of them actually implement the capability,
because the technology is some distance over the edge of what the average
CATV operator can keep running.

An operator probably wouldn't go to the trouble of shaping up his or her
plant for two-way unless a big revenue stream is in prospect.  The only
revenue stream currently in prospect is pay-per-view.  Two-way pay-per-view
systems can be easily identified: if you can agree to pay for a pay-per-view
event simply by pushing buttons on the converter box, it is two-way, and you
should worry a lot about the problems that Pratt raises.  If you have to
call the cable company on the phone to "order" the pay-per-view event, they
are using a one-way system, and you have some control over what they can know.

Last I heard a count there were perhaps twenty cable companies in the
country running two-way, and the number was declining.  Warner in Columbus,
Ohio (under the name QUBE) is the most well-known.
                                            Jerry


Human automata and inhuman automata

Chris Rusbridge <munnari!max.sait.oz.au!cccar@uunet.UU.NET>
Wed, 18 Nov 87 10:00:14 cst
There was a radio news item this morning about a Canberra company who 
received a Telecom bill over 2,000 pages long. Most of the pages 
contained zero items, so I don't think they were complaining about the 
bottom line. However the bill was a stack of paper over 450mm high!

Of course, changes to Telecom's billing computer system were blamed for the
debacle, which I think reached Parliament. The RISK illustrated here, above
the common failures to test changes in production systems properly, is the
human automaton who wrapped that stack of paper up in a parcel and sent it
off, apparently without checking or reporting it. What kind of *human*
system have they designed in there?

Chris Rusbridge, Academic Computing Service Manager
S. A. Institute of Technology
ACSnet: cccar@max.sait.oz 
Phone:  +61 8 343 3098  Fax +61 8 349 6939  Telex: AA82565
Post:   The Levels, SA 5095 Australia


Re: CB frequencies and power

<dan@WILMA.BBN.COM>
Thu, 19 Nov 87 10:24:52 -0500
> CB's run at 4 Watts.  Their wavelength is 436 inches.  (~11m).
> JOHN MCLEOD    Georgia Insitute of Technology, Atlanta Georgia

4 watts is the legal limit (power to antenna, as I recall), but when
I was an amateur radio operator, it was common knowledge that CB'ers
often flouted the law.  They'd buy a ham amplifier for 10 meters and
use it for 11 meters.  They'd get 100-1000 watts that way (1kw being
the legal ham limit).  In cars, even.

So despite the low legal power limit on CB transmissions, it would not
surprise me to hear of computer (and other) problems being caused by CB
radios at a considerable distance.
                                    Dan Franklin


Re: CB frequencies and power

John McLeod <jm7@pyr.gatech.edu>
Thu, 19 Nov 87 15:28:59 EST
This is true, however, after a few people got caught and had to pay 
$10,000 and up fines for running large amplifiers, a large number of 
those breaking the law stopped.  More recently, however, there
has been no enforcement at all on CB channels.


CB frequencies and power (RISKS-5.60)

Wm Brown III <Brown@GODZILLA.SCH.Symbolics.COM>
Thu, 19 Nov 87 14:19 PST
    CB's run at 4 Watts.  Their wavelength is 436 inches.  (~11m).

That's the theory.  In practice, however, *MANY* CB operators run bootleg
power amplifiers which put out tens, hundreds, or sometimes thousands of
watts.  When computer-controlled engines started showing up on our roads, 
some truckers actually made a game of stalling them out by keying up CB
transmitters nearby.  Points were scored according to the performance/price
class of the vehicles disabled and the degree of panic induced by engine
failure at 65 MPH.  VW drivers who wouldn't let an 18-wheeler pass them
were targets for more serious electronic warfare — it wasn't the trucker's
fault that the VW stalled two car lengths in front of him....

The FCC has very few enforcement agents, and has pretty much given up any 
pretense of controlling the 27 MHz band.  In some areas, owners of very
high-powered transmitters establish 'ownership' of a CB channel by simply
blasting everyone else off the air.  An account of one Blackhawk crash
laid the blame to a nearby 'bootleg CB transmitter,' which I interperted
to mean one of these power mongers.  The term "Alligator" has been coined
to describe radios with "Big mouths and tiny ears."


Re: "UNIX setuid stupidity"

David Phillip Oster <oster%SOE.Berkeley.EDU@jade.berkeley.edu>
Wed, 18 Nov 87 08:14:33 PST
Thank you for the correction. The setgid solution is much better than the
one I proposed. i didn't hit upon it because your solution does not solve
the problem _as stated_ (Student creates a copy in the teacher's directory
that no othe student can read.) Although it solves the more general 
problem.  A still simpler solution: mail the assignment to the teacher.

--- David Phillip Oster            --A Sun 3/60 makes a poor Macintosh II.
Arpa: oster@dewey.soe.berkeley.edu --A Macintosh II makes a poor Sun 3/60.
Uucp: {uwvax,decvax,ihnp4}!ucbvax!oster%dewey.soe.berkeley.edu


"UNIX setuid stupidity" (RISKS-5.57)

Stephen Russell <munnari!basser.cs.su.oz.au!steve@uunet.UU.NET>
19 Nov 87 10:54:10 GMT
In RISKS 5.57, David Oster makes the mistake of assuming that, as I reported
the error concerning inappropriate use of a setuid root program, that I was
responsible for it. In fact, the error was made by a member of the teaching
staff (since left here) several years ago. While the member of staff
probably should have known better, his attempts at making the program secure
were naive. What is more worrying is that the program must have been
installed by a system administrator, who certainly should have known better.
This raises the interesting question of who is responsible for this security
bug - the person who wrote the buggy program, or the programmer who
installed it without vetting it?

(As an aside, Mr Oster's defamatory statements, accusing me of stupidity,
illustrate how the immediacy of electronic mail tempt us to make comments which
we would think twice about if actually face to face. Another computer risk?).

   [I have omitted some of David's message, and ten or so additional
   messages on setuid/getuid.  Most of them focused rather too narrowly
   on specific partial implementations, but missed the bigger picture.
   By the way, I do not normally run messages with personal implications, 
   but this one seemed appropriate to clear the air.  I imagine in this
   case one party may have been less than objective and the other may
   have been overly defensive.  At any rate, the debate is deemed ended.  PGN]


Software Safety Specification

<mlbrown@nswc-wo.ARPA>
Mon, 16 Nov 87 16:19:26 est
I am in the process of writing a draft MIL-SPEC for Software Systems Safety
(MIL-SPEC-SWS).  It is based on MIL-STD-SNS (draft) and will address the
300 series tasks from MIL-STD-882B and parallel the DoD-STD-2167 software
development process.  I am looking for suggestions for materials or subjects
that should be included in the SPEC.  Analysis techniques, specific topical
areas that should be covered, etc.  I would welcome any material that the
RISKS readers would care to contribute.  MIL-SPEC-SWS will be more of a 
how-to and guidelines document than a policy and what-to document.  You
can e-mail to me directly at mlbrown@nswc-wo.arpa or by conventional mail
        Commander, Naval Surface Warfare Center
        Code: H12 (M. Brown)
        Dahlgren, VA  22448-5000
                                    Thanks,     Mike Brown


Call for Papers, COMPASS '88

Frank Houston <houston@nrl-csr.arpa>
Thu, 19 Nov 87 09:22:48 est
                               CALL FOR PAPERS

                                 COMPASS '88
                 (Third Annual Computer Assurance Conference)

 *   A man with cancer is killed because a computer tells a radiation
     therapy machine to administer a lethal dose.

 *   A rocket on the way to Mars has to be destroyed because a crucial
     line is left out of the computer software controlling it.

 *   A bank is forced to borrow $23.6 billion overnight because of a
     computer, and the government-securities market narrowly escapes
     disaster.

 *   Workers are killed by computer controlled industrial robots.

All of these disasters have happened, and their numbers are increasing year
by year.  COMPASS is an organization dedicated to finding ways of
combatting this problem, and increasing Computer Assurance.  The name
"COMPASS" combines abbreviations of "COMPuter" and "ASSurance".

What do we mean by computer assurance?

One might define the term analytically by including process security,
systems safety, software safety, reliability, quality control, testing,
verification and validation, mathematics, physics, and various engineering
disciplines.  It is not, however, a simple combination of these.  On the
one hand, a system may be totally unreliable yet perfectly safe; on the
other hand, a safe system may sometimes not be an appropriate goal.
Furthermore, there are deep, unresolved philosophical questions about both
immediate design goals of autonomous systems and more universal meta-goals
apropos of dealing with the unexpected.  What should these goals be and how
do the design goals and the meta-goals interact?

Help us explore computer assurance, define its boundaries, identify its
issues, and realize its objectives.  Submit an article or abstract for the
1988 conference.

Abstracts of any length will be considered; complete papers are preferred.
All submissions should be typed double spaced and single sided (draft
form).  Upon acceptance, IEEE kits for preparing camera ready copy will be
sent.

   ====================================================================
                                     *                                     
   Dates:  27 June — 1 July '88     *  Mail manuscripts, abstracts and    
                                     *  requests for information to:      
    Location:  Washington, D.C.      *
         ---------------             *
General Chair: CDR Micheal Gehl, ONR *           COMPASS '88
                                     *          P.O. Box 5314
  Program Chair: Janet Dunham, RTI   *       Rockville, MD 20851
         ---------------             *
   Submissions due:  30 Jan 1988     *  ( ) Submission  ( ) Information
                                     *
   ====================================================================

Submit abstracts electronically to Janet Dunham (jrd@rti.rti.org).
For more information contact Frank Houston (houston@nrl-csr.arpa).
Be sure to include your mailing address.


"Normal Accidents" revisited

David Chase <acornrc!rbbb@ames.arpa>
Thu, 19 Nov 87 8:37 PST
I just finished "Normal Accidents" by Charles Perrow (Basic Books, 198?).  I
recommend it for your class and I recommend it for RISKS aficionados.
Perrow's thesis is that "complex, tightly-coupled systems" will have
accidents; the accidents are "normal".  He also writes about risk-inducing
organizations and the way that they "analyze" accidents ("blame the operator"
is a common theme--certainly it has low short-term cost, and avoids the
inference that (say) nuclear power is intrinsically disaster-prone).

He says little about computers in this book.  However, he provides enough
examples of unexpected interactions and multiple errors to make most people
hesitant about claiming to "cover all the bases" (e.g., I did not know that
dams could cause the earth beneath them to shift, though it makes perfect
sense in hindsight).

David Chase, Olivetti Research Center, Menlo Park

    [This book is one of the cornerstones of RISKS, and is worth noting
    here again every now and then, particularly for new readers.  PGN]


Space Shuttle Whistle-Blowers Sound Alarm Again (reprint)

<rdicamil@CC5.BBN.COM>
Thu, 19 Nov 87 18:16:45 -0500
         Space Shuttle Whistle-Blowers Sound Alarm Again
             (Electronic Engineering Times, 11/16/87)

                       by Richard Doherty

  HOUSTON - The first step in a concerted action by so-called technology
  whistle-blowers to increase public awareness of continuing problems with the
  NASA shuttle will be made here this week.

  On Wednesday, former Lockheed company engineer John Maxson is due to address
  an ethics meeting of the American Society of Mechanical Engineers about the
  wider aspects of whistle-blowing.  During his presentation, Maxson is
  expected to give evidence collected over the past few months that he will
  claim supports fears that critical shuttle sub-system problems still remain.

  Maxson will share the stage with former Morton Thiokol engineer Roger
  Boisjoly, who currently has a billion-dollar suit underway against his
  one-time employer and NASA. Boisjoy has charged that Morton Thiokol
  conspired to cover up problems with the shuttle's solid rocket motors, the
  failure of which was blamed for the Challenger tragedy last year.  He also
  claims he suffered personal injury as a result of the disaster.

  Boisjoly was one of several Morton Thiokol engineers who objected to the
  launch of the Challenger before its ill-fated liftoff.

  Maxson was dismissed by Lockheed's Space Operations Co. some months after
  the shuttle was destroyed. Six weeks before the shuttle explosion, in
  December 1985, Maxson had tried to convince Senator Charles Grassley
  (R-Iowa), a supporter of such whistle-blowing, that problems with the
  shuttle launch system should preclude the launch of the shuttle Columbia.

  Columbia, which lifted off successfully after a near-disastrous mistake in
  fueling liquid oxygen tanks, was the last successful shuttle launch.

  Maxson has sued Lockheed for wrongful dismissal, and claims he is one of
  hundreds of NASA and contractor employees who were forced out "for doing our
  jobs".

  In coming months, as more evidence surfaces, Maxson aims to rally support
  among other unemployed engineers.  He hopes to help restore their jobs and
  also raise industry awareness of shuttle subsystem problems and managerial
  laissez-faire attitudes that he claim threaten a scheduled safe return to
  space June 2.

My addendum comment:  As a usual watcher of CNN "headline news" I've noticed
lately some press about another "new shuttle escape system".  This one is
some kind of rocket designed to pull (a parachuted equipped) astronaut out
of the ship by a harness.  I did notice that CNN mentioned in their
commentary that this kind of escape system would not have saved the
astronauts in the Challenger disaster.  However, I have failed to hear such
commentary included in similar stories of the major networks.

Too, this is one recent story in a seemingly continual series of press
releases about the new and improved shuttle escape mechanisms. Lot's of
money is being spent, but whether reported or not, upon (close) examination
none of these mechanisms would prevent the death of astronauts in a
Challenger type disaster.

I wonder just how much additional engineering is happening for purely public
relations purposes (and at what, if any risk) ?  Perhaps this is just
another clever manifestation of the "laissez-faire" attitude Mr. Maxson is
trying to expose. It's unfortunate that such P/R, spaced even over many
months, could lull the public into a false sense of security. Also, such
news less any critical commentary - does sell quite well ( news reporting is
also a commodity).

Please report problems with the web pages to the maintainer

x
Top