The RISKS Digest
Volume 8 Issue 66

Thursday, 4th May 1989

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…


o Standards == nothing
Rich Neitzel
o Traffic Alert Collision Avoidance System with "no bugs"
Henry Schaffer
o Nuclear reactor knocked offline by 2-way radio in control room
Wm. Randolph Franklin
o B-2 builders: Prototype not needed (Long Article)
Mark Thompson via Stephen W. Thompson
o American Express is watching...
Sundar Iyengar
o Telephone line security
David C. Kovar
o COMPASS Program
John Cherniavsky
o Info on RISKS (comp.risks)

Standards == nothing

Rich Neitzel <thor@stout.UCAR.EDU>
Mon, 1 May 89 13:44:52 MDT
Before deciding that I enjoyed software development more then engineering, I
was involved in nondestructive testing (NDT). One of the major functions of NDT
is to determine the adherence of the item under test to standards.  Generally
these standards are related to the safety of the larger system in which the the
tested item is a component. My experience directly contradicts that expounded
above.  Anyone who orders and uses a product simply based on its "meeting"
standards is being extremely foolish.

For example, an NDT lab that I worked for was owned by a steel warehouse firm.
The parent had a contract to supply structural steel for a major office
complex. Twice the firm purchased steel only to discover that the mill
certificates were faked. In the first case, this was discovered only when
welders reported problems during fabrication. This kind of problem is more
widespread then most people would like to admit. Consider the recent spate of
reports on substandard areospace fitting being sold with false documentation.

Further, I was several times asked to falsify NDT results to certify that items
met standards. In one case, I failed a number of welders taking a certification
examination (on the same building project noted above). The fabricator simply
took the SAME weld coupons to another lab and EVERY ONE of the welders passed.
In another case, at nuclear weapons plant I worked at, a team of EE's was
prevented from inserting additonal circuitry into a test system that would
falsify test results only when one threatened to go public with the

While the above examples are concerned with outright fraud, many things
involved in applying standards are open to interpretation.  Consider the RS-232
standard. How does an inspector of power plant welds determine if an ultrasonic
echo means the weld is substandard, when it is the "gray" zone? Many standard
compliant items are not in compliance.

The point of all this is that standards guarantee little or nothing.  Questions
of liability are meaningless. If the profit to be made is high enough and the
risk of detection small enough, many firms will falsify certification. Worse,
the falsification may be impossible to trace. A part fails and loss occurrs,
but often the damage is such that no reconstruction of the exact cause can be
made. Since the part was certified, the search is likely to turn elsewhere
(assembly, operation, etc.). I am tired of "real engineers", who are no more
exact, informed or methodicai then programmers pretending that engineering is
somehow less prone to exactly the same problems in project management and
control as programming. It would a trival exercise to compile a list of
engineering failures, just as it would be for programming failures. The real
issue is how to design and manufacture anything correctly.

Traffic Alert Collision Avoidance System with "no bugs"

Henry Schaffer <>
Mon, 1 May 89 21:43:31 EDT
  From: Assoc. Press article in the May 1, 1989 Raleigh(NC) Times:

      The TCASSII system consists of a sophisticated transponder ...
   antennae, and a computer that analyzes and displays the 
   movement of nearby planes.  ...

     "The system has no bugs," said Don Dodgen of Honeywell.

     If two computers meet, he said, orders to the pilots will be
   reconciled automatically:  if one plane is told to climb, the 
   other will be advised to descend or to say on course.

No comment can do justice to this.

--henry schaffer  n c state univ

Nuclear reactor knocked offline by 2-way radio in control room

Wm. Randolph Franklin RPI <>
Tue, 02 May 89 20:52:32 EDT
(condensed from Albany NY Times Union Wed April 26, 1989, page B-17)

The up-again down-again Nine Mile Point 2 nuclear power plant near Oswego was
back on line Tuesday, following a weekend shutdown that "shouldn't have
happened," according to a federal official.

An employee accidently keyed a hand-held two-way radio near circuitry for the
turbine generator monitoring system Saturday night.  The transmission shut down
the system, which in turn triggered an automatic shutdown of the entire

A section chief of the NRC region 1 office said that he has never heard of a
similar accident but that most plants are sensitive and there are strict rules
to prevent this.

Replacement fuel costs $350K per day when the 1080 MW plant is down.

The plant had been up less than a week after a shutdown caused by corrosion and
loose wiring in a meter.

B-2 builders: Prototype not needed (Long Article)

"Stephen W. Thompson" <>
Mon, 01 May 89 15:19:59 -0400
"Reprinted with permission from The Philadelphia Inquirer, April 24, 1989.
Further reproduction of this article without the written permission of The
Philadelphia Inquirer is strictly prohibited."  

By Mark Thompson, Inquirer Washington Bureau

WASHINGTON - The builders of the Pentagon's B-2 Stealth bomber are boasting
that their computer-aided design for the revolutionary boomerang-shaped
aircraft is so good that the $500 million plane will leap from the computer
screen into the air by July without benefit of a prototype model to test the

"The first B-2 is a production aircraft," the Northrop Corp. said
in its just-released annual report.  "There are none of the prototypes
that have been required in previous generations of aircraft."

But critics warn that the Air Force decision to begin building the
$68 billion fleet of 132 sinister-looking planes before flight
testing has even started could prove disastrous.

"I think the B-2 will crash the first time it flies," said Kosta
Tsipis, director of the Program in Science and Technology for
International Security at the Massachusetts Institute of Technology.
"I wouldn't be a passenger aboard it for anything in the world."

The lack of a prototype will make the planes' first flight "pretty
exciting," agreed John Pike, associate director of the Federation
of American Scientists in Washington.

"I'm perfectly prepared to see the airplane fly more or less as
advertised," he said.  "At the same time, I'm equally prepared to see
the airplane crash more or less immediately."

But Capt. Jay DeFrank, an Air Force spokesman, said, "We're confident
that it will make a successful first flight."  The plane's two seats
will be occupied by pilots from Northrop and the Air Force for the
inaugural flight, which may occur secretly, he said.

The top-secret B-2, successor to the troubled B-1B, has been designed to
fly into the Soviet Union undetected by radar.  The not-ready-to-fly
B-2, unveiled in November, is scheduled to be operational within the
next several years, but Defense Secretary Dick Cheney said yesterday
on NBC-TV's _Meet the Press_ that full production would not start in
the 1990 fiscal year as planned.

Asked whether he would consider killing the program, Cheney replied,
"We're going to postpone actually going into full procurement because
I'm not comfortable with the program yet, there are a lot of technical
problems with it, and it is extremely expensive.  And until I have time to
review it, which I've not yet had, I'm not prepared to make that

The B-2's flying-wing design is an updating of Northrop's YB-49 aircraft,
a 1940s-era prototype bomber that the Air Force killed before production
began.  The B-2's shape is naturally unstable, and the lack of a tail
means it will be much harder to control than a conventional airplane.

"It is essentially a boomerang," said James W. Kelley, a former Northrop
aerodynamicist.  "Once it goes into a spin, it cannot recover."

B-2 skeptics question both the plane's radical flying-wing design, first
revealed a year ago, and the Air force's decision to save money by
going straight from the drawing board to the production aircraft.

Historically, new aircraft designs are tested with a series of
custom-built planes, each flown and modified until all major problems
have been eliminated.  Only then does production begin.

But in the case of the B-2, about a dozen planes are under construction,
although not a single one has flown, several sources said.

In recent years, experts have urged the military to build prototypes
to let them "fly-before-buy," confirming the designs before committing
billions of dollars to production.  Prototyping should be done "to uncover
operational as well as technical deficiencies before a decision is made
to proceed with full-scale development," the presidentially appointed
Packard commission said in its 1986 study critical of Pentagon purchasing.

But while the Air Force is requiring prototypes for its fledgling and
highly secret Advanced Tactical Fighter, it does not believe the B-2
needs them.

"It was determined because of its revolutionary technology and the
highly sensitive nature of the program that prototyping was not the best
way to go," DeFrank said.  The secret nature of the program prevented
further elaboration, he said.

Others contend the plane's radical flying-wing design and high price
tag demand prototyping.

"A $70 billion program with no prototypes?" asked an incredulous Thomas
S. Amlie, an Air Force engineer at the Pentagon, who said computers
and models could not replicate the rigors of flight.  "Of course we
should prototype.  We ought to fly one, and wring the hell out of it,
with zero-zero ejection seats so the pilots can eject at zero
altitude and zero air speed and live through it."

Amlie dismissed Air Force arguments that there were classified reasons
why prototyping the B-2 makes no sense.

"They always say there are classified things that we can't know about
because we don't have the clearance," Amlie said.  "Well, I've been in
the business for 37 years, and every time someone has told me that it
turns out they were lying."

But Northrop says its battery of high-powered computers, whose data base
contains drawings of all of the B-2's parts down to the smalles rivet,
has "systematically eliminated" most of the risk inherent in a new
aircraft design.

With the computers, design changes can be made before production begins.
Such changes are particularly painstaking aboard the B-2, where the
plane's radar-evading design requires a frozen exterior shape into which
all of the plane's systems and weapons must be crammed.

"Given all the aerodynamic and performance compromises they've had to
make to reduce the radar cross-section of the B-2, you're just
flying much closer to the margin," said Pike of the Federation of American
Scientists.  "That's precisely why you need to do prototyping."

"It's very strange that they're not being required to prototype," added
Joseph V. Foa, an aeronautical engineer at George Washington University
who first studied flying wings 40 years ago.  "When you have an aircraft
that's going to cost a half-billion dollars apiece, it's a good idea
to prototype.

Pike said recurring delays — the plane's first flight originally was set
for 1987 — showed that Northrop's computers had not eliminated the B-2's
problems.  "That tells me this thing is no different from anything else,"
he said.  "Just because it looks right on the computer screen doesn't
mean that it's going to work in the real world."

Without prototyping, the Air Force — if it discovers problems — will
argue that the $20 billion investment it already has made in the
program requires repairs instead of cancellation, Pike said.

"They're basically front-loading the program so that regardless of what
the test results are, they'll already have spent so much money on it
that it will be difficult to cancel," Pike said.  "You're paying to have
the work done twice — first time to do it wrong, and then the second
time to do it right."

  Stephen W. Thompson, (215) 898-4585   [no relation to Mark], Institute for 
  Research on Higher Education, U. Pennsylvania, Philadelphia, PA  19104

American Express is watching...

Thu, 4 May 89 15:59:53 PDT
Here is another addition to the list of risks of information age.  There is an
article in Thursday morning edition [May 4] of San Jose Mercury News titled
"Member learns the hard way: American Express is watching".  It described how
American Express called a member to voice their concern that he might not be
able to pay their recent bill.  American Express was able to access his
checking account and find that he had less than what was owed to them.  His
card was temporarily "deactivated" after the member refused to give any
financial information except that he would pay up the bill with cash when it
came in.

Apparently, the card application, in finer print, declares that "[American
Express reserves] the right to access accounts to ascertain whether you are
able to pay the balance".  After some arguments with the company, the member
comments that "I learned a lesson: My life is not as private as I thought".

First, this is news to me.  I hold an AmExp card, and I wasn't even aware that
my accounts are constantly being checked.  Second, how could the banks dish out
information on the account holders to third parties without proper

Sundar Iyengar, Microprocessor Design, Intel, Santa Clara, CA 95051

Telephone line security

-David C. Kovar <corwin@daedalus.UUCP>
Mon, 17 Apr 89 15:31:57 -0400
  I was tracing the phone wires in my house yesterday afternoon trying to
find out why my phone was "off-hook" when all of the phones were actually
hung up. Just before the lines enter my house I found a gray box labelled
"Telephone Network Interface". Curious, I opened the box to find two RJ-11
modular phone jacks with black connectors in them that were held in by
clips. I popped the clip, unplugged the plugs and plugged in a normal phone.
Lo and behold, a dial tone! I wandered around the neighborhood a bit and
found a few more of these boxes. Looks like you can wander around Boston
with a phone, plug into someone's circuit, and make as many phone calls
as you like. Who needs lineman's equipment?

-David C. Kovar, oOffice of Information Technology, Harvard University

COMPASS '89 Program

John Cherniavsky <>
Thu, 4 May 89 15:21:06 EDT
     *            COMPASS '89                *
     *     JUNE 20th - June 22nd, 1989       *
     *                                       *
     *    AND TECHNOLOGY (formerly NBS)      *
     *         Gaithersburg, MD              *
     *                                       *
     *             PROGRAM                   *

* MONDAY,  19 JUNE 1989 *

Meeting of the Tri-services Software Safety Working Group

* TUESDAY, 20 JUNE 1989 *

0900     CALL TO ORDER, General Chair---Dario DeAngelis, Logicon
         Honorary Chair---The Honorable Tim Valentine,
     Chairman for the House Subcommittee on Transportation  
         Program Chair---John C. Cherniavsky, Georgetown University
         Chair, COMPASS Board---H.O. Lubbes, Space and Naval Warfare
         Systems Command

         "Computer Assurance: Safety, Security, Economics"
         Allen Hankinson, National Institute of Standards and Technology
         PANEL:    Peter Neumann, SRI International
                   Nancy Leveson, UC Irvine and MIT
           Allen Hankinson, NIST
                   Michael Brown, Naval Surface Warfare Center
1400     Special Presentation - Computer Related Risk of the Year
         "Misplaced Trust in Computer Systems"
         Peter Neumann, SRI International
1430     Minitutorial
         "Formal Analysis of Safety"
         Nancy Leveson, UC Irvine and MIT 
1600     Software System Safety in the Military
         Chair---Michael Brown, Naval Surface Warfare Center
         * Software Safety Handbook
            Archibald McKinlay VI, McDonnell Aircraft Corporation
         * Role of the System Safety Manager in Software Safety, 
        Bruce Hill, Consultant
1730     ADJOURN
1900     BANQUET 
         * "It is June 1989. Do you know what your computers are doing?"
                   Peter Neumann, SRI International

* WEDNESDAY, 21 JUNE 1989 *

         Chair---Nancy Leveson, MIT and UC Irvine
         * Software Safety Goal Verification Using Fault Tree Techniques:
           A Critically Ill Patient Monitor Example
            Brian Connolly, Hewlett Packard
         * Using Petri Net Theory to Analyze Software Safety Case Studies
        Wade Smith and Paul Jorgensen, Consultants
         * VMM Concepts Revisited
                Marvin Schaeffer, Trusted Information Systems 
         Chair --- Dolores Wallace, NIST
         * RM 2000 Approach to Software
            Major Sue Hermanson, USAF
         * Condition Testing for Software Quality Assurance
            K.C.Tai, North Carolina State University
         * Helping the Army Succeed Through Software V&V
                Richard O'Reagan and Michael Edwards, Teledyne Brown Engineering
         * Experimental Evaluation of Six Test Techniques
                Linda Lauterbach and B. Randall, Research Triangle Institute
         Chair---Richard Hamlet, Portland State University
         * Access Control and Verification in Petri-Net Based
            P. David Stotts and Richard Furuta, University of Maryland
         * Unit Testing for Software Assurance
            Richard Hamlet, Portland State University
         * Validation Through Exclusion: Techniques for Ensuring
           Software Safety
                John C. Cherniavsky, Georgetown University
         * A Simple Way of Improving the Quality of Login Security
                Khosrow Dehnad, AT&T Bell Laboratories
         Chair---Janet Dunham, Research Triangle Institute
     *Risk Analysis: Case Studies of Two Approaches with an
     Expert System Based Tool
        Jane Radatz, Logicon
1800     ADJOURN

* THURSDAY, 22 JUNE 1989 *

         Chair----Martha Branstad, Trusted Information Systems
         * Techniques for Data and Rule Validation in Knowledge Based Systems
            Jong P. Yoon, University of Florida
         * How to Qualify Knowledge Based Systems
        Claude Vogel, Cisi Ingenierie
         * Description of a Formal Verification and Validation
            Kenneth Lindsay, Magnavox Electronic Systems 
         * Taxonomy of the Cause of Proof Failure in Applications
           Using the HDM Methodology
                Kenneth Lindsay, Magnavox Electronic Systems
         Chair----Thomas F. Buckley, University of Leeds
         * Programming a Viper
            Thomas F. Buckley, University of Leeds 
     * Formal Verification of Microprocessor Systems
            Mandayam Srivas, Odyssey Research Associates
         * Prospects for Verifying the PSN Code
            Stephen Crocker, Trusted Information Systems
         * Requirements for Process Control Protection
            John McDermott, Naval Research Laboratory
         Chair---H.O.Lubbes, Naval Research Laboratories 
         * Assurance for the Trusted Mach Operating System
            Martha Branstad, Trusted Information Systems
         * Verifying Asymptotic Correctness
            Mark Howard and Ian Sutherland, Odyssey Research Associates
         * Security Analysis of a Token Ring Using Ulysses
                Daryl McCullough, Odyssey Research Associates
         * Penolope: An Ada Software Assurance Editor
            Carla Marceau, Odyssey Research Associates
         Chair---John C. Cherniavsky, Georgetown University
         Panel    Thomas Buckley, Leeds University
                  Steven Crocker, Trusted Information Systems
                  Darryl McCullough, Odyssey Research Associates
                  Mandayam Srivas, Odyssey Research Associates        
1800     ADJOURN

* FRIDAY, 23 June 1989 TUTORIALS*

0900     TUTORIAL
         * A Guide to VIPER, A Verifiable Integrated Processor for Enhanced
           Reliability - or - Why, How, and Wherefore of Using a Formally
           Proved Microprocessor for High Integrity Control Systems
                Thomas F. Buckley, University of Leeds
                Jon Wise, Charter Technologies 
0900     TUTORIAL
         * Formal Specification and Verification of Ada Programs
            David Guaspari, Odyssey Research Associates
                Carla Marceau, Odyssey Research Associates
1200     ADJOURN

FTP KL.SRI.COM, get stripe:<risks>COMPASS.INFO.  [I edited out the coffee
and lunch breaks for brevity and nonredundancy.  PGN]

Please report problems with the web pages to the maintainer