The RISKS Digest
Volume 9 Issue 47

Friday, 24th November 1989

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Air Force Radar Risk (update)
Henry Cox
Congressional report: "Bugs in the Program"
Gary Chapman
Dave Davis
Re: Specifying vs. defining
Dave Platt
Training programmers
Lee S. Ridgway
Re: Privacy and risks in credit information
John DeBert
Re: Automated Bank RISKS
Marc Shannon
Jon Mauney
Re: Autodialing horror stories
Robert Sansom
Info on RISKS (comp.risks)

Air Force Radar Risk (update) [See Jon Jacky, RISKS-8.28]

Henry Cox <cox@pike.ee.mcgill.ca>
Thu, 23 Nov 89 10:20:29 EST
RADAR AT U.S. BASE CAN TRIGGER PLANES' EJECTION SEATS:  LETTER
[ From the Montreal Gazette, 23 November 1989 ]

Knight-Ridder Newspapers

Robins Air Force Base, Ga. - The US air force has learned that radiation
from its PAVE PAWS radar at Robins AFB could activate internal equipment
- including ejection seats and fire extinguishers - on virtually all
planes that land at the base.

The disclosure was made in an Air Force "update" letter to Senator Sam
Null (D-Ga.) made public this week by the senator's Washington office.

Although the air force originally said that PAVE PAWS would not endanger
electro-explosive devices other than those on the outside of its plane, a
recent review of the radar has concluded otherwise, the air force letter said.

"As a result, Air Force Space Command is co-ordinating with the Air Logistics
Center at Robins AFB to implement procedures to ensure aircraft with internal
EEDs are also protected," wrote Maj.-Gen. Burton R. Moore, the air force's
director of legislative liaison.

But Nunn, in a written reply to Moore dated Nov. 20, says that the air force
hasn't fully answered his questions of last January, and has "raised new
questions" with its latest update.

"It would be helpful to know more about the hazard to such devices, what the
devices are used for, and what aircraft are equipped with them.  I would also
like to know how the air force determined that these devices were at risk,"
said the Senate Armed Services Committee chairman in his two-page letter.

The radiation hazard to internal EEDs is the latest safety revelation
concerning the southeastern PAVE PAWS - built too close the runway at Robins
AFB.  The radar, one of four nationwide, is designed to warn of sea launched
missile attacks and track satellites in space.  But since November of 1987, the
air force has been turning off the north face Robins PAVE PAWS to protect
vulnerable planes landing on its runway 3 kilometres north of the radar.

According to air force documents obtained by Knight-Ridder Newspapers
recently under the Freedom of Information Act, one aircraft at risk to
PAVE PAWS is the Strategic Air Command's KC-135R tanker, some of which
are based at the 19th Air Refuelling Wing at Robins.

EED equipment on other aircraft includes "flare/chaff dispensers,
pylon/ejector racks, tactical missiles, cruise missiles, crew escape, and
engine start cartridges," according to air force documents.

  [This problem was noted in RISKS-8.28, 19 FEB 1989.  Details are new.  PGN]


Congressional report: "Bugs in the Program" [Benson, RISKS-9.45]

Gary Chapman <chapman@csli.Stanford.EDU>
Wed, 22 Nov 89 12:06:19 PST
I have a copy of the report David Benson's message referred to--the
congressional committee report called "Bugs in the Program"--and I have some
comments on its recommendations about software development.

There is an emphasis in the report--an overemphasis in my view--on the Federal
procurement process for software.  This is probably understandable given the
source of the report; Congress is concerned about spending money, not about
methodological issues in software engineering.  However, the attention being
given to this report, and its emphasis on the procurement process, could lead
to some misconceptions in the Congress about the nature of the problems we
face.

The fundamental problem about emerging computer risks is that computer
scientists are trying to integrate, in a systematic and predictable fashion,
three domains:  discrete state machines, the highly contingent and
variable-rich nature of the "real world," and the equally unpredictable
character of human subjectivity and intent.  This is a project similar to that
of Leibniz, for example, when he said that what philosophy should aim for is a
mathematical calculus that would provide an equation for any question--in other
words, absolute certainty for any aspect of experience, expressed in logical
and mathematical evidence.  Leibniz didn't succeed, and for the most part we
don't take this kind of enterprise seriously anymore, except in developing
complex computer systems.

Since the whole nature of the "software crisis" is really just a dilemma
produced by the inability of people to adequately predict the future down to
the level of detail required by computers and their ancillary equipment, it is
something of a wonder why so many people are puzzled by the fact that we find
ourselves in this conundrum, or why we're wasting so much money in this effort.
Not to be too melodramatic about it, but the core of the Greek tragic tradition
is a mortal protagonist trying to manipulate the world according to his
designs, and then discovering, usually by running up against the wrath of the
gods, the limits of human power.  These days the gods deliver their lessons by
burning up mountains of money.

There seems to be a cognitive block on the part of many people in the
policymaking apparatus in understanding the real nature of the problem.  The
congressional report, for example, once again hauls out the various government
reports on the Strategic Defense Initiative and its software problems, as
though this idea was legitimate to begin with, that the whole thing is
something more than just an embarrassing boy's fantasy of omnipotence in space,
etc.  The Congress continues to treat the SDI, and its software difficulties,
as a real technical challenge, one that is only receiving less attention than
previously because of budget problems and the changing nature of the Soviet
threat.  The report quotes the Defense Science Board when it said that the "The
Strategic Defense Initiative. . . has a monumental software problem that must
be solved to attain the goals of the initiative."  This sentence makes one
slack-jawed with amazement.  Not only is it eye-rollingly obvious, but it
harrumphs that this "software problem" is something to be solved, which by now
should be considered completely absurd.  As Jack Ruina of MIT once said, the
SDI should have all the scientific credibility of astrology or water-dousing,
so the sentence of the DSB quoted in the congressional report should have all
the solemnity and relevance of a pronouncement on one's moon in Pisces.  The
fact that the SDI "software problem" is still discussed as a vexing issue of
public policy and an example of the growing software crisis shows how far off
the Congress is from understanding the true nature of the problem.

Another showcase example of the report is the B-1 bomber, and the citations
about the B-1 are once again from government agencies, the well-known
muckraking journal Aviation Week & Space Technology, and Jacques Gansler of
MIT, who is not only famous for his sympathy to the Pentagon, but who also runs
a private defense consulting firm.  There are no references to Nick Kotz's book
*Wild Blue Yonder:  Money, Politics, and the B-1 Bomber* (1988, Princeton
University Press), widely considered the definitive book on this fiasco.  Kotz
says, "After a continuous thirty-year defense buildup, marked by [such]
repeated excesses, the military program is totally out of control."  This
context in which large software projects are developed is completely absent
from the congressional report.  The story of the B-1 is not a story of
technical or procurement failures relating to its avionics software, although
this is certainly part of the overall narrative.  It is a sad and enraging
story of mismanagement, overselling, congressional cowardice and laziness, and
propaganda.  "Improved" software procurement procedures would be nearly
negligible in any real remedy to these problems.

The report by James H. Paul and Gregory C. Simon does provide a real service in
that it alerts the somnambulist Congress to some of the problems with software
safety, reliability, and professional education.  But if the result of the
report is to try and crack the whip on the software engineering profession in
order to produce more efficient development of code for the SDI, or for the B-1
bomber, or any number of projects that are doomed for entirely exogenous
reasons, the report will be regrettable.  The report recommends professional
certification, for example, which is an issue that I have not developed a
position on, but it seems to me completely premature to be talking about
professional certification for software professionals when the Pentagon itself
is a runaway train, and largely due to Congress' gullibility over high tech
funding requests.  It seems comic and slapstick if the Congress has overseen a
project like the B-2 bomber, for example, on which we have spent $22 billion,
and then is alerted by the Pentagon that this aircraft will cost somewhere
around half a billion dollars per copy, which puts Congress into a swirling
faint all of a sudden--and then when Congress recovers it shakes a wagging
finger at software engineers?  Or after spending $16 billion on SDI research,
Congress lethargically asks, how much will the software cost for this thing,
and the answer is eighty-eight kigillion dollars and the lifetime careers of
every single programmer in the Free World (which appears to be expanding).  And
Congress says, years into this thing, What??!!  Why weren't we told?  Let's see
a report on professional certification and the ethics of this profession!  Give
us a break!  The SDI was wasting money when it was just an idea in a paid
official's head, if he was thinking about it during working hours.

So it seems disingenuous to me, even if the intent was noble, to produce a
congressional report that quotes approvingly the following passage:  "Education
is the seedbed of good practice.  Whether computer science education in
American universities is contributing to or inhibiting good software practice
is not clear. . . ."  There is the clear call for "professionalization" of the
software field, and for the weeding out of incompetents and unethical bad
apples.  This from an institution that is a nonstop geyser of money for
crackpot ideas and mismanagement, money that will of course be willingly
deposited by people who are "professionals" at taking government money.
Congress, in a bar wearing a suit bulging with cash, and willing to listen to
every customer with a harebrained scheme for a big expensive system that will
save the world from disaster, is growling and spoiling for a fight when it
finds itself on the street wearing nothing but boxer shorts.  The
"cautiousness" of Congress in getting religion about software reliability and
cost overruns starts to look pretty silly, like one of the rich, pompous and
street-stupid louts that the Marx Brothers were always taking to the cleaners.

Gary Chapman
Executive Director, Computer Professionals for Social Responsibility


House report on software development and regulation [another view]

USENET NEWS <news@linus.mitre.org>
22 Nov 89 20:38:49 GMT
The House Committee on Science, Space, and Technology has just published a new
comprehensive report on software development, government procurement, and user
saftey risks called Bugs in the Program.  The cause of the study was the
Therac-25 X-ray therapy system, which has been discussed elsewhere [in RISKS].
The well-researched report describes some of the reasons for a software crisis,
citing inadequate technical capability to predict reliability and safety as
causes of accidents due to software.  In addition, it points out that all
government agencies have difficulty in managing complex software development
due to lack of technical and management training and experience.  Also,
inadequacies in computer science education and ethical training in our
profession are noted.

The rigid hardware approach to procurements and development is criticized as
being inappropriate to software, the most flexible part of systems.  (One
almost forgets this after living with these standards for a few years.)
Prototyping and user interaction during development is recognized as promoting
success.

The report recommends that Dr. Demming's statistical process control methods be
applied to software development and testing.  Further, it recommends that a
Working Group be established under the committee to begin improving the
software development process within the government, with assistance from NASA.
More research into future software problems and solutions should be funded, and
the government should develop the capability to evaluate proposed improvements
in the development process.

Dave Davis, MITRE Corp., McLean, VA


Specifying vs. defining (Davis, RISKS-9.46 on Congress, RISKS-9.45)

<dplatt@coherent.com>
Wed, 22 Nov 89 15:44:13 PST
> These two sentences seem inconsistent to me.  What's the difference
> between "spell-it-out-up-front" detailed specifications, and a "big
> up-front investment in problem definition?"  There's some fuzzy thinking
> going on here.
>
> I agree that it is impossible to precisely specify a large software
> system when the hardware environment and functional requirements aren't
> frozen.  In those situations it may be necessary to prototype the system
> in order to understand the problem.

I believe you've hit the nail on the head in your second paragraph, and
in effect answered the question you raise in the first.

The "spell-it-out-up-front", detailed-specification approach attempts to spell
out THE SOLUTION in great detail.  For example, the contract might call for a
microprocessor-based implementation using a distributed network of FooBaz 413
MPUs, running a real-time operating system capable of responding to interrupts
in no no more than 0.8263 nanofidgets, and capable of processing 1763
meta-shift-cokebottle commands per cup of coffee.

These sort of solution-specifications are often very comforting to the
contract-writer... they provide a detailed, measurable, and testable
description of the solution which must be delivered.  When the project
ends, it's relatively easy to compare the solution-as-specified with the
solution-as-delivered, and see whether there are any shortfalls.

This is all fine... as long as the person writing the specification _really_
understands what the problem is, and is actually capable of specifying a
package which will solve the problem.  If so, everybody's happy.

However... as others have pointed out, the solution is often specified
well before the problem is really understood, or by people who aren't as
aware of the problem as they should be.  The net result is a situation
we've seen all too often... a solution is delivered, it meets all of
the project specifications, and yet proves to be inadequate when
actually tested in the field, or to be incompatible with other packages
it really _should_ work with, or...

Things can get much worse... if, for example, the specification is
changed part-way through the project.  Adding a new requirement or two
to the specification can throw the whole implementation effort entirely
out of joint... leading to schedule slippage, cost overruns, or a
patchwork solution that proves to be unmaintainable.  All too often, the
solution must be thrown away, and a new one developed at great
expense... often by the same process, alas.

To paraphrase a guy I work with: "Don't hire me as a designer, and then
tell me what to implement.  If you do, you've just done my job, and you
don't need me.  Instead, tell me about your problem."

Dave Platt, Coherent Thought Inc., 3350 West Bayshore #205 Palo Alto CA 94303


Training programmers

"Lee S. Ridgway" <RIDGWAY@mitvma.mit.edu>
Wed, 22 Nov 89 13:37:24 EST
Given recent discussion about programming and programmer standards, and
the kind and quality of training programmers do or should get, I noticed
(and pondered the implication of) an ad on the Boston subway for a trade
school (CPI) in Cambridge, Mass., that announced the following training
schedule for four computer-type jobs:
Computer Operator - 20 weeks
Computer programmer - 12 weeks
Computer technician - 7 months
Data entry operator - 20 weeks

Seems a tad short time in which to learn programming with any depth
or proficiency, except maybe Basic.  Would any of you, in the position
to do so, risk hiring an entry-level programmer from such a training
program?

The school claims a 97% (overall) placement rate for its grads.  Nothing
about accreditation, or enrollment requirements, etc.  Too bad all the
little tear-off info cards had been torn off.


Re: Privacy and risks in credit information (Gorman, RISKS-9.46)

John DeBert <onymouse@netcom.UUCP>
24 Nov 89 07:15:57 GMT
TRW lists all information received from their customers, or, clients, as TRW
calls them, I believe.

This info includes the name of the client, your account number with them, the
type of account, the balance, balnce past-due, who has requested your credit
history and personal information such as social security numbers, your
residence addresses for at least the past few years as well as personal
information on whomever uses your credit accounts.

TRW also has a program called TRW Credentials Service which you may subscribe
to. For thirty-five dollars per year, you can get your credit record that is on
file with them. They also send you forms to fill out and return to them that -
if you fill them out completely - contain a complete and current financial
profile of you and your family. This information is kept online along with the
regular credit file and is made available "only upon application and with your
PIN code, which you provide the person or agency which would request it."

There are some really serious privacy problems with this service. Of course,
TRW is still a popular target for crackers and it is used by government,
businesses, et cetera, looking for information. I have asked TRW to tell me
who would have access to my records without my knowledge and they have thus
far not only refused to make a reply but never, ever answer any written
request for information that I make. (TRW also says that they will notify me
anytime someone requests a copy of my credit report but have failed to do
so: I have received an updated report in response to sending them a correction
to it and it shows some inquiries that were never reported to me.)

jd


Re: Automated Bank RISKS (Osborn, RISKS-9.46)

Marc Shannon <SYNFUL@DRYCAS.CLUB.CC.CMU.EDU>
Thu, 23 Nov 89 14:55 EST
My bank had implemented a Telephone Banking service a couple of years ago, but
their implementation of security is a bit better.  To get in, you enter the
last ten digits of your ATM card and then your four-digit PIN code.  Once in,
you can check your balance, inquire on check status (by check number) or a
recent check history, last deposit, transfer funds between accounts, and even
pay bills.

The only problem I've ever had with it is occasionally I'll enter *SOS# to talk
to a Customer Service Representative, get the voice message (with annoying
pauses: "Please . hold . on . and a . customer . service . representative .
will . assist . you!" (note that there is no pause between "and a" :-) ).  Then
a click, and then a dialtone!  After hitting a key on the phone to see what it
would do, I get transferred back to the telephone banking service.  Isn't that
special! :-)

(Customers can go directly to the representatives by entering a different menu
code before going to the telephone banking service.  If they try and ask for
any information on an account, the representatives will have them go back to
the service and enter their 10 and 4 digit codes for security.)
                                                                      --Marc


Re: Automated Bank RISKS (Osborn, RISKS-9.46)

Jon Mauney <mauney@cscadm.ncsu.edu>
Wed, 22 Nov 89 14:36:57 EST
In comp.risks osborn@cs.utexas.edu (John Howard Osborn) writes

>My bank, First Interstate, has recently implemented a handy new service.
>...  The problem is that the system also allows merchants
>to check a check.  That is, will a check, for a certain amount, from a certain
>account, clear at this time?  There is no security for this procedure.

Actually, this is another case of technology making things easier, but not
making a fundamental change.  I have an aquaintance who used to manage a small
shopping center.  He performed a simple search on a deadbeat tenant's account
by calling the bank
    "I have a check for $1000.  Is it good?"
then calling a different branch
    "I have a check for $2000.  Is it good?"
etc.  He was then able to tell the tenant exactly how much back rent would be
paid that month.  It is an amusing story when told from the point of view of my
acquaintance, not so amusing from a privacy viewpoint.
                                                     Jon Mauney


Re: Autodialing horror stories (John, RISKS-9.45)

<Robert.Sansom@CS.CMU.EDU>
Tue, 21 Nov 89 09:19:35 -0500 (EST)
> ...  A lady in Utah started getting anonymous phone calls in June, 1974.

The area code for Utah is 801.

> We investigated and discovered that there was a phone number that was
> one dial pulse (there were lots of pulse-dial only exchanges then)
> away from the 800 number that the computers were supposed to call.
> We called the number and discovered on very frustrated lady! Ramada
> [paid] for her to get a new phone number, and the problem went away.

The only area code that is one pulse away from 800 is 809.  This is the
area code for Bermuda, Puerto Rico, Virgin Islands and The Bahamas.

Did the lady in Utah have an 800 number?  Or was she having her calls
forwarded from her winter residence in St. Johns? 8-)

Robert Sansom, School of Computer Science, Carnegie Mellon University

                                    [Or is the story apocryphal?  PGN]

Please report problems with the web pages to the maintainer

x
Top