The RISKS Digest
Volume 11 Issue 2

Tuesday, 5th February 1991

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Bogus draft notices are computer generated
Jonathan Rice
People working at home on important tasks
Mike Albaugh
Predicting system reliability
Martyn Thomas
Re: Patriots
Steven Markus Woodcock
Mark Levison
Hungry copiers (another run-in with technology)
Scott Wilson
Re: Enterprising Vending Machines
Dave Curry
Broadcast LANs
Peter da Silva
Scott Hinckley
Info on RISKS (comp.risks)

Bogus draft notices are computer generated

Jonathan Rice <rice@willow.cray.com>
Tue, 5 Feb 91 9:21:58 CST
Last night's local news reported that "hundreds, perhaps thousands" of draft
notices have been posted around the Minneapolis campus of the University of
Minnesota.  The official looking notices are replete with convincing official
jargon, announcing that men in a certain age group are to report for immediate
duty.  Readers are directed to report to a room in the Hennepin County
Courthouse (the room is vacant) or to call one of two telephone numbers.  The
numbers are those of the Minneapolis Star & Tribune news desk, and of a
blameless lady in St. Paul who seemed from the interview to have kept her sense
of humor despite the barrage of calls.  Interviews with young men on campus
indicated that many had not thought to doubt the authenticity of the notices.

What caught my ear was a statement that University officials would be "checking
their computer facilities" to see if the notices had been composed and printed
there.

The risk is one that has been pointed out before: laser printers enable several
types of fraud — forged checks, phony invoices, letterheads for nonexistent
businesses — that once would have been ruled out by the need to have a
professional print shop as an accomplice.  But this is a new twist.

Jonathan C. Rice, Cray Research, Inc., 655F Lone Oak Drive Eagan, MN 55121
UUCP: uunet!cray!rice 612-683-5370


People working at home on important tasks

Mike Albaugh <albaugh@dms.UUCP>
Mon Feb 4 15:22:33 1991
An article on the accuracy of medical tests in the latest "Parade" (a
syndicated Sunday Supplement) had the following "interesting" statement.  Some
context: A women died of cervical cancer not detected due to screwups in
handling her pap smear. The lab tech who mishandled the test in question was
working at home. The woman's attorney is quoted:

    "Working at home might be fine for computer programmers, but
    it's reckless when your job involves making selective judgements
    that can affect someone's life."

We can all rest easier knowing that computer programmer's can have no negative
effect on people's lives.
                        Mike Albaugh


Predicting system reliability

Martyn Thomas <mct@praxis.co.uk>
Mon, 4 Feb 91 17:14:45 BST
It is hard to define what evidence we need, in order that we can have
confidence that a system meets its designed level of reliability.

(I am using reliability, loosely, to mean that the system does what you
wanted it to do, when you wanted it, for the period that you wanted it).

It seems clear that the following argument is unsound:

System X has been shown to meet its requirements.
System Y is no more difficult than System X.
Therefore System Y meets its requirements.

Therefore, even if an SDI system had been shown to work before, it would not
be evidence that a new SDI would work. A fortiori, any degree of success of
the Patriot system provides no evidence that SDI would work.

I think the confusion arises because two different arguments get conflated:

Argument 1: A system as complex as SDI could never be designed so that it
worked correctly. (I do not support this argument - any [computable] system
*could* be designed so that it worked correctly [ even by chance]. I am more
interested in how we accumulate the evidence which allows us to form the
opinion that a *particular* system has achieved its design objectives. This is
argument 2, below).

Argument 2: A system as complex as SDI can never be evaluated in a way which
would give reasonable grounds for claiming that it would work correctly when
deployed. (I believe that this is true. SDI would be too complex for formal
proof of correctness - and the specification may be wrong. SDI would be
impossible to test under operational conditions. In general you need to test
for more than 10 times the period of fault-free operation you are looking for,
with no faults found, to achieve 99% confidence that you will meet the
requirement. SDI would probably only need to work correctly for a few days, so
a few weeks fault-free operation [under operational conditions - ie under
attack!] would demonstrate achievement. The time isn't the problem, but
creating the test conditions is surely impossible).

When we move the argument to some other safety-critical systems, the time
factor becomes dominant. A constant-control system, such as the A320
fly-by-wire, needs 10^-8 probability of failures per hour. This implies 10^9
hours of fault-free operation to justify a claim (to 99% confidence) that the
requirement has been met. This is clearly absurd, so how do we judge whether or
not the requirement has been met?

On-demand critical systems, such as spacecraft course-correction or reactor
shutdown systems, may only need to operate correctly for a few minutes or hours
during their whole design lifetime. This is clearly testable (if we can be sure
that the operational conditions can be reproduced accurately enough - which
becomes the problem).

I would welcome further discussion of these basic questions: how should we form
an opinion about the probable future reliability of a system; what
justification is needed for that opinion, if it is to stand up to critical
appraisal by other engineers; what is the practical limit (in terms of
failures/hour) which we can realistically expect to be able to justify, and how
is this limit affected by the complexity of the system?

Martyn Thomas, Praxis plc, 20 Manvers Street, Bath BA1 1PX UK.
Tel:    +44-225-444700.   Email:   mct@praxis.co.uk


Re: Patriots (RISKS-10.83)

Steven Markus Woodcock <swoodcoc@isis.cs.du.edu>
Mon, 4 Feb 91 23:30:55 MST
Regarding the use of Patriots:

   I work for Martin Marietta, and know several people who helped design
and build the Patriot.  I also work at the National Test Bed, so I'm
quite familiar with SDI's role in the Patriot's development.

   The Patriots fired at the aircraft in Turkey missed precisely because
they were loaded with terminal defense missile software--a much simpler
interception problem than hitting an aircraft that's actively avoiding you.
If they had been loaded with their anti-aircraft software, it would
probably have been a different story (although isn't there a ground destruct
override on those things?).


Patriots, ballistic missiles and aircraft

Mark Levison <levisonm@qucis.queensu.ca>
Tue, 5 Feb 91 12:29:24 EST
   On the subject of the Patriot missile system: while the SCUDs they are
firing against are archiac, the basic problems do not change.  The only short
to medium range ballastic missile that change the problem significantly was the
Pershing II. This system had a terminal guidance phase, allowing course changes
during or after a counter missile launch.  Improvements in ballastic missile
technology since the SCUD are in the areas of speed (not significant),
accuaracy, range and payload.

   Against aircraft the problem changes significantly, while aircraft are
capable of evasive action they are also much slower moving targets typically
Mach 0.6 (for an A6 or A10 on an attack run) to Mach 2+ (fast moving ie Mirage
F1, 2000 etc) vs Mach 3 - 7 for a ballastic missile.  So ignoring ECM, chaff
and similiar capabalities aircraft should not significantly more difficult than
ballastic missiles. Of course we have not seen and hopefully will never need to
see examples of the Patriot system shooting down an aircraft in a real combat
situation.

   As should be obvious here I am only trying to demonstrate that Patriot
missile is probably capable of doing the job that its designers claim it can
do.  I am ignoring all issues of ECM and radars because of my scant knowledge
in this area.  I am also ignoring whether you actually want to shoot down
missiles that might be chemical/biological armed.

Mark Levison                                      levisonm@qucis.queensu.ca


Hungry copiers (another run-in with technology)

Scott Wilson [CHTM] <swilson@pprg.unm.edu>
5 Feb 91 04:33:35 GMT
I too have encountered some poorly thought out additions to old technology. Get
this:

Went down to the library to copy a paper. Didn't have enough change, but saw a
bill changer on the machine. (What will they think of next?)  Put in a $5 bill.
Put the paper in place, hit <COPY>. Machine grumbles, proceeds to hang, saying
"Check Paper". I begins to smell a rat.

I track down the student assistants at the front desk - thay cannot fix the
machine. They are the only attendants on duty (Murphy applies here).  I press
"Change return" and it says "Must make at least one copy".

The add-on bill changer was programmed to avoid use as a source of change by
requiring at least one copy. If the machine jams on the first copy, your bill
stays in until you can find a way to get it unjammed. If that cannot be
accomplished, then your imaginary copy costs you the bill!

I figured that if I wasn't going to get the bill back by arranging to have the
machine fixed, then I was at liberty to try other means.  Found the power cord,
unplugged said machine. Plugged back in.  Bill changer complained about "out of
order". Unplugged again.  Plugged back in. Changer starts thumping, spews out
my $5 bill.  I guess it was trying to clear itself.

Too many more "consumer improvements" and I'll scream!

Scott Wilson


Re: Enterprising Vending Machines

<davy@erg.sri.com>
Tue, 05 Feb 91 08:23:29 -0800
Most post office vending machines I've used tell you to press the button
before putting your money in, and it will tell you whether it's sold out
or not.  These are the USPS machines with the "light bar" for messages
across the top and about four tiers of stamps.

So perhaps your surmise about this being an old retrofitted machine is correct.

--Dave
           [Dave and I share a vending machine in the Menlo Park CA Post Office
           that tells you to press the button before putting money.  But
           first-time apparently get burned quite frequently.  Here is another
           example of ordinary mortals having to gain sophistication in the
           vagaries of automated systems in order to maintain their cool.
           (My use of "their" was intentionally ambiguous.)  PGN]


Broadcast LANs

Peter da Silva <peter@taronga.hackercorp.com>
Tue, 5 Feb 1991 04:56:03 GMT
A lesser problem that reduces the quality of the working environment is likely
to be plain old EMI. Already it's impossible to tune AM radio stations in
modern office buildings, and low-power FM ones (like the local PBS station) are
kind of marginal. I'd hate to think of what a wireless LAN would do to my
Classical or Jazz fix, let alone All Things Considered.


Re: Broadcast local area networks are a'comin (Tom Lane, RISKS-10.83)

<scott@huntsai.boeing.com>
Tue, 5 Feb 1991 09:58:37 -0600
<>If I ran a corporate network, I wouldn't touch this with a 10-foot pole.

How much of your data on the average network is really a security issue?
I work here at Boeing and, at least in my area, sensitive data is not
kept on the network (and it is over 150' to the nearest area off campus
anyway, not to mention a couple of concrete/steel walls)

Even in a small company I bet most data could be sent with little to no
encryption without any danger of sensitive material being lost.

In summary this WOULD be a good system for many (most?) networks where
cost/difficulty of cableing would be a major deterent, but (due to
limited channels and security risks) it would not be for everyone.

The existence of security risks does not negate the possible benefits of
technology per say, but rather is a side effect that must be
acknowleged and responsibly handled.

Scott Hinckley, Boeing AIC, 110 Pine Ridge Road #608 Huntsville Al35801
(205)461-2073             UUCP:..!uunet!uw-beaver!bcsaic!huntsai!scott

Please report problems with the web pages to the maintainer

x
Top