Forum on Risks to the Public in Computers and Related Systems
ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator
Volume 1: Issue 29
Thursday, 12 Dec 1985
Contents
Computer-compared prescriptions- Dave Platt
SDI: Danny Cohen and Eastport Group comments- Gary Chapman via Jim Horning
Worms, etc.- Keith F. Lynch
- Stavros Macrakis
Passwords, etc.- King Ables
- Dave Curry
- Dan Bower
Risks re computer-compared prescriptions
Dave Platt <Dave-Platt%LADC@CISL-SERVICE-MULTICS.ARPA>
Tue, 10 Dec 85 09:50 PST
Random-Quote: The race is not always to the swift, nor the battle to the strong
-- but that's the way to bet. (DAMON RUNYON)
Recently, an increasing number of pharmacies have been putting greater
amounts of drug information "on line". As I understand it, they will keep
track of all of a particular customer's prescriptions, and will alert the
pharmacist if they should be asked to fill a prescription that conflicts
with any other medication that the customer is taking. The rationale is, I
believe, that if a person is receiving prescriptions from two different
doctors (different specialists, perhaps), then neither of the doctors would
necessarily be aware of the drugs that the other had prescribed, or of any
possible unfortunate interactions between the drugs. Normally, I assume
that the pharmacist would inform the consumer and contact the prescribing
doctor for further instructions.
Several concerns come to mind:
- Where is the database of drug conflicts derived from? Manufacturers'
data files? FDA reports? Articles in recent medical journals? Just
how complete is it?
- Does the database cover only drug-to-drug interactions, or is it more
complete? Might it, for example, contain counter-indication information
for specific drugs (e.g., don't take this if you're pregnant)? How about
reports of unusual symptoms or side effects?
- How "intelligent" (sorry!) is the logic that compares a new prescription
with a person's medical/drug history? Is there any AI/expert-system
capability, or is it simply a look-up-a-list-of-conflicts? Might the
code be capable of, for example, warning a person who's receiving
medication for asthma not to take doses of a specific brand of antibiotic
because that particular brand is preserved with a sulphite compound that
has been reported to trigger asthma attacks in sensitive individuals?
- If a pharmacy advertises their new drug-checking software (and some do
mention it in their ads), are they assuming any degree of responsibility
or liability for either (a) false "conflict exists" warnings that cause
a consumer not to take a necessary drug prescribed for them, or (b)
any failure to alert a customer to a conflict that does exist?
- Will doctors, pharmacists, and/or consumers begin to depend on the
correct functioning systems such as this, at the expense of studying
the issues involved themselves?
This particular issue is similar to the one discussed several issues back,
concerning AI/KE/expert-system tools such as MYCIN that "diagnose"
illnesses from symptoms or "suggest" treatments. However, this system
is one step further away from the doctor and closer to the consumer;
there might be a greater tendency for people to "take it at its word"
rather than simply using it as a tool.
SDI: Danny Cohen and Eastport Group comments
Jim Horning <horning@decwrl.DEC.COM >
11 Dec 1985 1340-PST (Wednesday)
[Forwarded With Permission from Gary Chapman]
Date: Tue, 10 Dec 85 16:14:34 pst
From: Gary Chapman <PARC-CSLI!chapman@SU-Glacier.ARPA>
Subject: Danny Cohen and Eastport group report
COHEN SAYS SDI CRITICS ARE "STAGNANT SUBCULTURE"
Danny Cohen, chairman of the so-called "Eastport group," told the Senate
Armed Services Subcommittee on Strategic and Theater Nuclear Weapons that
the discipline of software engineering is "an institutionalized and stagnant
subculture." He said that Dave Parnas' criticisms were misrepresenting the
facts by claiming that there are "scientific arguments" and "fundamental
mathematical" obstacles that lead to the conclusion that reliable BM/C3
software cannot be built.
However, Cohen said in his testimony that the so-called "horserace"
architecture studies have only paid "lip service" to potential battle
management problems, and he criticized these studies harshly.
Cohen said that software engineers have a fetish with mathematical
correctness, and that they "try to mimic mathematics at any cost, even if it
means sacrificing functionality and relevance. This sect grossly overrates
the perfection of Swiss clockwork, and strives to achieve it." Cohen said
that the SDI should look to the telephone system as a model of a large
system that works well with distributed, autonomous components. He said,
"The communications approach copes with imperfections and corrects for them,
rather than attempting to achieve an unattainable perfection."
Apparently the example of the telephone system is also featured in the first
chapter of the Eastport group's report to the SDIO. A December 1 draft of
the report was obtained by the editors of Military Space, and reviewed in
the December 9 issue. The report's conclusions are summarized in the
observation that computing resources and battle management software "are
within the capabilities of the hardware and software technologies that could
be developed within the next several years...with the tradeoffs necessary to
make the software tractable in the system architecture."
The panel criticized the ten Phase 1 system architecture contractors for
downplaying the problems of battle management software. The panel
apparently recommends that the SDIO conduct a broader system architecture
study, with an eye toward an "unconventional architecture whose program is
within the anticipated limits of software engineering, without overreliance
on radical software development approaches."
The panel rejects the "tight coordination" implied in the Fletcher
Commission report of 1984. The panel recommends a loose coordination, with
"robustness, simplicity and the ability to infer the performance of small
parts of the system. Otherwise, the U.S. could not test a full-scale
deployment short of actual use."
The panel also says in the report that "innovative approaches" are necessary
for managing the software development of the SDI. The panel report
recommends an independent research and technical support organization to
supervise the program, and a high-speed communications network to support
SDI contractors.
Cohen also told the editors of Military Space that he believes differences
in opinion about the SDI within the computer science community come from
different conceptualizations about the problem. Cohen said, "Critics like
Parnas take the approach they're traditionally familiar with as software
engineers--a 'waterfall' or 'top-down' approach. They look at battle
management software as one single, gigantic 'bounded' problem requiring
all-new software--instead of seeing it as a dynamic federation of many
different networks and nodes, much of which may already be out there now.
"The implication of viewing battle management as a federated network--rather
than as a monolithic, rigidly centralized process prone to single-point
software collapse and "Trojan horses'--comes down to this: the issue is not
software, it's *protocols* between many different networks. It's
inter-computer or inter-network communications--not single-system software."
Worms, etc.
"Keith F. Lynch" <KFL@MIT-MC.ARPA>
Tue, 10 Dec 85 01:10:21 EST
To: MDAY@MIT-XX.ARPA
cc: RISKS@SRI-CSL.ARPA
From: Mark S. Day <MDAY@MIT-XX.ARPA>
The "solitary programmer" mentality is at least partly to blame for
things like "unauthorized worms" -- if people expect to have their
code read by others, who may question the reasons for doing certain
things, it becomes enormously harder to conceal unauthorized features
(unless the programmer can convince the inspector(s) to join in a
conspiracy).
I disagree. How does one guarantee that the source code shown is in
fact what was compiled?
I have little or no sympathy for people who illegally copy a program
and then find one day that it's trashed their data. Serves 'em right.
Does the punishment fit the crime? What if it was some employee who
illegally copied the program, which then destroys irreplacable company
data? How certain is it that this 'protection' scheme will not go off
by accident? If it does, who is liable? Can the company which uses
it simply disclaim all liability?
From: Aaron M. Ellison <BI467000%BROWNVM.BITNET@WISCVM.ARPA>
Regarding Neal Macklin's "expose" of virus technology, I would only
add that the idea is not at all new. John Brunner, a well-known
speculative fiction writer, wrote a novel called "Shockwave Rider"
over 10 years ago(!) predicting the blackmailing of a then corrupt
U.S. government by a morally-upright computer hacker. ...
The idea is older than that. I don't have any references, but I
recall reading about 'computer viruses' before 1970.
...Keith
`Gary North's Remnant Review' (Worms, etc.)
Stavros Macrakis <macrakis@harvard.HARVARD.EDU >
Tue, 10 Dec 85 18:04:07 EST
This overly-long message appears to say nothing new. We are treated to some sort of breathless and misinformed paranoia about `anti-Zionists', Soviets, and foreigners in general, not to mention hackers. We get thriller-novel political scenarios with countless pointless (and hardly verisimilitudinous) details. (See below for a textual analysis of North's tract.) We all know that computer security is hard. We know that banks and other important institutions have often failed to apply even the most elementary security precautions. We even know about `trapdoors', `worms', and `viruses' (which were discussed during the design of Multics, if not earlier). We also know that both technical people and users of computers must become more aware of security issues. What does North contribute?: a vision of a horrible secret hidden from the public by frightened bankers whose livelihood depends on hoodwinking the public into believing that paper money and bank deposits have value -- a secret which

Report problems with the web pages to the maintainer