The RISKS Digest
Volume 1 Issue 13

Saturday, 14th September 1985

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Risks in RISKS
Peter G. Neumann
Preserving rights to Email messages
Larry Hunter
Risk Comparisons
T. Tussing
Risks history/philosophy
Nicholas Spies

Risks in RISKS

Peter G. Neumann <Neumann@SRI-CSLA.ARPA>
Sat 14 Sep 85 23:18:42-PDT
The text of the message that follows this one is taken verbatim from the
HUMAN-NETS Digest (HUMAN-NETS@RUTGERS), 11 Sep 1985, Volume 8 : Issue 29, on
the topic of risks in EMAIL.  That topic is of vital significance to the
RISKS Forum, for at least two reasons:

  (1) You should recognize the risks that might be incurred by you in 
      submitting messages to this forum, and in sending on-line messages 
      in general.

  (2) The free propagation of messages not copyrighted can itself lead
      to significant risks to the contributor and to the RISKS Forum,
      if those messages were false or libelous, or if they are altered.

In general, you and I must assume that any message on this forum may be
forwarded indefinitely and read by anyone, and could appear in print in all
sorts of strange places.  (If something worthy of inclusion in the ACM
Software Engineering Notes is at all controversial, I ask for explicit
permission before publishing it.)

What is even RISKIER is that your message can be trivially altered along the
way as it traverses the network of networks, possibly drastically changing
your intended meaning.  Use of check sums and crypto seals can reduce the
risk of undetected alteration, but does not solve the problem entirely.

Peter G. Neumann 

(This message is not copyrighted.)


preserving rights to Email messages.

Larry Hunter <Hunter@YALE.ARPA>
Tue, 3 Sep 85 11:08:31 EDT
Copyright-by: Larry Hunter, 1985

After consulting with several lawyer friends, the conclusion I
reach is that anything you send out over the nets is public
property — ie, anyone can reproduce it verbatim, for profit
and the author has no right to control its use.  There is,
however, a very simple way to preserve the author's rights to
control the use his messages are put to.  The courts have
held that practically any clear attempt of an author to
preserve his/her rights to a written work are sufficient
to actually preserve them.  No need to add the 'circled c'
to ASCII, just add a 'Copyright-by:' line to the header of your
local mailer and voila! your rights are preserved.

                                               Larry

PS. I am not a lawyer and this is only my opinion - if you have
a vital interest in some related matter, talk to a real lawyer!


Risk Comparisons

<TTussing.es@Xerox.ARPA>
13 Sep 85 13:27:12 PDT (Friday)
Someone sent me this and I thought the people on this mailing list might
be interested.

Excerpt from a pamphlet by Dow Chemical Corp, entitled Life is in the
Balance:

Dr. richard Wilson, who is professor of physics at Harvard, has devised
a mathematical formula that measures risks in terms of the minutes and
seconds of life lost.  Taking the average person of 30 who has a
life-span in the United States of approximately 73 years, Wilson says
that statistical person cuts time from his life in the following ways:

Smoking one cigarette - minus 12 minutes
Drinking a diet soft drink - minus about 9 seconds
Driving without a seat belt - 6 seconds for each trip
Being an unmarried male - minus 1800 days
Being male rather than female - minus 2700 days

We can also view risks by figuring which risks are qeual.  For example,
the following items all pose an equal risk of increasing the likelihood
of death by one chance in a million:

Drinking half a liter of wine
Spending three hours in a coal mine
Living two days in New York
Traveling six minutes by canoe
Riding 10 miles on a bicycle
Driving 300 miles in a car
Flying 1000 miles by jet


Risks history/philosophy

<Nicholas.Spies@CMU-CS-H.ARPA>
14 Sep 1985 01:06-EST
This is definitely old news, but then again the facts behind the case have
only recently seen the light of day. In his recent biography "Alan Turing:
the enigma" (Simon & Schuster) Andrew Hodges reveals in some detail the
inner workings of the German Enigma encryption device (arguably a
"computer") which contributed (backhandedly) to the development of computers
as we know and love them today. (If you're interested in Turing, computers
or WWII history read it, if you haven't already.)

The portion of the book devoted to Turing's stint at Bletchley Park is
peppered with lost opportunities on the German side.  Apparently with little
additional effort the Germans could have rendered the "Bletchley bombes"
completely useless. In fact the only reason the Germans were not careful was
their unswerving faith in the Enigma device. Even when there was ample
evidence pointing to a message-security problem the integrity of Enigma was
never seriously questioned by the Germans. There were, of course, countless
other factors, but the German faith in their technological answer was in
large measure responsible for their losing the U-boat war and very likely
the war itself.

Another anecdote, not related to computers (recounted in either "The Ultra
Secret", Winterbotham or "Bodyguard of Lies", A. Cave Brown, two other
excellent books on the secret war) gave one reason for the German atomic
bomb project not really getting off the ground.  It seems that a German
professor of the old school was in charge of finding a material for
moderating a chain reaction (by absorbing neutrons). Graphite was tried but
failed which meant that deuterium (heavy water) was thought to be needed.
When it was suggested that the graphite might not be pure enough (which, as
it turned out, was the reason the test failed) the professor reacted with
rage that his authority was being questioned and he effectively derailed
German research in reactor design. (Later the plant for making heavy water
built in Norway was sabotaged by British agents which made a reactor
impossible which preventing the manufacture of fissile material.)

These examples suggest that excessive reliance on either technological
solutions or "authoritative opinion" may carry grave risks, albeit in these
cases for an evil regime. The question facing us is whether we (or the
Soviets, for that matter) have fallen into the same traps. I would say that
we (both) definitely have, for the means to power are now more than ever
technological (as opposed to political or diplomatic) and one or another
"expert" is routinely trotted out to "prove" the efficacy of this or that
technological scheme.

Indeed, how can it be otherwise? Hitler opened the Pandora's Box of applying
high-tech to warfare and it worked (at least until a higher-tech response
prevailed). After WWII a new era was born in which global political power no
longer rested on moral authority but on a command of the new applied
sciences and scientists. Engineers had provided political leaders with
instruments of war for centuries, but now scientists are looked upon as the
fountainhead of political power, by dictators, politicians and the people
alike. It may now be said truly that Knowledge is Power.

To some the risks of technology are the inevitable consequence of the
inexorable "progress" that technology itself represents. It seems to me that
this view places too great an emphasis on the growth of "technology" itself
at the expense of our ability to integrate it with human wants and needs.
It's almost to say that "technology" has a virtual life of its own that we
have no control over. This is manifestly untrue because "technology" is the
mere summation of the creative acts and compulsions of a great number of
people enamored of the idea of "technology". But if "technology" does have a
life of its own it must be based on a willing denial of responsibility on
the part of each member involved in furthering it, particularly when large
populations or the world are put at risk by introducing a new "technological
development". It seems, therefore, self-evident that morality and technology
are intimately interwoven.

In the largest sense, the risks of computers are the risks of having an
increasing number of computer experts that are in a position to tell people
what computers can be safely used for.  Their expert opinions may be well
thought out or erroneous, as the case may be, but they are in fact the only
opinions that the public, financial institutions, military establishments or
politicians can depend on. The fact that any or all may place a value on
this expert information and act on it puts a heavy moral burden on the
providers of this information, whether they like it or not.

The only population that I have had direct contact with who have faced this
primal issue of the risks of technology are the Amish-Mennonites of Ontario;
I made a film about their 150th Anniversary in Canada. (I have also edited
films about the Amish in Lancaster Co., PA.) The trigger for the Amish was
rubber-tired wheels on carriages around the 1870's because this allowed the
young of courting age to "go to town" more easily, with a perceived
disruption of Amish life not far behind. To this "improvement" they said
"No". Subsequently, the Amish have taken great care to keep the increasing
technological developments surrounding them at bay, but not by pure
rejection. In short, they have evaluated the risks of adopting technologies.

For instance, gasoline engines are permitted for stationary use (and also on
horse-drawn wagons) for harvesting, threshing, bailing and for powering milk
refrigerators.  There's no contradiction in using a valuable power source so
long as it isn't applied to providing the means for increased contact with
the outside world. Electricity is used if generated within the farm; and
public telephones may be used as well; as long as wires (i.e. connections)
to the outside world are avoided there is no reason to avoid using the
technology. The oddity of the Amish is based on common sense when their
objectives are known.

Although the Amish reaction to technology may strike many as merely "quaint"
they do show that it is possible to stop short the "inevitable" growth of
technology. (The Amish are renowned for their success in farming, which is
not the case for many others that have embraced modern technological ways.)

I am not advocating a general return to Amish ways (indeed this only makes
sense within the context of Amish values), but I will say that we all face a
similar confrontation with a technology that may do us great harm on many
fronts.  Unfortunately we are prone to treat our own creations (be they
buildings, cars, chemicals or computers) as if they are as benevolent as the
products of 5 billion years of co-adaptive evolution. As increasingly
complex and interdependent as our creations become, the more they will
reveal themselves as ill-suited to the tasks they were meant to perform; it
only stands to reason because of the lack of a truly global feedback in the
design process. And also, how are we to judge the efficacy of our machines
if we have lost sight of the reason we have created them?

Please report problems with the web pages to the maintainer

x
Top