The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 6 Issue 13

Sunday, 24 January 1988


o U.S. Fears Satellites Damaged
o Signal-light malfunction blamed in L.A. train wreck
o Big Error on Benefits by a State Computer
o London Underground Ticket Machine fraud
John Pettitt
o The responsibility of and for `bringing us C and Unix'
Geraint Jones
o Risks in technology transfer policy
Alan Wexelblat
o Technology transfer policy and Halley's Comet probe
Alex Colvin
o Non-ionizing radiation
John Nowack
Jonathan Thornburg
o Books about SDI software -- a request
Dan Jones
o Info on RISKS (comp.risks)

U.S. Fears Satellites Damaged

Peter G. Neumann <>
Sun 24 Jan 88 14:10:34-PST
Subtitle -- Soviets used lasers to cripple equipment, sources contend.

Washington, by Richard Sale (UPI, 24 January 1988).

U.S. intelligence agencies are convinced Soviet laser attacks have damaged
supersophisticated U.S. spy satellites deployed to monitor missile and
spacecraft launches, administration sources said.  These sources said they
believe the Soviets fired ground-based lasers to cripple optical equipment
attempting to scan launches at Tyuratam, the major Soviet space center, to
obtain a variety of sensitive military information.  Administration
intelligence sources said they fear that other vital U.S. reconnaissance
satellites will soon be endangered because six new Soviet laser battle stations
are under construction...  "There is no way you can protect the optical sensors
on satellites" from laser attacks, an Air Force official said. ...

Intelligence sources acknowledged that the Pentagon also has trained
ground-based lasers on Soviet spacecraft, sometimes in attempts to disrupt
their sensors. ...

  [From the San Francisco Examiner and Chronicle, front page, 24 Jan 88.  The
  article goes on to consider reports that some spacecraft malfunctions may
  have been due to laser "hosing", e.g., a KH-11 or Code 1010 satellite, which
  was permanently damaged in 1978.  Seems unlikely -- the technology was not
  very well advanced then?  PGN]

  [However, the risks of laser interference or accidental triggering are worth
  noting.  Adding to the risks of computing in SDI, might such a concerted
  attack of simultaneous laser bursts on many satellite sensors be mistakenly
  detected as the launch of a nuclear attack!?  PGN]

Signal-light malfunction blamed in L.A. train wreck

Peter G. Neumann <>
Sun 24 Jan 88 14:28:53-PST
PICO RIVERA, Los Angeles County (AP, 24 Jan 88)

A malfunctioning signal light appeared to have caused a freight train to crash
into a parked train, killing a man and igniting a fire that consumed a church
and a store, a railroad official said Saturday.  A 72-car freight train
traveling about 40 mph to 45 mph slammed into a parked 67-car freight train
at 10:30 p.m. Friday after a signal light about a mile from the impact gave 
the green go-ahead light, an official said.  Damage to the trains and
buildings was estimated at $2.3 million.

  [From the San Francisco Sunday Examiner and Chronicle.  The identical story
  appeared TWICE in the same issue on 24 January 1988 -- on page B-5 and also
  on page B-7, although with different headlines.  The headline guy must have
  been napping, or else the story was intended to illustrate the importance of
  redundancy.  PGN]

  [Ironically, the Federal Communcations Commission recently approved plans
  for a nationwide computerized train-control system -- inspired by the
  collision on 4 January 1987 of three speeding Conrail locomotives and an
  Amtrak passenger train, klling 16 and injuring 176 near Chase, MD, with
  losses estimated at over $40 million.  The FCC's private radio bureau
  reported that "This terrible collision could have been avoided had the
  locomotives been under the control of a central computer."  This popular
  view assumes that such computer systems always work correctly, and that
  people always program them correctly.  PGN]

Big Error on Benefits by a State Computer

Peter G. Neumann <>
Sun 24 Jan 88 14:15:34-PST
By Perry Lang, San Francisco Chronicle, 21 January 1988.

"Thousands of Californians have been charged for unemployment and disability
insurance benefits they never received because of a computer snafu in
Sacramento.  One of the state's computers, which tallied ... benefits for 1987,
malfunctioned and moved the decimal place two spaces to the right -- producing
dollar amounts that were up to 100 times more than they should have been. ...
[A]bout 60,000 people throughout the state received erroneous statements."

   [Computer malfunction? or program error? or human error on input?  PGN]

London Underground Ticket Machine fraud

John Pettitt <>
Mon Jan 18 13:50:11 1988
Reproduced without permission from "datalink" monday 11 jan 1988

London Underground's controversioal UKL 150 million computerised ticket
system could create a fare dodgers' paradise. ...  The system, based on
sophisticated real time software developed by Logica, has been criticised
because it allows adults to purchase child tickets and travel on the
Undergroud without being visualy checked by ticket collectors. ... Now
security consultants have confirmed that the new type of ticket, which uses
a magnetic strip holding details of the fare, will be easier to forge that
the traditional printed type.

John Maxfield, and anti-hacking consultant in Detroit, says similar tickets
have already been beaten by teenagers in the US.  He said:  "San Francisco
metro caught a gang forging the tickets there last January.  The gang had
used pasteboard and cassette tape to make duplicates."

A spokesman for Westinghouse Cubic, which manuafctures the new ticket
barriers, at first denied its system had been breached in the US.  But a
spokesman later admited:  "With the right know-how, of course anything in
the world can be duplicated, including our tickets."

Can any US readers of comp.risks add any further info on the SF incident ?

John Pettitt, Specialix, Giggs Hill Rd, Thames Ditton, Surrey, England, KT7 0TR
Tel: +44-1-398-9422         Fax: +44-1-398-7122          Telex: 918110 SPECIX G

    [Considering how easy this is to do in SF's BART and in DC's METRO, we 
    might just as well NOT discuss it here.  But the vulnerability -- a 
    playback copycat attack -- has been well known for many years.  PGN]

The responsibility of and for `bringing us C and Unix'

Geraint Jones <>
Sat, Jan 23 15:15:10 1988 GMT
I take issue with some of Peter Neumann's editorial comment (RISKS 6.11,
after the continuing discussion of Sun clock problems traced to mishandled
side-effects in C expressions).  I accept that the programmer was at fault,
that we should always be aware that we are at risk of being allowed to make
mistakes; but `the people who brought us C and Unix' _do_ share the fault.

    UNIX is moderately wonderful; because of that, you will scarcely find a
convenient and powerful desktop computer which does not use UNIX, and I for
one would not choose to use one.  Choose UNIX, and you get C; and that's the
fault of all `the people who brought us C and Unix', including the few
individuals who had the good (and just one or two bad) ideas in the first
place, all the universities and companies who have popularised and modified
UNIX, and those of us who use it.

    C was a pretty neat idea when you compare it with what else was about
fifteen years ago, and without it UNIX could not have been knocked up as it
was.  The technology exists, and has existed for years, to check that the
arguments of a function are side-effect free.  Ten years ago, I used a BCPL
compiler that would decide whether or not it was safe to call arguments to
`macros' (manifest functions) by substitution.  Where are the C compilers that
check for such things?  I write C programs only because `the market' has
created a near monopoly in portable programs in the community in which I work.

    The C macro-substitution mechanism cries out to be misused, and we
_should_ kick up a fuss about it.  Such things should not be allowed to
continue.  Peter Neumann reminds us to ``Know your requirements before you
start designing, programming, or simply using a computer system.'' Well, I
do; I want to write programs which correctly implement the algorithms I
design.  I want the software tools that I use to make it as difficult as
possible for me to make a fool of myself; yes, even at the expense of making
it harder to write programs. Now, where do I get them?               gj

    [RISKS are in the eye of the beholder.  ALL COMPUTING entails certain
    risks.  If you want perfectly safe programming languages and operating
    systems, you would be most unhappy with the constraints.  The only
    program you could write would be THE NULL PROGRAM, and even that would not
    be safe if nonstop real-time positive control were required.  On one hand 
    we have people who will tell us that they can produce 10 million lines of
    code that will work adequately without system testing.  On the other hand
    we have systems and languages that hinder any such efforts.  Ultimately we
    need truly gifted programmers.  Ken Thompson is one.  But there probably
    aren't more than a handful anywhere approximating him in the country.
    Besides, people that creative would be badly matched to the task of trying
    to write 10 million lines of code.  Creativity often is best exercised when
    the results are not what was expected.

    You might look at Modula 2 and C++.  But don't expect fool-proof operating
    systems.  There aren't any.  By the way, we should not trust any programs
    developed by fools -- even with perfect tools.  PGN]

Technology transfer policy and Halley's Comet probe (RISKS-6.12)

Alex Colvin <>
Sat, 23 Jan 88 14:22:06 EST
In regard to the discussion of technology transfer policy:  Scientific
American noted that on the Soviet Halley's Comet probe the only experiment
not controlled by a microprocessor was an American contribution.

                       [I presume you are implying that this is a RISK.  
                       It might even be a BLESSING IN D' SKIES?  PGN]

Non-ionizing radiation

Fri 08 Jan 1988 17:20 CDT
When I read the study about non-ionizing radiation, I seemed to remember an
article in a similar vein, and about an hour at the library dug it up.  It's
actually a series of articles published in QST, the technical magizine of
the American Radio Relay League.  The following comes from QST, Vol. LXII,
No. 9, September 1978, p. 31.  For more information on this same subject see
QST Vol. LXII, No. 6, June 1978, pp. 11-13, and for more info on the risks
of chemical exposure see part 2 of that article in No. 7, July, 1978,
pp. 37-38.  Most towns with an active ham population will have a club that
will more than likely have given a subscription to this publication to a
local public or university library.

John Nowack -- KA9EYT (aka The Black Knight)
(A member of the Society for the Prevention of Injustice to Tuna (S.P.I.T.))

MISS042@ECNCDC.BITNET <>======> Western Illinois University (A Member of
                                   the Mid-Illinois Computer Coopertive;
                                   Educational Computing Network)

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=%< cut here -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
The following disclaimer heads the article:
The publishers of QST assume no reponsibility for statements made by

                 How Dangerous is RF Radiation?
                      by: J. E. Kearmen, W1XZ
                          RFD, Collinsville, CT

Workers at Motorola have recently conducted experiments of great interest to
most amateurs.  Their results have been published in several IEEE publications
(see end for info).  I'm grateful to Mr. Ronald Brecher, WA2EUN, who supplied
a copy of the March, 1977 document.
  The experimenters constructed a simulated human head and torso and exposed
it to the radiated fields from 150 and 160 MHz, 6 watt handheld transceivers.
Both radios were equipped with helical, or "rubber duck" antennas.  In
addition, tests were performed with a 1/4 wavelength antenna installed on the
450 MHz unit.  A thermal probe was used to measure temperature rise due to
exposure.  These experiments were performed because of a concern that the
newer, high-power units might pose a health hazard.  Previous measurments of
the field strength surrounding these radios had indicated that a field
intensity exceeding 10 mW/cm2 might exist.  This is a safety standard for human
exposure to RF energy at higher frequencies.
  Beacause the field would be concentrated by a probe causing nontypical,
localized heating, the probes were removed while the transmitter was operating.
The "dummy" was exposed for from 15 to 60 seconds.  After power was removed,
the probe was again inserted and the temperature change was determined.  Steps
were taken to prevent thermal transients caused by the insertion and removal
of the probe.  It would have been possible for heating to occur in small
areas not being monitored by a probe. To look for "hot spots", an IR
(infrared) scanner was used to take thermograms of the dummy.
  Assuming the transceiver was positioned as it would be during normal
operation, no significant heating effects were noticed on either band. Even at
450 MHz, the temperature rise was slight.  At a shallow probe depth (0.2 in.
or 5 mm), the greatest temperature rise was less than 1 degree C.  (Actually
10 degree C, at the eyebrows - jcn) At deeper probe penetrations, the
temperature rise was less.  Attempting to determine possible hazards from a
measurement of radiated field intensity may cause misleading results.  The
low total energy and high field impeadence which exist when such radios are
brought in close proximity to the body will result in lower energy transfer
than field strength measurements alone would seem to indicate.  For example,
at a point two inches (50 mm) from the helical antenna of the 150 MHz
transmitter (Fig 1 (a good drawing of the measured temperatures -jcn)), a
Narda field probe measured a maximum field intensity of 168 mW/cm2.  This
value greatly exceeds the 10 mW/cm2 exposure standard.  Measurements based on
the penetrating effects at the same point indicate a maximum power flow density
in tissue of 2.8 mW/cm2.  On 450 MHz, with the same spacing from the 1/4
wavelength *whip* antenna (Fig 3), a maximum radiated intensity of 16 mW/cm2
was found.  Power-flow density was only 2.5 mW/cm2.  The radiation meter
indicates a hazardous condition, while actual measurement of the effects
shows this is not the case.  Power *absorption* in all cases was less than
1 mW/cm2.
  IR thermograms did not detect any unusual hot spots.  A health hazard
exists when the tip of the antenna is close to the eye (whithin 0.2 inch or
5 mm) and the transmitter is operated.  In this case, an rf burn will result
on the cornea.  The thick plastic cap on the tip of the antenna makes this
unlikely to occur.  When the radios are held in the normal position for use,
no hazard exists.
  While these tests were performed for 150 and 450 MHz, I think it safe to
assume we need not fear our 220 MHz rigs either.  These tests point out the
fallacy of using radiated field intensity as a criterion of saftey.  Some
consumer publications have begun to measure field strength radiated from
CB radios.  Comsumers have been warned not to stand too close to the mobile
whip while a 5-watt CB transmitter is operating, due to the high field
strength!  These papers have shown that radiated power may greatly exceed that
which is absorbed and converted into heat.  Amateurs should continue to
exercise prudence when using uhf and microwave equipment, of course.  It
does seem that our portable transceivers pose no threat to our health.

cancer, ham radio operators, and Poisson statistics

Sat, 9 Jan 88 20:21:02 PST
Perhaps I'm missing something, but the AP story quoted in Risks 6.3
about cancer death rates among ham radio operators doesn't seem to
me to show anything abnormal --- the deviations from expectation are
about what you'd expect from random fluctuations.  For example, for
the leukemia case (29 exp vs 36 obs), *chance* *fluctuations* *alone*
will cause the number of deaths to be at least 36, about 10% of the
time.  In other words, if we hypothesise that there's no excess, then
this experiment (still) has a 10% chance of seeing excesses at least
as large as those observed.

The other rates quoted give similar results.  The probability that
all these rates would simultaneously deviate by these amounds is
rather small, but this sort of statistical "inference" is frowned on
by the pros --- it risks a "shotgun effect" in which you check (say)
100 different types of cancer, find 5%-chance-occurence sized excesses
in 5 of them (quite unsuprisingly), then report just those 5 and say
that the chances of getting these excesses in all 5 is (5%)**5 = one
chance in 3 million.

Of course, the AP reporter may well have garbled things, but the data
in the story don't seem to prove (*) any excess death rates.

(*)     I'm using "prove" in it's normal statistical sense, ie "prove
        at a 95% or better confidence level".

Books about SDI software

Thu, 21 Jan 88 22:07 EDT
I am going to be writing a report on the feasibility of the software for SDI.
Have any RISKS readers seen any good books or articles on the subject?  If so,
would you mind mailing me a reference, and maybe a few sentence abstract.  I
will post a complete list if anyone is interested.  Thanks in advance.

Dan Jones, dmj3@cisunx.uucp,

   [RESPONSES TO Dan, PLEASE.  Completed list from Dan to RISKS, please... PGN]

Please report problems with the web pages to the maintainer