The RISKS Digest
Volume 3 Issue 39

Tuesday, 19th August 1986

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Nuclear false alarm
Robert Stroud
Risk to beer production?
Robert Stroud
Re: High Tech Sex
Lindsay F. Marshall
QA on nuclear power plants and the shuttle
Roy Smith
Hackers in BITNET
Sterling Bjorndas
Info on RISKS (comp.risks)

Nuclear false alarm

Robert Stroud <robert%kelpie.newcastle.ac.uk@Cs.Ucl.AC.UK>
Mon, 18 Aug 86 17:56:25 bst
"BT is blamed for HM's displeasure" (Computing, August 14th 1986)
by Angus McCrone (c) Computing

British Telecom (BT) is being blamed for a network fault which caused
nuclear attack sirens in Edinburgh to blare into action last month.  The
sirens disturbed thousands of people at 7.30 in the morning.  The incident
coincided with a visit by the Queen and Margaret Thatcher to watch the
Commonwealth Games.

A spokeswoman at the Home Office, which has the responsibility for civil
defence in the UK, said that BT was checking a carrier control unit in
Edinburgh. This is believed to have malfunctioned causing the alarm to go
off.  The carrier control unit, one of about 250 around the country, has the
job of connecting the Ministry of Defence's air operations computer centre
and local police stations which activate the alarm.

The Home Office has ruled out computer error as a reason for the mistake,
and seems convinced that human error or sabotage were not involved either.
This is despite the fact that no similar mistakes have been recorded in the
past 12 years, and that the incident happened at the height of a
controversial visit to Scotland by the Prime Minister.  A BT official
confirmed that a report on the alarm had been sent to the Home Office, but
would not say whether his company accepted responsibility for the mistake.

In time of war the Home Office consults with the MoD before ordering police
stations to switch on the alarms, which warn citizens to expect air or
nuclear attack.  The incident in Edinburgh last month caused little panic
because most people switched on their radios to check there was no real
emergency.


Risk to beer production?

Robert Stroud <robert%kelpie.newcastle.ac.uk@Cs.Ucl.AC.UK>
Mon, 18 Aug 86 17:52:47 bst
"Minor bug starts mass beer panic" (Datalink, August 11th 1986)
by Dave Holmes (c) Datalink

A holidaying programmer sparked off a bizarre series of events last week
culminating in a rumour that Tetley's, Yorkshire's most famous brewery, had
stopped production.  Workmates had realised that they needed the advice of
the programmer, Richard Chubb, to sort out a problem with the control system
he was developing for Tetley.  Police were asked to help track him down on
holiday in Scotland, but a telex from Strathclyde police to seven Scottish
police divisions apparently suggested that the brewery had stopped
production because of a computer breakdown.

News of this got back to Yorkshire and last weekend Tetley was deluged with
calls from worried publicans afraid that supplies of Yorkshire's finest were
about to dry up.  David Gaskill, of the engineering company Turkington which
was installing the control system explained what had happened: "There was a
communications glitch between two systems we are installing at Tetley, and
the program is not fully documented yet. To go through the code was going to
take ages, but Richard could have sorted it out in 20 minutes," he said.


Re: High Tech Sex

"Lindsay F. Marshall" <lindsay%kelpie.newcastle.ac.uk@Cs.Ucl.AC.UK>
Tue, 19 Aug 86 15:21:58 bst
The interesting question that it raises is that of what has happened to the
information on the data-base.  Has it been destroyed, or has it been
incorporated into the Police computer records?
                                                      Lindsay

     [The implication of the article was that indeed the records had been
      confiscated.  With a shredder in the office, it could have been what
      was on the diskettes — but more than likely there were simply 
      printouts lying around.  PGN]


QA on nuclear power plants and the shuttle

Roy Smith <allegra!phri!roy@seismo.CSS.GOV>
Tue, 19 Aug 86 11:50:39 edt
    Last night I watched "The China Syndrome" on TV.  For those of you
not familiar with this moderately-trashy movie, it's about the threat of a
meltdown at a nuclear power plant.  It seems that when the plant was built,
the X-ray testing of the welds was faked, so a bad weld went unnoticed
(causing a pump to fail, etc).
                                 [That was taken from some real cases...  PGN]

    Anyway, at one point, the hero exclaims, "but our quality control
is second only to NASA's!"  Shows you the RISKS of making comparisons,
doesn't it?  Do nuclear plants have O-rings?

                                 [No, but they do have lots of reports of 
                                  equipment failures and human errors that
                                  don't seem to get wide public view.  PGN]


Hackers in BITNET

<BJORNDAS%CLARGRAD.BITNET@WISCVM.WISC.EDU>
18 AUG 86 12:43-PST
The following is an abridged version of an article from issue 3.3 of
VM-COM, an e-magazine published distributed in BITNET. It has been
edited with permission, by Sterling Bjorndahl (BJORNDAS@CLARGRAD).

               Life in the Fast Lane:  Column #2

                  Chris Condon BITLIB@YALEVM

   There are hackers in BITNET.  You aren't surprised, I'm sure.  Now, not
all hackers are slavering, demented, animals waiting to break into, crash,
and destroy systems, illegally using their resources, plundering userids
that are not their own, and making a general mess out of everything.
   Only some are.
   There exists in this network a group of hackers who broke into a userid
at Fermilab via BITNET.  They used the RELAY conference machine system to
keep in contact. Administration types at Cornell University, hearing of
this, came to this conclusion:

   "The Cornell Relay has been shut down forever due to the misuse of BITNET by
   some hackers in West Germany who discussed their trade on the Relay.  It is
   Cornell's desire to not be associated with the Relay system in the future..."

   The reaction by these people might seem a bit extreme, but it could be
even worse.  There are some people in BITNET who would like to see students
completely banned from the network, or chatting banned from the network, or
both.  These are people to be reckoned with. They are in positions of power
to do such things at their own nodes, given enough reason.  For Cornell, the
hackers breaking into Fermilab turned out to be an excellent excuse. It need
not be anything so extreme.
   Our actions are a reflection on the students in BITNET.  It has been said
(not enough) that BITNET usage is a privilege.  It brings with it a great
responsibility.  Everything we do may have far reaching effects without our
knowing it.  The hackers that broke into Fermilab were not from Cornell, had
no intention of getting that Relay shut down, and they probably did not
consider that it would happen.
   I posted a notice on this subject for the Usage Guidelines Group via
LISTSERV@BITNIC.  These are some of the responses (names withheld):

A. "The problem, as I see it, stems from a lack of moral and ethical 
   standards in the computer world, as well as the natural inquisitiveness 
   of young people specifically and computer type people generally."

[I disagree that "computer types" have any worse ethical standards
than the bulk of this society. They just have a lot of power. - S.B.]

B. "I don't know what, if any, audit trail is left from interactive traffic on
   the net. If there isn't any, I think there ought to be and installations
   with security concerns about chatting should monitor the traffic for
   suspicious activity."

C. "A totally restrictive policy, one that makes absolute and unbending
   restrictions, especially to undergraduate students, will have two effects.

     1: Those persons who are borderline on being responsible or abusive 
        with the system may just go the wrong way, partly out of challenged
        to their perception of a "cold-hearted" system.

     2: Students will lack (unless they break in and get away with it which
        is what we try to prevent) a practical education of how real life
        computers are implemented.  I know these things to be true from
        first hand experience, because I used to be such a hacker. I did get
        away with it and I did learn enough to go right into an upper level
        systems programming job right out of school... The school I attended
        had a very closed policy.  They were, however, not effective in
        implementing that policy, and so some of us got into the system."

D.     "My suggestion is that a policy be established to deal
        [constructively] with "curious students" who show promise.  Just
        how you do this has to depend on your resources."

   Like it or not, someone is looking over your shoulder.  Maybe you
won't get caught when you do something irresponsible via BITNET, but
somebody will pay the consequences.  Somebody out there is looking for
an excuse to shut you, or some other student, out of BITNET...  The
actions of some students have simply led  him to believe that shutting
students out is a good thing.  It will take your example to convince
him otherwise.

Please report problems with the web pages to the maintainer

x
Top