The RISKS Digest
Volume 5 Issue 29

Saturday, 15th August 1987

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

RISKS submissions
PGN
Lack of user training = legal liability? — Computer SNAFU Ruled a Rights Violation
Rodney Hoffman
London Docklands Light Railway
Mark Brader
Software and system safety
Nancy Leveson
New safety MIL-STD
Nancy Leveson
Info on RISKS (comp.risks)

RISKS submissions

Peter G. Neumann <Neumann@csl.sri.com>
Sat 15 Aug 87 18:12:52-PDT
I submit that your submissions over the past weeks have driven me into
submission.  The quality of contributions and self-control exhibited by
contributors has dropped off radically.  Various complaints have been
received recently that I have suddenly become far too generous in including
too much irrelevant, unsound, incoherent, and otherwise marginal material.
Well, I currently have a HUGE backlog of messages for consideration — 30 in
the last four days — but all are mostly minor variants of previous
contributions, and many of them will not emerge.  (Perhaps the recent deluge
is merely a summertime phenomenon?)  Well, due to my own heavily overloaded
schedule, RISKS will slow down for the next two weeks, when I will be
limited to a few short shots at NETland.  But please keep on submitting the
GOOD stuff — I'll get to it eventually.  And please excuse the slowdown.

By the way, I erred in including the MITRE notice in RISKS-5.28.  Please
don't expect me to do it again.

The responses to Denning/Saltzer/Hardie are running about three-to-one for
nonsecrecy about security flaws, with all sorts of caveats, hedges, special
cases, etc.  Logic tells you that you'd better know when — and how — YOU
are vulnerable.

The certification issue is not generating much enthusiasm either way.
(Perhaps certification is also needed for computer system users who have
authorization to do act dangerously or fraudulently.)  The legal aspects of
screwing up may be a driving force toward certification to protect system
developers (or procurers?) if a big liability suit gets won by the injured
parties.  (The Rogan case that follows is small potatoes.  By the way, Donn
Parker notes that the Equity Fund case, with direct losses of $200 million,
is actually estimated at $2 billion total losses if indirect costs such as
stockholder losses are included.)

The standards issue for system safety is warming up again, but is also not
likely go generate too much enthusiasm.  (If you care, see the concluding
item in this issue.  It also implies all sorts of risks.)


Lack of user training = legal liability?

Rodney Hoffman <Hoffman.es@Xerox.COM>
14 Aug 87 11:07:01 PDT (Friday)
                                          [Also submitted by Frederick Bingham]

Excerpted from the Los Angeles Times, Aug. 13, 1987:

          COMPUTER SNAFU RULED A RIGHTS VIOLATION
    Wrong Man Repeatedly Detained on Murder, Robbery Charges
                   By Jack Jones

The Los Angeles Police Department violated the constitutional rights of
a Michigan man by continuing to list him in a nationwide crime
information computer system as wanted for murder and robbery, even after
it was clear the real suspect was using his name, U.S. District Judge
Robert Kelleher has concluded.

Terry Dean Rogan, 29, will now go to Los Angeles Federal Court to seek
monetary damages ....  Kelleher found that [the two] Los Angeles Police
Detectives... do not share liability with the city, even though on
several occasions they re-entered Rogan's name in the National Crime
Information Center system and failed to add new descriptive information
that would have established Rogan as the wrong man.

The city failed, Kelleher ruled, to properly train and supervise the
officers in the constitutional protection aspects of using the crime
center system and the necessity of adding more accurate information when
it becomes available.  The actual suspect, Alabama state prison escapee
Bernard McKandes,... began passing himself off as Rogan in 1981....

Rogan ... was first arrested in October, 1982.... He was held in jail
until a fingerprint check showed he was not the man being sought.
Nevertheless, ... Los Angeles detectives put his name back into the
computer, neglecting to add qualifying information, and he was arrested
four more times — mostly after routine traffic stops.  On two
occasions, he was taken into custody at gunpoint, handcuffed and jailed.
It was not until early 1984 that Rogan's name was removed from the
computer system — after a Saginaw News reporter told [one of the L.A.
detectives] that McKandes ... was back in prison in Alabama.
McKandes... subsequently was convicted of the California murder and
robbery charges.
                     [See ACM Software Engineering Notes 10 3
                     (July 1985) for some further background.  PGN]


London Docklands Light Railway; Northern Line's Dot-Matrix Indicators

Mark Brader <msb@sq.com>
Thu, 13 Aug 87 12:44:03 EDT
The June issue of the British magazine Modern Railways carried this item:

     'UNAUTHORISED TESTS' CAUSED DLR CRASH

     The Managing Director of Docklands Light Railway Limited, Cliff Bonnett,
     has said the accident which occurred at Island Gardens station on
     10 March (Modern Railways, May) was primarily caused by unauthorised
     tests, carried out before required modifications had been carried out
     at the southern terminus.  The train, which ended up overhanging from
     the elevated track after crashing through buffers, would have been
     'arrested' if the protection system 'in its full and modified form'
     had been installed.  The train was being driven manually.

Incidentally, on the same page of the magazine is this:

     DOT MATRIX PROBLEMS

     A marked improvement in the performance of the Northern Line's
     gremlin-infested dot-matrix train indicators is promised by the
     autumn, but modified software cannot be satisfactorily commissioned
     for a year, says London Underground, while a new central computer
     at the line's Coburg Street, Euston control centre is awaited.
     Meanwhile, some indicators have lost the minutes-to-train-arrival
     feature, displaying only the order of train arrival.

Mark Brader, Toronto, utzoo!sq!msb


Software and system safety

Nancy Leveson <nancy@commerce.UCI.EDU>
Thu, 13 Aug 87 10:49:41 -0700
Two weeks ago I taught a 3-day continuing education class on software safety
at UCLA.  The makeup of the class says some interesting things about the
awareness of software safety issues in the U.S.  I was pleasantly surprised
to have 40 people enrolled (the average class size of software engineering
classes there is about 25).  Half the class was from outside California
which means their management was willing to invest money in sending people
across the country to take the class (implying some awareness and commitment
to the problem — the class itself also was not cheap).  It was also
interesting to note that although the majority of people came from the
aerospace industry (including someone from Morton Thiokol), there were 3
medical device manufacturers (all said that their attendance was directly
related to the Therac 25 incident — these accidents that you read about in
risks do have an effect, especially when lawsuits and media publicity are
involved); 2 commercial aircraft manufacturers and a manufacturer of
aircraft engines; the Air Force, Army, and Navy; a couple of firms that do
safety analysis on a contract basis; and one in entertainment (Walt Disney).

Many in the class identified themselves as safety engineers [see the
following message], but there were also software engineers and a fair number
of people who identified themselves as "software safety engineers" or
"software system safety engineers."  I was especially curious about the
software safety engineers and asked a few questions.  All but one had
previously been system safety engineers and had acquired this title within
the past few months.  One had been a software engineer previously and had
become a "software safety engineer" very recently.
                                                       Nancy Leveson

[Now I know where all the mickey-mouse computer systems are coming from.  PGN]


New safety MIL-STD

Nancy Leveson <nancy@commerce.UCI.EDU>
Thu, 13 Aug 87 13:31:50 -0700
   [This item may be boring to some of you, and important to others.  It is
   included for the record.  Comments To: nancy@ics.UCI.EDU, Cc:RISKS.  PGN]

A new change notice to a system safety standard (MIL-STD-882B: System Safety
Program Requirements) has just been released (July 1, 1987).  The surprise
is the amount of reference to software contained in it and the new tasks on
software safety included.  The following are some excerpts (there is lots
missing).  I am sending more information to Peter Neumann for inclusion in
the next issue of SEN.  I am curious about how most system safety engineers,
who are untrained in software engineering, will be able to accomplish these
tasks.  Since in most cases they will not, I would guess that many of these
requirements will be passed along to the software engineers to actually
perform.  The tasks could potentially also have other impacts on software
engineers working on safety-critical projects.  Those of you in the
aerospace industry should be aware of what is about to hit you and others
may find other government agencies following suit.

TASK 202 - PRELIMINARY HAZARD ANALYSIS
[includes] consideration of the potential contribution by software to 
subsystem/system mishaps, safety design criteria to control safety-critical 
software commands and responses and appropriate action to incorporate them in 
the software specifications, and software fail-safe design considerations.

TASK 203 - SUBSYSTEM HAZARD ANALYSIS
identify all components and equipments, including software, whose performance,
performance degradation, functional failure, or inadvertent functioning could 
result in a hazard or whose design does not satisfy safety requirements.

TASK 204 - SYSTEM HAZARD ANALYSIS
perform and document a system hazard analysis to identify hazards and assess 
the risk of the total system design, including software, and specifically of 
the subsystem interfaces.  

TASK 204  OPERATING AND SUPPORT HAZARD ANALYSIS
requirements to evaluate hazards resulting from the implementation of 
operations or tasks performed by persons ... Includes identification of
changes needed in software to eliminate hazards or reduce their associated
risks along with warnings, cautions, and special emergency procedures
including those necessitated by failure of a software-controlled operation
to produce expected and required safe result or indication.

TASK SECTION 300 - SOFTWARE SYSTEM SAFETY
[states that] Software System Safety is an integral part of the total System 
Safety Program.  The 300 series of tasks are recommended for programs which 
involve large or complicated software packages ...

TASK 301 - SOFTWARE REQUIREMENTS HAZARD ANALYSIS
The contractor shall examine systems and software requirements and design in 
order to identify unsafe modes for resolution, such as out-of-sequence, wrong
event, inappropriate magnitude, inadvertent command, adverse environment,
deadlocking, failure-to-command modes, etc. ... Software Safety Requirement 
Tracking ...  Analyze Software Requirements Specifications:... assure that the
System Safety Requirements are correctly and completely specified, that they
have been properly translated into software requirements, and that the software
safety requirements will appropriately influence the software design...
The contractor shall develop safety-related recommendations, and design and 
testing requirements and shall incorporate them in the Software Top-Level
and Software Detailed Design Documents, and the Software Test Plan.

TASK 302 - TOP-LEVEL DESIGN HAZARD ANALYSIS
... analyze the Top-Level Design ... include definition and subsequent analysis
of Safety-Critical Computer Software Components (SCCSC), identify the degree
of risk involved, and the design and test plan to be implemented ... ensure
that all safety requirements are correctly and completely specified in the
Top-Level design. ... include analysis of input/output timing, multiple
event, out-of-sequence event, failure of event, wrong event, inappropriate
magnitude, adverse environmental, deadlocking, hardware sensitivities, etc.

TASK 303 - DETAILED DESIGN HAZARD ANALYSIS
...shall analyze the Software Detailed Design ... to verify the correct 
incorporation of safety requirements and to analyze the safety-critical CSCs
... includes relationships between safety-critical and other designated 
software at the CSCI, CSC, and lower unit levels (including subroutines,
data bases, data files, tables, and variables).  It also includes the
requirement to include safety-related information in the Software User's
Manuals.] ... the contractor shall identify safety-critical computer
software units to the code developers, and provide them with explicit
safety-related coding recommendations and safety requirements from the
top-level specifications and design documents.

TASK 304 - CODE-LEVEL SOFTWARE HAZARD ANALYSIS
The contractor shall analyze program code and system interfaces for events,
faults, and conditions which could cause or contribute to undesired events
affecting safety... Analyze (1) safety-critical CSCs for correctness and
completeness, and for input/output timing, multiple event, out-of-sequence 
event, failure of event, adverse environment, deadlocking, wrong event, 
inappropriate magnitude, hardware failure sensitivities, etc. ... (4) proper 
error default handling for ... inappropriate or unexpected data in the input 
data stream, (5) fail-safe and fail-soft modes, (6) input overload or 
out-of-bound conditions.

TASK 305  SOFTWARE SAFETY TESTING
The contractor shall test the software to ensure that all hazards have been
eliminated or controlled to an acceptable level of risk.  Implementation of
safety requirements (inhibits, traps, interlocks, assertions, etc.) shall be 
verified.  The contractor shall verify that the software functions safely both 
within its specified environment, and under abnormal conditions.  

TASK 306 - SOFTWARE/USER INTERFACE ANALYSIS ...

TASK 307 - SOFTWARE CHANGE HAZARD ANALYSIS
The contractor shall analyze all changes, modifications, and patches made to
the software for safety hazards, to include the following: All changes to
specifications, requirements, design, code, systems, equipment, and test
plans, descriptions, procedures, cases, or criteria shall be subjected to
software hazard analysis and testing, unless it can be shown to be
unnecessary due to the nature of the change... the contractor shall show
that the change or patch does not create a hazard, does not impact on a
hazard that has previously been resolved, does not make a currently existing
hazard more severe, and does not adversely affect any safety-critical
computer software component or related and interfacing code... The
contractor shall review the affected documentation, and ensure that it
correctly reflects all safety-related changes made.

Please report problems with the web pages to the maintainer

x
Top