The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 9 Issue 59

Wednesday 10 January 1990


o Drawbridge opens without warning in rush-hour traffic
Jon Jacky
o Massive Electrical Failure in a Bus
Peter Jones
o What hung the computer?
o Passwords and security
Phil Ritzenthaler
Henry Spencer
Jerry Leichter
Peter da Silva
o IEEE Symposium on Research in Security and Privacy, Oakland 1990
o Info on RISKS (comp.risks)

Drawbridge opens without warning in rush-hour traffic

Tue, 9 Jan 1990 22:53:19 PST
This isn't a computer incident, but it contains several system safety lessons
that may be of interest to readers of this digest.  The accident has been
discussed in the Seattle papers almost every day since December 22.  The
following excerpts are from one of the stories that describe the cause, in
THE SEATTLE TIMES, Monday January 8, 1990, p. 1:


... Engineers found that a screw driven into one wire and a short in another
wire caused last month's fatal opening of the (Evergreen Point Floating
Bridge's) drawspan.

(The inspectors will report) to the State Department of Transportation (DOT) on
Friday, said Richard Hearth, one of two engineers from the consulting firm
of Parsons, Brinkerhof, Quade and Douglas Inc. who inspected the faulty
Highway 520 bridge Saturday and yesterday.

The two shorted wires, one on the east drawspan and one on the west span,
combined to send current around safety systems and lift the west section of
the drawspan during a routine test December 22.

Several cars crashed into the unexpectedly rising bridge deck. ... One (driver)
was killed.  Five people were injured in the morning rush hour accident. ...

"Each fault individually did not cause the problem," said Patrick Buller, an
electrical engineer from the (Washington State) Patrol who took part in the
investigation.  "It took both problems simultaneously for the bridge to rise,"
he said.

Power from the burned cable on the east drawspan probably reached the west
side either through the metal of the bridge deck or along the metal walls of
the submerged cables which connect the two movable spans with the control
tower, said Daryl Rush, bridge maintenance supervisor for the DOT.

Picking up that current, the short on the west side, where 3/8-inch sheet metal
screw holding an electrical box cover had pierced a wire, sent power to motors
that lifted the west span, Hearth said.

The routine tests during which the accident occurred had until Dec. 22 been run
almost weekly without incident for 18 years.

Ironically, it was a tiny heater designed to keep moisture from damaging
circuits in an electrical relay that burned plastic insulation from one of the
wires, causing the short circuit on the east span.  Such heaters protect many
of the bridge's electrical devices.

During a bridge opening, the device housing the burned wire senses the rotation
of a gear in order to tell operators how far the east draw span has retracted.

At Step 10 in the 18-step dry-run test that ws going on the day of the fatal
accident, the device received power.  Though the bridge does not open during
any part of the test procedure, the shorted wire allowed power to reach motors
on the west span.

For the Evergreen Point Bridge to open, grated sections of the bridge deck
rise, allowing sections of the roadway to slide back for the passage of ships.

[ Here are further excerpts from THE SEATTLE POST-INTELLIGENCER, January
8, 1990, p. 1, ELECTRICAL SHORTS LIFTED SPAN by Mike Merrit ]

One move tranportation officials made to ensure such an accident doesn't happen
again is to stop the "dry run" tests of the bridge's control systems.  Future
tests will be done only with the span closed to traffic. ...

Hearth declined to comment on whether the bridge should have been designed with
automatic mechanisms to prevent activation of its lift-deck motors unless the
span was closed to traffic.  "In hindsight, the question is a good one," (John)
Stevenson (assistant district administrator for DOT) said.  "But when the
design was being completed, it's hard to know what factors were taken into
account.  ... The span was built across Lake Washington in 1962.

... The west span short circuit, caused when a tiny screw touched wires inside
a metal box, had probably existed for some months or years.  The second short
circuit ... had probably occurred gradually over several months or a year ...
(a) heater apparently melted insulation from one of a bundle of wires.
Sometime between tests run Dec. 15 and Dec. 22, when the malfunction occured,
enough insulation melted away to let the bare wire touch the metal cover plate.

... "The wire (with the melted insulation) is in such a place that it's not
real visible," said Ernest Frye, who oversees inspections as the bridge's lead
maintenance technician. ...  (the assembly containing the wire) is inspected
annually ... the latest inspection was April 9, 1989.  But the damaged wire is
one of a bundle of wires under a metal cover plate that is not usually
removed in routine inspections.

Charles Mayhan, state moveable bridge engineer, said he did not do a detailed
examination of the ... mechanism while inspecting the bridge earlier this
year.  "If I took the time to do all that, I wouldn't be able to do more than
one bridge a year," he said.

[ The consulting engineers will recommend installation of devices to warn
operators of short circuits ]  Devices to detect small short circuits weren't
widely used when the bridge was designed in the 1950's ...

- Jonathan Jacky, University of Washington

Massive Electrical Failure in a Bus

Peter Jones <MAINT@UQAM.bitnet>
Sun, 7 Jan 90 03:01:03 EST
On Friday Jan 5, 1989, as I was returning from a day's skiing, the electrical
system of the bus I was riding in suddenly went dead. The driver was forced to
pull over on the side of the expressway. Subsequent investigation showed that
the cause of the problem was a loose connection of the power cable leading to
the starter. This cable can carry several hundred amperes of starting current;
power is actually delivered to the starter motor via a nearby relay controlled
by the ignition key. According to the driver, the cable came off a loose
connection at the starter and shorted the high side of the battery to the
bottom of the oil pan, burning a hole and spattering oil on the engine.
Luckily, no fire resulted.

The driver got out and walked back to an emergency phone placed at the side of
the highway. He returned, saying that instead of being connected to the police,
he had heard a recorded message saying the phone was "not in a designated
area", and giving another number to dial. But the phone had no dial on it!

A passing motorist was able to help the driver by relaying a message to the
police.  As a result, the bus company was notified, and sent another bus to get
us home.

Some further observations:
1) The headlights and AM/FM radio still worked after we pulled over, but not
   the rear flashers or rear lights. The driver put a flare behind the bus.
2) Installation of a two-way radio was planned for the following Monday. But
   I wonder if it could operate independently of the electrical system of
   the bus, in an emergency.
3) The extreme cold weather had ended, so that no-one felt cold.

The part that is computer system related is the part about the roadside phones
giving an "error message" that the user could not respond to. I am reminded of
the customer-operated touch-screen POS terminal reported in RISKS that gave a
software error message and asked the user (customer) to hit return on a
keyboard that presumably had only been present during testing.

I see this situation as a case of losing track of the system's behaviour as
seen by the user. If you can get connected to a recording, then why not the
police station? In a potentially life-threatening situation, no other system
response is acceptable, in my view. Remeber the Woody Allen film where an
attempt to call the 911 police number results in an "all lines busy" voice
recording, followed by a disconnect? This case was no joke. Nor did it seem to
have been a failure of an individual phone, which wouldn't be worth mentioning

There's also the issue of the primary failure (electrical failure) resulting in
a failure of systems to prevent further damage (the flashers).

So, while this tale appears marginally related to computers, I feel the system
ergonomics and reliability design issues are very pertinent, and easy to
visualize in this case.

Peter Jones     MAINT@UQAM     (514)-987-3542
"Life's too short to try and fill up every minute of it" :-)

What hung the computer?

Tue, 09 Jan 90 23:16:55 -0800
A couple of nights ago I was working away merrily on my Macintosh with two cats
plopped all over the table as usual. For some reason, while I was working in a
usually reliable program, the computer froze. Couldn't figure it out; all I was
doing was typing. While I was still thinking, the Mac suddenly unfroze, and the
icon for the CDROM drive popped up on the desktop. It takes more than minute
for this particular CDROM to come online, during which time the Macintosh is
busied out, thus the inexplicable delay.

Who had pushed in the CDROM? A cat. One of them woke briefly, stretched,
changed position, and in doing so shoved the disk into the drive. Since I am
quite used to cats stretching I didn't bother to watch him and had no idea he
needed the disk online. I was already cursing the software for crashing the
machine until the icon showed up.

Part of the blame for this confusion can be laid squarely on Apple.  Their own
Human Interface Guidelines specify that the pointer should be replaced with a
watch for a "lengthy operation" (Inside Macintosh p.I-37). They often follow
their own guidelines, but not in this case.

   [Another hidden side-effect of the Unix "cat" command?  If blank text had
   been inserted, you might have thought the system had gone tabby.  Part of
   the problem is the emphasis on just "human" interfaces.  But perhaps your
   cat now has a Macintush.  Therein lies a tail.  PGN]

Re: The risks of not learning (Password security, RISKS-9.57)

Phil Ritzenthaler <>
5 Jan 90 18:33:10 GMT
> One student's answer:  "Passwords have the advantage that it's easier to let
> my friend use my account.  If the system uses biometric devices for user
> authentication, I have to be there when he logs in, to get him on to the
> system.  But, if the system uses passwords, I can just tell him my password,
> and don't have to be there - he can login at his leisure."

GOOD LORD IN HEAVEN!!  And these are going to be our System Managers and
administrators of the FUTURE??

Heaven help us!!

Phil Ritzenthaler  The Advanced Computing Center for the Arts & Design (ACCAD)
Systems Manager    The Ohio State University

Re: The risks of not learning?

Tue, 9 Jan 90 14:58:28 EST
>One student's answer:  "Passwords have the advantage that it's easier to let
>my friend use my account..."
>So much for that one-week unit on 'Individual Accountability'.

I hope you also spent some time talking about how formal mechanisms often
imperfectly capture the real nature of human organizations, and about the
negative effects on security when overly-rigid formal mechanisms lead to the
evolution of informal "bypass" mechanisms?  This is a perfect example.  Anyone
who has ever loaned someone else a key knows that there *are* real requirements
for such things, accountability problems notwithstanding, when the overhead of
"doing it right" is high.
                                     Henry Spencer at U of Toronto Zoology

Re: Password sharing (RISKS-9.58)

Tue, 9 Jan 90 16:20 EST
In an editorial remark added to my recent RISKS contribution, our Moderator
    If you want individual accountability, you do not share passwords.
    The telephone system is intrinsically vulnerable to illicit use of
    credit cards.  If you wish to trust a friend, that may have low risk
    -- unless MANY people are involved; then you might like to have
    records that show who is actually using your resources....

This continues to miss the point I was making.

Consider the following two situations:

    1.  The system manager on my machine assigns a shared account to
            me and five other people (who, let us say for the sake of
        argument, are my friends).

    2.  I let the same five friends know what my password is.

Are these equivalent?  The answer is absolutely not:  While the "observable
facts" may look the same - the same people have the same information - the
RESPONSIBILITIES are radically different.  In case 1, if one of the six people
with access to the account abuse it, I will be able to deny responsibility -
as will the five others.  This is "lack of individual responsibility".

However, in case 2, if one of the six of us abuses the account, it is clear
who bears the responsibility:  Me.  My five friends are acting as my agents,
and I can complain all I like that it was they who acted - in giving them my
password, I implicitly accepted the consequences of their actions.

Now, the problem is that those of us with an interest in security can easily
see ourselves in case 1 - and we've learned that case 1 represents bad, if
all too common, system management policy.  (How many times have heard of
systems in which each account costs money, so any number of users are assigned
to the same account to keep the accountants happy?)  However, we have much
more difficulty imagining ourselves in case 2, since WE certainly would never
let others know our password.

However it is exactly case 2 which we are discussing here!  As long as it is
understood that the "owner of record" of an account remains responsible for
it no matter who he allows to use it - just as the owner of a telephone must
pay the charges his friends run up on it - I can see no issue of "individual
accountability" here.

The law, and our common sense of ethics, has been dealing with lending in this
way for a long, long time.  Why should computer accounts be treated in any
different way?  Admittedly, it may be worthwhile to point out to people that
lending their accounts has a greater potential for abuse than lending their
shirts, or whatever - exactly because passwords as a means of identification
are freely reproducible.  People understand this; they are more willing to lend
a key than to "lend" the combination to a lock.
                            -- Jerry

Password sharing

Tue, 9 Jan 90 21:02:15 EST
It has always seemed to me that there is no point in trying to prevent people
from `lending' their passwords to others -- they'll do it anyway.

A better administrative strategy might be to make it plain to people that they
will be held responsible for any act committed under their login, whether they
actually did it or not.

Re: Pasword sharing (RISKS-9.58)

Wed, 10 Jan 90 10:09 CST
>    [Yes, some people are concerned about computer security and others are not
>    -- at least not until they've been burned.  But Reality and Sensibility
>   are in this case two radically different things.  If you want individual
>   accountability, you do not share passwords. ...

This is all true, but the amount of security you need depends on many things.
For example, an account you pay money for might be a different matter than
an account on a local BBS, which again is different from an account on a
friend's UNIX box. Work and school accounts are another matter.

I wouldn't casually hand out passwords here at work, but I might give someone
my compuserve password... and I'd change it later. I'd be even more likely
to give someone a password for a local BBS, and probably wouldn't bother to
change it.

Another variant is who you give the password to. You might be willing to swap
passwords to class accounts at school, or project accounts at work, but you
probably wouldn't hand out your CI$ password to your co-workers.

So, having the ability to lend access to a resource is valuable, and is not
always somehing to be feared. Security and convenience being opposed goals,
it's important to gauge the level of security to the value of the resource in
                               Peter da Silva +1 713 274 5180

Security and Privacy, Oakland 1990

"Peter G. Neumann" <>
Tue, 9 Jan 1990 15:15:37 PST
     1990 IEEE Symposium on Research in Security and Privacy
                         ADVANCE PROGRAM
                         4 January 1990

MONDAY, MAY 7, 1990

  0845-0900 Welcome, Introduction, Deborah Downs, Deborah Cooper

  0900-1030 Commercial A1, Steve Lipner, Chair

Status Update

A VMM Security Kernel for the VAX Architecture, Paul Karger, Mary Ellen Zurko,
Douglas W. Bonin, Andrew H. Mason

An Architecture for Practical Delegation in a Distributed System, Morrie
Gasser, Ellen McDermott

Practical Authentication for Distributed Computing, John Linn

  1100-1200 A1 SOS, Elizabeth Sullivan, Chair

The Army Secure Operating System, Neil Waldhart

Specification and Verification of the ASOS Kernel, Ben L. Di Vito, Paul H.
Palmquist, Eric R. Anderson, Michael L. Johnston

  1330-1500 Database I, Teresa Lunt, Chair

Integrating an Object-Oriented Data Model with Multilevel Security, Sushil
Jajodia, Boris Kogan

A Little Knowledge Goes a Long Way: Fast Detection of Compromised Data in 2 D
Tables, Dan Gusfield

Extending the Brewer-Nash Model to a Multilevel Context, Catherine Meadows


  Database II, Earl Boebert, Chair

Polyinstantiation and Integrity in Multilevel Relations, Sushil Jajodia, Ravi

Naming and Grouping Privileges to Simplify Security Management in Large
Databases, Robert W. Baldwin

Referential Secrecy, Rae K. Burns


  1730-1930 Reception
  2000-2400 Poster Sessions, Hospitality Suite

TUESDAY, MAY 8, 1990

  0900-1030 Information Flow, John Rushby

Information Flow in Nondeterministic Systems, J. Todd Wittbold, Dale M. Johnson

Constructively Unsing Noninterference to Analyze Systems, Todd Fine

Probabilistic Interference, James W. Gray, III

Security Models and Information Flow, John McLean


  1100-1200 Access Control and Integrity

Beyond the Pale of MAC and DAC -- Defining New Forms of Access Control,
Catherine Jensen McCollum, Judith R. Messing, LouAnna Notargiacomo

Some Conundrums Concerning Separation of Duty, Michael J. Nash, Keith R.

  1330-1500 Authentication, Tom Berson, Chair

SP3 Peer Entity Identification, Bill Birnbaum

The Role of Trust in Protected Mail, Martha Branstad, W. Curtis Barker,
Pamela Cochrane

A Security Architecture and Mechanism for Data Confidentiality in TCP/IP
Protocols, Raju Ramaswamy

Reasoning about Belief in Crytographic Protocols, Li Gong, Roger Needham,
Raphael Yahalom

  1530-1650 Verification, Deborah Cooper, Chair

The Deductive Theory Manager: A Knowledge Based System for Formal Verification,
Ben Di Vito, Cristi Garvey, Davis Kwong, Alex Murray, Jane Solomon, Amy Wu

Formal Construction of Provably Secure Systems with Cartesiana, Heinz Brix,
Albert Dietl

Verifying A Hardware Security Architecture, Joshua D. Guttman, Hai-Ping Ko

A Hierarchical Methodology  for  Verifying  Microprogrammed  Microprocessors

  1700-1800 Technical Committee Business Meeting

  2000-2400 Poster Sessions, Hospitality Suite


  0910-1020 Auditing and Intrusion Detection, Jim Anderson, Chair

The Auditing Facility for a VMM Security Kernel, Kenneth F. Seiden, Jeffrey P.

Adaptive Real-time Anomaly Detection Using Inductively Generated Sequential
Patterns, Henry Sl Teng, Kaihu Chen, Stephen C-Y Lu

Auditing the Use of Covert Storage Channels in Secure Systems, Shiuh-Pyng
Shieh, Virgil D. Gligor

  1020-1030 Presentation of Awards

  1030-1100 Break

  1100-1200 Database III, Cristi Garvey, Chair

Transaction Processing in Multilevel-Secure Databases Using Replicated
Architecture, Sushil Jajodia, Boris Kogan

Multiversion Concurrency Control for Multilevel Secure Database Systems, T.F.
Keefe, W.T. Tsai

Modeling Security-Relevant Data Semantics, Gary W. Smith

  1330-1700 Panel Discussions -- To Be Announced

Please report problems with the web pages to the maintainer