The RISKS Digest
Volume 4 Issue 50

Monday, 23rd February 1987

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Principles of RISKS
James H. Coombs
"Demon computer"
PGN
NSA Risks
Alan Wexelblat
Results of a recent security review
Mary Holstege
Electronic steering
Kevin J. Belles
Rick Sidwell
Kevin Oliveau
Mark L. Lambert
Info on RISKS (comp.risks)

Principles of RISKS

"James H. Coombs" <JAZBO%BROWNVM.BITNET@wiscvm.wisc.edu>
Sun, 22 Feb 87 23:32:07 EST
I have been reading RISKS for a while now and find that I have absorbed
some healthy principles.  I recently developed a program that relies on
coding to determine how to manipulate data.  After completing the first
version, I found that I had a much higher proportion of coding errors
than expected.  Since I do not want to proofread reports microscopically,
I decided to rewrite the program to analyze the data automatically (as
much as practical).  I then considered doing away with the hand-coding
altogether, but the RISKS of relying excessively on computer programs to
determine the "right results" quickly came to mind.  The second version
of the program uses the results of its own data analysis, but warning
messages are issued wherever that analysis disagrees with the coding.
This seems to me an ideal solution, and I am sure that I would not have
arrived at it so readily if I had not been reading RISKS.

I now face situations in which the program has minor deficiencies that I
tend to ignore.  Knowing that the hand-coding is right and that it would
take several hours to upgrade, I prefer to see a few warning messages
flow by.  It reminds me of all of the people who have ignored warning
lights and buzzers to their detriment, and I am preparing to work on the
program to ensure that I do not become insensitive to the warnings.

So, thank you for moderating RISKS, Peter, and thanks to all of the
contributors.  People occasionally complain about postings that do not
deal directly and exclusively with computers; in my experience, these
postings help prevent problems in software design.  --Jim

James H. Coombs, Mellon Postdoctoral Fellow in English, Brown University

   [Some of you look upon RISKS as a collection of anecdotes and nothing
    more.  The old principled codger that I am, I always look for the
    underlying principles, which you are (sometimes subliminally)
    continually confronted with when you read RISKS.  I think it would
    get tiresome to our readers if I called out the principles related
    related to each contribution, but you don't have to read too carefully
    between the lines.  The "relevance" issue amuses me, because principles
    can be derived from or applied to cases in which the computer link is
    only marginal.  Many of the same principles apply irrespective of the
    degree of computer involvement.  PGN]


"Demon computer"

Peter G. Neumann <Neumann@CSL.SRI.COM>
Sun 22 Feb 87 19:01:09-PST
Once in a rare while we turn to that wonder of sources, the Weekly World
News, for a different kind of news item.  (An earlier one was the Chinese
computer developer who was electrocuted by his old computer after he built a
new one.  There WWN's shtick involved his wife blaming jealousy on the part
of the artificially intelligent computer that had been programmed to have
human-like emotions.)  The issue of 3 March 1987 had on page 3 the tale of a
bank in Valpariso, Chile, that had recently installed $7.3 million worth of
computer equipment, including 13 terminals.  One of these terminals was used
by three different people who met extreme misfortune in some strange way --
deaths of two employees and the brain-dead coma of a third.  One of the
deaths was attributed to a massive stroke, the other to "unknown causes".
``At first we decided to remove the terminal'', said Jorge Montalabo (VP of
customer relations).  ``But the workman who came to carry it away fainted
when he tried to unplug it from the system.''  Since no workers will now go
near the terminal, the bank is apparently going to try exorcism! ``If the
exorcism doesn't work and someone else dies while using the terminal, we'll
have to scrap all of our computers and spend millions getting a new system.
Otherwise no one will work here.''

The issue here is of course not whether the computer terminal is possessed,
but whether there could have been some harmful attribute of that particular
terminal (electromagnetic or isotopic radiation, etc.).  (Presumably
shutting down the entire system and removing that terminal would have made
sense...)  On the human side, it is not surprising that such a sequence of
events would have such a profound effect on the surviving computer
personnel.

It would be easy to discredit this story on the basis of other off-the-wall
stories found by the WWN.  Although I don't think RISKS should indulge in
rampant speculation, it does seem plausible that some physical phenomenon
could have been involved — electric currents, radiation emissions, etc.,
and thus some open-minded curiosity about this case is in order.  I wonder
whether there are any RISKS readers in South America who could provide any
solid information on this case!  PGN


NSA Risks

Alan Wexelblat <wex@MCC.COM>
Mon, 23 Feb 87 10:21:25 CST
One thing I'm surprised no one mentioned is the RISKS being discovered at
the NSA in the ongoing Iran-Contra affair.  The NSA spooks (and the
administration) seem to be getting burned because of *too much* backup.

For example, the false chronology of events reported by Regan and Reagan
during their testimony to Congress was discovered only when someone made
available the NSA's massive computer archive.  Apparently every file, mail
message, etc., ever created on the NSA's computers is archived there at time
of deletion.  It seems that most NSA people were not aware of the archive
(or had forgotten about it).  Messages in this archive showed how North,
Casey, and Poindexter had concocted the false story.

Just recently, Oliver North's secretary turned over floppies to the Tower
commission which contained undeleted copies of the memos that North et al.
carefully shredded before the investigators arrived.


Results of a recent security review

Mary Holstege <HOLSTEGE@Sushi.Stanford.EDU>
Mon 23 Feb 87 10:25:19-PST
Findings of a recent security review:

The environment here is a computer software vendor, which also has a number
of timesharing customers using the same computer as the programmers.
Customers rely on the computer for accounting and inventory control
applications.

I should first say that people at the company at which this security review
took place were of the general impression that their system, and in
particular directories containing their proprietary programs and sensitive
customer data, were quite secure.  The combination of case histories
garnered from RISKS and a recently-terminated employee prompted the review,
but the general consensus before it was started was that there was no cause
for concern.  A couple of programmers griped about having their "time
wasted" in such a "silly exercise."  This turned out to have been a mistaken
assumption.

Although it is unlikely that a breakin actually occurred, part of 
the problem with one of the security defects was that such a breakin
would be untraceable.  The facts are these:

The account system of this particular computer includes the ability to
"share" files on other accounts. If the directory file of that account is
shared one is granted access to the account as a whole.  One can gain access
to an account which is not shared through the use an alias command which
will require a password.  No such password is required to access a shared
directory.  Generally, aliases, failed aliases, logins, and failed logins
are all logged. Secure accounts are protected by having a special
"PASSWORD2" program which performs additional (user-definable) verification.
This program is uninterruptible.  On this particular company's system the
password2 program demands that the user enter a six digit number that varies
with the time of day, day of the week, and an array of random numbers
presented to the user.  Since the secondary password changes with each login
attempt, it was felt that accounts protected in this way were immune to
breakin attempts.

First problem: The proliferation of shares for the convenience of
programmers. It was found that several of the programmers had permanent
shares to many "secure" accounts.  All the most important accounts on the
system were shared with accounts that were not protected by the secondary
password system.  Thus, to break into one of these secure accounts one had
only to break into one of the programmer accounts and alias.  Many shares
which had been given for some temporary project were never removed.  Thus
programmers who were denied knowledge of login passwords to sensitive
accounts had shares to them anyway.  Some *customer* accounts still had such
shares left over from temporary projects.

Second problem: Poor passwords.  Most of the users on the system (including
the programmers) had not changed them in over a year.  Most passwords were
easily guessed.  A programmer with shares to the most sensitive accounts on
the system had a two letter password consisting of his initials.  A number
of customer accounts had been set up with *null* passwords which had never
been changed! All this in spite of the fact that memos had been circulated
advising users of the need for better passwods and relatively frequent
password changes.

Third problem: Alias and login logging can be turned off on an
account-by-account basis.  While this does require access to the system
manager's account, by (1) and (2) it was easy enough to obtain this access.
What's more, at least one programmer, with shares to many sensitive accounts
had turned off this logging for several accounts, for reasons which remain
unclear.  (Perhaps just because he was able; perhaps because at one time he
had found a hole in the security system — duly reported — which allowed
access to certain hyper-secure accounts and he had wanted to pin the hole
down without having to answer a lot of embarrassing questions.)  This was
not discovered by the security review, incidentally, but by an attempt to
reconstruct the circumstances of a system crash.

Fourth problem: While the secondary password program is uninterruptibly run
automatically with terminal logins, it is not run at all for a batch job
login.  Thus it is possible to break into the secure accounts by a standard
password attack once one has gained access to any account on the system
(although with the overhead of creating batch jobs, it's true).

Fifth problem: A program can be created with "OWNDIR" privileges.  While it
is running, it has all the privileges associated with the account on which
it resides.  So one can grant unprivileged users access to commands that
require privileges by sharing such a program with them.  The problem with
this is that unless the program is shared execute-only instead of read-only,
it can be interrupted, granting the unprivileged user access to all the
privileges of the program's account.  A number of such programs were shared
read-only instead of execute-only.  The types of privileges thus gained
include the ability to bring down the system or terminate any job.

Conclusions: there were several gaping holes in the security system, and it
would have been quite possible for someone to have gained access to
proprietary programs or sensitive timesharing customer data without any
record of the breakin being left.  And all this at a company which already
had in place what it considered sophisticated security policies and
effective protection against breakins.  One shudders to think what the
situation might be at places which haven't even considered the problem.

                             — Mary Holstege   Holstege@SUSHI.STANFORD.EDU

   [This contribution is yet another example of an old problem.  
    Nevertheless, it is worth including.  "Eternal vigilance ..."  PGN]


Electronic steering

Kevin J. Belles <scubed!crash!kevinb@seismo.CSS.GOV>
Mon, 23 Feb 87 02:44:11 PST
   In reply to the comment about drive-by-wire versus the F-16 fly-by-wire
system, the levels of safety of the two systems aren't really comparable.
   The average military aircraft gets a maintenance checkout either each or
every other flight to test for system malfunction, while on an average auto
you are looking at more like a checkup every 50K miles. Myself, I'd not
consider purchasing a drive-by-wire auto unless it has a much better record
than the computer-aided automotive systems have demonstrated so far, especially
on a putative high-performance sports car model, but instead drive a car with
no electronics in it to speak of (a 1966 Volvo 122S). 
   Although a computer enthusiast, I prefer not to bank my life on them more
than absolutely necessary, knowing their ocassionally erring ways. All it would
take in a system as described in previous issues would be once.

                    -Kevin Belles

Kevin J. Belles - UUCP: {hplabs!hp-sdd, akgua, sdcsvax, nosc}!crash!kevinb
~~~~~ ~~ ~~~~~~ - ARPA: crash!kevinb@{nosc, ucsd} 
        - INET: kevinb@crash.CTS.COM        - BIX:  kevinb


Re: Electronic steering

Rick Sidwell <sidwell@ICSD.UCI.EDU>
Mon, 23 Feb 87 09:36:05 -0800
 <> Side note: Isn't the F-16 a fly-by-wire plane?  If electronic steering is
 <> safe, and reliable enough for combat jets, why wouldn't it be safe enough
 <> for everyday car?

Combat (and other jets) are maintained much better than cars, in general.
The FAA requires every airplane to pass a pre-flight check before the first
flight of each day.  How often does the average person give a pre-drive check
to his/her automobile before driving to work?

I was once driving a car when it suffered a complete power failure.  The main
cable from the battery had somehow worked it way next to the engine (probably
during a battery replacement), and the heat melted through the insulation,
shorting the +12 volt line to ground.  Everything stopped:  the engine, the
radio, the clock, etc.  Fortunately, the power steering and power brakes worked
even without power, and no damage occured.  I hate to think what would have 
happened if the car had had electronic steering.

I am not against electronic steering in principle; it does have its advantages
(if they design it right, it should be much easier to perform such maneuvers
as parallel parking in tight spots and making U-turns on small streets).  But
I would hope that the designers take into consideration the possibility of
a sudden and complete power loss while driving.


Re: electronic steering

Kevin Oliveau <oliveau@think.com>
Mon, 23 Feb 87 12:21:20 EST
One RISKS reader (RISKS-4.49) asks why a system that is safe and reliable
enough for combat aircraft would not be safe enough for a car?

The answer is that combat aircraft receive a great deal of care and
preventive maintenance.  Cars, on the other hand, are often driven without
being properly maintained and their systems are not repaired until they
break down.  Mechanical systems are fairly reliable and degrade faily
smoothly.  (Brakes often make noise or become "mushy" before failing
completely.)  Electronic systems tend to simply stop working.  So in today's
car, you drive through a puddle and your engine dies (perhaps the power
steering dies as well), but you will have control of the car:  you can steer
and brake.  In tomorrow's car, you'll drive into the oncoming lane without
any control at all.
                            Kevin Oliveau

   [Electronic systems need not just stop working — they may have failure
   modes that bear little resemblance to physical principles.  A wild
   transfer in a program or a dropped bit may result in strange behavior.

   Wet brakes may fail in either case.  For that matter, in each case there
   is the possibility of overreaction.  (I am reminded of the 1950's tale
   about the Swarthmore students who greased up a train track approaching a
   station.  The engineer applied full brakes when the train did not slow
   down; at the end of the greased section, the train and the clean track
   did not react well to one another...)  PGN]


Re: electronic steering

<markl@JHEREG.LCS.MIT.EDU>
Mon, 23 Feb 87 17:01:19 est
  >Side note: Isn't the F-16 a fly-by-wire plane?  If electronic steering is
  >safe, and reliable enough for combat jets, why wouldn't it be safe enough
  >for everyday car?

(The following may be apocryphal...) A friend of mine once told me that
the first time a prototype F16 was taken out on the runway, its test
pilot tried to retract the F16's landing gear while on the ground.
The gear happily did so.  This caused a fair amount of damage to the
F16.  My friend speculated that this might not have happened had the
F16 not had a computer between the pilot and the landing gear.  I'm
not at all convinced that remote steering and such-like are safe at
all.  I can just see the folks at GM forgetting a couple of lines of
code in some important part of the steering program...

And there is another problem which I am not sure has even been brought up
here.  Aircraft are supposedly meticulously maintained (unless they are
owned by Eastern...).  Even with this high quality maintenance, accidents
happen.  What do you suppose will happen when you put sophisticated computer
steering equipment in a car that gets serviced when the owner feels like it?
We have enough trouble forcing cars to pass safety and emissions control
inspections without having to depend on car owners to get their on-board
steering computer inspected every year.
                                    Mark L. Lambert

MIT Laboratory for Computer Science, Distributed Systems Group
Internet: markl@jhereg.lcs.mit.edu

             [Yes, I noted the overlap in the last four messages.  But each
             made a different point, so I did not reject any...  PGN]

Please report problems with the web pages to the maintainer

x
Top