The RISKS Digest
Volume 3 Issue 66

Thursday, 25th September 1986

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Follow-up on Stanford breakins: PLEASE LISTEN THIS TIME!
Brian Reid
F-16 software [concluded?]
Herb Lin
Info on RISKS (comp.risks)

Follow-up on Stanford breakins: PLEASE LISTEN THIS TIME!

Brian Reid <reid@decwrl.DEC.COM>
25 Sep 1986 0014-PDT (Thursday)
   "What experience and history teach is that people have never learned
    anything from history, or acted upon principles deduced from it."
        — Georg Hegel, 1832

Since so many of you are throwing insults and sneers in my direction, I feel
that I ought to respond. I am startled by how many of you did not understand
my breakin message at all, and in your haste to condemn me for "asking for
it" you completely misunderstood what I was telling you, and why.

I'm going to be a bit wordy here, but I can justify it on two counts. First,
I claim that this topic is absolutely central to the core purpose of RISKS (I
will support that statement in a bit). Second, I would like to take another
crack at making you understand what the problem is. I can't remember the
names, but all of you people from military bases and secure installations who
coughed about how it was a network administration failure are completely
missing the point. This is a "risks of technology" issue, pure and simple.

As an aside, I should say that I am not the system manager of any of the
systems that was broken into, and that I do not control the actions of any
of the users of any of the computers. Therefore under no possible explanation
can this be "my fault". My role is that I helped to track the intruders down,
and, more importantly, that I wrote about it.

I am guessing that most of you are college graduates. That means that you
once were at a college. Allow me to remind you that people do not need badges
to get into buildings. There are not guards at the door. There are a large
number of public buildings to which doors are not even locked. There is not a
fence around the campus, and there are not guard dogs patrolling the
perimeter.

The university is an open, somewhat unregulated place whose purpose is the
creation and exchange of ideas. Freedom is paramount. Not just academic
freedom, but physical freedom. People must be able to walk where they need to
walk, to see what they need to see, to touch what they need to touch.
Obviously some parts of the university need to be protected from some people,
so some of the doors will be locked. But the Stanford campus has 200
buildings on it, and I am free to walk into almost any of them any time that
I want. More to the point, *you* are also free to walk into any of them.

Now let us suppose that I am walking by the Linguistics building and I notice
that there is a teenager taking books out of the building and putting them in
his car, and that after I watch for a short while, I conclude that he is not
the owner of the books. I will have no trouble convincing any policeman that
the teenager is committing a crime. More important, if this teenager has had
anything resembling a normal upbringing in our culture, I will have no
trouble convincing the teenager that he is committing a crime. Part of the
training that we receive as citizens in our society is a training in what is
acceptable public behavior and what is not. The books were not locked up, the
doors to the library were not locked, but in general people do not run in and
steal all of the books.

Or let me suppose instead that I am a reporter for the Daily News. I have a
desk in a huge room full of desks. Most of the desks are empty because the
other reporters are out on a story. You've seen scenes like this in the
movies. It is rare in small towns to find those newsrooms locked. Here in
Palo Alto I can walk out of my office, walk over to the offices of the Times
Tribune a few blocks away, walk in to the newsroom, and sit down at any of
those desks without being challenged or stopped. There is no guard at the
door, and the door is not locked. There are 50,000 people in my city, and
since I have lived here not one of them has walked into the newsroom and
started destroying or stealing anything, even though it is not protected.
Why not? Because the rules for correct behavior in our society, which are
taught to every child, include the concept of private space, private
property, and things that belong to other people. My 3-year-old daughter
understands perfectly well that she is not to walk into neighbors' houses
without ringing the doorbell first, though she doesn't quite understand why.

People's training in correct social behavior is incredibly strong, even
among "criminals". Murderers are not likely to be litterbugs. Just because
somebody has violated one taboo does not mean that he will immediately and
systematically break all of them.

In some places, however, society breaks down and force must be used. In the
Washington Square area of New York, for example, near NYU, you must lock
everything or it will be stolen.  At Guantanamo you must have guards or the
Cubans will come take things. But in Palo Alto, and in Kansas and in Nebraska
and Wisconsin and rural Delaware and in thousands of other places, you do not
need to have guards and things do not get stolen.

I'm not sure what people on military bases use computer networks for, but
here in the research world we use computer networks as the building blocks of
electronic communities, as the hallways of the electronic workplace. Many of
us spend our time building network communities, and many of us spend our time
developing the technology that we and others will use to build network
communities. We are exploring, building, studying, and teaching in an
electronic world. And naturally each of us builds an electronic community
that mirrors the ordinary community that we live in. Networks in the Pentagon
are built by people who are accustomed to seeing soldiers with guns standing
in the hallway. Networks at Stanford are built by people who don't get out of
bed until 6 in the evening and who ride unicycles in the hallways.

Every now and then we get an intruder in our electronic world, and it
surprises us because the intruder does not share our sense of societal
responsibilities. Perhaps if Stanford were a military base we would simply
shoot the intruder and be done with it, but that is not our way of doing
things. We have two problems. One is immediate--how to stop him, and how
to stop people like him. Another is very long-term: how to make him and his
society understand that this is aberrant behavior.

The result of all of this is that we cannot, with 1986 technology, build
computer networks that are as free and open as our buildings, and therefore
we cannot build the kind of electronic community that we would like.

I promised you that I would justify what this all has to do with RISKS. 

We are developing technologies, and other people are using those
technologies. Sometimes other people misuse them. Misuse of technology is one
of the primary risks of that technology to society. When you are engineering
something that will be used by the public, it is not good enough for you to
engineer it so that if it is used properly it will not hurt anybody. You must
also engineer it so that if it is used *improperly* it will not hurt anybody.
I want to avoid arguments of just where the technologist's responsibility
ends and the consumer's responsibility begins, but I want to convince you,
even if you don't believe in the consumer protection movement, that there is
a nonzero technologist's responsibility.

Let us suppose, for example, that you discovered a new way to make
screwdrivers, by making the handles out of plastic explosives, so that the
screwdriver would work much better under some circumstances. In fact, these
screwdrivers with the gelignite handles are so much better at putting in
screws than any other screwdriver ever invented, that people buy them in
droves. They have only one bug: if you ever forget that the handle is
gelignite, and use the screwdriver to hit something with, it will explode and
blow your hand off. You, the inventor of the screwdriver, moan each time you
read a newspaper article about loss of limb, complaining that people
shouldn't *do* that with your screwdrivers.

Now suppose that you have invented a great new way to make computer networks,
and that it is significantly more convenient than any other way of making
computer networks. In fact, these networks are so fast and so convenient that
everybody is buying them. They have only one bug: if you ever use the network
to connect to an untrusted computer, and then if you also forget to delete
the permissions after you have done this, then people will break into your
computer and delete all of your files. When people complain about this, you
say "don't connect to untrusted computers" or "remember to delete the files"
or "fire anyone who does that".

Dammit, it doesn't work that way. The world is full of people who care only
about expediency, about getting their screws driven or their nets worked. In
the heat of the moment, they are not going to remember the caveats. People
never do. If the only computers were on military bases, you could forbid
the practice and punish the offenders. But only about 0.1% of the computers
are on military bases, so we need some solutions for the rest of us.

Consider this scenario (a true story). Some guy in the Petroleum Engineering
department buys a computer, gets a BSD license for it, and hires a Computer
Science major to do some systems programming for him. The CS major hasn't
taken the networks course yet and doesn't know the risks of breakins. The
petroleum engineer doesn't know a network from a rubber chicken, and in
desperation tells the CS student that he can do whatever he wants as long as
the plots are done by Friday afternoon. The CS student needs to do some
homework, and it is much more convenient for him to do his homework on the
petroleum computer, so he does his homework there. Then he needs to copy it
to the CS department computer, so he puts a permission file in his account on
the CSD computer that will let him copy his homework from the petroleum
engineering computer to the CSD computer. Now the CS student graduates and
gets a job as a systems programmer for the Robotics department, and his
systems programmer's account has lots of permissions. He has long since
forgotten about the permissions file that he set up to move his homework last
March. Meanwhile, somebody breaks into the petroleum engineering computer,
because its owner is more interested in petroleum than in computers and
doesn't really care what the guest password is. The somebody follows the
permission links and breaks into the robotics computer and deletes things.

Whose fault is this? Who is to blame? Who caused this breakin? Was it the
network administrator, who "permitted" the creation of .rhosts files? Was it
the person who, in a fit of expedience, created /usr/local/bin with 0776
protection? Was it the idiot at UCB who released 4.2BSD with /usr/spool/at
having protection 0777? Was it the owner of the petroleum engineering
computer? Was it the mother of the kid who did the breaking in, for failing
to teach him to respect electronic private property? I'm not sure whose fault
it is, but I know three things:

 1) It isn't my fault (I wasn't there). It isn't the student's fault (he
    didn't know any better--what can you expect for $5.75/hour). It isn't the
    petroleum engineer's fault (NSF only gave him 65% of the grant money he
    asked for and he couldn't afford a full-time programmer). Maybe you could
    argue that it is the fault of the administrator of the CSD machine, but in
    fact there was no administrator of the CSD machine because he had quit to
    form a startup company. In fact, it is nobody's fault.

 2) No solution involving authority, management, or administration will work
    in a network that crosses organization boundaries.

 3) If people keep designing technologies that are both convenient and 
    dangerous, and if they keep selling them to nonspecialists, then
    expedience will always win out over caution. Convenience always wins,
    except where it is specifically outlawed by authority. To me, this is
    one of the primary RISKs of any technology. What's special about
    computers is that the general public does not understand them well
    enough to evaluate the risks for itself.


F-16 software [concluded?]

<LIN@XX.LCS.MIT.EDU>
Thu, 25 Sep 1986 09:39 EDT
    From: rti-sel!dg_rtp!throopw%mcnc.csnet at CSNET-RELAY.ARPA

      > I spoke to an F-16 flight instructor about this business concerning
      > bomb release when the plane is upside down.  He said the software 
      > OUGHT to prevent such an occurrence.  When the plane is not at the
      > right angle of attack into the air stream, toss-bombing can result
      > in the bomb being thrown back into the airplane.  

    Hmpf.  *I* spoke to an ex Air-Force pilot.  He said if *any* restriction on
    bomb release is incorporated it should be to prevent it when the plane (or
    more specificially, the bomb itself... there *is* a difference, and you had
    better realize it!) is pulling negative G's.  This was my original point...
    "upside down" or "inverted" isn't the correct thing to worry about, it is
    the wrong mindset entirely, too simple a notion.

This dispute (well, sort of dispute anyway) is instructive — each of us
consulted our own experts, and we come away with different answers.  It
suggests why even defining safety is so hard.  Maybe I misunderstood my
flight instructor's response, or maybe I posed the question to him
improperly, or maybe he just gave an off-the-cuff answer without thinking it
thorugh, or maybe he's wrong...

Moral: When you are lost and ask for directions, never ask just one person
for directions.  Ask two people, and you have a better chance of getting to
where you want to go.
                                            Herb

      [On the other hand, when the two people give you DIFFERENT DIRECTIONS,
       you must realize that AT LEAST ONE of them is wrong.  So, you may
       have to ask THREE PEOPLE before you get any agreement...  A further
       moral is that you should have some justifiable trust in those who 
       are giving you advice.  PGN]

Please report problems with the web pages to the maintainer

x
Top