The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 7 Issue 89

Tuesday 6 December 1988

Contents

o Computer Literacy #4
Ronni Rosenberg
o Privacy versus honesty/equality
Jerry Carlin
o Computerized speeding tickets?
Clifford Johnson
o Subways that "know" who's on board
Marc J Balcer
o Automatic toll systems -- Dallas
Andrew R. MacBride
o "Hackers", "crackers", "snackers", and ethics
"Maj. Doug Hardie"
o `hacker' is already a dictionary entry
Joe Morris
Douglas Jones
o Re: /dev/*mem and superuser
Jeff Makey
o Info on RISKS (comp.risks)

Computer Literacy #4

Ronni Rosenberg <ronni@wheaties.ai.mit.edu>
Tue, 6 Dec 88 15:12:11 EST
What are your reactions to a proposal for a different sort of "computer
literacy" course, described below?  (I am not saying that all schools should
teach such a course.)  Is it a good or bad idea?  Why?  Should the description
be changed?  If so, how?  How do you compare this with what you know about
existing computer-literacy courses?  Who should develop such a curriculum?
Who should pay for it?  Please respond directly to me.  Thanks.

          *               *               *               *

Compared to current computer-literacy classes, the proposed course would spend
much less time reviewing the mechanics of operating machines, the syntax of
applications software or programming languages, and rote learning of lists,
from computer components to uses.  It would spend much more time considering
the capabilities and limitations of computers, through discussions of the
impacts of important computer applications.  This might be a standalone course
or a series of discussions interwoven into courses in, for instance, social
studies or history.

One specific example of material that could contribute to meaningful education
about computers is a multi-media presentation entitled "Reliability and
Risk: Computers and Nuclear War," produced and distributed by CPSR.  The
presentation explains how current political and military trends decrease
the time allowed for people to react to a crisis, thereby shifting critical
decision-making responsibilities to computers.  It attacks the myth of
computer infallibility by describing different types of computer errors, their
sources and consequences.  It explores the growing reliance on computerized
decision-making and how a computer error, especially in times of crisis, could
trigger an accidental nuclear war.  Lasting a half-hour, the program obviously
cannot cover the topic in great depth.  But it does present salient points
about an important and complex area of computer use, greatly heightens
people's awareness of problems that they are unlikely to learn about from
magazine articles about computers, and stimulates exciting discussions and
further thought.  The presentation uses no computers, and the intended
audience need no previous computer experience.

The proposed course might include discussions of
 *  SDI's computing requirements -- so students could consider the concept of
    software trusthworthiness and the potential for design errors in complex
    systems.
 *  The Vincennes episode -- so they could consider the difficulty of using a
    system outside the boundaries of its intended use.
 *  The FBI's National Crime Information Center (NCIC) -- so they could 
    consider the relationship between civil liberties and computer technology.
 *  The National Test Bed, war games -- so they could consider the limits of
    computer simulations.
 *  Computerized monitoring techniques -- so they could consider impacts of
    computers on the workplace.
 *  How computer science is funded -- so they could consider which sorts of
    problems society views as important.
 *  Some of the myriad of RISKS stories -- so they could consider the risks of
    depending on computer systems.

And so on.  Overall, the course would emphasize the importance of the social
and political context in which a computer system is developed and used.


Privacy versus honesty/equality

Jerry Carlin <jmc@ptsfa.PacBell.COM>
1 Dec 88 20:45:26 GMT
The following is from an article: "In Sweden, the public can read 
prime minister's mail" by Eva Janzon, Associated Press.

In Sweden, government records have been open to the public since 1766.
This includes the right to read the Prime Minister's mail (except for
a few classified items). Not only that, but everyone's records are
effectively public:

    "Knowing your neighbor's date of birth is enough to gain access
    to files at the National Taxation Board which lists income and
    tax from the previous year, church membership, marital status
    and current address.

    "If you take the number to the county police, you can find out
    about any unpaid bills.  Other registers list education, state of
    health and membership in associations.

    "All this has been accepted as a price for keeping people honest
    in a society that strives for equality."

The article did state that some Swedes dislike this invasion of privacy.

Is the risk of inequality and dishonesty more important than the risk 
to privacy?

Jerry Carlin (415) 823-2441 {bellcore,sun,ames,pyramid}!pacbell!jmc


Computerized speeding tickets?

"Clifford Johnson" <GA.CJJ@Forsythe.Stanford.EDU>
Mon, 5 Dec 88 17:54:29 PST
> Alas, as a Mass police officer pointed out in an interview, you have to catch
> someone *in the act* of speeding to get them for it.  Probably something to 
> do with that annoying bill of rights...

Not so in every state, I believe.  I recall a news story some 18 years ago in a
desert state (Arizona?), in which a cop called a another cop at another town to
look out for a certain car.  The defense argued that there was no way to know
to for sure that the speed limit was exceeded merely because the distance/time
in total exceeded the speed limit.  A university mathematician (measure
theorist) testified as to the meaning of the Mean Value Theorem, and the
speeding ticket was upheld by a presumably puzzled judge because no
counter-expert could be found to dispute him.

Does anyone know whether the Mass. "rule" is simply local?

   [And then there is the tale of the San Francisco police using computer
   records interactively to tow up-scale vehicles (on the grounds that their
   owners are more likely to pay up to get their cars back when towed).
   Yesterday they towed a car belonging to an undercover agent.  Referring
   to Ronni's item (particularly NCIC) in this issue, suppose that 
   information was in the computer that that car belonged to an undercover
   agent.  Then we have to assume that the agent was NOT ADEQUATELY under
   cover, especially if any further identification was included.  PGN]


Subways that "know" who's on board

Marc J Balcer <balcer@gypsy.siemens.com>
Tue, 6 Dec 88 09:38:49 EST
From the Philadelphia Inquirer, Saturday, December 3:

SEPTA TURNSTILES TAKE A HIGH-TECH SPIN
by Mark Bowden, Inquirer Staff Writer

[...]  The new turnstiles still accept tokens, but they are also equipped with
magnetic scanners that enable passengers to let themselves into the station
just by sliding through their new, magnetically encoded weekly and monthly
passes.  The old turnstiles only accepted tokens.  [...]  Because each of the
turnstiles is connected to a central computer, and each card is encoded with a
serial number, use of the new turnstiles will help SEPTA compile far more
detailed records of how people use the transit system.

"If someone gets on the Elevated in the Northeast, uses the Broad Street Subway
at midday, and then commutes home at night on the Elevated, we will have an
exact record of all those trips," said [Robert E.] Wooten, SEPTA assistant
general manager for public affairs."  It will provide our operations planning
department with lots of detailed information about who gets on where, when, and
how often.

Marc J. Balcer    [balcer@gypsy.siemens.com]      (609) 734-6531
Siemens Research Center, 755 College Road East, Princeton, NJ 08540


Automatic toll systems -- Dallas

<c60a-1bl@WEB.Berkeley.EDU>
Tue, 6 Dec 88 00:19:16 PST
    Regarding an earlier discussion of automatic toll systems:

    This evening (~11:45pm PST) on CNN, I caught the tail end of a
report on an automated toll-collection system being tested in Dallas.
The device consists of (and I quote) "chips and diodes and capacitors
on a board", and is apparently queried at each toll station. During a
brief statement, the president(?) of AMTECH, Inc. discussed plans
for the use of this system in many cities and in the rail network.

    Anyone have comments or more information?
    (I wish I had seen the beginning of the report...)

Andrew R. MacBride     c60a-1bl@widow.berkeley.edu (128.32.185.4)


"Hackers", "crackers", "snackers", and ethics (RISKS-7.86)

"Maj. Doug Hardie" <Hardie@DOCKMASTER.ARPA>
Tue, 6 Dec 88 09:42 EST
> Moreover, in more mature scientific fields, such as medicine, it
> is not left up to the experimenter to decide for himself what is
> ethically acceptable; he or she must convince review boards that
> include both peers and (one hopes) members of the affected public.

The cost of medical research is significant.  It is not within the resources of
your average high school student.  The cost of hacking "computer research" is
very low.  I seriously doubt that any kind of review system could be set up
that would be able to cope with the volume of this problem.  Even if you could
set it up, it would be a bureaucracy unto itself.

Also, I point out that the term hacker was in common use when I was in college
(64-69) to refer to a person who did not have any real understanding of what
they were doing, but just banged away with anything in a random pattern hoping
that something would work.  Calling an engineering student a hacker was the
ultimate put down.


`hacker' is already a dictionary entry

Joe Morris (jcmorris@mitre.arpa) <jcmorris@mitre.arpa>
Tue, 06 Dec 88 11:38:21 EST
In RISKS-7.87, Frank Maginnis observes:
>                                    "Hacker" and "virus" will undoubtedly
>appear very soon in standard English dictionaries with the general public's
>understanding of the terms, not the profession's -- "hacker" probably
>already has! We'll just have to adapt.

I can't speak for 'virus', but 'hacker' is already there.  From the 1986 
edition of _Webster's_New_World_Dictionary_ (Prentice-Hall) comes the
following entry:

  hack.er n. 1. a person who hacks (see hack(1)) 2. an unskilled golfer,
  tennis player, etc. 3. a talented amateur user of computers, specif. one 
  who attempts to gain unauthorized access to files in various systems

The dictionary doesn't have the verb _hack_ defined in a computer sense, but
that may be waiting on the next edition.

Can anyone point to the first use of the term?  I remember using it in 1962
(and have comments in programs to prove it) but it seemed to be well-used
by then.  


Re: "Hackers," "crackers," "snackers," and ethics

Douglas Jones <jones@herky.cs.uiowa.edu>
Tue, 6 Dec 88 13:51:24 CST
In P. G. Neumann's note of Mon, 5 Dec 88 10:27:35 PST, he points out that
we have to do something about whistle-blowing, and then gets back to questions
of hacking being dangerous, especially when we have flawed systems.  These
two statements bring to mind a sensible buisness practice of the early 1970's
that I have not seen used recently.

Back in the summer of 1972, I worked for Com-Share Incorporated, one of two
firms to commercialize the Berkeley Timesharing System.  Back then, I had not
yet heard the term "hacker", but we certainly knew that there were such people.
Com-Share had two interesting policies with regard to such people:

  1) All Com-Share employees were encouraged to use Com-Share facilities
      for personal use during off-hours, and the majority of personal use
      was assumed to be of a sort we would now call hacking.

  2) Com-Share had a standing reward of $500 for anyone who could expose
      a flaw in their system security, and while I was there, they raised
      the reward to $1000.

In concert, these policies encouraged hacking, but they made it into a
constructive activity.  An occasionally cited aspect of the "hacker ethic"
is that when hackers find something wrong with a system, they should report
the problem.  The problem is that reporting a problem might lead to its being
fixed, which in turn, might deny future access to the hacker.  A reward
can overcome this negative aspect of reporting bugs.

When I worked on the PLATO system at Illinois in the mid '70s, the system
administrators viewed the large community of PLATO hackers (mostly writing
and playing games, but with occasional password security attacks of the kmem
variety) as useful because they would exercise new system features long
before "legitimate" users would find them, and because they provided a heavy
system load before there was much of a legitimate user community.  As the
legitimate community grew and the excess capacity of the system diminished,
game playing and other "hacking" activities were severely curtailed, but
never eliminated.

In recent years, most computer crimes legislation I have seen has made
almost anything resembling hacking into a crime, and many system
administrators no-longer appear interested in the benefits that
a carefully managed hacker community can provide.  A hacker who finds
a flaw in a system and reports it is viewed as being a criminal with
a conscience instead of a benefit to society.

In a way, hackers who report flaws that they find in a system are like
whistleblowers, and this recent legal and managerial trend is quite
analogous to the "shoot-the-messenger" approach that is commonly
applied to whistleblowers.


Re: /dev/*mem and superuser

Jeff Makey <Makey@LOGICON.ARPA>
6 Dec 1988 1258-PST (Tuesday)
In RISKS 7.87, Paul E. McKenney <mckenney@spam.istc.sri.com> described
how to protect /dev/*mem on UNIX systems from uncontrolled read
access.  Unfortunately, he made a small mistake.  /bin/ps and other
programs that need access to /dev/mem should have their modes set to
2755.  Use of mode 4755 (as Paul suggested) sets the setuid bit rather
than the setgid bit.  Since /dev/mem is owned by root and Paul also
suggested changing the owner of /bin/ps to bin, there is probably no
security problem in his fix, but ps won't work.

I have done this on my 4.2 BSD system with no apparent ill effects.
In addition to /bin/ps, /usr/ucb/w needs this treatment.

Once again, we encounter a risk in (blindly) applying untested
bugfixes.  This comment, of course, applies to my own suggestions in
the paragraphs above.

Please report problems with the web pages to the maintainer

Top