The RISKS Digest
Volume 6 Issue 28

Wednesday, 17th February 1988

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Interleaved Alert Systems
Earl Boebert
Unix Review — Safe and Secure
Aaron Schuman
Re: More info on Compuserve Macinvirus
Amos Shapir
More on LTAC — software review and warranties
Nancy Leveson
Re: Software Warranties
Barry Nelson
Computer Pornography
Joe Morris
Jay Elinsky
Jim Frost
Don Mac Phee
A bit more on the AMTRAK crash...
John McMahon
Re: Last Clasp credit cards
Jack Holleran
911
Brint Cooper
Talk on Legal Issues of Computer Graphics by Susan Nycum
Eugene N. Miya
Info on RISKS (comp.risks)

Interleaved Alert Systems

<Boebert@DOCKMASTER.ARPA>
Wed, 17 Feb 88 10:05 EST
Barbara Tuchman, in her classic _The Guns of August_, makes a strong case
that WWI started because of interleaved alert systems.  The issue then was
mobilization time in days versus flight time in minutes, but the positive
feedback effect was the same.  Worth reading by anybody interested in
interactions among large systems.


The Latest Unix Review

Aaron Schuman <human%hpinddf@hplabs.HP.COM>
Wed, 17 Feb 88 17:05:08 -0800
The Feb '88 issue of Unix Review (vol 6, #2) takes "Safe and Secure" as its
theme.  I found it to be worthwhile reading.  Especially useful were Tom
Berson's interview with Colonel Roger Schell and an article on cost
considerations of security by Gligor & Chandersekaran.  If you've got an
hour, go find yourself a copy.  Happy reading.


Re: More info on Compuserve Macinvirus (RISKS DIGEST 6.27)

Amos Shapir NSTA <amos@nsc.NSC.COM>
Wed, 17 Feb 88 09:07:01 PST
Flames aside, there is one good outcome of Richard Brandow's message: On
March 2, any MacII user who assumes (as the Chicago Tribune reporter did)
that viruses were just an urban legend, will learn otherwise in an easy
way, and take appropriate steps to protect his Mac.

    Amos Shapir             
National Semiconductor 7C/266  1135 Kern st. Sunnyvale (408) 721-8161
amos@nsc.com till March 1, 88; Then back to amos%taux01@nsc.com 


More on LTAC — software review and warranties [Re: RISKS-6.22]

Nancy Leveson <nancy@commerce.UCI.EDU>
Wed, 17 Feb 88 10:03:27 -0800
                        [Note: LTAC = Legal Technology Advisory Council.  PGN]

I have some additional information, which judging from the response I got to 
my message, may be of interest to enough people to warrant putting it in Risks.

Apparently, there are committees like the IEEE Working Groups that LTAC has
formed to develop a draft of the guidelines or criteria on which the software
will be evaluated.  These working groups include representatives from all
interested parties, including those who build and sell the software.  The
guidelines are developed by a concensus process — there is no majority vote.
The criteria are discussed until all agree.  The guidelines statement is then 
sent to companies who sell that particular type of software.

If a company submits their software to be tested, they receive an
exception letter which states where the software does not meet the criteria.
This letter provides enough information so that the vendor can replicate the
erroneous behavior. The software must satisfy all the mandatory criteria.
There are also some preferred criteria which specify additional features
that would be nice to include in such software.  LTAC has two categories:
Standard means that one half the preferred criteria are included and
Advanced means that two thirds of the preferred criteria are included.  The
vendor is given a chance to fix any of the problems mentioned in the exception
letter.  The same tests are used for each of the software packages of a 
certain type, e.g., all docketing programs are submitted to the same set of 
test cases. (I assume that additional test cases are written for special
claims by the vendor).  

The reviews provided for each approved software package are extensive and do 
not just say "yes" or "no."  They are 30-60 pages long and describe the
features of the software and the detailed results of the testing process.  
The review is sent to the vendor first to get their comments.  If there are 
errors in the review and the vendor does not point this out and later 
discovers them, then the vendor must pay for reprinting the review.

A previous Risks message mentioned the problem of the cost of the review.  It 
IS expensive.  For example, a single-user Time, Accounting, and Billing system
will cost the vendor $27,000 to go through the review process.  On the other
hand, it seems like vendors could get the published guidelines and provide
a warranty themselves if they wanted to — I am sure that would satisfy their
customers and also save them the money.  The cost of LTAC is not covered by
the charges, by the way.  Over the three years of existence, the ABA has 
contributed over $1,000,000 to LTAC.  So LTAC is not only non-profit, it is
operating at a deficit.  One should note that the cost of getting a UL rating 
is many times greater than the cost of getting the ABA software approval.

I do not believe that an LTAC-type operation will solve all our problems with
software.  But it is an interesting phenomenon to watch the purchasers get
together and demand that vendors are truthful and accept responsibility for 
their products and their claims about their products when government is not
taking adequate steps to protect them.


RE: Software Warranties

Barry Nelson <bnelson@ccb.bbn.com>
Wed, 17 Feb 88 10:17:53 EST
RE: RISKS 6.27 Robert Kennedy <jrk%computer-lab.cambridge.ac.uk@NSS.Cs.Ucl.AC.UK>

<> Furthermore, UL, as far as I know, doesn't say whether or not the products
<> perform as advertised. They only say whether they are safe or not.

Not  even  that!   They  license  you  to  mark  your units as having met their
*minimum* safety standards, as inspected by their engineers. They do not  claim
it's  safe  or that they have looked at everything, or that they have written a
perfect standard.  They will not tell you how to make it safer, only whether or
not it meets their interpretation of a given paragraph in a standard.

From  my readings of Product Liability Cases, it appears that a manufacturer is
often held strictly liable for damage or injuries which occurred as a result of
the  product  *regardless*  of  it's  adherence  to  safety  standards.  Safety
certification efforts by the vendor *DO* help disprove negligence.

Note that UL (et al) assumes *no* liability for your product or its use. If you
invoke  their  mantle during litigation, they may start their own investigation
of the incident and issue an affadavit as to any deviations found in the  unit.
This  is tantamount to an indictment, should *anything* be found and places the
onus clearly on the defendant to now prove irrelevance of each  defect  to  the
claimed injury.  (Talk about a two-edged sword!)

The  point  is: you cannot hide behind someone else's evaluation if you are the
product experts or could have hired one.  UL does not claim to be expert,  only
an  inspector and promulgator of Standards.  The same would probably hold for a
software test agency.  It establishes a minimum acceptance, not a quality goal.

Barry C. Nelson /Senior Systems Engineer /
BBN Communications Corporation / 70 Fawcett Street, Cambridge, MA 

"This document contains statements of opinion by the author that are not
 attributable to BBN Communications Corporation or its management."

                          [Some of this was also noted in a contemporaneous
                          message from Ronni Rosenberg.  PGN]


Computer Pornography (revisited)

jcmorris@mitre.arpa <Joe Morris>
Wed, 17 Feb 88 17:20:40 EST
In RISKS 6:27, Jonathan Kamens asks:

> [...]Can the administration of a supposedly user-privacy-secure system
> censor the material that is made accessible on it?  Is the presence of 
> a filesystem on a machine evidence that the administration "supports"
> the contents of the filesystem?

The answers are, I suggest, "yes" and "it depends".  In general, the
owner/operator/manager of a computer system has the legal authority to say what
can be done with it, and has the legal responsibility to reject unlawful
activities where it is aware of them.  (There is, of course, a gray area in
deciding how much effort must be expended in discovering whether there are any
such unlawful uses being made of the system.)

For example, if the operator of a BBS is aware that a certain message contains
pirated credit card numbers and does not remove the it from the system, then
the damaged parties (the credit card holder and/or the issuer) probably have a
right of action.  If it is not reasonable to expect the operator to screen the
messages (Compuserve for example) then there should be no right of action as
long as the operator has not been made aware of the improper use.  From a legal
standpoint I doubt that there is any significance in the question of whether
the data was in a private or public file.  Once the nature of the material is
known the operator may be required to act.

Even if the material is not unlawful, the operator of the computer system still
has every right to establish policy governing how that system is to be used.
If a user doesn't like the policy an attempt can be made to change it, but
that's all.  Even if the material isn't illegal, management has a valid concern
for public relations which isn't helped by allowing the facility to become
known as a repository for feelthy peechurs.  It's like a newspaper, where the
policy is set by the publisher.  If the editor doesn't like it, tough.  In the
case cited in the RISKS entry the Project Athena management was apparently
responding to negative publicity which could damage its reputation with
individuals who are in a position to affect its business.

There doesn't even have to be the extreme of "dirty" material.  If the system
management wants to declare that game programs are not to be placed on the
system, that's their prerogative.  If you insist on playing Adventure on the
system, you're not welcome.

A final note: there is a difference between the legal authority to set policy
for a system and the ethical exercise of that right.  The recent Supreme Court
decision on the Hazelwood student newspaper is a case in point: however
ill-considered the specific decision may have been, the school as publisher had
the final say on the contents of the paper.
                                                      Joe Morris


Computer pornography on Project Athena system

Jay Elinsky <ELINSKY@ibm.com>
17 Feb 88 13:05:00 EST
Maybe Project Athena lets you use their resources for any purpose you want.
Here in the corporate world, we're allowed to use company resources only for
company business.  Not that my manager can go snooping into my files (he
can't, except under certain exceptional conditions).  But if there's a disk
space shortage then I could be asked to justify the space I'm consuming.  If
I honestly say that I'm storing dirty pictures, then I'll be told that it's
not a legitimate business use of the system.  If I lie, then I deserve to be
disciplined.

Jay Elinsky, IBM T.J. Watson Research Center, Yorktown Heights, NY


RISKS in using public computers — computer pornography [RISKS-6.27]

Jim Frost <madd@bu-cs.bu.edu>
18 Feb 88 00:05:42 GMT
This isn't specifically about the xpix incident, but deals with a very
relevant RISK.  Many users of "public" computer systems (e.g., a university
mainframe) are unaware of policies governing the use of the
hardware/software.  On our systems at Boston University, anything created on
any university-owned mainframe is basically the property of Boston
University (there are possible exceptions but they aren't the subject at
hand).  This means that if a student created a nifty program, s/he would be
unable to copyright that program independently of the university.  Now, the
RISK of this is that the university doesn't make this publicly known (I
found out about it after one of my programs turned out to be valuable — I
didn't want to sell it but several people commented that the copyright
notice I put on it was invalid).

From the university's point of view (and probably that of MIT with regards to
Athena), they own the system and thus can dictate the use of its resources.
If they don't like something, they reserve the right to destroy it/alter
it/sell it/whatever.  If that is the policy with Athena, an independent user
making his files world-readable could just be shut down by the system manager.

With regards to copyrights, is it really legal for a university (or other
entity) to claim copyright to anything made on their system without the
writer's specific permission (eg signing a paper saying that anything done on
a company's system is the property of the company unless the company releases
it)?  I would liken the source on the machine to typing on a piece of paper.
The way something is expressed on the paper should be the property of the
person that expresses it, not that of the owner of the paper (in the mind of
this programmer, at least), which is what I thought was the idea behind the
copyright law.  This would seem to follow the common practice, too, since
people buy programs, music, books, etc but the writer maintains ownership of
the expression although the buyer owns the medium.

Food for thought.          jim frost           madd@bu-it.bu.edu


Don Mac Phee <NKK101%URIMVS.BITNET@MITVMA.MIT.EDU>
Wed, 17 Feb 88 10:13 EST
In RISKS 6.27, Jonathan Kamens speaks of a broader subject than computer
pornography. He asks of what ARE your rights on a semi- public (i.e. a system
at an institution or workplace) system.  I'll just stick in some of the obvious
answers after a little backround. ;-)

>   I am sure you can imagine what kinds of graphics they contained.
> After the xpix directories had existed for about a week, the director
> of Project Athena announced that complaints about the boys and girls
> directories had been made by a dean; the dean had said that she had
> received complaints from students.  The xpix directory was soon
> thereafter made totally inaccessible to Athena users.

>   First of all, is what Athena did legitimate?

Who administers the system? This discussion raged for the longest of times on a
system at the University of RI. There was a communications database used by the
students for informal chats and discussion groups. The notes sent by some users
had a tendency to be abusive and affronting. After a number of users complained
to the computing center, the offensive notes, and sometimes entire discussion
groups were edited or removed by the staff. The basis for the decision was
that PARTICIPATE (the name of the database) was a system maintained resource,
so therefore was subject to editing by the staff. If you wanted to be abusive,
you had your own account space to be abusive in.

>   Was it really worth it for Athena to install the directory
> protections if there are ways to get around them and the net result is
> less efficient use of system resources?

See explanation above......

> What are the possible implications of Project Athena's decision?

It sounds to me you have a half-way decent administator :-) Although I (here
comes the opinion) wouldn't allow them in the first place.

>   Can the administration of a supposedly user-privacy-secure system
> censor the material that is made accessible on it?

If it's a system resource, they should. If its your own files located in the
directory space provided to you by the system, and the files are not HARMFUL to
the system, no.

>   Is the presence of a filesystem on a machine evidence that the
> administration "supports" the contents of the filesystem?

That's why the administration EDITS it. Freedom of speech applies to a LOT of
areas. This is NOT one of them. They are providing you with space and utilities
to perform a specific function. Learn. If you want pornography, go to the local
drugstore. Admitted, a system might have a LOT of free space for nonesuch like
this, but it also takes more effort to maintain it. CPU time spent copying and
reading the data, paper wasted printing it, time spent making archives of the
data, time spent restoring the data, the wear and tear on the digitizer. The
mind boggles when you consider all of this.
                                                      Don Mac Phee

p.s. All standard disclaimers apply.


A bit more on the AMTRAK crash...

x4333) <XRJJM%SCINT.SPAN@STAR.STANFORD.EDU (John McMahon, STX/COBE>
Wed 17 Feb 88 08:23:08-PDT
***> From: msb@sq.com (Mark Brader)
***> > The FCC's private radio bureau reported [of the Chase, MD, accident]
***> > that "This terrible collision could have been avoided had the
***> > locomotives been under the control of a central computer."
***> It could also have been avoided if the turnout in question had had
***> a "derail".  This device, as the name suggests, would derail one train --
***> in this case, the locomotives — rather than letting it onto the through
***> line where it could (and did) collide with,

Mark brings up a valid point.  Unfortunately, that section of track (Just south
of the Gunpowder River bridges) has no derails.  I haven't been on that section
of track, but the layout diagrams I have seen never mentioned a derail.

As I recall (since the docs are not in front of me) the track looks like this:

                                              Gunpow Bridge
        <------------A----------------*-C----------------------->
To Washington                        /                          To New York
        <------------B--------------/

The Conrail train, on track B, had ignored at least one warning signal.  It
ended up going through a stop signal right before it reached the switch.  The
Engineer hit the brakes as the train went through the switch, and ultimately
stopped at point C.

At the same time, the AMTRAK train had been approaching the same point on
track A.  It's reported speed was around 100 MPH.  On some sections
of AMTRAK's Northeast corridor, 125 MPH is the speed limit.  There has been
some question as to how wise it is to run trains so fast, when only some of
them are under Automatic Train Control (ATC).  All AMTRAK trains in the area
are under ATC, the CONRAIL trains aren't.

Since the CONRAIL train couldn't outrun the AMTRAK, and they couldn't back up
(An article in the Washintonian Magazine suggested the engineer of the CONRAIL
train considered backing up until the AMTRAK came into view) Impact occurred.

A derail switch would have (probably) saved the AMTRAK train.

                                              Gunpow Bridge
        <------------A----------------*------------------------>
To Washington                        /                          To New York
        <------------B--------------*--D--!

If the derail was installed (Track D) the CONRAIL train would have passed the
STOP signal and instead of being forced onto track A would proceed on to
track D.  The AMTRAK train may have shot by without even knowing there
was a problem.

The risk here is that the CONRAIL locomotive still would have crashed, the
lives of the CONRAIL train crew would be threatened, and if the crash was bad
enough it could still spill back onto the "A" track.  It seems forcing CONRAIL
into using ATC would be a better idea.

John McMahon


Re: Last Clasp credit cards

Jack Holleran <Holleran@DOCKMASTER.ARPA>
Wed, 17 Feb 88 00:19 EST
I don't think that the magnetic clasps on purses could degauss or fully 
erase credit cards.  The magnets may introduce some noise on the magnetic
stripe but it should still be legible electronically.

First, you need a sufficient strength to really erase.  How much is
enough?  You have to exceed the coercivity of the magnetic stripe on
the card.  Most of the cards are using a quality magnetic stripe to 
prevent overwriting by the criminal element.

Second, why would the purse manufacturer use a "high coercivity" magnet 
to keep the purse closed.  He is probably going to use the cheapest
magnet he can find to do the job.  If its too expensive, he'll figure a
way to bring back snaps.

I think the damage is probably being done in the stores where everyone
seems to have an on-line reader.  No offense to the hard working clerks
but have you really watched how they "read" a card on the reader.  How often
have they had to reread the card and then, "punch" the numbers into the
reader or cash register or call the credit card service bureau.  The card
could be bad but the reader might be "dirty" or the clerk could be "reading"
the card wrong.

Concerning the eelskin metalic particles introduced in the tanning process
(RISKS-6.25), the stripe on the credit card is a modified magnet.  It will when
placed near particles which could be magnetized, attract them.  The particles
could then "dirty" the reader.  Which in turn "dirties" another card.  Since
some of the other conversations in RISKS have been about viruses, this might be
a description of a "particle virus".

Jack Holleran


911

Brint Cooper <abc@BRL.ARPA>
Tue, 16 Feb 88 22:22:12 EST
> Several cases have been reported here recently in which calls from cellular
> telephones to the 911 emergency number have been seriously misdirected due to
> automated load shedding by the cellular nodes.  The problem arises when the
> node nearest a caller is overloaded and a call automatically gets switched to
> the next nearest node.  For example a person calling 911 in Oakville, Ont. 
> was redirected to St. Catharines, Ont which is about 85 km away. 

    A low-tech, non-computer solution is easily available.  The 911 (or
police, fire, ambulance, whatever) dispatchers in adjacent jurisdictions simply
monitor one another's radio transmissions.  While this is technically in
violation of FCC rules, the Commission knows it is done and condones it in the
interests of life and safety.  For example, state and local police here have,
in earlier days, monitored one another's transmissions to coordinate problems
as have fire departments in adjacent jurisdictions.
                                                               Brint


2/23 8 PM Bay Area ACM/SIGGRAPH: Legal Issues of Computer Graphics

Eugene N. Miya <eugene@ames-pioneer.arpa>
Wed, 17 Feb 88 17:23:08 pst
Legal Issues of Computer Graphics
Susan Hubbell Nycum

Date: February 23, Tuesday (4th Tuesday of the Month)
Time: 8 PM
Location: Xerox Palo Alto Research Center (PARC), 3333 Coyote Hill Road

Bay Area ACM/SIGGRAPH
Association for Computing Machinery
Special Interest Group on Computer Graphics

Ms. Nycum will speak on the legal issues involving computer graphics.  The
focus will be on proprietary protection including the recent developments
in copyright for screen displays and patents for user interfaces.

(Ms. Nycum is a partner of the international law firm of Baker and McKenzie
resident in the Palo Alto Office, specializing in the legal aspects of high
technology including computers and communications — proprietary-rights,
licensing technology transfer, governmental regulation, privacy, computer
crime, licensing, litigation and general advice to high technology
companies and organizations using high technology products and services.)

Please report problems with the web pages to the maintainer

x
Top