The RISKS Digest
Volume 14 Issue 18

Thursday, 10th December 1992

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…


o Miscarriages — chip workers in the U.S., VDT users in Finland
o Programming errors affect state lottery
Mark Seecof
o Systems causing unintended changes in behaviour
Doug Moore
o ACM Code of Ethics and Professional Conduct; also, Ethics Starter Kit
o Computers do it better
Don Norman
o Traces of Memory and the Orange Book
Kraig R. Meyer
o Library sans card catalog
Patrick White
o Defence against hackers may be illegal; login banners grow
John Lloyd
o Info on RISKS (comp.risks)

Miscarriages — chip workers in the U.S., VDT users in Finland

"Peter G. Neumann" <>
Thu, 10 Dec 92 9:55:54 PST
The four-year study by the University of California at Davis reports that
women making computer chips have a 40% higher incidence of miscarriages than
other workers in the same factories.  It covered 15,000 workers at 14
factories in seven states.  A previous study by the University of
Massachusetts reported a 70% increased risk among women in a particular
factory of Digital Equipment Corp.  [Source: San Francisco Chronicle, 4
December 1992, p.1.]

Researchers in Finland have identified a statistically significant incidence
of miscarriages among women using computer video display terminals that emit
electromagnetic radiation of type ELF — triple the expected normal.  A report
is being published in the American Journal of Epidemiology.  [Source: San
Francisco Chronicle, 10 December 1992, p.A7]

Incidentally, Paul Brodeur has another article on electromagnetic radiation
effects in the New Yorker dated 7 December 1992.  [RISKS readers will recall
the previous series of three articles being discussed here, e.g., RISKS-9.06
and .07.]

      [Forewarned is not necessarily forearmed, even if you have four arms.
      But this problem really demands greater attention, even if there are
      some who say these studies are not definitive.]

Programming errors affect state lottery

Mark Seecof <>
Tue, 8 Dec 92 13:11:49 -0800
[unquoted paraphrasing and bracketed comments mine --Mark S.]

A story by Virginia Ellis in the Los Angeles Times (page A3, Tues 3 Dec 92)
reports that programming errors associated with adding a new "keno" game to
the Calif. state lottery "apparently caused some wagers made early Nov. 17 not
to be logged into the main database.  As a result terminals at retail outlets
were unable to identify those tickets as winners.  A separate programming
oversight caused a similar problem in northern Calif. on Nov. 23, when for
part of the day no tickets for any games could be cashed and validated,
officials said."

The state controller's office is said to be asking whether pre-deployment
testing of the new software was adequate.  "Lottery Director Sharon Sharp
acknowledged that the lottery pressed GTECH, the Rhode Island company that
operates and maintains its computers, to get the games on-line by mid-Nov. but
she insisted there was still time for keno to be adequately tested.  ``We felt
110% comfortable,'' she said.  ``In fact, we could have tested it for three
more months and this still could have happened.''"

The lottery director downplayed the trouble, calling programming problems par
for the course.  Players affected by the troubles took a different view.  At
least one "replay" winner who couldn't get his free ticket from the computer
was told by the lottery to use a complex by-mail procedure to collect his
winnings.  He decided that his time plus 29 cents postage was too much for a
$1 return.  The lottery director said that the system has an "elaborate backup
system" which "ensures that no winner will be lost.  She [Sharp] said she has
since appointed an internal task force of computer technicians and auditors to
investigate the problems and determining if damages should be assessed against
GTECH."  Another lottery official suggested that "operator error" may've been

[And now the most interesting part...]

"The problems come just three months after the Calif. Lottery Commission
approved a change in the GTECH contract that decreases by hundreds of
thousands of dollars per day the maximum damages that can be assessed against
the company for errors that affect computer performance.  The change was
approved without public discussion.  Sharp said she proposed the change
because the company was being asked to get the keno game ready on a short
deadline.  ``What was important was to be able to get the... cooperation of
our supplier.  The cooperation meant much more that whatever financial
windfall you would or you wouldn't receive from... damages,'' she said.  Since
its introduction Nov. 16, Sharp said keno has added about $7 million a week to
lottery revenues."

[I think that GTECH must have known that the lottery's proposed keno
deployment schedule was too ambitious.  Otherwise, why condition cooperation
on the damage waiver?  This story documents that the Lottery Commission traded
off the risk of poor performance against the benefit of early deployment.  The
question is, did its assessment of this tradeoff calculate the costs (in time,
irritation, and lost winnings due to unreasonable transaction costs to recover
after computer failure) to players of failure, or only the costs to the
lottery of delayed deployment?  Should costs to players of failure be
considered by the lottery?  Or should it look only to its own balance sheet?]

Mark Seecof <>
Publishing Systems Department, Los Angeles Times

Systems causing unintended changes in behaviour

Doug Moore <>
Wed, 9 Dec 1992 19:00:00 -0500
A couple of items in RISKS touched upon computer systems and technology
affecting people's behaviour and causing changes in our society.  There is a
risk that some changes may be undesirable and unintended.

Sometimes the change comes about because the system lacks sufficient
information and or isn't smart enough to handle it.  When working at Bell
Canada back in the '70s, I saw an example of that.  A system was supposed to
compare numbers of long distance operators working with the number required to
handle the load of calls.  Over the long term it was hoped the information
would help in predicting staffing requirements based on various factors.
However, it was also used to evaluate managers on their current success or
lack of success in matching the number working with the number needed.  One
serious flaw was that the program assumed the actual number of operators
working could be changed every half hour.  This assumption was at odds with
the union contract that put minimums on the number of hours in a shift and
limits on scheduling of shifts.  The result was that at the end of each day
managers would spent an hour or more doing nothing but telling operators to
unplug or plug into the system in an attempt to fool it.  The managers' and
operators behaviour was changed in a direction the company didn't intend.

On other occasions a change can come about simply because it is in some ways
easier for a system to deal with data where quantities or levels are
significant, compared to other data, and managers may place too much emphasis
on that data.

The Metro Toronto Police may have changed their system now, but at one time,
their system reported each week statistics on the activities of each police
officer - just in order to ensure sound management of staff resources of
course.  Such things like numbers of parking tickets issued were easy to input
and report.  A wide variety of other activities, such as taking a lost child
home, or spending some time checking into a broken window at a business, could
not be as easily input or reported in meaningful ways, yet the value of those
other activities may be far more.  Supervisory officers would, of course,
recognize the value of such activities in principle, but the common reaction
to the weekly report was to notice such things as few parking tickets being
issued, to require explanations when that happened, and to tell the officers
to spend more time on issuing parking tickets so that next week's report
wouldn't "look so bad".  The net result of such a system to change the police
officers' behaviour.  While they would be unlikely to ignore other matters
that came up, the officers would none the less concentrate on the activities
that easily produced large numbers on the reports - such as issuing parking

In both of these examples changes happened that were not intended by anyone.
How to predict and avoid or manage such changes may not be simple when a
system is being designed or managed, but an effort is needed.

Doug Moore          Canada Remote Systems  - Toronto, Ontario
                    World's Largest PCBOARD System - 416-629-7000/629-7044

ACM Code of Ethics and Professional Conduct; Ethics Starter Kit

"Peter G. Neumann" <>
Wed, 9 Dec 92 10:56:39 PST
Adopted by the ACM Council on 16 Oct 1992, and certainly of interest to the
entire RISKS community, not just to ACM members and Boy Scout programmer
merit-badge seekers:

   1.1 Contribute to society and human well-being
   1.2 Avoid harm to others
   1.3 Be honest and trustworthy
   1.4 Be fair and take action not to discriminate
   1.5 Honor property rights including copyrights and patents
   1.6 Give proper credit for intellectual property
   1.7 Respect the privacy of others
   1.8 Honor confidentiality

   As an ACM computing professional, I will
   2.1 Strive to achieve the highest quality, effectiveness, and dignity
       in both the process and products of professional work.
   2.2 Acquire and maintain professional competence.
   2.3 Know and respect existing laws pertaining to professional work.
   2.4 Accept and provide appropriate professional review
   2.5 Give comprehensive and thorough evaluations of computer systems
       and their impacts, including analysis of possible risks
   2.6 Honor contracts, agreements, and assigned responsibilities
   2.7 Improve public understanding of computing and its consequences
   2.8 Access computing and communication resources only when
       authorized to do so.

   As an ACM member and an an organizational leader, I will
   3.1 Articulate social responsibilities of members of an organizational
       unit and encourage full acceptance of those responsibilities.
   3.2 Manage personnel and resources to design and build information
       systems that enhance the quality of working life.
   3.3 Acknowledge and support proper and authorized uses of an organization's
       computing and communication resources.
   3.4 Ensure that users and those who will be affected by a system have
       their needs clearly articulated during the assessment and design
       of requirements; later, the system must be validated to meet [its]
   3.5 Articulate and support policies that protect the dignity of users
       and others affected by a computing system.
   3.6 Create opportunities for members of the organization to learn the
       principles and limitations of computer systems.

4. COMPLIANCE WITH THE CODE.  As an ACM member, I will
   4.1 Uphold and promote the principles of this code.
   4.2 Treat violations of this code as inconsistent with membership
       in the ACM.

 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Incidentally, a computer ethics "Starter Kit" intended for computer science/
engineering teachers, libraries, and media resource centers is available from
the Research Center on Computing and Society, Southern Connecticut State
University, 501 Crescent St., New Haven CT 06515 (, fax
1-203-397-4681).  Ask for the RCCS Publications Catalog, which includes (among
many other things) three videos, (1) Computer Ethics in the Computer Science
Curriculum, (2) What is Computer Ethics?, and (3) Teaching Computer Ethics:
Strategies and Cases, a monograph (Teaching Computer Ethics, proceedings of
the teaching track from the National Conference on Computing and Values), and
the CPSR collection of course syllabi by Batya Friedman and Terry Winograd.
Prof. Terrell Ward Bynum is the Director of the RCCS.

Computers do it better

Don Norman <>
Wed, 9 Dec 1992 10:09:55 -0800
The following just appeared on AppleLink, Apple's semi-public bulletin board
and e-mail system.  I have censored out identities.  We all knew that the
common, every-day errors many of us have with e-mail can lead to potential
disasters affecting a huge number of people when world-wide WANs/LANs exist.
Here is an example, this one evidently triggered by the blind carbon copy
mechanism, which may reside invisibly on a message.  Invisible information is
never a good idea because it can lead to very visible results.

                       [Bad object-oriented software engineering, despite
                       parameterized information hiding (including X,
                       below) and inheritance of ancestors!  ENCAPSULATION
                       is the key to GOOD information hiding.  PGN]

In the Internet mail system, a blind cc is visible to the sender (as a line
labelled bcc: list-of-names) but is not only invisible to the receivers, it
ISN'T THERE, so that any reply by any of the receivers cannot go to the bcc
list.  That seems like the better way to do things.  (Although a bcc list with
2,800 recipients is pretty amazing.)

  Item    [nnnnnnn]                         8-Dec-92        14:48PST
  From:   [A]
  To:     Mailing List

  Please get this information to your workgroup immediately.

  An unknown error or bug has caused an estimated 2800 people to be blind
  carbon copied on a message sent by [W] called BRAZIL ACCESSORY KITS. The
  area of the LAN affected has not totally been identified, so I am sending
  this memo to all Technical Coordinators.

  It is important that you and your users DO NOT RESPOND to this memo, as each
  response is sent to another 2800 mailboxes.

  Additionally, your users should delete any message called BRAZIL ACCESSORY

  Various telecom and LAN managers are working with [W] and [X] to identify
  the problem. [Y] (phone number) is available if you need additional
                              [Rampent mispelings corekted bye PGN.]

Traces of Memory and the Orange Book

Tue, 08 Dec 92 15:57:12 PST
Hardly an issue of RISKS goes by without a story of someone obtaining residual
data--data that the original owner thought was gone.  It's usually from a disk
(issues of medical records, bbs files, and Prodigy come to mind), although
most recently (RISKS 14.15) Tom Swiss points out that utilities exist to scour
RAM for bits.

There's a common moral to all these stories that I feel a need to interject
here: We'd all be better off (at least from a privacy perspective) if the
products we used minimally had security features equivalent to the C2 criteria
of the Orange Book ("U.S. Department of Defense Trusted Computer Security
Evaluation Criteria").

Specifically relating to residual data, it states: "No information, including
encrypted representations of information, produced by a prior subject's
actions is to be available to any subject that obtains access to an object
that has been released back to the system."

Of course, Tom wouldn't have been able to recover his own lost file, but one
might argue that a decent user interface would have prevented that in the
first place...
                    Kraig R. Meyer 

    [I might add that it is not just the reuse of the object that is
    risky, but the residues left over from incomplete deletion, irrespective
    of whether there is ever any reuse of the object.  If an object is
    deallocated, its memory contents (primary, secondary, tertiary, backup,
    whatever) may still remain and be accessible to penetrators, system

Library sans card catalog

Patrick White <>
Tue, 8 Dec 92 13:39:54 -0800
Here's a computer related risk I didn't expect to run into at my local public

Over the last few years, the public library system has been converting the
card catalog and checkout system over to a (remote) centralized computer that
provides all sorts of nifty features like dial in access, ability to check if
a book is checked out, requesting books from other libraries and much more.
Well, in the last year or so, they found that the paper card catalog was not
being kept up to date, so, since they had the computer, they got rid of it.

Last week, a transformer blew and the computer went down (since it had no UPS,
it went down hard and recovery included fixing hardware as well as restoring

While the computer was down, it was still possible to check out books
(apparently they had some sort of backup procedure in place for that), but
there was no card catalog — one had to ask at the reference desk to get a
list of places to go look around in the stacks for books on their topic.

I talked with one of their computer services people and was told that they
plan to put in a UPS for next time so the machine can be taken down safely and
the data preserved, but there are no plans for anything beyond that (in
particular, no decentralization was planned).  Obviously, another blown
transformer or some down power/phone lines (we are expecting an unusually
nasty winter thus year) could take out the card catalog again.

This certainly isn't a life-threatening sort of risk, but does illustrate one
risk of computerizing an index at a site distant from the records.

Pat White (, work: (503) 578-3463)

Defence against hackers may be illegal; login banners grow

John Lloyd <JAL@VS32.dnet>
Mon, 7 Dec 92 17:41:57 PST
The following excerpt from CERT Advisory CA-92:19 suggests that it
may be a federal crime in the U.S. for computer system administrators
to take certain actions to defend their systems against a hacker.

The risk described here is not entirely about computing: it is mostly
a legal risk created, I think, by legislators and administrators that
are insufficiently aware of computing technologies and practises.  A cynic
might suggest other reasons for these kinds of legal opinions.

Lines marked with > are direct quotes; see two other comments by me later.

John A Lloyd  (enough to distinguish me from others, I hope!)
Supervisor, Technical Support
MacDonald, Dettwiler and Associates Ltd
Richmond BC Canada

>CA-92:19                         CERT Advisory
>                                December 7, 1992
>                            Keystroke Logging Banner
>The CERT Coordination Center has received information from the United States
>Department of Justice, General Litigation and Legal Advice Section, Criminal
>Division, regarding keystroke monitoring by computer systems administrators,
>as a method of protecting computer systems from unauthorized access.
>The information that follows is based on the Justice Department's advice
>to all federal agencies.  CERT strongly suggests adding a notice banner
>such as the one included below to all systems.  Sites not covered by U.S.
>law should consult their legal counsel.
>    The legality of such monitoring is governed by 18 U.S.C. section 2510
>    et seq.  That statute was last amended in 1986, years before the words
>    "virus" and "worm" became part of our everyday vocabulary.  Therefore,
>    not surprisingly, the statute does not directly address the propriety
>    of keystroke monitoring by system administrators.
>    Attorneys for the Department have engaged in a review of the statute
>    and its legislative history.  We believe that such keystroke monitoring
>    of intruders may be defensible under the statute.  However, the statute
>    does not expressly authorize such monitoring.  Moreover, no court has
>    yet had an opportunity to rule on this issue.  If the courts were to
>    decide that such monitoring is improper, it would potentially give rise
>    to both criminal and civil liability for system administrators.
>    Therefore, absent clear guidance from the courts, we believe it is
>    advisable for system administrators who will be engaged in such
>    monitoring to give notice to those who would be subject to monitoring
>    that, by using the system, they are expressly consenting to such
>    monitoring.  Since it is important that unauthorized intruders be given
>    notice, some form of banner notice at the time of signing on to the
>    system is required.  Simply providing written notice in advance to only
>    authorized users will not be sufficient to place outside hackers on
>    notice.

In other words, you must "give notice" lest you be convicted of snooping (and,
of course being a victim of unauthorized monitoring your attacker will go
free.)  How do you ensure that snoops, crackers and disgruntled employees are
guaranteed to view a message when you can't guarantee they are prevented from
logging in in the first place?  Catch-22, eh?

>    An agency's banner should give clear and unequivocal notice to
>    intruders that by signing onto the system they are expressly consenting
>    to such monitoring.  The banner should also indicate to authorized
>    users that they may be monitored during the effort to monitor the
>    intruder (e.g., if a hacker is downloading a user's file, keystroke
>    monitoring will intercept both the hacker's download command and the
>    authorized user's file).  We also understand that system administrators
>    may in some cases monitor authorized users in the course of routine
>    system maintenance.  If this is the case, the banner should indicate
>    this fact.  An example of an appropriate banner might be as follows:
>       This system is for the use of authorized users only.
>       Individuals using this computer system without authority, or in
>       excess of their authority, are subject to having all of their
>       activities on this system monitored and recorded by system
>       personnel.
>       In the course of monitoring individuals improperly using this
>       system, or in the course of system maintenance, the activities
>       of authorized users may also be monitored.
>       Anyone using this system expressly consents to such monitoring
>       and is advised that if such monitoring reveals possible
>       evidence of criminal activity, system personnel may provide the
>       evidence of such monitoring to law enforcement officials.
>Each site using this suggested banner should tailor it to their precise
>needs.  Any questions should be directed to your organization's legal

Given that notices that appear on the outside of shrink-wrap software boxes
have been held not to constitute a contract; nor that system demands for
"passwords" and "identification" are considered insufficient evidence that
authorized secure access is intended, I find it unlikely that any mere textual
bulletin supplied AFTER logging in would be considered sufficient notice to
unauthorized users.  Most systems let you interrupt such displays anyway.  And
are you sure that your cracker can read at 9600 baud?

>The CERT Coordination Center wishes to thank Robert S. Mueller, III,
>Scott Charney and Marty Stansell-Gamm from the United States Department
>of Justice for their help in preparing this Advisory.
>If you believe that your system has been compromised, contact the CERT
>Coordination Center or your representative in FIRST (Forum of Incident
>Response and Security Teams).
>Internet E-mail:
>Telephone: 412-268-7090 (24-hour hotline)
>           CERT personnel answer 7:30 a.m.-6:00 p.m. EST(GMT-5)/EDT(GMT-4),
>           on call for emergencies during other hours.
>CERT Coordination Center
>Software Engineering Institute
>Carnegie Mellon University
>Pittsburgh, PA 15213-3890
>Past advisories, information about FIRST representatives, and other
>information related to computer security are available for anonymous FTP
>from (

Please report problems with the web pages to the maintainer