The RISKS Digest
Volume 11 Issue 26

Monday, 11th March 1991

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Re: Droids/De-skilling
Michael L. Duerr
Robert Murphy
Bob Sutterfield
Eric Prebys
Steve Cavrak
Alan Wexelblat
Jeffrey Sorensen
Phil Agre
Re: High Tea at the Helmsley in New York City
David L. Smith
Re: Medical image compression
Ian Clements
Bill Davidsen
Apathy and viral spread
Rob Slade
Info on RISKS (comp.risks)

Droids (Re: Andrew, RISKS-11.21)

Michael L. Duerr <duerr@motcid.UUCP>
8 Mar 91 00:33:36 GMT
Droids are a perfect consequence of the "cash society", or capitalism.  Capital
- money, rich people - seeks to maintain power and control.  Freethinking,
independent people are too prone to upsetting the status quo.  More
importantly, they are expensive, since they are individuals instead of a bulk
commodity.  Engineering is the discipline of reducing production from that
which requires a craftsman to that which a droid can do.  Computer based
systems, cash registers that make change, and countless other high-tech
innovations exist solely for the purpose of de-skilling work.  Converting
production from a dignified act with deep psychological gratification to
assembly lines and other droid jobs is a trend clear since the beginning of the
industrial revolution.

What are the risks?  People with droid jobs that read droid textbooks in
school, electrotranquilize on droid TV, read droid newspapers, possess droid
ideologies with mouthed droid slogans ( A thousand points of light / peace with
honor / etc. ) lose their facility for critical thinking.  This explains why
subtle issues like electronic privacy, media monopolies and deregulation mania
are now beyond the grasp of average citizens.  Our computers think for us, and
we can no longer ponder the advisibility of invading Panama / Grenada / Lebanon
/ Iraq / ...  ( Droids do make good soldiers, or should I perhaps say cannon
fodder. )  Droids lack the intellect to analyze the mainstream press and detect
propaganda.  The steady slip of Western "Civilization" into droidism is perhaps
the greatest risk facing us all.  We need to value intellect, culture and
learning if we are to survive, and if our society is to preserve even a shred
of respect for human dignity.


Re: But the computer person said it was OK! (Andrew, RISKS-11.21)

Robert Murphy <bobert@ceili.UUCP>
Thu, 7 Mar 91 12:17:59 PST
Nick Andrew's remarks on "droids" provides an apt name for a phenomenon
I'm sure most of us run into with some frequency.  Alas, far too many
companies and managers prefer droid employees, even in the computer
industry.  How often have you encountered the following RISKy scenarios?

1. A company wants to create a technically challenging product, but the project
has an insecure manager or martinet running the show who only hires droids who
are not capable of doing the necessary creative work, so the project fails.

2. A company hires an expert, promising her that the company "wants to do it
right".  The new employee then discovers that this was only window dressing,
and what they really want is a droid who looks like an expert.  When the
employee tries to exercise her expertise, she encounters bureaucratic obstacles
and is eventually ordered to change her "uncooperative attitude."  If she is
not such an astute political operator as to force the right technical decisions
in spite of the obstacles, then she will either leave or knuckle under, and the
project fails.


cultural adaptation to droids

Bob Sutterfield <bob@MorningStar.Com>
Mon, 11 Mar 91 21:23:40 GMT
The emergence of the "droid" class has led to some interesting cultural
adaptations.  Someone already mentioned their strategy of asking for the
droid's supervisor, then iterating until a human is discovered.

I have found that droids often expect to talk to other droids, so if one
creatively molds one's behavior into the droid pattern, often significant
advantages can be reaped.  For instance, when trying to convince a droid to
send me an XXX-component to replace a critical but broken XXX-widget here, I
have been known to resort to saying something like "I sure wish you could get
that down to the shipping dock this afternoon rather than tomorrow morning.  My
boss is breathing down my neck and he doesn't like being told that XXX isn't
working."  This invariably elicits sympathy in the droid on the other end of
the phone, who then tries to help out a fellow droid in trouble.  In real life,
I have considerable autonomy from my supervisors and whatever flexibility I
really need to get my job done (very un-droidlike attributes), but there's no
need to tell the droid that.  Let the droid think that all the world's a droid,
{s}he will feel good about doing something to help another droid, and I'll get
my XXX fixed a day faster.


RE: de-skilling (Brantley, RISKS-11.20)

Eric Prebys, CERN-PPE/OPAL <prebys@vxcern.cern.ch>
Mon, 4 Mar 91 16:41:53 +0100
In his reply to Alan Wexelblat's "dumbing-down" letter, Peter Brantley
makes two somewhat contradictory statements. The first -

>There have been many instances
>where computerization has forced the lay off of newly redundant personnel, but
>for those who receive or gain control of workplace automation, the story is
>different.  They often experience a reskilling, or more explicitly, a
>redefinition of their job tasks, with even greater required skills.

is probably true in many cases, but he then goes on to say -

>The operation of machinery does not, and should not, require knowledge of
>appropriate intervention, as Alan suggests.  Indeed, the very *point* of
>automation is to remove these tasks from the province of the worker.

I think that anyone who has had experience with building and/or programming
automated systems would strongly disagree with this statement, if for no other
reason than that it presupposes infallible systems with infallible users.
Indeed, it could be argued that some of the most important "reskilling"
required in a highly automated environment involves learning to recognize when
"appropriate intervention" is required and what that intervention should be.
In many cases, this falls under the heading of quality control, but the
required attention to detail is much more difficult to maintain in an automated
environment.

    As an example, consider an automobile assembly line. In an old-fashioned
(i.e. human) line it's difficult to imagine the workers leaving the steering
wheel out of every one of the cars they make one day (although one occasionally
hears stories like that); however, it doesn't take much imagination at all to
picture a fully automated assembly line doing just that (say on Feb. 29 of it's
first leap-year in operation), and the problem going unnoticed for some time.
What is "appropriate" intervention in this case?  While I wouldn't expect a
worker to run down and start installing the steering wheels himself, I would
certainly expect him to stop production until the problem was fixed---even if
his Macintosh screen was still telling him everything was alright.

    This is certainly an unrealisitic example, but the point is this: While
automation does in general result in fewer errors and frees workers from menial
tasks, one should not be lulled into a lethargic state. There is still a
minimum level of knowledge about any given product/task that should be
maintained. Automobile workers should still know that cars require steering
wheels and salespeople should still know that making change requires a
subtraction.
                                        -Eric Prebys, CERN, Geneva, Switzerland


Computers and Stoopid Work

Steve Cavrak <cavrak@griffin.UVM.EDU>
Sun, 10 Mar 91 8:36:09 EST
Although I subscribe to the theory that it is cars that have made us stoopid
(the more you drive, the dumber you get), I'd like to add the following to the
readings on computers and the transformation of work:

   Zuboff, Shoshana, @i(In the Age of the Smart Machine: The Future of
   Work and Power), Basic Books, New York, 1988.

The author makes the distinction between firms that "automate" and those that
"informate".  Her approach is both historical and sociological - including
material from interviews with administrators, managers, and workers.

Sherry Turkel's The Second Self: Computers and the Human Spirit, Touchstone,
Simon and Schuster, 1984, would suggest that computers in the right hands can
make you smarter.  Maybe computers are for kids; adults would be better off
without them.

  [The Zuboff reference was also noted by Professor Michael Mahoney, of the
  Princeton History and Philosophy of Science Program (mike@pucc.bitnet),
  as communicated by David M.  Laur <dmlaur@phoenix.Princeton.EDU>.  PGN]


Deskilling - reply to Peter Brantley (Re: RISKS-11.20)

<wex@PWS.BULL.COM>
Tue, 5 Mar 91 09:43:50 est
Brantley make some interesting assertions, but he and I fundamentally
disagree.  Quoted text is from his article in RISKS-11.20.

> it is questionable whether the service sectors have been the targets of a
> "dumbing."

Let's continue to use the word "deskilling" which is a more accurate
description of the process.  "Dumb" is a pejorative and I only used it as a
reference to the discussions ongoing in today's society about the "dumbing
down" of such things as news boadcasts, political discussion, etc.

That said, I think the process of deskilling is obvious.  The article to
which I responded noted an incident, familiar to many of us, where a clerk
was unable to tally or make change or do other normal clerkly activities
while hir machine was down.  This person is clearly less skilled than hir
predecessors.  Further, this deskilling results in a loss of service to
customers and a loss of revenue to businesses.  I hardly thought this a
conten

Deskilling/Dumbing-down (Re: Brantley, RISKS-11.20)

Jeffrey Sorensen <sorensen@dino.ecse.rpi.edu>
Tue, 5 Mar 91 15:39:23 EST
Peter Brantley, Department of Sociology, University of Arizona, Tucson,
on the topic of the "dumbing" of the workforce writes:

> If we fail to notice that the U.S. educational and social system
> does not support the acquisition of these skills, then we have done a grave
> disservice.  This is not something particular to automation, but to our social
> system.  Automation is a neutral force.  Society is not.

While it is unclear how the U.S. educational system could teach the skills
required for "automation", more importantly is that automation is not a neutral
force.  All technologies have characteristics that detemine the nature of their
interactions with society.  The technology itself determines who will use it,
how it will be used, and its effects on individual lives.  Even more important
is that in the long run, technology determines what types of political forms
will emerge to deal with it.  To quote from Jerry Mander's interesting book
_Four Arguments for the Elimination of Television_:

  "If you accept nuclear power plants, you also accept a techno-scientific-
   industrial-military elite.  Without these people in charge, you could not
   have nuclear power.  You and I getting together with a few friends could
   not make use of nuclear power.  We could not build such a plant, nor could
   we make personal use of its output, nor handle or store the radioactive
   waste products which remain dangerous to life for thousands of years.  The
   wastes, in turn, determine that _future_ societies [his emphasis] will have
   to maintain a technological capacity to deal with the problem, and the
   military capability to protect the wastes.  So the existance of the
   technology determines many aspects of the society." p. 44

> [...] Workers were, are, and will be oppressed.

Many of the information processing skills that are becoming requisite in our
moder society could be better termed meta-skills.  Workers today must deal with
the great many changes that are occuring at higher rates as new technologies
emerge.  It is this ability to adapt and learn new skill that is now highly
sought in industry.  But paradoxically, it is these same meta-skills that make
much of the job of management redundent.  Information technology allows people
to exploit one another's experience and "borrow" the skills required for a
specific task.  It is possible to design equipment in a fashion that allows
unknowledgeable people to operate and fix it, and this is largely the goal of
graphical interfaces.

     Jeff Sorensen     sorensen@ecse.rpi.edu    (518) 276-8202


deskilling (Brantley, RISKS-11.20)

Phil Agre <phila@cogs.sussex.ac.uk>
Mon, 4 Mar 91 12:33:56 GMT
Peter Brantley's article about deskilling in RISKS-11.20 makes a number of
assertions that are not entirely uncontroversial.  I would like to flag these
here, for the sake of RISKS readers who are interested in the issues and are
thinking of doing some further reading.

    The operation of machinery does not, and should not, require knowledge of
    appropriate intervention, as Alan suggests.  Indeed, the very *point* of
    automation is to remove these tasks from the province of the worker.

This is not a very accurate account of the history.  Large numbers of factory
workers are, and have been for upwards of a century, in `machine tending'
jobs.  The prototype case of this was in textile factories, where a single
individual would run have to about keeping a dozen or more knitting machines
unjammed.  Machines go wrong.  Eliminating the need for human intervention
may be the `point' of automation, in the sense of the managers' ideal, but
in practice it is an ideal approached only slowly.

    ... as Braverman accurately noted, the age of craft work — where you
    could send Joe to "bang on it a few times" to fix it are long gone.

This is a misleading gloss of Braverman's point.  Braverman argued that the
age of craft work did not simply drift into obsolescence but was actively
brought to an end as part of the process of shifting control over the
organization of work.  To characterize craft work in terms of sending Joe
to bang on things is a caricature, part of the very ideology that justified
the whole process — a process which could have gone in other directions.

    Braverman did not note that the craft work population of the U.S. was
    always *very* small.

Quite the contrary, an appendix to Braverman's book hotly disputes the
assertion that craft workers were never numerous.  His argument, briefly, is
that statistical assertions to the contrary are based on projecting modern
de-skilled job categories back onto the very different processes of work in
earlier times.  He gives the example of farm hands, whose jobs have grown
steadily less skilled over time.

This is not to defend the simple de-skilling thesis, but simply to avoid a
shift to an equally oversimplified opposite extreme.  The picture is indeed
complicated.  RISKS readers who are interested in the subject should go hit
the literature.  The Thompson book I cited is a good place to start.

Phil Agre, University of Sussex


Re: High Tea at the Helmsley in New York City (Tashkovic, RISKS-11.23)

<dave@whoops.fps.com>
Mon, 11 Mar 91 17:17:04 PST
And not a very customer-friendly establishment!  Using the computer as an
excuse as to why something "cannot" be done is not acceptable, especially at a
place like that where exceptional service is the commodity being paid for.  The
correct response to an excuse like that is "I wasn't aware the Gold Room was
part of McDonald's.  Please take care of it and don't bother me anymore."  If
they had had to write out a special receipt by hand that is what they should
have done.  $50 for tea for two and they're going to tell you the computer's
dictating how you can pay.  Indeed!

David L. Smith, FPS Computing, San Diego        ucsd!celit!dave or dave@fps.com


Re: Medical image compression (Lane, RISKS-11.23)

Ian Clements <ian@lassen.wpd.sgi.com>
Mon, 11 Mar 91 08:24:39 -0800
 Most physicians (radiologists) will not make a diagnosis based upon digital
imagery transmitted to them via phone lines (or which has been compressed in
any way).  Why?  Because a mis-diagnosis may result in malpractice.  Most
physicians realize that compression results in loss of information so they
often wait until the film arrives or until they have a chance to review all the
data before making a diagnosis.

 Clearly, there are risks other than those associated with data compression
which apply to medical imaging systems that rely on computers.  An example;
software in an MRI system depends on the correct orientation of the patient in
relation to the magnetic field (for the purpose of labeling an image).  What
happens when the magnet is incorrectly installed?

                                       Ian Clements  ian@sgi.com  415/962-3410


Re: Medical image compression (Lane, RISKS-11.23)

<davidsen@crdos1.crd.ge.com>
Mon, 11 Mar 91 09:03:19 EST
  We've had this discussion before. As a person who has had medical imaging and
who has worked developing medical imaging software for CAT and ultrasound, I
don't buy the idea that if the original image is not perfect it is okay to
degrade it further.

bill davidsen   (davidsen@crdos1.crd.GE.COM -or- uunet!crdgw1!crdos1!davidsen)


Apathy and viral spread

Rob Slade <p1@arkham.wimsey.bc.ca>
Sat, 09 Mar 91 21:05:01 PST
Recently, Stratford Software has started a new online information service
called SUZY.  (The service is active in Canada, and is in beta testing for
users in the United States.)  SUZY operates along lines similar to those of the
Prodigy service and the PLC BBS network in that "vendor" supplied software must
be used on both host and terminal; you cannot just dial up SUZY with your
favourite communications package.  This has allowed Stratford to market SUZY as
the ultimate in "user friendly" services; the user does not need to know
anything about protocols for connection, the "terminal" software deals with all
network connections and everything from installation to email is done with a
menu driven interface.  (It is now even "rodent compatible.")

(Lest I be seen as too enthusiastic here, I suspect everyone on this group
would find the lack of functionality somewhat restrictive.  Long time net users
will demand features it can't yet provide, but it certainly is the kind of
system that any "naive" user could access without difficulty.)

I manage the data security/anti-viral topic area (referred to as an
"Information Network", or "IN") called INtegrity.  Any SUZY user can
look at the information in the INs, but, as they "leave" the area, they
are asked if they want to "join".  This simply puts them on a mailing
list that can be used to send announcements to the "members" of an IN.
If they want to "join", they hit <ENTER>, if not, they hit <ESC>.

Using figures from a month ago, the number of SUZY users who have joined
INtegrity stood at 170.  Some others will have dropped in and looked around,
but deliberately left themselves off the list when they left the IN.  (We
"INkeepers" have no access to that information.)

The number of accounts on SUZY a month ago at about 6000. However, research I
have done indicates that less than 15% actually use the system more than once a
month.  Interestingly, this figure has remained unchanged since SUZY was
released.  That means that less than 900 accounts were "active" at the time.

What does this mean to you, and to data security?  It means that less than 3%
of all, and 20% of *active* SUZY users care enough about data security to join
the anti-virus IN.  This is the *real* reason that computer viri are so
widespread today: people do not realize the danger.

Those of you who have studied viral characteristics, and virus protection and
functions, will realize how easy it is to protect yourselves against most viri.
But if the majority of users think they are safe, and do not take *any*
precautions, then viri have a fertile breeding ground to grow and spread in.
As my wife says, it shows not only how few people understand technology, but
how few even understand the concepts of public health.

I have been careful about identifying my affiliation, and describing the
situation for a reason.  When I first posted this on VIRUS-L, I got flamed by
someone who someone who said my observation was invalid because a) SUZY is a
pay system, b) he knew of at least three BBSes where people were interested in
viri and c) my IN wasn't any good anyway.

SUZY is a commercial system, and this is the reason I chose it for my figures.
It is marketed to both home and business users, and therefore gives a better
"cross section" of the "whole" user community, not just the "home users and
hackers".  It is also promoted as "the system for the rest of us" as Apple
would say, and again provides access to novice as well as expert users.
(Weighted a bit heavily to the novice side, but then so is the general user
community, wouldn't you say?)

I know of a number of local BBSes that cater to interest in viral programs as
well.  I support three of them myself.  But I selected those boards on the
basis of their interest, and it would be very strange if the user population
there represented the general population.  By the sales figures, those who use
a modem at all almost automatically put themselves in the upper 10% of computer
users.

(Am I going to take John's advice about improving my IN?  I'd be delighted.
Unfortunately, it seems he doesn't use the system.  Odd ...)

I am coming to find, though, that it is often the "experts" who give those of
us who are working in this field the most trouble, vis this recent exchange:
   Message #1678 - Anti-virus forum
   Date : 07-Mar-91 19:24
   From : Stephen Fryer
SF> I mostly have problems with the computers the instructors
SF> use;  instructors are at least as good at spreading viruses
SF> like Stoned since many of them seem to think their more
SF> exalted status (socially and educationally) makes them
SF> immune to such things.

My response?  Oh, yes.  I've seen this all too often.

Actually, I'm not so sure that it's as much conceit, as a kind of frightened
fatalism.  They probably are aware that they don't know much about virus
protection, but in this business everybody has to be an expert on everything,
so they just ignore it and hope it will go away.  Strange reaction in my view,
but then again, how do they get the facts?  Courses are few and far between,
and most of the books are not very strong on how to protect yourself (besides
being "technically" out of date the instant they go to press.)  Forget the
media.  (InformationWeek printed only four articles on viri during 1990.
Computing Canada published a "Computer Security" issue in November of 1990, and
printed only two articles on viri, both so general as to be almost useless.  I
had submitted five articles to CC for that issue, and the one they picked was
on how to "define" a computer viral program.)

But again, I agree with Stephen's assessment; it's the "experts" who are often
the greatest problem.  (Last government office I worked in, the first
disinfection I had to do was on the system support operator's machine.  He had
infected himself while trying to do a disinfection for someone else!  Recently,
in teaching in a microcomputer lab at a local school board I found that two
computers were infected.  I informed the lab manager, with some difficulty, and
returned the next week to find that not only were they not disinfected, but a
third had joined them.)

I mean, with respect to information on computer viral programs you can't *give*
it away.  Quite literally.  Cheap courses I give through local school boards
get cancelled due to lack of registration.  Mid-priced courses I run through
the Federal Business Development Bank just squeak through.  It's the expensive
ones that the Center for Advanced Professional Development has me do that reach
the "break even" point for registrations two months before the course dates.
(So if you *have* to swap disks with someone, make sure he's wearing an
expensive suit. :-)

This is the first time since I started working with computers that the attitude
of the general public has really had me baffled.  People must surely realize by
now that viri are real, not just the "scare tactics" of the security industry.
The two biggest problems the world faces today are ignorance and apathy.  But
people don't know that, and they just don't care ...

Vancouver Institute for Research into User Security, Canada V7K 2G6
         Robert_Slade@mtsg.sfu.ca         (SUZY) INtegrity

Please report problems with the web pages to the maintainer

x
Top