The Risks Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 5: Issue 34

Monday, 7 September 1987

Contents

o Dutch Police Hampered By Faulty Computer System
Patrick van Kleef
o Computer Psychosis
Bill McGarry
o Risks and people
Alan Wexelblat
o The influence of RISKS on car design?
Danny Cohen
o Reach out, touch someone
Scott E. Preece
o Info on RISKS (comp.risks)
---------------------------------------------

Dutch Police Hampered By Faulty Computer System

Patrick van Kleef <mcvax!cs.vu.nl!kleef@seismo.CSS.GOV>
5 Sep 87 16:49:11 GMT
Criminals are set free, innocent people get arrested due to serious problems
with the central computer of the Dutch police.  The newly installed computer
at the CRI (Central Investigation Information dpt.) in The Hague, Holland,
is invested with bugs.  Police departments all over Holland have decided not
to use the CRI computer anymore, until the problems are solved.

Only one day after the computer was installed, problems started occurring,
giving inaccurate or plain incorrect information about people the police
wanted to check.  Innocent people appeared to be on a 'wanted list',
criminals were 'cleared'.  This resulted in injust arrests or people
sent away whereas they should have been arrested.  The chaos was enormous.

The CRI clearly made the blunt mistake to completely dispose of the
old system and switching to the new system at once.  No back-up system
was maintained.  And the supplier of the software ("It's not our fault,
we've delivered what was ordered") washed his hands in innocence, blaming
the police for incorrect usage of the system.

Will they never learn?

Paul Molenaar, freelance journalist

---------------------------------------------

Computer Psychosis

Bill McGarry <decvax!bunker!wtm@ucbvax.Berkeley.EDU>
Fri, 4 Sep 87 22:38:41 EDT
United Press International [4 Sep 87]

COPENHAGEN, Denmark -- A young man became so mesmerized by his computer 
that he was hospitalized with a "computer syndrome" that made him unable to 
distinguish between the real world and computer programs, a Denmark medical 
journal said.

The journal said the unidentified 18-year-old contracted the new form of
psychosis, called computer syndrome by three doctors at Copenhagen's Nordvang 
Hospital, after spending 12 to 16 hours a day in front of his computer.

The doctors said the young man began to think in programming language,
waking up in the middle of the night thinking, "Line 10, go to the bathroom;
Line 11 next."

The patient told the doctors he "discovered that man is only a machine. 
There is no difference between the computer and man."

In WEEKLY FOR PHYSICIANS, psychologist Bent Brok and psychiatrists Eva 
Jensen and Erik Simonsen said, "He merged with the computer and afforded it 
supernatural qualities."

In the end, he suffered from insomnia and anxiety and had to be 
hospitalized.  The article did not indicate his present condition.

The young man's preoccupation with computers is not unique, but his 
psychotic condition is unusual, Tuesday's report said, warning against 
"many young people's excessive preoccupation with computers."

The three doctors said that the computer is used by youths as a substitute 
for human contact because it always responds in a rational manner but that 
the stress on logic can lead to immaturity and emotional limitations.

The computer trade itself also seems to be aware of the problem.

"A large group of young people -- about 95 percent of them boys -- are 
computer freaks who live for nothing but the machine," said Lars Knudsen, 
32, manager of a Copenhagen computer firm, Professional Datainformation.

"The typical computer freak is between 14 and 16," said Knudsen, a former freak
himself.  "He gets up at 2 in the afternoon and sits in front of the screen 
until 4 in the morning.  He drinks 3 liters of Coke and has no girlfriend."

---------------------------------------------

Risks and people

Alan Wexelblat <wex@MCC.COM>
Sun, 6 Sep 87 13:42:20 CDT
Two short notes:

   The "were the flaps extended" debate has made me remember just how
   unreliable human eyewitnesses are.  Many times we depend on human
   observation to back up sensor data.  Yet we forget about the people
   who "saw" the plane on fire before it hit the ground, and the pilot
   of the plane behind who "saw" that the flaps were extended.

   The comments by Rich Neitzel about certification have also reminded
   me of the risks of natural language.  At one point in his "NO to
   certification" message, Neitzel uses "proscribed training" where he
   means "prescribed training."  Of course, this totally reverses the
   meaning.  It is not hard to imagine such an error being written
   into the specification for a system, leading to behavior totally
   opposite that which was desired.

--Alan Wexelblat
ARPA: WEX@MCC.COM
UUCP: {seismo, harvard, gatech, pyramid, &c.}!sally!im4u!milano!wex

---------------------------------------------

The influence of RISKS on car design?

<COHEN@C.ISI.EDU>
4 Sep 1987 17:43:51 PDT
Roy Smith (System Administrator, Public Health Research Institute, NYC) remarks
(in RISKS DIGEST 5.32) about Honda's 4WS (4-wheel steering) having "no 
computers, wiring or electronic black boxes" and being "mechanical and sure".

He then asks "have the Honda engineers been reading RISKS?" and "Has car
design turned a corner because of us?".

These are most interesting questions, especially since Honda has the 4WS
in Japan since 1976.
                            Danny Cohen

   [OK.  So an <illogical> conclusion would be that the Japanese invented 
   RISKS prior to 1976.  A more sensible conclusion might be that it pays
   to stick with a good design?  But I think there is a tendency to have
   to keep up with the Joneses in modernizing auto designs and other systems
   by using computers just to one-up the competition.  I'm delighted to hear 
   that Honda might have resisted that in the 4WS.  PGN]

---------------------------------------------

Reach out, touch someone

Scott E. Preece <preece%mycroft@gswd-vms.Gould.COM>
Sun, 6 Sep 87 19:56:30 CDT
  From: sclafani+@andrew.cmu.edu (Michael Sclafani)
> He says the satellite link won't work until he perfects techniques for
> making the human body act as an antenna.  Be he predicts its use by
> parents, pet owners, overseas workers in potential hostage situations,
> Alzheimers's patients and police tracking criminals or parolees.
> 
> Does Dr. Man see any hint of Big Brother in all of this?
> 
> "Yes, but I don't want to go into it.  I'm more into the technical
> aspects."
> 
>    [What will it take before inventors of technology consider
>    implications of their work as part of their responsibilities?  MS]

I admit I wasn't there and didn't hear the interview happen, but this sounds
to me like a cheap shot out-of-context quote.  I don't think there's much
question about the desirability of the uses he suggests; it's not
unreasonable to promote them despite the potential misuses the moderator and
the interviewer fear.  There are precious few things in the world that don't
have both good and bad uses.  It's important to be aware of the potential
abuses of the things we create, but it's also important to not be so afraid
of them that we sit around creating nothing.  Once it is possible to do a
thing, and it seems likely that what Dr. Man suggests is not far from
feasibility, it will be done.  It would be nice for the creator to be one of
the first to point out the dangers and to suggest the need for their
regulation, but it is unfair to expect someone whose reputation and
financial future are tied up in the development of a product to spend as
much time plugging its dangers as its virtues.  That's what the rest of us
are supposed to be doing.  One could reasonably say that Dr. Man had done
his duty by making his product visible far enough in advance that we can
work on controlling it before it is reality.

scott preece, gould/csd - urbana         uucp: ihnp4!uiucdcs!ccvaxa!preece


Report problems with the web pages to the maintainer