The RISKS Digest
Volume 3 Issue 22

Saturday, 19th July 1986

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Nostalgia
Mike Williams
Flames about BASIC
Jim Anderson
More on risks of teaching "just" programming
Herb Lin
Responsibility for Computer Actions
George S. Cole
CDP and Certification
Andy Glew
The undetected hang-up risk (more)
Ted Lee
Info on RISKS (comp.risks)

Nostalgia

"John Michael (Mike) Williams" <JWilliams@DOCKMASTER.ARPA>
Thu, 17 Jul 86 17:34 EDT
Willis' comment in connection with computer literacy in the old days:

    >Remember when you read this:  I'm talking of the period when
    >it was all mainframes and centralized computing shops, and the programming
    >fraternity argued persuasively for and held sway in the closed-shop!

triggered all sorts of memories I'm sure Willis shares.

Surely we both remember the Bendix G-15?  the Monrobot?  the CDC 160A (that
motherless PPU)?  Yes, there were big IBM 704 and UNIVAC II shops, but there
were also IBM 604 Punched Card Electronic Calculators, and UNIVAC 40/60/120s; 
the latter I remember being used for critical airframe and weapons system
calculations by then-Douglas Aircraft in 1956, when I joined the industry.

I don't remember, if I ever knew, what computers were used to support the
Comet and Electra I designs, but perhaps there may be a connection between
their sorry record and RISKS.  In any case, the problem of distributed small
computing environments has always been with us, if on a smaller scale.

Mike Williams, System Development Corp.  McLean VA


Flames about BASIC

<JPAnderson@DOCKMASTER.ARPA>
Thu, 17 Jul 86 22:45 EDT
Those of your readership who bristle when one programming language or
another is put down for any reason might like to read what has to rank as
the ultimate rebuttal.  I refer of course to Howard E. Tompkins paper "In
Defense of Teaching Structured COBOL as Computer Science (or Notes on being
Sage Struck).  It appeared in SIGPLAN notices, V18,4 of April 1983.  A real
hoot!
                                        Jim


More on risks of teaching "just" programming

<LIN@XX.LCS.MIT.EDU>
Sat, 19 Jul 1986 02:23 EDT
My own feeling is that for for "computer literacy" in the general populace
(rather than say for engineers or economists who will have to write
programs), programming is mostly irrelevant.  The most important notions for
everyone to have (that is after all the meaning of "literacy") are those
related to procedures: what procedures are, what input is, what output is,
how input can be related to output and so on.  Being able to ask the
question "But how can the computer know to do X?" in a meaningful way, and
puzzling out the answer to that question is in my view a whole lot more
important than knowing the syntax of PASCAL or BASIC.

The problem is ultimately related to clear thinking, and how to teach
people to do THAT.

    [We have included various somewhat redundant responses on this topic
     in recent RISKS, because the points being made are IMPORTANT but
     OFTEN IGNORED.  There is no substitute for style, elegance, care,
     and — above all — understanding what you are doing.   PGN]


Responsibility for Computer Actions

<cole.pa@Xerox.COM>
31 Dec 00 16:30 PST
    Responsibility for a computer foul-up can realistically be laid
anywhere from the individual operator's feet (for placing the wrong hard
disk in or plugging in the wrong power supply) to the hardware
designer's feet (for allowing ungrounded power plug-ins) to the system
programmer's feet to the compiler design team's feet (group shot) to the
application's designers' feet (another group shot?) ... all depending on
what is the "source" of the failure.
    Presuming for the nonce that the fault lies in the application
software (not in its implementation, via the transition from high-level to
machine code or transition from electronic state to electronic state), there
still remains a problem of determining who is responsible. Who provided the
algorithm? The implementation? The specification? Did anybody perform a
mathematical theorem validation? Could such realistically be done for the
entire program? (Hah.) Hindsight allows a (relatively) easy post-mortem that
shows "this step" could have been validated (and thus had the error shown
up), often enough. But the program is a SYSTEM, and the safeguards are at
this point far from perfect.
    Ought they be perfect? Think how much that would cost.
    Rather than tacking terms like "responsibility" to the entire
spectrum of computer programs, it would make more sense (legally and
ethically) to designate the principles and requirements for liability to be
attached for an injury, and let the moralists be concerned with the
responsibility. (Responsibility can NEVER be attached, no matter how hard it
is thrown; it is only accepted. But I would far rather have people
programming with or for me who voluntarily accept responsibility, since they
then provide the best protection.)

    Professional licensing, which requires the establishment of minimal
standards, allows actions based on malpractice to be brought. As long as
this licensing is voluntary and not mandatory the market can help
establish responsibility — for then the product seller who hires an
unlicensed programmer to produce the core program will have to consider
whether they might be charged with negligence.

    Standard applications, however, should only be subject to strict
"products" liability where there is a standard operating environment. If a
program specifies that it is designed to operate on an Apple II-E with an
Epson MX-80 or FX-80 printer, (or some set of CPU chips, terminals, and
printers with a set of standard operating systems), any user who goes to a
different environment (even if somebody else promised it would be identical,
or just compatible) has no one but himself to curse. The difference between
a hammer and a consumer computer application is (realistically) indifferent
in terms of consumer law — if you use a hammer as a wedge or a support for
some scaffolding, you can hardly cry foul when it fails at a task for which
it is not designed.
    (Of course, the above is complicated by some rulings that
"foreseeable misuses" allow liability. The consumer applications computer
company will want to restrict the range by specifying where it guarantees
its product , and will want to extend the probable hardiness to a penumbra
of likely modifications beyond that to prevent mishaps.)

                    George S. Cole, Esq.


CDP and Certification

Andy "Krazy" Glew <aglew%ccvaxa@gswd-vms.ARPA>
Thu, 17 Jul 86 09:11:19 cdt
Eugene Miya asks whether the CDP is a level of professional certification.  I
do not have a CDP, but I passed the Certified Computer Programmer (CCP) exam in
Systems Programming which is also given by the Institute for the Certification
of Computer Professionals (ICCP).

Does passing the exam itself indicate any level of competence? No - I would
expect first year engineering students to be able to pass it with no
difficulty. However, the fact that someone is serious enough about
`professionalism' to go out and get certified probably indicates something
about his character, if not his abilities. Obviously, the certification process
must become more stringent - the new requirement for periodic recertification
is a step in the right direction.

A secondary effect of `professional' certification is that you are expected to
subscribe to a code of ethics. Many people deride these, but I know that I, at
least, have them in the back of my mind when I consider systems whose failure
can harm people. `Empty symbology' has a powerful psychological effect: wearing
an Iron Ring reminds me about an oath I took with much rattling of chains that
I would never "pass bad workmanship". The ancient Greeks used to pour libations
to gods they knew weren't there.

Why take something like the CCP? For frankly mercenary reasons - I took it to
increase my chances of getting a job. But also because I am familiar with the
history of engineering as a profession in Canada and Great Britain (engineering
isn't a profession in the United States yet, is it?) and though that the ICCP
might be the beginning of something similar for software engineering / computer
science / programming. 

What would distinguish such a profession from the present situation? Purely and
simply, liability. A professional is liable for his actions, not just to the
best of his ability, but to the limits of knowledge in his field.

Liability is a great incentive for taking proper care of your work. To the
extent that care, the highest reasonable level of care that we can expect
humans to provide, can reduce the chance of failure in software systems,
professionalism is a good thing.

Andy "Krazy" Glew. Gould CSD-Urbana.    USEnet:  ihnp4!uiucdcs!ccvaxa!aglew
1101 E. University, Urbana, IL 61801    ARPAnet: aglew@gswd-vms


The undetected hang-up risk (more)

<TMPLee@DOCKMASTER.ARPA>
Fri, 18 Jul 86 03:08 EDT
When our local GTE Telenet office finally installed its 2400 baud service I
discovered the same problem referred to in the penultimate Risks:  if the
line dropped, there was a very good chance the local Telenet machine did not
detect it and one could later dial back in.  Several times i dialed in and
found myself in the middle of someone else's connection; I also, of course,
after several hours (almost a day one time, I seem to remember) was able to
dial back in and find myself connected to my original host system.  It took
several weeks of trouble reports, as well as calls from "high government
officials" (the computer I was using was this one:  the folks at the
National Computer Security Center were not, as one would hope and expect,
pleased) before Telenet acknowledged there was a problem and did something
about it.  I seem to remember that it was simply an ill modem, but the
experience was enlightening.
                                               Ted

Please report problems with the web pages to the maintainer

x
Top