Forum on Risks to the Public in Computers and Related Systems
ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator
Volume 12: Issue 11
Tuesday 30 July 1991
Contents
DEFSTAN 00/55-56- Victoria Stavridou and Andres Ravn
Computer problems at BCCI- David Shepherd
Census data in the Land of Oz- Michael Panosh
Soft Eng Cntrl aids during Hydraulic Failure (Science News)- Jeffrey Sorensen
Risks of human error in Soviet nuclear "industry"- Ken Mayer
Cardphone Problems in Ireland- D.P.O'Donoghue
Book review: Practical Unix Security- Clifford Stoll
*WRONG* ftp-adress in Brunnstein: Index of Known Malware- Eibo Thieme
Re: Smart cockpit with no backup- Henry Spencer
Re: The limits of simulation- Henry Spencer
Re: Licensing of Software Engineers- Henry Spencer
Data entry is NOT software engineering..- Thomas P. Blinn
New Jersey "software engineering" registration legislation- Arthur Rubin
A. Padgett Peterson
Christopher R Riley
Chris Riley
Joseph Beckenbach
Flawed assertion in RISKS-12.08- Mark Seecof
Re: Risks of Posting to RISKS- Jerry Hollombe
Info on RISKS (comp.risks)
DEFSTAN 00/55-56
<Victoria.Stavridou@prg.oxford.ac.uk>
Tue, 30 Jul 91 13:11:43 BST
A COMMENT ON THE REVISED UK MOD STANDARDS 00-55/56 Victoria Stavridou (RHBNC, Univ of London) and Anders Ravn (DTH, Technical Univ of Denmark) Our overall impression is that both interim standards have been substantially improved as a result of the public review exercise undertaken last year. This is particularly true of 00-55 which has been transformed from a long, confused and heavily prescriptive document to a much more sensible 2-part volume which describes the desired goals (Part 1/1 -- Requirements), suggests techniques for achieving these goals and identifies problems associated with certain practices (Part 2/1 -- Guidance). We have found that the formal methods content has been fully retained in this 2nd version and has in fact been improved since its interactions with non formal design and development aspects have, in many cases, been identified. For instance, the roles of the formal specification and the English commentary have been properly assigned (1/1 29.1) and appropriate uses are suggested (the requirement for verifying the English commentary has been removed!). Furthermore, this second version of the standard is far less prescriptive. Instead it identifies the goal and then leaves the designer/programmer free to use whatever techniques he wants so long as an overall requirement is satisfied. For instance, instead of banning dynamic memory allocation, there is now a requirement that the storage bounds during normal operation must be analysed and peak memory utilisation should not exceed 50% (2/1 30.50.2). The effect of these changes is that now ProCoS programs (which is of particular interest to us) would comply with the standard which was not the case for the previous version of 00-55. We are, however, less happy with the very weak links between 00-55 and 00-56. 1/1 8.2 on safety integrity analysis requires that this is carried out in accordance with 00-56; there is, however, no explicit way of relating the state/event spaces of the hazard analysis stage with that of the specification and/or program. This is a giant leap which assumes perfect physical components such as unfailing actuators and sensors! Current work between RHBNC and DTH is investigating this issue for the ProCoS $SL_0$ and fault tree analysis using a gas boiler as the running example. As a minor point, 00-56 states (6.8.1 Table 9) that the probability of human failure to act correctly in reasonable time after the onset of a high stress condition is between 0.3 and 1.0. It seems to us that a probability of 1.0 is pretty probable as in this case one knows exactly what the human is going to do! You just take the negation!
Computer problems at BCCI
David Shepherd <des@inmos.com>
Mon, 29 Jul 91 11:20:42 BST
Recent UK newspaper reports have commented that the receivers who are trying to
sort out the fraud at the now closed Bank of Credit and Commerce International
(BCCI) are being greatly hindered by the fact that all the data is stored on an
"archaic" computer system. They are also concerned that someone tipped BCCI off
about the investiagtion before the bank was closed and that the computer
records have been deliberately confused.
david shepherd: des@inmos.co.uk or des@inmos.com tel: 0454-616616 x 379
inmos ltd, 1000 aztec west, almondsbury, bristol, bs12 4sq
[A strong suspicion is that the records had been deliberately confused
all along. PGN]
Census data in the Land of Oz
Michael.Panosh, Unix.Education, Australia <MWP.MICHAEL@melpn1.prime.com>
Tue, 30 Jul 1991 12:49:36 +1000
Australia is coming up to another census exercise. Overheard on a breakfast radio news broadcast (paraphrase): "Confidentiality of data is assured as all the census papers are shredded." Reassured me no end!! Michael Panosh, Prime Computer, Australia -- mwp.michael@melpn1.prime.com
Science News Article: Soft Eng Cntrl aids during Hydraulic Failure
Jeffrey Sorensen <sorensen@spl.ecse.rpi.edu>
Mon, 29 Jul 91 12:20:05 EDT
[Jeffrey drew PGN's attention to a Science News article, 27 July 1991 (V 140, N 4, p 63), entitled "Software may ensure safer landings". It discusses recent problems with hydraulics (including the 1989 Iowa failure), and discusses a new software system:] The new software system has been tested on various flight simulators, including ones for the McDonnell Douglas F-15 and the Boeing 720. These simulators showed that with only manual control of the engines, crews could maneuver their planes but would have great difficulty landing. With software-controlled engines, however, pilots repeated simulated safe landings -- even in turbulence and crosswinds. [For the full article, contact Science Service, Inc., Editorial and Business Offices, 1719 N St. N.W., Washington DC 20036 (202-785-2255)]
Risks of human error in Soviet nuclear "industry" (Blinn, RISKS-12.08)
Ken Mayer <ken@visix.com>
Mon, 29 Jul 91 17:27:44 -0400
Whenever I see the words "human error" I always think "human interface design error." I bristle at the idea of blaming people instead of poor design because the operators are the ones who are least able to defend themselves. The most highly skilled, trained human being will still make mistakes every now and then. It is up to designers to build systems that tolerate mistakes, and provide an unambiguous path to a solution.
Cardphone Problems in Ireland
"Parsifal aka D.P.O'Donoghue" <8614903@ul.ie>
Mon, 29 Jul 1991 14:20 GMT
The recent thread about disturbances in phone software brought to mind the disruption earlier on in the year to the Cardphones installed by Telecom E/ireann (Irish State phone co) around this campus. For some reason -never explained or publicised- for a period of approx 2 weeks users could utilise the Cardphones on Campus free of charge. Somehow someone discovered that by dialing the free number "17" (used to test ringback etc), hanging up and then lifting the receiver when the ringback occurred a dialtone was available. This news spread like wildfire amongst the non-Irish students here who took the opportunity to ring home. A certain amount of discreet observation meant that other people found out the method and rang for nothing. About two weeks later all the cardphones were out of order with a notice stating that a "network upgrade" was taking place and that was the end of that. Desmond P.O'Donoghue 8614903@ul.ie
Book review: Practical Unix Security
Clifford Stoll <well!cliff@fernwood.UUCP>
Wed, 24 Jul 91 14:11:43 pdt
Practical Unix Security, by Simson Garfinkel & Gene Spafford
O'Reilly & Associates, Inc. ISBN 0-937175-72-2
Now, here's a book that's long overdue. If you're managing a Unix system, get
this book. You'll learn much more than just how to secure your system. Garf
and Spaf walk you through networks, file systems, and Unix internals, a tour
customized for finding security weakness.
Previous Unix security books were aimed at stand alone systems; this is the
first that discusses Unix security in a networked environment. It's a
practical book ("Never use
Set-User-Id-Shell Scripts") with an underlying current of "here's how Unix
works -- be careful of this wrinkle".
Plenty of good stuff here: A chapter on how to discover a break-in, another on
legal issues and privacy. A list of sensitive files in Unix. How to figure
out Unix log files. Security implications of X-windows and NFS. Kerberos and
Secure RPC. Most important, this is the first book to show you how to secure
your computer in a networked environment.
Today, most Unix computers have no system administrator. For others, a single
manager handles for dozens of workstations. Old time security -- strict
isolation -- just won't work. Instead, security depends on understanding
acceptable network interactions. For a harried system manager, this book will
pay for itself in aspirin.
A few omissions: secure id cards, intrusion detection expert systems, and the
Andrew File System. A few chapters are wasted on obvious things: backup your
data, change your passwords, RS/232 pinouts, and the old shopworn arguments
about what a hacker is.
It's sad that this book needs 500 pages. Their security checklist is twelve
pages long -- nobody will ever go through the entire list. Is this the fault
of the authors? Or of Unix? Hard to say, but I sure wish there were an faster
path to a tight system.
-reviewed by Cliff Stoll 22 July 1991
*WRONG* ftp-adress in Brunnstein: Index of Known Malware (RISKS-12.08)
Eibo Thieme <eibo@rosun1.informatik.uni-hamburg.de>
Mon, 29 Jul 91 16:51:00 +0200
The adress of our ftp-server contained a small error, here is the correct
version. But remember that our line is really slow !
VTC documents (Index of Known Malicious Software: IMSDOS.791; Index of Virus
Catalog: Index.791; all entries classified up to now) are now available from
FTP:
Our FTP server: ftp.informatik.uni-hamburg.de
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Login anonymous
ID as you wish (preferably your name)
dir: directory of available information
cd pub/virus: VTCs documents
Re: Smart cockpit with no backup
<henry@zoo.toronto.edu>
Tue, 30 Jul 91 12:54:16 EDT
>... In the event of a computer failure, the only good that mechanical backup
>instruments would do would be to let the pilot watch the altitude ticking off...
There is some justice in this; "if the computer fails, you're dead". But...
"The" computer? *Which* computer? "Fails"? Fails *how*? It is not at all
inconceivable to have a failure in which some control remains but sophisticated
instrument presentations are scrambled or absent. Indeed, one would hope that
the last-ditch emergency fallback mode of the software would be to provide
basic control and nothing more.
Henry Spencer at U of Toronto Zoology
Re: The limits of simulation
<henry@zoo.toronto.edu>
Sun, 28 Jul 91 19:58:59 EDT
> simulations of the [motor's] firing dynamics did not reveal subtle factors
Actually, this may not quite be true, it turns out. I'm told, by folks I
shouldn't identify, that some of the simulation work *did* at least hint at
problems. This was not followed up, possibly as a deliberate decision by
managers who didn't want to hear bad news.
Henry Spencer
Licensing of Software Engineers
<henry@zoo.toronto.edu>
Tue, 30 Jul 91 13:06:41 EDT
>It only requires licensing of those who offer themselves for hire as software
>ENGINEERS. Hire and engineer are both key words...
An excellent point. Many of the readers of this list may not be aware that
it is *already* illegal to offer yourself for hire as, say, a "mechanical
engineer" without being a licensed professional engineer. The NJ law is
closing a loophole in existing practice, not introducing something new and
radical. "Engineer" is the crucial word; that term is legally protected
and cannot be used frivolously... in any other field.
(One exception to this: if you are working for somebody else, *they* can call
you an "engineer" without requiring such qualification. [There may be some
slight restrictions on this, I'm not up on details.] The more activist
engineers have been known to agitate for removal of this rather large
exemption.)
Far too many people call themselves "software engineers", many of them having
nowhere near the background and demonstrated competence expected of a licensed
engineer in more traditional fields. While implementing the NJ law is going to
be, um, a learning experience for all concerned, it's a step in the right
direction.
Henry Spencer at U of Toronto Zoology
Data entry is NOT software engineering.
"Dr. Tom @MKO, CMG S/W Mktg" <blinn@dr.enet.dec.com>
Tue, 30 Jul 91 09:04:08 PDT
In RISKS 12.10, Bill Murray and Bob Frankston comment on the NJ legislation requiring the registration of "software engineers". On the whole, I agree with Bill Murray. In spite of the fact that I program, and have even done a certain amount of "software engineering", I doubt that I could qualify as a "registered professional software engineer", in spite of my formal education (at the doctoral level) in computer and computing science and applied statistics. I find it amusing that Bob Frankston asks "What about a VCR programmer?" The stories of unusable human interfaces in commercial VCRs abound, but I'd hardly characterize the "data entry" aspect of most home VCR use as "programming". In fact, if "VCR programmers" were really software engineers, they probably would have learned (at least a modicum) of human factors considerations, and the products extant in the marketplace might be more approachable. Bob assumes the legislature is attempting "to codify what is not understood." Actually, the registration and certification of professional engineers is well understood, and it is the very lack of such that evidences the non-professional status of our business. Dr. Thomas P. Blinn, Digital Equipment Corporation, Digital Drive -- MKO2-2/F10 Merrimack, New Hampshire 03054 ...!decwrl!dr.enet.dec.com!blinn (603) 884-4865
New Jersey "software engineering" registration legislation
<a_rubin@dsg4.dse.beckman.com>
Mon, 29 Jul 91 13:18:49 PDT
.... and risks of not looking at your posting software. The original file uses backspace characters to simulate underlining. Most of the underlined text was "design/designing" replacing "engineer/engineering". 2165888@mcimail.com 70707.453@compuserve.com arthur@pnet01.cts.com (personal) a_rubin@dsg4.dse.beckman.com (work)
Software Engineering Registration (NJ)
A. Padgett Peterson <padgett%tccslr.dnet@uvs1.orl.mmc.com>
Mon, 29 Jul 91 17:06:29 -0400
The concept of registration for "software engineering" seems novel in
that the proposal seems to both go too far and not far enough (thanks Bill).
The concept that all programming be regulated, even all commercial programming,
is ludicrous. At the same time certain catagories of programming cry out for
regulation.
As a licensed Professional Engineer, my primary responsibility is to
ensure that certain engineering tasks are done in accordance with regulation
and in a safe manner. The state of Florida has decided that I (as a result of
experience and testing) am competant to determine this. One of the
not-so-evident responsibilities is to not accept work that I am not qualified
to perform.
In the past, I have had the opportunity to work on many projects that
did not require licensing including digital flight controls for several
aircraft and a communications topology for the FAA National AirSpace Plan, two
areas that probably should have been covered by such licensing.
Other areas that come to mind are many medical software elements,
computer assets used for road and traffic control, and emergency
telecommunications networks. Certainly, IMHO in recent months we have seen
several examples of what happens when software is developed without evident
control.
That complex software is difficult to debug does not seem to be an
adequate defense for mistakes yet as more and more software replaces mechanical
processes, the potential for danger increases. Certainly the computer in my
wife's car is easily overridden since mechanical linkages from the wheel to the
steering and from the accelerator to the throttle plate still can override any
electrical command. A computerized highway control system is another matter.
Consider the implications if a traffic signal were to display green in all
directions simultaneously. (Yellow might be worse).
Consequently, as more sophisticated systems come into use, a formal
method needs to be established to determine that adequate safeguards are
provided. The problem is that often, only the designer or design team has the
expert knowlege of a particular system required to determine its safety.
This is the reason that registration of engineers came about in the
first place: since every critical design cannot be validated, we have to
validate the designer. It is not the perfect answer, merely the best choice
from what we have.
The major problem that comes about is in designing a certification
process that achieves its goals, not an easy task in any discipline but even
more so in software since it is still evolving. In electrical engineering, the
processes involved in providing adequate power for a building are well defined
and codes have been developed that set out these rules. Nothing similar exists
for software.
To make matters more difficult, while electrical quanta are reasonably
well defined (Alternating Current usually means either 60 or 400 hertz for most
purposes & leads and lags are well defined), good computer software must
consider the platform, clock speed, memory speed, bus speed, race conditions,
failure conditions and a host of other variables, something many programmers
are insulated from.
Consequently, at some point, critial designs must be examined by
someone who understands not merely the software, but the compiler, the
operating system, the CPU, and the installation as well. I would not feel very
safe near a nuclear power facility using a control program designed in Visual
BASIC by someone who only understood Windows (trademarks acknowleged) though
the approach might be well suited to balancing my checkbook.
I can see a very valid need for a counterpart in software to the same
certifications a licensed engineer makes when signing off on an engineering
design: (in English)
1) I am competent to decide if this design is safe and meets applicable design
standards.
2) I have examined this design in sufficient detail to make this determination.
3) Based on study and in my professional opinion this design is safe &
meets all applicable standards.
4) By affixing my seal, I personnaly certify that this design and my study
of it meet the above criteria.
While the general public is often only aware of (3), all elements are
actually present and failure of any element is grounds for censure/suspension
/revocation of a professional license - in fact most of the board actions that
I see result from defects in (1) or (2).
It should also be mentioned that in many organizations, often only the
Chief Engineer needs to be licensed. I would suspect that a Software
Engineering license would be much the same.
In short, I can see a very real need for such a licensing requirement,
not globally but for those engaged in approval of critical or safety-related
projects. The major problem will come from the certification process itself
given the bewildering array of platforms, embedded micro-controllers, and
languages. It will not be trivial to impliment but is something that needs to
be done.
A. Padgett Peterson, P.E.
Re: Licensing of Software Engineers
Christopher R Riley <chris@mtuxo.att.com>
Tue, 30 Jul 91 11:52:52 EDT
I think there is a misunderstanding of who is to be licensed. I have seen
copies of the text that says "software 1[engineer] ________1" and
another that says "software 1[engineer] designer1". The footnote at the
bottom of page 1 of my copy says:
EXPLANATION--Matter enclosed in bold-faced brackets [thus] in the
above bill is not enacted and is inteded to be omitted in the law.
Matter underlined thus is new matter. Matter enclosed in
superscript numberals has been adopted as follows:
1 Assembly ACP committee amendments adopted June 13, 1991.
2 Assembly floor amendments adopted June 24, 1991.
Some people's text does not have the word designers, but just the
underlining (probably from backspace-underscore pairs in the text).
I think the bill is intended to license software designers. My question is,
does this mean that all people who write any piece of software code is a
software designer, and thus must be licensed?
Chris Riley chris@mtuxo.att.com
Re: New Jersey "software engineering" registration legislation (J.M.Ritter)
Joseph Beckenbach {Adapter Software Release Engr} <jerbil@ultra.com>
Mon, 29 Jul 91 18:48:09 PDT
The two most relevant points here:
1) How does this Act differ from any other Engineering Licensure within NJ?
Answer: I don't know, but someone with a reasonable lawyer in New
Jersey could probably find out some information for us. The language seems
loose (regulating all practicing "software engineering"), but then we need
to compare it to other language being used to regulate, eg, civil engineers.
To my mind, it's a large step in the right direction. Not everyone
who's wrapped a bandage needs to be a licensed medical practitioner, just
those who intend to make a living of it as a main provider. At least, that's
a simplification of my understanding of licensure.
2) How does this affect the "common practitioner"?
Answer: Again, I don't know. However, looking at the parallels with
other professions, I would venture my layman's interpretation: only those
with proper qualifications may legally use the word "Engineer" to refer to
themselves professionally.
It does mean that all the job descriptions titled "Software
Engineer" will need to turn into "Member of Software Technical Staff".
I'll try to look over it a bit more before adding more humble opinion
to the network. Meanwhile, it needs comparison to other Engineering
Licensure statutes, such as for civil engineers and architects.
Joseph Beckenbach
Californian "software-engineer-wannabe"
Test programs not programmers, but license software ENGINEERS!
Joseph Beckenbach jerbil@ultra.com 408-922-0100 x246
Flawed assertion in RISKS-12.08
Mark Seecof <marks@capnet.latimes.com>
Fri, 26 Jul 91 11:49:15 -0700
I'm catching some (well-deserved) heat about my assertion that it would
"obviously" be impossible for a large majority of drivers to be "(much) better
than average." This was poorly expressed, especially if "average" means
"arithmetic mean" (of scores in some simple metric).
I should have written out my opinion that an "ability score" versus population
plot would reveal a typical "bell-shaped curve," that most drivers would have
scores near the middle, that few drivers would score very far above the median,
that by definition half of them would score at or below it, and that biases
(thank you, Mr. Tanner <mtanner@gmu.edu>) in drivers' perceptions of other
drivers on the road would factor into their self-assessment of superiority in
such a way as to render it over-optimistic.
[Messages were received from Tim Smith <ts@cup.portal.com>:
"1 1 8 8 8 8 9 9 9 9. Note that the average of these 10 numbers is 7. Note
that a large majority of them are above average. This is obviously possible."
... and Jeremy Grodberg <lia!jgro@fernwood.mpk.ca.us>:
"And when it looks like statistics are *not* being abused, remember that
97.325% of all statistics are made up." ]
Re: Risks of Posting to RISKS (Dunlop RISKS-12.06)
The Polymath <hollombe@ttidcb.TTI.COM>
Thu, 18 Jul 91 15:42:44 -0700
There seems to have been an enormous misunderstanding here. That, or lawyers have gotten into the works, somewhere. I hope I can clear things up to everyone's satisfaction. }In RISKS 12.02, Jerry Hollombe describes our publication of his 1989 RISKS }posting about the "censorship" of rec.humor.funny at Stanford University. Mr. }Hollombe's piece was reprinted (with his permission) in Charles Dunlop and Rob }Kling (eds), _Computerization and Controversy: Value Conflicts and Social }Choices_ (Boston, Academic Press, 1991, ISBN: 0-12-224356-0). (See pp.376-379). [Further details omitted] Absolutely true. I gave my permission. I had no complaints then and I have none now. In fact, I'm quite proud of having been included in the book and have been calling my friends' attention to it. Some have expressed interest in obtaining copies for themselves. (I'm reading through it as time permits. So far I've found it interesting and thought provoking). }... However, we did not effectively anticipate this new controversy }about computerization: one's ability to fairly reprint RISKS (or any BBS) }postings after posters have given explicit permission! I certainly never intended to raise such an issue. Even if I did have a change of heart (which I didn't) I like to think I'm honorable enough to live with the consequences of my actions. My word, once given, stands. I would not retract it even if I wanted to and could. You had my permission to publish. That stands. } Unfortunately, Mr. Hollombe attributes his problem with the reprinting of }his RISKS posting solely to publishers and editors, and he conveniently ignores }his control over the publication. In RISKS 12.02 he writes: } } >The risk? The words we exchange here aren't as ephemeral as they may } >appear on a VDT screen, so be careful what you say and how you say it. } >You never know who might decide to package and ship it to a customer. } >(-: } } This complaint strikes us as unfair. ... This was not intended as a complaint and I'm amazed, and horrified, to see it interpreted as one. As I said in my posting, and have repeated here, I gave my permission for my words to be published. I've had no regrets about that, then or since. Very much the contrary, in fact. What I was attempting to convey was the idea that many people read, keep copies of and even disseminate what's posted here. It therefore behooves us to write thoughtfully, at least. On a more mundane level, I might have been more reluctant to give permission to use my posting if it had been full of spelling and grammatical errors. I was NOT complaining about the fact of its publication. }... We believe that ... }we were VERY FAIR to Mr. Hollombe. I believe you were more than fair. I'd say you were outright generous. I'm not complaining. Tell your lawyers to relax, if that's the problem. I'm not going to sue you. I like being published. Really. I don't know what more I can say at this point. Please feel free to contact me directly if you have any further questions about the matter. I'll be happy to reconfirm my permission to publish, in writing if that will help. (Alas, it seems the old risk of being misunderstood in written media is alive and well. That's no one's fault. It's just the nature of the situation). Jerry Hollombe, Citicorp, 3100 Ocean Park Blvd. Santa Monica, CA 90405 {rutgers|pyramid|philabs|psivax}!ttidca!hollombe (213) 450-9111, x2483

Report problems with the web pages to the maintainer