The RISKS Digest
Volume 5 Issue 28

Wednesday, 12th August 1987

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Certification of software engineers
Nancy Leveson
Re: Secrecy About Risks of Secrecy
Maj. Doug Hardie
Russell Williams
Jeff Putnam
Eliminating the Need for Passwords
Lee Hasiuk
Re: Risks of automating production
Richard A. Cowan
James H. Coombs
'Mustn't tire the computer!'
Scott E. Preece
Rick Kuhn
Re: NASA wet computers
Eugene Miya
Halon
Dave Platt
Steve Conklin
Jack Ostroff
LT Scott Norton
Scott Preece
Railway automation
Stephen Colwill
Employment opportunities at MITRE
Marshall D. Abrams
The RISKS Forum is moderated. Contributions should be relevant, sound, in good
taste, objective, coherent, concise, nonrepetitious. Diversity is welcome.
Contributions to RISKS@CSL.SRI.COM, Requests to RISKS-Request@CSL.SRI.COM.
FTP back issues Vol i Issue j from F4.CSL.SRI.COM:<RISKS>RISKS-i.j.
Volume summaries for each i in max j: (i,j) = (1,46),(2,57),(3,92),
4
97
Info on RISKS (comp.risks)

Certification of software engineers

Nancy Leveson <nancy%cb6.uci.edu@ROME.UCI.EDU>
Wed, 12 Aug 87 21:16:16 -0700
I would like to raise the issue in this forum of certification of software 
engineers. Certainly, someone who has built a model aircraft would not be 
considered (or consider themself) competent to design a commercial jet.  
Yet those who have written a few BASIC programs as a hobby often seem to 
have no such qualms about their education and abilities.  High school students 
tell me about their jobs building important software systems.  As a contrast,
engineers must usually satisfy minimal educational requirements, and 
engineering projects for critical systems often have a requirement for a 
professional engineer to be involved.  Although the ACARD report suggestions 
were regarded as extreme by many in the computer science field, they are not 
necessarily much more than is routinely expected in other related professions.

For example,  System Safety Engineering is a field that is about the same
age as computer science or perhaps even somewhat younger.  There are, however,
already certification programs.  And as an example of what can be done
for critical systems,  there is a MIL-STD-1574A (USAF): System Safety Program 
for Space and Missile Systems that mandates that Air Force space and
missile projects have a "Qualified System Safety Engineer" to manage the
system safety program and be "responsible for the effective implementation 
of the following tasks ... [too many to cite, but essentially the tasks 
necessary to implement a system safety program]" and who is "directly 
accountable to the program manager [i.e., the US Air Force program manager] 
for the conduct and effectiveness of all contracted safety effort for the 
entire program."

A Qualified System Safety Engineer is defined as someone who must meet the 
following requirements:
    "A technically competent individual who is educated at least to the
     bachelor of science level in engineering or related applied science
     and is registered as a professional engineer in one of the states
     or territories of the United States or has equivalent experience
     approved by the Purchasing Office.
        This individual shall have been assigned as a system safety
     engineer on a full-time basis for a minimum of four years in at
     least three of the six functional areas listed below:
             System Safety Management
             System Safety Analysis
             System Safety Design
             System Safety Research
             System Safety Operations
             Accident Investigation

I don't know of any similar requirements in government standards for managers
of software engineering programs on critical projects nor any similar 
assignment and acceptance of direct responsibility and accountability to 
the customer for the *effectiveness* of the software engineering effort.
Does anybody else?  Am I wrong in my observation that under-qualified people 
are sometimes doing critical jobs and making important decisions?  Do you 
agree that some type of certification is necessary?  If so,  there are 
still many open issues e.g.,  should there be minimum educational 
requirements and what kind, should there be minimum experience requirements, 
should there be examinations, should certification be in particular application 
areas (dp, scientific, safety-critical real-time, etc.) or perhaps in 
particular subareas (systems analysis, design and coding, QA, etc.), should 
there be continuing education requirements, should there be minimum 
requirements for those teaching software engineering and certification of
educational programs, ...

Nancy Leveson


Re: Secrecy About Risks of Secrecy (RISKS-5.26)

"Maj. Doug Hardie" <Hardie@DOCKMASTER.ARPA>
Wed, 12 Aug 87 12:46 EDT
> From: Jerome H. Saltzer <Saltzer@ATHENA.MIT.EDU>
> So the decision must be accompanied by a plan to shore things up, either
> by fixing the flaw, getting the information moved to a safer place, or--
> a common solution--putting other barriers around the flawed system...

This is practical when you are dealing with a single system.  However, the
number of DEC VAXs (as an example) that handle sensitive information is too
large to identify.  There currently exists no way to even tell all the
administrators using any particular system that they have a problem.  While
additional barriers are probably not prohibitive for one system, when
multiplied by the number of systems in use throughout the government, I
expect that cost would be unbearable.
                                                   — Doug

     [... and the cost of each administrator and user on each system
     keeping his head in sand may be even greater...  PGN]


<ucbcad!ames.UUCP!lll-tis!elxsi!beatnix!rw@ucbvax.Berkeley.EDU>
Tue, 11 Aug 87 08:18:31 pdt
      (Russell Williams)
To: ames!CSL.SRI!RISKS@csl.sri.com
Subject: Secrecy of Secrecy:  A vote for disclosure

I just want to vote, very strongly, for disclosure of operating system
security flaws.  Non-publication of these flaws assumes that there is some
other system for distributing such information only to those of us in
industry responsible for design and implementation of reasonably secure
operating systems.  If there is such a system please let me in on it (after
a suitable security check, of course); if not, I'm confident our customers
would prefer the situation where 10 potential penetrators *and* I know
the information, to the situation where only 5 potential penetrators know it.

Russell Williams, ELXSI Super Computers, San Jose
...{ucbvax!sun,lll-lcc!lll-tis,altos86}!elxsi!rw


security discussions

<putnam%thuban.tcpip@ge-crd.arpa>
12 Aug 87 11:42 EST
The question about open discussion of computer security flaws brings
up an interesting point that I have not seen mentioned, that is, that
the people who understand such flaws (often the people who try to find
them) are not as often as we might like the people who administer the
systems.  In many cases I have seen the systems managers are not 
prepared, either by interest, temperament, or background, to understand
the security problems and react properly to them.  (Not that this is
the case everywhere, but it is certainly the case in many situations.)

Such system managers are the types who tend to react to security holes
by trying to suppress any information about them - not by trying to
fix them.  They are also quite likely to respond inappropriately by 
imposing severe security measures in the wrong places - usually impeding
the real purpose of the machines, and rarely actually fixing the problems
(indeed, sometimes this "added security" makes the problem worse).  

The question of publicizing security holes is thus further complicated.  If
the problem is publicized, will it spur some people to take (often
inappropriate) measures that will inconvenience (or worse) the users of the
machine in a misguided attempt to patch the problem?
                                                        jeff putnam

    [We have also had various cases of people who tried vainly to convince
    their administrators that there were problems, and then got fired when
    they demonstrated the problems...   PGN]


Eliminating the Need for Passwords

Lee Hasiuk <ucbcad!ames.UUCP!rochester!srs!lee@ucbvax.Berkeley.EDU>
Wed, 12 Aug 87 10:07:33 EDT
Without trying to continue the discussion on passwords, access codes, credit
card numbers, etc, I would like to relate a very interesting article about
eliminating the need for people to ever put their actual code in the open.
The article was about 'Zero Knowledge Proofs' and appeared in the Science 
Times section of the New York Times earlier this year.  

It appears that an Israeli scientist has developed an authorization system 
based on such proofs that allows a person to verify that they are the 
owner of an account, without having to give away the 'account number' or any
other information which allows them to be impersonated.  The basic idea is to
give the account owner a complex graph which is successfully colored so that
no two connected nodes use the same color.  It is easy to generate such a
graph, but given a complex graph, coloring it is not, in general, possible
to color it successfully.

A device checking authorization gets to see the graph, but the color of the
nodes is hidden.  The device picks two connected nodes at a time and is
shown that they are, indeed, colored differently.  After each pair has been
shown, the colors used in the graph are interchanged so that after many tries
the device can be quite sure that the graph is colored, but have no clue
as to a coloring scheme that works (which when combined with the graph,
represents the actual account number).

The article stated that such a scheme could be applied to passwords, and
hypothesized a computer that, rather than ask for a login password, asked
a series of questions that would change each session.  The user would know
a scheme, based on the questions and his 'password' to produce answers that
would let the computer know with very high certainty that they were the
authorized user.
                                   Lee Hasiuk

     [Also note crypto authenticators such as the Sytek Challenge mechanism.
     But also note that access has been gained, some underlying operating
     systems may force you to assume that everyone is equally trusted (or
     untrusted)...  PGN]


Re: Risks of automating production

Richard A. Cowan <COWAN@XX.LCS.MIT.EDU>
Mon 10 Aug 87 03:47:58-EDT
    From: "James H. Coombs" <JAZBO%BROWNVM.BITNET@wiscvm.wisc.edu>
    The folklore has always been that computers, especially in the form of    
    robots, would one day render blue collar workers obsolete...

It is true that automation has not rendered workers obsolete, or resulted in
huge cost savings.  As you note, the main impact of automation so far is
that it results in greater central *control* over the production process.
In fact, this was one of the principal motives for automation from the
start: management wanted greater control over labor.  (By "management" here
I mean high-level managers who are removed from day-to-day production.)

Even though the worst fears have not been realized yet, that is no reason to
be optimistic.  Although automation thus far has been rather awkwardly
implemented (requiring nearly as many workers as before), it has set the
stage (by routinizing work and thus deskilling workers) for a new phase in
automation that actually gets rid of the work force.  As David Noble said in
his (highly recommended) book Forces of Production, "Men behaving like
machines paved the way for machines without men."

    ...  The speculation now is that managers are more likely to be displaced
    than workers, since it is the managers who tend to be responsible for
    inventory management and accounting.

The "managers" to be replaced here are probably quite low-level supervisors.
But the idea that automation hurts management *in general* (while the
workers make out fine) would be a bit silly.  In the end, I believe that
automation will cause incredibly disruption and suffering unless there is
also a dramatic shortening of the work week.

Check out the Noble book — Oxford University Press, 1984!
                                                                -rich


Re: Risks of automating production

"James H. Coombs" <JAZBO%BROWNVM.BITNET@wiscvm.wisc.edu>
Mon, 10 Aug 87 13:09:17 EDT
   From: Richard A. Cowan <COWAN@XX.LCS.MIT.EDU>
   Although automation thus far has been rather awkwardly implemented 
   (requiring nearly as many workers as before), it has set the stage 
   (by routinizing work and thus deskilling workers) for a new phase in
   automation that actually gets rid of the work force.

I have been confused from the beginning about what it is that the workers do
now.  I suppose that it could be little more than filling parts buckets for
machines or something like that, in which case their work would be even more
routine than before.  I had this hope, however, that they were servicing the
machines, in which case their work might be significantly more skilled and
interesting.  Can anyone give more details on this?

Also, I wonder what happens to the managers at various levels.  Does their
work become more interesting and rewarding, or less?  I'm sure that for
some, the "planners," work becomes less stressful.  What about the others?
--Jim 


'Mustn't tire the computer!'

Scott E. Preece <preece%mycroft@gswd-vms.Gould.COM>
Wed, 12 Aug 87 09:34:52 CDT
  "A. N. Walker" <anw%maths.nottingham.ac.uk@Cs.Ucl.AC.UK>
> "It is true that we tell counter staff at branches to make enquiries if
> cheques are cashed on three consecutive days.  But to instruct the
> computer to do this on ATMs would be far too time-consuming," he said.

Well, while it's true that it shows great lack of insight not to do anything
about a known potential problem (especially when the withdrawals were at the
daily max, which should always be a flag for caution), it's less than clear
what they should do.  A human teller can ask for ID; most existing ATMs
can't.  If you just fail the transaction, you're going to make customers
unhappy (how often we object to restrictions intended to protect us).  It
would be a good idea for all ATMs to have cameras and for human intervention
to be triggered by any of a number of suspicious circumstances (I'd say max
withdrawals on two successive days was suspicious), but that's also going to
raise the cost of service...

scott preece, gould/csd - urbana, uucp: ihnp4!uiucdcs!ccvaxa!preece

     ["The real risk here is having people who do not know anything make
     pronouncements about what computers can and cannot do, including 
     the bank manager...  Why would anybody store the information unsorted
     and unorganized and then search brute force through every record?!"
     ... anonymous commenter on Andy Walker's message]


Re: "Mustn't Tire the Computer"

Kuhn <kuhn@ICST-SE.ARPA>
Wed, 12 Aug 87 10:45:24 edt
[...] When I worked on ATM and systems software for banks and savings &
loans, [for UK readers: US savings & loans are descendents of your Building
Societies] I learned that some of them run what they call a "Kiting Suspect
Report" that attempts to spot check kiting schemes.  One of my customers
detected a check kiter three or four days after he had started cashing 8-10
checks a day at area supermarkets against his accounts at my customer and
another S&L.  I don't know if charges were ever filed, but the other S&L and
all the supermarkets were notified and the scheme stopped.

--Rick Kuhn, National Bureau of Standards


Re: NASA wet computers

Eugene Miya <eugene@ames-nas.arpa>
Tue, 11 Aug 87 23:43:31 PDT
[...] Practically, all of major computing computing facilities in NASA (not
the really small distributed machines) have wet covers.  This is because
most Centers were constructed before halon was put to wide use and so fire
protection is provided via "wet" protection.  The problem comes in upgrade.
Mission Control Centers are used practically all the time, making upgrades
tough.  I asked this question at a meeting today for RISKS.  It comes from
"very reliable sources."

Also note that lots of Centers are in hurricane areas as well.
                                                                 --eugene


Halon risks in computer rooms, etc.

Dave Platt <dplatt@teknowledge-vaxc.arpa>
Wed, 12 Aug 87 10:43:35 PDT
    [Many messages on this subject... I have excerpted to reduce overlap.  PGN]

As I understand it, Halon's fire-killing ability comes from two
characteristics of the gas:

1) It excludes oxygen, when used in sufficiently large amounts.  This
   is one of the ways in which carbon dioxide kills flames (its cooling
   effect being the other), and thus any oxygen-exclusion risks from a
   Halon system would also probably exist in a CO2 system of similar
   gas-magnitude.

2) Halon interferes, chemically, with the combustion process.  I
   believe that it ties up the free (ionized) radicals in the flame,
   and blocks one or more of the partial-oxidation steps.  Because this
   is a chemical effect, and not a physical one such as oxygen
   exclusion, it can occur at relatively low concentrations of Halon in
   the atmosphere.

I saw a demo at NCC a couple of years ago, in which a Halon-system
manufacturer's sales-rep stood in a closed booth smoking a cigarette.
He then let loose a 1-second burst of Halon, and the cigarette went
out.  For the next several minutes, he stood in the booth, breathing
normally, and demonstrated that he was unable to relight his cigarette
(he stuck a Bic lighter outside the booth through an arm-sized hold,
flicked it alight, moved it gently back within the booth, and it went
out the moment it entered the Halon-charged atmosphere).

I've been told that the major risk to humans from being in an area that
has been Halonized isn't the Halon itself... instead, it's the risk of
inhaling the rather nasty partial-combustion products of the plastics,
other petrochemicals, and miscellaneous combustibles that were burning
when the Halon was released.  Because the combustion was interrupted,
these chemicals won't have been burned to completion (to water, carbon
dioxide, etc.);  they may include lots of carbon monoxide, hydrochloric
or other acids, and similar things that aren't conducive to long life
and respiratory health.

Please don't take this as gospel... and kids, don't try this at home.

X-Uucp:  /decwrl|hplabs  \
        { seismo|uw-beaver}!teknowledge-vaxc.arpa!dplatt
         \ucbvax|sun     /
X-Usnail:  Teknowledge Inc., 1850 Embarcadero Road, Palo Alto CA  94303
X-Voice:   (415) 424-0500


HALON

<uwvax!seismo!ingr!tesla!steve@ucbvax.Berkeley.EDU>
Wed, 12 Aug 87 11:57:41 EDT
Several years ago, at a previous employer, I designed fire control systems,
including many HALON systems. HALON does -not- extinguish the fire by
displacing the oxygen in the air. It interferes with the actual
combustion process. The amount of HALON required to disable combustion is
only around 3%-7%, which leaves plenty of oxygen for humans. (The following
is all from memory and may not be entirely accurate) There are some effects
to humans from prolonged ( >1 hour ) exposure to HALON at this concentration.
I think that they are on the order of headaches, etc. and dissappear when 
exposure ceases. There were no long term effects known at that time.

The greatest danger in using HALON that I know of results from exposure
to the gas as it is discharged from the nozzles into the room. HALON systems
are designed to get the gas dispersed in the required concentration as
rapidly as possible. The high flow rate of the gas and the expansion at
the nozzle results in very low temperatures in the discharged gas.
I heard a story that a government agency required a full discharge test
of a HALON system in a computer room, and to shield some installed equipment,
someone held a sheet of plywood between one of the nozzles and the equipment.
He suffered severe frostbite on all eight fingers.

Steve Conklin, Intergraph Corp., Huntsville, AL 35807, (205) 772-6618
{uunet,ihnp4}!ingr!tesla!steve


halon, sprinklers, etc.

Jack Ostroff <OSTROFF@RED.RUTGERS.EDU>
12 Aug 87 11:48:01 EDT
I used to be a volunteer fireman when Halon became popular, [...]

At one computer center I know, the fire extinguisher system consists of
CO2 jets below the artificial floor.  It has never gone off, but rumor has 
it that the jets will throw the floor tiles a small way, before flooding the
entire basement with CO2 in several minutes.  This DOES replace the oxygen,
and everyone who works there is told to RUN LIKE HELL if they hear the fire
alarms go off.

Jack (OSTROFF@RED.RUTGERS.EDU  or  ...!rutgers!red.rutgers.edu!ostroff)


Halon to protect computer installations

"LT Scott A. Norton, USN" <4526P%NAVPGS.BITNET@wiscvm.wisc.edu>
Wed, 12 Aug 1987 15:10:03 PDT
[...] Of course, Halon does have some risks.  Nothing's perfect.  The
immediate risk to people is that at very high temperatures, Halon is
chemically changed to phosgene, a strong poison that burns the lungs.  The
temperature in a small class-A fire in a computer room would not produce
hazardous amounts of phosgene, but a large conflagration might.

The reason why the fire department is not satisfied with just Halon is that
it is a one-shot system.  That's O.K. if a fire breaks out in the protected
area, the Halon system activates promptly, and the fire goes out.  But there
are some failure modes:

1.  If the room is not sealed, the Halon will be blown away.  If the initial
source of combustion is still present, the fire will reflash.  Normally,
Halon systems shut off ventilation and air conditioning before discharging
the Halon, but do you trust those relays to work?

2.  If the fire spreads to the area from outside, the Halon system will only
push it back once.  Once the Halon dissipates, the fire can move in again.

3.  Halon systems are complicated.  Contrast that with a sprinkler system:
Water in a pipe, Wood's metal keeping it there.  In a fire, the metal melts
and water sprays out.

So, the tradeoff in computer room fire protection is that Halon is very safe
to the equipment and to people in the event of an accidental discharge,
sprinklers are safe for the people but not the equipment in an accidental
discharge.  In a small fire under normal conditions, Halon is effective at
extinguishing the fire but doesn't hurt the equipment and has a low risk to
the people (probably lower than that of the carbon monoxide from the fire).
But in the case of a big fire, Halon by itself can not be counted on to
fully extinguish the fire.

When I was Tactical Data Systems Officer on a cruiser, I worried
about firefighting water, since the Navy uses sea water to fight
fires.  So don't complain about fresh water spraying into your
equipment, it will just wash troublesome dust off the circuits. :-)

LT Scott A. Norton, USN
Naval Postgraduate School
Monterey, CA 93943-5018
4526P@NavPGS.BITNET


Fire protection in the computer room

Scott E. Preece <preece%mycroft@gswd-vms.Gould.COM>
Wed, 12 Aug 87 09:45:10 CDT
[...]  We had a false alarm set off the halon in our machine room.  The
amount of warning the system gives before discharging the gas is impressive.
So is the effect on the computer room (lots of gas, lots of air motion, lots
of dust and displaced ceiling tiles).  Much preferable to water. (But also
much more expensive — that false alarm was not cheap by any stretch of the
language.)

scott preece, gould/csd - urbana, uucp: ihnp4!uiucdcs!ccvaxa!preece


railway automation

Stephen Colwill <mcvax!praxis!steve@seismo.CSS.GOV>
Wed, 12 Aug 87 11:09:00 WET
  > Scott E. Preece <preece%mycroft@gswd-vsm.Gould.COM>
  > A train is a large object whose position is only approximately known
  > at best, whose speed is hard to change and is changed using systems
  > whose performance is subject to wild fluctuations with age, maintenance
  > and track conditions, and whose surroundings are totally invisible to
  > the controlling system.

In the context of the automation of an existing network using old rolling
stock, I take your point.  My original posting focussed, however, on the
London Docklands Light Railway.  The LDLR was purpose-built from scratch, the
rolling stock is new and also purpose-built.  It seems to me that the designers
had ample opportunity to install precisely the kind of sensors you describe.
I cannot believe that these sensors are beyond modern technology.

Stories of trains jumping buffers give the lie to any hope that the
opportunity afforded by the design of a virgin system has been properly
taken.  It is this that I find so disappointing.

Steve Colwill, Praxis, Bath.


Employment opportunities at MITRE

Marshall D. Abrams <abrams%community-chest.mitre.org@gateway.mitre.org>
Wed, 12 Aug 87 15:40:37 -0400
We are looking for for people to join the Information Security Group at The
MITRE Washington Center.  As you probably know, MITRE is a not-for-profit
Federal Funded Research and Development Center, founded at the request of
the Air Force.  We work almost exclusively for the government, both for
civil and military agencies and departments.

MITRE support to sponsors includes:  security requirements analysis and
definition; security assessment of existing systems; information security
design; procurement support including document development, evaluation, and
contractors supervision; policy development support; prototype implementation;
verification; and certification and accreditation planning.  We have a good
mix of theory, policy, and real applications which has proven to be very
synergistic.  We are currently involved with network and database security
both for development of evaluation criteria, and for applications to
operational systems.

I would welcome the opportunity to discuss our work program, and staffing
requirements with people at all levels of qualifications.  U.S. citizenship
is required.

Marshall D. Abrams, phone: (703) 883-6938, The MITRE Corporation, 
7525 Colshire Drive, Mail Stop Z670, Mc Lean, VA   22102

Please report problems with the web pages to the maintainer

x
Top