The RISKS Digest
Volume 5 Issue 30

Friday, 21st August 1987

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Role of NISAC in Reporting Vulnerabilities
Bruce N. Baker
Indemnification of ATC manufacturers
Bill Buckley
Bank Computers and flagging
Joseph I. Herman
Re: Certifying Software Engineers
Mark Weiser
Nancy Leveson
Info on RISKS (comp.risks)

Role of NISAC in Reporting Vulnerabilities

Bruce N. Baker <BNBaker@KL.SRI.Com>
Tue 18 Aug 87 14:30:40-PDT
In all the discussions that have taken place since Peter Denning's submission
about the paradox of reporting vulnerabilities in computer systems, I have 
been surprised that no one has mentioned DoD Instruction 5215.2, dated
September 2, 1986, "Computer Security Vulnerability Reporting Program (CSTVRP)"

The Instruction:

     1.  Establishes a Computer Security Technical Vulnerability Reporting 
Program under the direction of the National Security Agency, National 
Information Security Assessment Center (NISAC).

     2.  Establishes procedures for reporting all demonstrable and repeatable
technical vulnerabilities of Automated Information Systems (AIS).

     3.  Provides for the collection, consolidation, analysis, reporting or
notification of generic technical vulnerabilities and corrective measures in
support of the DoD Computer Security requirements in DoD Directive 5200.28,
"Security Requirements for Automatic Data Processing (ADP) Systems," 
December 18, 1972.

     4.  Establishes methodologies for dissemination of vulnerability
information.

Some pertinent excerpts are as follows:
...
     The program shall be focused on technical vulnerabilities in commercially
available hardware, firmware and software products acquired by DoD...
...
     The reporting portion of the program is also available on a voluntary
basis for the non-DoD AIS community.
...
     The NISAC shall maintain a central repository of computer vulnerability
information.
     Information on the technical vulnerabilities of AIS's shall be protected
from unauthorized disclosure while ensuring it is disseminated to individuals
responsible for the security of an AIS.
...
     The NISAC shall:
...
     Establish procedures to encourage the voluntary submission of technical
vulnerability information from non-DoD AIS users.
     Establish and maintain a data base of technical vulnerability information
having security commensurate with its sensitivity.
     Establish procedures for transmitting technical vulnerability information
to affected manufacturers for corrective action.
...
     Vulnerability summaries in narrative form should be addressed to:
       National Security Agency
       Ft. George G. Meade, MD 20755-6000
       Attn:  Chief, S2
     An inner envelope should be marked:
       Attn:  CSTVRP/S2093
...
     Any technical vulnerabilities in products appearing on the Evaluated
Products List will be referred to the responsible vendor for correction.
Appropriate warnings will be disseminated.
...
     Vendors may be provided the technical details of reported vulnerabilities
to make corrections, but shall not be provided information about the specific
site(s) concerned, methods of discovery, or other information which could lead
to increased site vulnerability without the express written approval of the
Head of the DoD Component or the DAA.

Comments:  It should be noted that this Instruction applies primarily to
systems that have been evaluated against and certified to be technically
compliant with the DoD Trusted Computer System Evaluation Criteria (the Orange
Book) and, of course, is directed primarily to Defense related agencies.
Nonetheless, I assume that they wish to receive vulnerability information on
any commercially available product.  All vulnerability information received
under the instruction is classified at least CONFIDENTIAL so that affects the
way the information is transmitted to NSA, especially for any organizations
authorized to handle classified information.
     A copy of DoD Instruction 5215.2 can be obtained from:
       U.S. Naval Publications and Forms Center
       5801 Tabor Ave.  Attn: Code 301
       Philadelphia, PA 19120
                              Telex # 834295; Phone # (215) 697 3321

     [The CONFIDENTIAL aspect is of little help to most RISKS readers, and
     probably explains why nothing has appeared here before.  But there are
     unclassified evaluations, and the Evaluated Products List is worth noting.
     In particularly, it contains evaluations of Gould's UNIX-based system
     UTX/32S and VAX VMS 4.4, for example.  A very useful article for people
     interested in UNIX vulnerabilities and how Gould has sought to fix them 
     is Gary Grossman's "Gould Computer Systems Division Secure UNIX Program
     Status", 9th National Computer Security Conf., NBS, Gaithersburg MD,
     September 1986, Addendum pp 27-37.  I get a lot of requests for the
     kind of information found in that article, so I thought I might as
     well mention it here.  (The 10th National is coming up in Baltimore,
     21-24 September 1987.)  PGN]


Indemnification of ATC manufacturers

<cmpuchm@lll-winken.arpa>
Mon, 17 Aug 87 11:23:32 pdt
The July 20 issue of AW&ST had an editorial viewpoint from a former FAA
offical describing a situation where manufacturers of air traffic control
equipment are shying away from bidding on ATC contracts because of the large
and vague liability issues. This industry has formed a group to try to define
liability and provide indemnification. This example reflects the more general
situation where the liability risks of 'doing business' will prevent automation
in a growing number of 'critical application' industries. As the liability
issue comes to a head in other industries we will have a situation where the
non-use of computers, instruments, and software will increase the risks to
society.

We are and should be responsible for our actions and products as computer
professionals. But if liability is undefined and indemnification non-existent,
how many of us will be willing to work on 'critical' applications, when our
companies, jobs, etc. are 'on the line' owing to potential errors or
negligence of third parties, in addition to our own errors or omissions.

Bill Buckley, Compuchem, Inc., Hayward, CA.  ATT : 415/489-6514
      {lll-lcc,wucs1,uwvax,pryamid,isis,princeton,uunet}!lll-winken!cmpuchm


Bank Computers and flagging

Joseph I. Herman (Joe) <DZOEY@UMD2.UMD.EDU>
Sun, 16 Aug 87 14:13:06 EDT
I have been reading with interest (no pun intended) the stories of ATM
problems.  I find that the data processing problems of the bank I deal
with, 1st National, are not confined to their ATM system (which has actually
been extremely reliable).  The day before I went on vacation, I transfered
some money between a couple of my accounts.  I did this the old fashioned
way.  I walked into the bank, sat down at one of the bank officer's desks
and filled out the paper work to transfer funds between accounts.  I then went
merrily on vacation.    When I returned, I found that the funds had been
withdrawn from one account and (according to my ATM card) had never appeared
in my other account.  I went in and talked to the bank manager.  They
produced a printout of my account and it showed the money was there.   Feeling
a bit foolish, I thanked them and went back out to the ATM machine.  The ATM
still showed the balance as being off.  I went back to the bank manager and
she looked more closely at the printout.  What happened is that when the
bank officer tried to post my transaction, the computer went down.  This
caused the transaction to be what they called "double flagged".  Withdrawls
are always instantaneous, but deposits and transfers take a day to clear.
When the transaction was double flagged, the money never got posted.  It
was still my money and it was in the account earning interest, but I couldn't
get to it.  That's why the ATM reflected a lower balance since ATM's
reflect the balance available, not the actual balance.

I asked the bank manager if I had to come in personally to the bank every
time their computer crashed to check my account.  She said no, double
flags usually expire in 48 hours.  For some UNKNOWN reason, they don't
always expire.  I asked what would have happened if I hadn't come in.
She said that the money would have stayed in my account earning interest,
but I would never have been able to access it.  I asked her why such
exceptions like mine (double flagged for > 100 hours) weren't noted.
She said it doesn't occur often enough for them to check for the condition.

Personally, I always thought that's what was so nice about computers.  They
can check lot's of things very quickly and note extreme cases that a human
being might miss.  And what really got to me is that they (the bank manager
and apparently the programming department) DON'T KNOW why some double flags
are never removed.  That really worries me.  I think someone in their
programming department should be shot.
                                            Warily yours, Joe Herman


Re: Certifying Software Engineers

Nancy Leveson <nancy@commerce.UCI.EDU>
Thu, 13 Aug 87 16:15:48 -0700
  [From Mark Weiser, weiser.pa@Xerox.COM] 
  The article by Nancy Leveson about certifying software engineers mixes
  apples and oranges a bit.  The first part talks about safety engineers,
  the latter part about software engineering managers.  I know of no
  certification for any managerial level person (other than MBA programs,
  if you call that certification).  There are such things as certified
  computer programmers, with specializations in areas like systems.  I
  think the the CCP program is a useful one, although it perhaps does not
  go far enough, being based only on a test.  Although I have a PhD in
  computer science, I took the CCP exam to see what it was like.  I
  passed, but I suspect many of my colleagues would not.
                                                            -mark

Response by Nancy:  I obviously was unclear in my message.  The Qualified
System Engineer requirement is only for the person who assumes the
responsibility for the system safety program (i.e., who acts as technical 
manager or technical lead), the other safety engineers are not required to
satisfy the same requirements, and this person is certified as to
technical ability, not management ability.  It is similar to the
certification as Professional Engineer — very few engineers are so
designated, but safety-critical and important projects often include the
requirement for a Professional Engineer to be involved. The idea is to ensure
that at least SOMEBODY is responsible and accountable that things are done
right and there is somebody on the project with a chance of knowing HOW to
do things right.  I don't know anything about the CCP certification, but get 
the impression that it is a minimal qualification test rather than a 
process to identify the very best.

   [I have grave doubts about some of the certification ideas — and about
   the notion of system safety people, especially if done in the absence of
   intelligent system designs!  PGN]

Please report problems with the web pages to the maintainer

x
Top