The RISKS Digest
Volume 14 Issue 77

Wednesday, 21st July 1993

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Another "time bomb" — MOSS
Dave Horsfall
Flood forecast risks
Douglas W. Jones
Re: Earthquake `early' warning system
Richard Stead
Re: Strasbourg A320 crash: "Pilot Error"
Lars-Henrik Eriksson
Re: ATM Fraud/Databases/Ouch!
Scott Schwartz
Credit Cards on the Internet
Paul Robinson
Re: Medical Reimbursements ...
Michael T. Palmer
Re: DSS as Stamp Tax
A. Padgett Peterson
Steve Bellovin
Jim Bidzos
CPSR Secrecy Statement
Dave Banisar
Announcements of NIST documents
Dolores Wallace
Info on RISKS (comp.risks)

Another "time bomb" — MOSS

Dave Horsfall <dave@eram.esi.com.au>
Wed, 21 Jul 93 21:39:48 EST
A colleague has just informed me that thousands of MOSS systems (a graphics
package) all over the world crashed on 15 Jul, due to some authentication
problem to do with the date.  The "quick fix" was to set the date backwards,
but this introduces its own problems; a lot of software rely on
monotonically-increasing timestamps (were there any problems reported at the
Leap Second on 0000 UTC 1st July?), and some Apollo systems use the time value
to generate "unique" (sic) file system descriptors...

Dave Horsfall (VK2KFU)    VK2KFU@VK2RWI.NSW.AUS.OC
dave@esi.COM.AU           ...munnari!esi.COM.AU!dave


Flood forecast risks

Douglas W. Jones <jones%pyrite@uunet.uu.net>
Wed, 21 Jul 1993 16:48:42 GMT
Living on the edge of the flooded Iowa River, I've been following the river
reports from the Corps of Engineers quite regularly in recent weeks.  They
offer a recording you can call at their local office that gives the current
river flow (in cubic feet per second), the current river stage in town, and
the current statistics for the Coralville Reservoir ten miles upstream from my
house.

This recording was well maintained until the reservoir went over the emergency
flood spillway for the first time ever on the night of Monday, July 5, and
then things began to break down.  Normally, the recording is a deadpan recital
of the numbers, but that night, the woman who recorded the message sounded
like she was in a state of panic, and soon after that, they discontinued the
recording for a week.

By midweek, even the data they were giving to the newspapers was sketchy, they
were reporting the outflow of the reservoir and the current water level, but
not the inflow.  Furthermore, their predictions began to get very sketchy --
it turns out that lightning hit the gauging station upstream from the
reservoir that they use to measure inflow, and without that data, they
couldn't run their computer models to predict outflow.  To quote one corps
official I chatted with while he was looking at the floodwaters down the block
from my house, they'd come to rely too much on their computer models and on
their automatic data collection system.

Another problem they face is that they know the capacity of their reservoir up
to the emergency flood spillway, but nobody every bothered to calculate the
capacity of the reservoir when the water is 4 feet above the spillway; I
gather that the working assumption has been that once water spilled over the
spillway, it would stop rising.  The spillway is 400 feet wide, so that seemed
intuitive, but this is an awfully big flood!

Clearly, the hydraulic models of river systems that the corps uses to manage
their flood control reservoirs need to be made more robust so that they can
handle missing data, and the assumption that water will never rise
significantly over dam spillways needs to be modified!

The Des Moines Register reported a similar but even more disastrous story
about the floods in Des Moines in their Sunday July 18 chronology of the
flood.  On the weekend of July 11, the waterworks director phoned the US
Weather Service for a flood forecast.  They apparently gave him a forecast of
a 22 foot crest on Tuesday, and when the director then told them that the
river was already over 23, they were taken by surprise.

In that case, there was no provision made to compare the flood forecast coming
out of the computer model with the current river data.  This data comes in
from stream gauging stations around the United States by satellite, and to
make a flood forecast, they combine this data with rainfall data.  If flood
forecasts could be computed instantly, there would be no problem, but they
aren't instantaneous and there was apparently no provision made for such a
sanity check on the forecast!
                  Doug Jones  jones@cs.uiowa.edu


Re: Earthquake `early' warning system (Owens, RISKS-14.76)

Richard Stead <stead@seismo.CSS.GOV>
Wed, 21 Jul 93 15:09:23 EDT
Early warning systems have been kicking around the seismological community for
a decade at least.  Most studies of proposed early warning systems have shown
the economic return to be too small to justify development.  California
commissioned a such a study and chose not to pursue early warning on that
basis.  Of course, such analyses cannot include intangibles such as peace of
mind or people temporarily in hazardous situations (eg: holding a pot of
boiling water).

It is true that the system must be automated to achieve any warning.  The
system depends on detecting and identifying the quake in close proximity to
the source.  Then it relies on the speed of light exceeding that of elastic
waves.  The most damaging waves will arrive no earlier than an average
velocity of 4.5 km/s.  This would appear to give 45 seconds warning at 100 km.
However, the system itself takes time.  Without getting into the details, it
is safe to say that time from the quake to the sensors and then processing and
communicating the results might typically take 30 seconds.

This means that sites closest to the quake, those most adversely affected by
it, gain no benefit, and sites far away (more than 200 km or so in
California's case) do not benefit because the shaking is probably manageable.
So there is a narrow range of distance that benefits.  Even so, the system may
be justified.

To answer the risks mentioned - Of course, it would be a rare system that
could truly function without false alarms.  One need only keep the false alarm
rate to a manageable level, and make sure that any actions taken on the basis
of the alarm not be hazardous.  Slowing a train or halting elevators once or
twice a year should not be a problem.  In fact, I would think periodic tests
of the system would be important as well.  Predicting the damage should be
straightforward.  The magnitude of the quake can be estimated to sufficient
accuracy - then a standard magnitude/ distance relation will return a rough
expected intensity.  It would not be hard to be more accurate - one need only
measure the site response at the place where an action will be taken based on
the warning, then program a site correction into the receiver.  It is entirely
up to the receiver to determine expected intensity and at which intensity to
take which actions.  This receiver can be as sophisticated as needed, but it
will receive only magnitude, location and time.  (Later, "verification"
broadcasts could be made that rely on more data but take longer to produce.
Any actions taken at the first notice could be modified then.)

I see two risks.  The first is that most of the scientists involved seem to be
trying to pull a "dual-use" out of this system.  They want to deploy
sophisticated sensors that are good for seismological research, and save and
process mountains of data.  I think that if the system is to work well, such
efforts should be dropped.  Early warning long ago ceased to be a seismic
problem, it is an engineering and political problem.  The sensors to be
deployed should be a simple, reliable and inexpensive as possible, and should
transmit a minimum of data to cut communications costs.  By cutting physical
and communications costs, the number of sensors can be increased and the cost
of the entire program reduced.  More sensors mean more warning time and more
people benefit (up to the limit of the depth of the quakes).

The second risk is that people may rely on the existence of the system.
I would hate to see someone designing something and say "well, if there
was a quake, this would be a problem, but we have the early warning".

Security will be seen by many as a risk, but I think that is addressable by
isolating the system from phone communications and networks and encrypting all
communications.
                    Richard Stead  stead@seismo.css.gov


Re: Strasbourg A320 crash: "Pilot Error" (Mellor, RISKS-14.74)

Lars-Henrik Eriksson <lhe@sics.se>
Wed, 21 Jul 93 07:32:29 +0200
Such a device (ground proximity warning system - GPWS) already exists and is
mandatory on international flights. The Strasbourg A320 was only used
domestically in France - where GPWS is (was?) not required.  For some reason
the airline decided not to install that equipment (or even had it disabled - I
have some recollection that GPWS is standard equipment on the A320).

GPWS will alert the pilot to five different conditions (modes) that are
considered unsafe at low altitude. Excess rate of descent is one of them.

Lars-Henrik Eriksson, Swedish Institute of Computer Science, Box 1263
S-164 28  KISTA, SWEDEN                  +46 8 752 15 09  lhe@sics.se


ATM Fraud/Databases/Ouch! (Smith, RISKS-14.76)

Scott Schwartz <schwartz@groucho.cse.psu.edu>
Tue, 20 Jul 1993 21:47:48 -0400
> Hope YOUR financial institutions protect your PINs as well...

Hah.  My brother lost his wallet six months ago.  Since then he's been dealing
with massive credit fraud, and he's been told that his easiest recourse is to
get a new social security number and a new set of accounts.


Credit Cards on the Internet

"Tansin A. Darcos & Company" <0005066432@mcimail.com>
Wed, 21 Jul 93 01:54 GMT
I sent the following to the Privatising the Internet List in
response to a question about credit card transactions via E-Mail:

>From: Paul Robinson <TDARCOS@MCIMAIL.COM>
Organization: Tansin A. Darcos & Company, Silver Spring, MD USA

Marianne Sweet <sweet@alexia.lis.uiuc.edu>, writes:

> On 17 June 1993 I posted a request for information to this list
> regarding credit card payment over the Internet.  Since I received
> no responses, I am trying one more time with a slightly different
> request.
>
> Does anyone know of (commercial) services on the Internet that
> accept online credit card payment.  I am currently working on a
> commercial application for the Internet and am interested in
> discovering what other services' charge displays look like.

The simple fact of the matter is that using Internet to handle
charge-card transactions is not very common for several reasons:

(1) Soliciting CC transactions might violate the Acceptable Use
    Provisions (doesn't apply if your feed is from a commercial
    internet connection.)

(2) Electronic mail is sent *in the clear* across the network.  While
    it is part of a series of packets, it still can be seen by
    (a) the administrator of your site (b) the administrator of the
    destination site (c) anyone having access to the network connections
    used between the two sites who wants to monitor traffic.

In theory, some bright boy could put a "mail scanner" on the system and
watch for any messages that contain a string of 12 or 16 digits beginning
with 4 or 5 and capture those messages.

This means that someone watching for a message where the text
contains:

I'd like to order a 300 meg hard drive.  Ship to
  Hathan Nale,
  1 Patriot Lane,
  Fourth July, MD 21776

Charge 4000 1776 1492 1993  exp 7/4/2076

Now someone supposedly could watch for this and keep such information.
Rare, but it could happen.

For those who are paranoid, the answer, of course, is to have
privacy-enhanced mail made freely available.  Once people have the
ability to send mail to someone encoded with their private key and
the recipient's public key, so that only the recipient can decode
it, and so the sender can't deny it was sent, we will have problems.

There is probably close to 100 megabytes of transfers going across
Internet every day; 40 meg of this is usenet news alone.  While your
little message might never be noticed, the chance is it could.  Or
you could mis mail it.

For example, say you intended to mail this order to "info-fax@sdi.com"
(For, example, Soft-Disk Inc., a hard-disk seller) and instead, you miss and
send it to "info-vax@sri.com" (it's only off by two characters), instead
of sending the message with your credit card number to the one person at
Soft-Disk Inc, you've just mailed it to the 75,000 readers of Info-Vax,
which is gatewayed into "comp.os.vms".

PS: the above credit card number is fictitious.

Paul Robinson - TDARCOS@MCIMAIL.COM


Re: Medical Reimbursements ... (Frankston, RISKS-14.76)

<m.t.palmer@LaRC.NASA.GOV>
Wed Jul 21 09:01:51 1993
Bob_Frankston@frankston.com wrote:
>Simple ideas like replacing bills that contain a single amount ...

Actually, one of our local doctors has initiated a billing system very close
to what you desire (after years of confusing us with what insurance payment
went with what service, etc).  The general form of the bill now looks like:

  Date     Date/Code    Description            Amount       Amount    Balance

01/01/93    123456      Office Visit            35.00
           01/01/93     Patient Copayment       10.00CR
           02/24/93     Insurance Payment       17.50CR
           02/24/93     Write-off                7.50CR
            234567      Laboratory Work         25.00
           02/24/93     Insurance Payment       18.00CR
                        Coinsurance Due                      7.00       7.00

02/10/93    345678      Consultation             50.00
           04/02/93     Insurance Payment        37.50CR
           04/02/93     Write-off                 4.00CR
                        Coinsurance Due                      8.50      15.50

Several features of this format deserve special mention.  First, note that
events are only listed in chronological order within each "block," and that
the specific office code for each billable service is given.  In the above
example, even though the consultation occurred before the insurance payments
for the original office visit, those payments are listed with the service
that generated them.  Also, notice the existence of explicit write-offs.  Our
doctor is a "preferred provider" with a major health insurance company, and
has agreed to accept the company's Usual, Customary, and Reasonable (UCR)
amounts as payment in full.  Even though the insurer doesn't alway *pay* the
UCR amount, which means we still have copayments and coinsurance, there is
still usually a difference between what the doctor "charges" and what he will
accept as full payment.  This billing format makes that distinction explicit
for the first time.

How does this format reduce RISK?  Simple.  A recent bill (the first in this
new format) seemed a tad high (like $200 or so too much), so we re-created
our entire service history with that doctor in a spreadsheet and used this
format.  We quickly noticed where he had overlooked a previous account credit,
and where he had forgotten to include a write-off for a specific service.
It's probably easy to appreciate the level of frustration and helplessness
we had previously felt with regard to keeping track of the myriad of bills,
copayments, coinsurance, insurance payments, etc. for a couple of doctors
and a dentist (we don't even have KIDS yet!).  I will now keep track of
all health service accounts using the new billing format, and will encourage
our other doctors to do the same.

Michael T. Palmer  m.t.palmer@larc.nasa.gov


DSS as Stamp Tax - Other historical precedents (Seecof, RISKS-14.76)

A. Padgett Peterson <padgett@tccslr.dnet.mmc.com>
Wed, 21 Jul 93 07:57:59 -0400
>From: Mark Seecof <marks@wimsey.latimes.com>
>Subject: DSS as a stamp tax

Possibly a more exact parallel might be found in the "Teapot Dome" incident
during the early 1900's in which Harry Sinclair and a group of oil magnates
induced interior secretary Albert Bacon Fall to grant an exclusive license
for federal oil reserves to their companies (remember Dino the Sinclair
dinosaur ? Sinclair is now part of BP).

Mr. Fall was subsequently sentenced to prison for his part in the activities

                        Padgett


DSS as a stamp tax

<smb@research.att.com>
Wed, 21 Jul 93 12:00:12 EDT
There seems to be a lot of misunderstanding about what happened between
NIST and PKP.  NIST didn't ``give away'' anything to PKP.  Rather,
PKP had NIST over a legal barrel; some sort of deal was necessary.
You can argue over the specific terms, but practically speaking, NIST
had little choice.

The problem is this:  the DSS appears to infringe several different
patents owned by PKP.  (I'm not going to discuss the propriety of
algorithm patents here, a decision that I'm sure our esteemed moderator
will applaud, given his comments about the load.)  These include the
Diffie-Hellman exponential key exchange patent and the Schnorr digital
signature patent.  It could be argued that for various reasons, these
don't apply.  Maybe — but that's far obvious.  At the very least, the
standard would be tied up in court for many years.

NIST can't ignore such patents.  Their own policy supports the propriety
of such things (DSS itself is being patented.)  And the government is
not allowed to ignore private patent rights; that would, I'm told,
constitute an illegal taking of property, as per the Fifth Amendment.

That left NIST with several choices, none great.  They could do nothing,
and not promulgate any standard.  But that wouldn't meet the government's
own need for one.  They could adopt RSA, since the government has free
rights to that patent.  The rest of us would still have to pay royalties
for its use.  Besides, RSA can be used for secrecy, which NIST didn't
want.  (They cite exportability; others cite NSA's desire to spy.
Pick your rationale.)  Or they could ask NSA for a totally new signature
scheme.  Maybe NSA has one — but I doubt very much they're going
to reveal a totally new way to do cryptographic operations.

Finally, NIST could cut a deal with PKP.  That's what they chose to do.  It's
not in any way a plot to enrich PKP; it's simply the only way to preserve the
DSS.
    --Steve Bellovin


Re: DSS as a stamp tax

Jim Bidzos <jim@RSA.COM>
Wed, 21 Jul 93 13:38:09 PDT
Mark Seecof should really should have read the DSS announcement more carefully
before drawing any conclusions.  The announcement said that DSS would be
royalty-free for government use.  Only private-sector use would require
licensing.  He compares it to a "tax."

He writes:

> forcing us to pay PKP everytime we sign something digitally ...

Very wrong.  When someone in the private sector licenses DSS, they pay a
royalty once for the manufacture and sale of the product, not its use. (It
says you can license even the chips, the lowest cost component of a system.
There is quite a difference between a "use tax" and a "royalty.")

If he was referring to the $1 per certificate, that comes only from those who
provide certificates as a service as a service, not from the user, and not for
"use" of a signature product.


CPSR Secrecy Statement

Dave Banisar <banisar@washofc.cpsr.org>
Fri, 16 Jul 1993 8:27:56 EST
  CPSR Secrecy Statement

    Computer Professionals for Social Responsibility (CPSR) has called for a
complete overhaul in the federal government's information classification
system, including the removal of cryptography from the categories of
information automatically deemed to be secret.  In a letter to a special
Presidential task force examining the classification system, CPSR said that
the current system — embodied in an Executive Order issued by President
Reagan in 1982 — "has limited informed public debate on technological issues
and has restricted scientific innovation and technological development."

       The CPSR statement, which was submitted in response to a task force
request for public comments, strongly criticizes a provision in the Reagan
secrecy directive that presumptively classifies any information that "concerns
cryptology."  CPSR notes that "while cryptography — the science of making and
breaking secret security codes — was once the sole province of the military
and the intelligence agencies, the technology today plays an essential role in
assuring the security and privacy of a wide range of communications affecting
finance, education, research and personal correspondence."  With the end of
the Cold War and the growth of widely available computer network services, the
outdated view of cryptography reflected in the Reagan order must change,
according to the statement.

       CPSR's call for revision of the classification system is based upon the
organization's experience in attempting to obtain government information
relating to cryptography and computer security issues.  CPSR is currently
litigating Freedom of Information Act lawsuits against the National Security
Agency (NSA) seeking the disclosure of technical data concerning the digital
signature standard (DSS) and the administration's recent "Clipper Chip"
proposal.  NSA has relied on the Reagan Executive Order as authority for
withholding the information from the public.

       In its submission to the classification task force, CPSR also called
for the following changes to the current secrecy directive:

     * A return to the "balancing test," whereby the public interest in
     the disclosure of information is weighed against the claimed harm
     that might result from such disclosure;

     * A prohibition against the reclassification of information that has
     been previously released;

     * The requirement that the economic cost of classifying scientific
     and technical be considered before such information may be classified;

     * The automatic declassification of information after 20 years,
     unless the head of the original classifying agency, in the exercise
     of his or her non-delegatable authority, determines in writing that
     the material requires continued classification for a specified period
     of time; and

     * The establishment of an independent oversight commission to monitor
     the operation of the security classification system.

       The task force is scheduled to submit a draft revision of the Executive
Order to President Clinton on November 30.

       The full text of the CPSR statement can be obtained via ftp, wais and
gopher from cpsr.org, under the filename cpsr\crypto\secrecy_statement.txt.

       CPSR is a national organization of professionals in the computing
field.  Membership is open to the public.  For more information on CPSR, or a
full copy of the July 14 letter from Marc Rotenberg (CPSR Washington Director)
and David L. Sobel (CPSR Legal Counsel) to the Information Security Oversight
Office, contact <cpsr@cpsr.org>.


Announcements of NIST documents

Dolores Wallace <wallace@swe.ncsl.nist.gov>
Wed, 21 Jul 93 12:09:48 EDT
To order any of the following three documents, contact the Superintendent of
Documents, U.S. Government Printing Office (GPO), Washington, DC 20402, (202)
783-3238.

* Wendy W. Peng and Dolores R. Wallace, Software Error Analysis, NIST Special
Publication 500-209, GPO Stock Number SN003-003-03212-3.  $7.00.

This document provides guidance on software error analysis.  Software error
analysis includes error detection, analysis, and resolution.  Error detection
techniques considered in the study are those used in software development,
software quality assurance, and software verification, validation and testing
activities.  These techniques are those frequently cited in technical
literature and software engineering standards or those representing new
approaches to support error detection.  The study includes statistical;
process control techniques and relates them to their use as a software quality
assurance technique for both product and process improvement.  Finally, the
report describes several software reliability models.


* Dolores R. Wallace, Laura M. Ippolito, D. Richard Kuhn, High Integrity
Software Standards and Guidelines, NIST Special Publication 500-204, GPO Stock
Number is SN003-03171-2.  $6.50.

This report presents results of a study of standards, draft standards, and
guidelines (all of which will hereafter be referred to as documents) that
provide requirements for the assurance of software in safety systems in
nuclear power plants.  The study focused on identifying the attributes
necessary in a standard for providing reasonable assurance for software in
nuclear systems.  The study addressed some issues involved in demonstrating
conformance to a standard.  The documents vary widely in their requirements
and the precision with which the requirements are expressed.  Recommendations
are provided for guidance for the assurance of high integrity software.  It is
recommended that a nuclear industry standard be developed based on the
documents reviewed in this study with additional attention to the concerns
identified in this report.


* Dolores R. Wallace, Wendy W. Peng, Laura M. Ippolito, Software Quality
Assurance: Documentation and Reviews, NISTIR 4909, FREE (as available).
Contact Dolores Wallace at (301) 975-3340 or wallace@swe.ncsl.nist.gov, or
Laura Ippolito at (301) 975-5248 or ippolito@swe.ncsl.nist.gov or either at
[FAX] (301) 590-0932.

This study examines the contents of a software quality assurance standard for
nuclear applications.  The study includes recommendations for the
documentation of software systems.  Background information on the standard,
documentation, and the review process is provided.  The report includes an
analysis of the applicability, content, and omissions of the standard and
compares it with a general software quality assurance standard produced by the
Institute for Electrical and Electronics Engineers.  Information is provided
for the content of the different types of documentation.  This report
describes information for use in safety evaluation reviews.  Many
recommendations in this report are applicable for software quality assurance
in general.

   - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

* Dan Craigen, Susan Gerhart, Ted Ralston, NIST Formal Methods Report

The National Institute of Standards and Technology has published a survey of
twelve case studies of the use of formal methods in industrial projects.  The
report is published as NIST GCR 93/626 and is available in paper form from the
National Technical Information Service.  It is also available in electronic
form; for electronic version, contact wallace@swe.ncsl.nist.gov .

The formal methods report contains the complete final version of "An
International Survey of Industrial Applications of Formal Methods" sponsored
by the U.S. National Institute of Standards and Technology, the U.S. Naval
Research Laboratory, and the Canadian Atomic Energy Control Board.  This
report consists of two separate volumes.  Volume 1 describes the purpose,
approach, analysis, and conclusions of the survey; Volume 2 describes the case
studies.

ORDER FROM: National Technical Information Service,
     5285 Port Royal Road, Springfield VA 22161     Phone( 703) 487-4650

YOU MUST Use the PB numbers to order:

     "An International Survey of Industrial Applications of Formal
     Methods  Volume 1 Purpose, Approach, Analysis and Conclusions"
     NIST GCR 93/626-V1       PB93-178556/AS
     Hard Copy: A07/$27.00  Microfiche: A02 $12.50

     "An International Survey of Industrial Applications of Formal
     Methods  Volume 2  Case Studies"

     NIST GCR 93-626-V2      PB93-178564/AS
     Hard Copy: A09/$27.00  Microfiche: A03/$12.50

     For electronic version, contact wallace@swe.ncsl.nist.gov

Please report problems with the web pages to the maintainer

x
Top