The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 7 Issue 65

Saturday 15 October 1988

Contents

o Vendor introduces "safe" Ada subset
Jonathan Jacky
o Re: ethics of conflict simulation
Sean Malloy
o Re: Assault on Privacy
Ronni Rosenberg
o Software warranties and Trade Practices in Australia
B L Coombs annoted by "cbp"
via Lee Naish
o RISKS of EPROMS
George Sukenick
o Info on RISKS (comp.risks)

Vendor introduces "safe" Ada subset

jon@june.cs.washington.edu <Jonathan Jacky, University of Washington>
Fri, 14 Oct 88 09:04:38 PDT
From ELECTRONIC ENGINEERING TIMES, 26 Sept 1988, p. 25:

Ada SUBSET ADDRESSES SOFTWARE SAFETY

Southampton, England - (A subset of Ada called Spark) is reported to overcome
the drawbacks of (Ada) in applications where software integrity is critical.
...  Spark was developed at the University of Southampton with the sponsorhip 
of the British Ministry of Defence.  It is now being marketed by Program
Validation Ltd.

(A representative of Program Validation) said that the use of Ada for safety
critical programming poses some serious problems.  There is no formal
definition of the language and the precise meaning of some its constructions is
unclear.  According to Program Validation, the resulting uncertainties make
formal verification of Ada programs impossible and cast doubts on the integrity
of the compiled code.  A further complication is that the richness of Ada
allows programs to be constructed that are apparently simple, but hide great
underlying complexity.

... To achieve Ada integrity, Spark has introduced several restrictions.  It
does not allow the use of tasks, exceptions or generic units.  Access types 
are also omitted, as these are considered unacceptable in real-time safety
critical applications.  ... Certain features - such as "go to" statements
and "declare" statements - are totally barred.


Re: ethics of conflict simulation

Sean Malloy <malloy@nprdc.arpa>
Thu, 13 Oct 88 13:40:12 PDT
>From RISKS-FORUM 7.74: (Scott Wilde)    The problem is not some nebulous
>fear of the Pentagon "poisoning" the industry as a whole, but rather
>that they would interfere _with the particular game under consideration_.

In fact, one of the games designed by Simulations Publications, Inc. (SPI)
before they were bought out by TSR was _ordered_ by the Army.  _Firefight_ was
intended as a simulation for warfare in Europe, to teach tactics to infantry
and armor commanders. Within a number of simplifying abstractions, it modeled
the weapons systems available to a unit commander in Germany.

SPI later made this game available as part of their regular line. It soon
became apparent that the game was not only useful for teaching tactics, it was
also a device to build confidence and improve morale -- the way the rules and
weapons systems data were set up, it was almost impossible for a Soviet player
to pull anything better than a draw out of the game. The game mechanics were
biased so that an American player could win by using the `right' tactics
(`right' in the Army sense -- the approved Army tactics for a given situation),
rather than encouraging the players to come up with their own tactics.

>From the Army's point of view, it was a very good simulation. From the
opinions expressed about it in the gaming community, it flopped
miserably as a _game_.

Sean Malloy, Navy Personnel Research & Development Cntr, San Diego CA 92152-6800


Re: Assault on Privacy

Ronni Rosenberg <ronni@VX.LCS.MIT.EDU>
Thu, 13 Oct 88 13:36:10 edt
Thanks to Anthony Atkielski for providing information on privacy legislation
in France.  I hope that France's legislation closes some of the loopholes in
U.S. privacy legislation.  But it is worth pointing out that laws that may
sound good on the books often do not translate into tough action.

For instance, the Fair Credit Reporting Act (1971) specifies expiration
periods, for bankruptcy data (14 years) and other adverse data (7 years),
which is not well defined.  Where legislation contains vague definitions,
applying it may be left to the judgement of the agency being regulated.

The FCRA also requires credit agencies to provide you with the data in their
file about you, on request, and to allow you to correct it.  Sounds good.
But you can get such info. for free only after you have been denied credit on
the basis of it.  If you want to get the info. before you have a problem, it's
not too expensive, but you'll have quite a time trying to find all the private
organizations that maintain files about you.  If you make a correction, there
is no guarantee that it will be propogated to other files based on this one
and to other organizations that obtained the false data previously.  And if
you lost something, such as a mortgage, because of false data, tough luck.

The Privacy Act (1974) makes it easier for people to know about their files
(in government agencies and the private organizations with which they do
business).  But publication of the existence of records is done in the Federal
Register, which is not exactly handy.

Agencies are restricted from releasing personal data to another agency without
written permission of the person who provided the data, except for "routine"
purposes.  In 1979, the Office of Personnel Management released lots of its
data to other agencies.  What was the "routine" purpose?  "To protect the
legitimate interests of government."  Similar definitions can be used to
"justify" the collection of any sort of info.

Atkielski thinks that individuals in France can insist that a credit bureau
erase its file about themselves.  But if society is structured so that many of
the normal transactions of life depend on credit ratings, how real a "choice"
do you have about participating?

I wish much more of the burden were on the organizations that maintain (and,
in many cases, profit from) data banks.  I'd like to see organizations held
responsible for notifying individuals directly about the existence of files
about themselves; requesting permission from individuals every time info. is
released; guaranteeing that corrections will be made and propogated quickly;
assuming liability for losses based on false data; and so on.


Software warranties and Trade Practices in Australia

Lee Naish <lee@munmurra.mu.oz.au>
Wed, 12 Oct 88 13:43:36 EST
        [This was picked off the net in Australia, from "cbp",
        including and commenting on a letter from B L COOMBS.  Lee]

Software Warranties - The Truth

[The Trade Practices Commission recently sent the following letter to 2000
Australian computer companies.

Permission has been obtained from Ian Searle of the TPC to reproduce this
letter here
<

RISKS of EPROMS

George Sukenick <sukenick%ccnysci%cucard@nyu.edu>
Mon, 10 Oct 88 15:35:08 EDT
>  RISKS of EPROMS (Daniel Klein)
>The UV eraseable EPROMS that are found in many smaller computers are also
>subject to failure when their picture is taken.  Yep, you read that correctly.

(Due to camera shy EPROMS? :-))

Electronic flashes draw a lot of current in a short time.  The unshielded
system might have been crashing due to EMP rather than light interfering with
the EPROMs.  I guess that the test would then be to see what happens with
various combinations of covering the EPROM's windows (they were open in the
machine?) and shielding the flash.
                    -george

Please report problems with the web pages to the maintainer

Top