The RISKS Digest
Volume 12 Issue 4

Tuesday, 9th July 1991

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Clip-Art Confusion Causes City Change
Christopher Davis
Hiding a face on television
Tim Smith
A RISKy night in Georgia
Robert E. Van Cleef
Risks of HR 1400 to modem community
Jim Thomas
Dissemination of confidential information
Hugh Cartwright
Review of "TERMINATOR 2: Judgment Day"
R. Mehlman
Re: Computers and Exporting
Vadim Antonov
Re: Formalism vs Experimentation
Vadim Antonov
Daniel Palumbo
Disk based crime plan
Rob Boudrie
Deleting vs. Shredding
Brad Templeton
Re: The Risks of Undelete and the Law
Steven Tepper
William Ricker
Info on RISKS (comp.risks)

Clip-Art Confusion Causes City Change

Christopher Davis <ckd@eff.org>
Tue, 9 Jul 91 18:53:28 -0400
>From the _Boston Herald_, July 2, 1991, from "Paul Sullivan's CELEBRITY" page:

   "Mayor Dinkins' birthday invites sky-high error"

   The Big Apple must @i(really) be turning sour because even an
   invitation to Mayor @b(David Dinkins') upcoming birthday party
   features the skyline of @i(Boston) instead of the skyline of New York City.

   [... it's for a July 10th fundraising party (my birthday too!) --ckd]

   Prominently displayed in the sketch is the famous Citgo sign in
   Kenmore Square, the old John Hancock Building, the new Hancock Tower
   and the Prudential Tower.

   [...mayor's spokeswoman passes buck to organizing committee]

   A committee spokeswoman blamed it on that old standby, computer
   error.  "When you punch in 'skyline,' that's what came out," she
   said.  [phrasing as original; either she's confused about past &
   present, or the quote got messed up.  No [sic] in original --ckd]

>From my reading of that, it appears that there's some sort of "clip art"
package involved, probably with titles and/or keywords to select "appropriate"
pictures.

Is this a case of the RISKS of using a single-keyword search?  More the RISKS
of blindly accepting the results, in my opinion.  If the photo included with
the article is of the card (it's not clear whether it is or not), then someone
definitely should have looked twice.  (The Mets fans should have at least
recognized the Citgo sign from the '86 Series, and the Yankees fans don't need
to go that far back.  The RISKS of not watching enough baseball?)

Garbage in, Gospel out, once again.  This one's just another old story
to RISKS readers...

Christopher Davis <ckd@eff.org>, System Manager & Postmaster
Electronic Frontier Foundation, Cambridge, MA


Hiding a face on television

<ts@cup.portal.com>
Tue, 9 Jul 91 00:40:41 PDT
Being too lazy to change the channel, I'm being subjected to the pseudonews
program _Hardcopy_.  As I type this, I'm watching a little boy testify in court
about the possible murder of his mother.

To protect the child from being recognized, they are doing something to the
video of his face so that it consists of several large squares that change as
he moves.  This seems to be the standard way to hide things on TV now.

Is this safe?  It seems that there should be enough information here to
reconstruct the hidden face (or other body parts — they seem to be using this
process to cover up nudity now too).
                                                   Tim Smith


A RISKy night in Georgia

Robert E. Van Cleef <vancleef@nas.nasa.gov>
Tue, 9 Jul 91 12:32:23 -0700
Computer Crime (Information Weekly, July 8, 1991, page 6)

A Computer Systems Protection Act went into effect last week in Georgia. The
Act provides the same punishment for computer thievery as for other types of
theft crimes. The bill calls for prison terms of up to 15 years for
"computer-assisted theft, trespass, invasion of privacy, and forgery." Under
the Act, stealing someone's computer password in Georgia can get you a $5,000
fine or one year behind bars.


Risks of HR 1400 to modem community

<TK0JUT1@MVS.CSO.NIU.EDU>
Wed, 03 Jul 91 21:13 CDT
What are the risks of Bush's proposed crime bill??  There is provision in HR
1400, the House version of the "Comprehensive Violent Crime Control Act of
1991" that should be of concern to those concerned with the potential
reduction of Constitutional protections of privacy and association. The
current version would also revise 18 USC (sect) 2709 which authorizes the FBI
"subscriber information and toll billing records information or electronic
communication transactional records" from any "wire or electronic
communication service provider."

The subject of the request need not be the person under investigation, but
can be made of anybody who is perceived to possess information relevant to
an investigation. The language of existing law is sufficiently vague that
it seems to include (or could be interpreted by zealous agents to include)
any private documents that one may have on a university mainframe that
might contain "transactional information" (a broad term with potential for
widest possible definition). This could be construed to mean that if somebody
on the internet received private e-mail from the target of an FBI
investigation, then the first person could be subject to having a variety of
private material turned over to the FBI. Current language in HR 1400 also
expands the definition of acts subject to investigation by broadening the
scope of counter-intelligence.

This is already the current law. The proposed revision adds:
      "(c) PENALTY FOR DISCLOSURE.-No wire or electronic communication
    service provider, or officer, employee, or agent thereof, shall disclose
    to any person that the Federal Bureau of Investigation has sought or
    obtained access to information under this section. A knowing violation of
    this section is punishable as a class A misdemeanor."
    (From HR 1400, Sec. 743. COUNTERINTELLIGENCE ACCESS TO TELEPHONE RECORDS)

Not only is this provision a threat, but there is neither a reasonable length
of time after which such information may be given, nor is there any
exception for disclosing the information to another (the information shall not
be disclosed to **ANY PERSON**), including priests or doctors.

When rumors of "national security implications" arose in some of the Secret
Service raids last year, it takes little imagination to see the "act first
apologize later (if ever)" mentality in action, snooping through records
and vague "transactional information." The proposed wording constitutes
a threat by adding a level of secrecy to investigative power.

Jim Thomas


Dissemination of confidential information

<HCART@vax.oxford.ac.uk>
Tue, 9 JUL 91 09:13:52 BST
 Klaus Brunnstein <brunnstein@rz.informatik.uni-hamburg.dbp.de>
in Risks 12.03 argues quite reasonably for a Code of "Discourse Ethics".
and comments on

  "... the trust which I assume my communication partners follow..."

While his proposed Code would meet a real need, I am afraid Klaus's
own position is weakened when he writes:

> I personally just received Bill Gates memo on Microsoft's
> performance and future problems; .... I assume
> that Bill Gates will not be glad that I had it.

  Probably not.

> I am highly sure that the
> community in which I received this information is trustable, and they and I
> will not uncover any details...

  Except that this "trustable community" is already circulating what they
know to be confidential information to Klaus, and, presumably, to others.

  Doubtless it was inept of Microsoft to allow their e-mail to be intercepted,
but if the purpose of those publicising the interception is to expose flaws in
the e-mail system, surely the right course is to deal with Microsoft, not to
disseminate the information more widely.  Those, like Klaus, working in
security, have an justifiable interest in security holes uncovered by others.
However, circulation of the actual information pulled through these holes in no
way helps to seal them.  Indeed, it must give rise to serious doubts about the
motives of those who retransmit mail to which they have no legitimate access.

Hugh Cartwright.  Physical Chemistry, Oxford University, UK.


Review of "TERMINATOR 2: Judgment Day"

<rmehlman@grumpy.span.nasa.gov>
Sun, 7 Jul 91 22:44:23 PDT
The computers have no BUGS.
                                    [You were expecting, maybe, realism?  PGN]


Re: Computers and Exporting

Vadim Antonov <avg@hq.demos.su>
Sat, 15 Jun 91 17:15:46 +0300 (MSD)
>Take for instance the DES export restriction. Sources for DES have been
>posted on Usenet.

The source codes and formal descriptions were publically available in USSR long
before that posting. I've first seen it being a student and hacking some Unix
sources about 1982. Isn't it stupid to continue insisting on export
restrictions of the well-known technology?

(I remember our military instructors (military education was mandatory in USSR,
sigh) talking about tactical characteristics of Soviet aircrafts referring to
the American intelligence sources! Surely, these data were "secret" inside
USSR! Familiar scenario, isn't it?)

Vadim Antonov, DEMOS, Moscow, USSR


Re: Formalism vs Experimentation

Vadim Antonov <avg@hq.demos.su>
Sat, 15 Jun 91 19:34:11 +0300 (MSD)
I tend to agree that formal methods is the useful thing for practical
programming; but it's silly to limit CS education by (or to focus it at) formal
methods. May be I'm a heretic but there is no such thing as a "programmer".
Programming is always a marginal discipline - any real program deals with

a) hardware (I'm worrying why programmers are ignorant in hardware design
methods; say the modern buzzword "object-oriented" is nothing more than the
sixty-years-old method of modular hardware design. Anyone having experience in
digital hardware design have no troubles with parallel programming, etc.
Understanding HOW hardware does work is necessary for any good programming.)

b) humans (I dunno why, but programmers often tend to design really anti-human
user interfaces. Psychology is the thing most programmers needs to be familiar
with. I also think the most programmers should at least have some sense of
taste. I got tired looking at the ocean of tastelessness of Messy-Dossy bells
and whistles. Good English (or anything else :-) is not the last part of good
documentation.)

c) mathematics (Absolutely necessary in numerous well-investigated fields like
numeric computations and syntax analysis AND useful as a mean to improve
analytical thinking.)

d) poligraphy (Text-processing is the daily routine of most programmers).
                                                        [Yes, NOT POLYGRAPHY!]

e) business and management (Teamwork, planning and market estimations are
necessary things to make something successful - who wants to spend his life
creating programs nobody wants to pay for? Still, management often is the
second profession of ex-programmers.)

f) specific knowledges in application's domain.  (If you're creating a program
for robots controlling machine you need to know some mechanics, aren't you.)

*) after all anybody could note a dozen more things useful in programming.

As you can see the "ideal" programmer should be really universally educated and
the modern education is overly concentrated on formal side of programming.
Someone noted the Soviet system of education, well...  It's really
mathematically-based and produces good puzzle-solvers.  The "educational"
programming suffers from puzzle nature of problems students are used to solve.
Real problems are different. Even systems programmers very seldom needs to
invent algorithmic tricks. The best solution is the simpliest one, not the most
tricky and "efficient".

The formalized CS education we have in Soviet Union yields really awful results
- for example the quantity of grads capable to write real programs is about
2-3% after the CS Dept. of Moscow U (not the worst one, be sure) - and those
students who CAN program all are self-educated hackers and as a rule they had
terrible conflicts with educational authorities. Some of the most talented
programmers here are still students in their 30s. Thus the practice is against
Dijkstra.

Let me state that programming is not the science of coding but the art of
finding solutions of non-formalized problems and expressing these solutions in
explicit and clear way.

   [Paragraph on gender-related matters deleted.  PGN]

Vadim Antonov, DEMOS, Moscow, USSR


Re: Formalism vs. Experimentation

Daniel Palumbo <Daniel_Palumbo.SVMB@air25.larc.nasa.gov>
28 Jun 91 13:29:45
Having just concluded an effort to experimentally verify clock synchronization
theory, I was drawn to this recent RISKS discussion,  However, I found very
little substance relating to what I thought was at issue here.

Our group at NASA Langley is concerned with validating/verifying digital flight
control systems on aircraft.  One of our battle cries has been that testing
(experimentation) is inadequate to demonstrate that a system is 'bet your life'
correct.  Formal methods (which in the U.S. means proof of correctness) is
championed as an alternative.

During the course of my work with clock synchronization theory, I came to
believe that experimentation is an absolutely vital part of formal methods.
Experimentation can even be considered a formal method if done in a rigorous,
scientific method.  My more formally oriented co-workers and I have debated
this issue with the general consensus (from my perspective) that
experimentation is needed any time a design bumps up against the real world.
Some have even suggested that experimentation is useful in establishing that a
purely logical relationship is not obviously untrue before a proof is
attempted.

The question which remains is, "What is the best recipe for mixing formal
methods and experimentation to yield the most confidence in a design at the
lowest cost?"

  [dlp@air12.larc.nasa.gov]


Disk based crime plan

Rob Boudrie <rboudrie@encore.com>
Fri, 5 Jul 91 14:01:35 EDT
There is another "dark side" to this business of using "disk data" of alleged
"crime plans" as evidence against a suspect.  Unlike typewriting (traceable to
a machine); photocopies (also traceable) and handwriting, the digital nature of
computer data lends itself to tampering.  There is now a virtually detection
proof mechanism whereby an overzealous cop can embellish the evidence if the
case is weak but (s)he "knows" that the suspect is guilty and wants to prove
it.  To those who say "they would never do that", I would point out that two
Boston police officers were recently convicted of perjury for fabricating an
informant to get a search warrant for drugs.


Deleting vs. Shredding

Brad Templeton <brad@looking.on.ca>
Wed, 3 Jul 91 15:17:12 EDT
Is there an expectation of privacy with a shredded document?

After all, it seems to me that a tool to scan in and paste together slices from
a single slice shredder (as opposed to the multi-slice ones that just leave
little bits of chaff) would not be hard to create.  I fully expect that the
intelligence types have already built them, although it is unlikely that they
would be released to the public.

I wonder how the courts would react to evidence that was a re-combined shredded
document?


Re: The Risks of Undelete and the Law (RISKS-12.02)

Steven Tepper <greep@speech.sri.com>
Wed, 3 Jul 91 12:59:46 PDT
When I worked for the government (a number of years ago) I was told that to
dispose of a magnetic tape containing classified information you either had to
write over the tape twenty times or burn it.  -greep


Re: The Risks of Undelete and the Law

William Ricker <wdr@wang.com>
Mon, 8 Jul 91 19:51:43 EDT
In comp.risks Ron Dippold writes:  [...]
>The court soundly, and IMO correctly, rejected this claim, analogizing the
>retrieval of the deleted file data (by an FBI agent who was a computer expert)
>to deciphering a coded message in a diary, after the diary was obtained under
>a valid subpoena.

Since you say "IMO" [In My Opinion] not "IMHO" [~ Humble ~], do you mean to
imply you are an attorney specializing in Constitutional Appellate matters, or a
professor of same?

Do you say "correctly" because
 A* there was sufficient other evidence to convict, and thus he
    shouldn't be let free on a technicality?
 B* the police had specifically listed the computer on the search warrant,
    and thus the "expectation of privacy" has been breached legally
    under warrant after due process consideration of probably cause in the
    warrant hearing.
 C* the new supreme court would gladly shed some light into the penumbra
    of the 9th amendment and the right to privacy, anywhere "stare
    decisis"*1* doesn't apply as well as some where it does, and thus
    this probably won't be overturned on appeal?
       [*1* "let the decision (precedent) stand"]
 D* the _Katz_ "expectation of privacy" should be based on what a
    technically competent expert witness would expect, not what a common
    user would expect?

This particular case does sound, IMHO, as if "harmless error" could be the
finding on the privacy issue, for the first and second reasons (A&B).  The
allusion to locked/encrypted diaries seized under warrant as precedent makes me
suspect B.  I would be disheartened if however the finding were that the
technical accuracy of the user's expectations were actually material to their
coverage by _Katz_.

Spurious claims to privacy (e.g. the very recent case of a paper bag
of drugs in the car, 89-1690, California v. Acevedo, where accused
granted permission to search the care, but claimed no permission to
open the bag was implied and that warrantless search of the bag was a
violation of 4th) are to be rejected, IMHO.

However, again IMHO, where even guilty parties really did believe they had
privacy, such as the instant Pennsylvania felony kidnap & murder case and
Poindexter and North deleting their incriminating ContraGate PROFS messages
only to have the IBM mainframe backup tapes read by the House/Senate
committees, the 4th/9th penumbra should grant them *criminal* evidentiary
protection commensurate to their expected privacy.  The Senate of course has
the right to read government property, and civil/commercial litigation has much
looser rules of evidence, where I would expect backups & restorals to be
admissible.
   (The ContraGate tapes may have been subpoenaed specifically, in which case
the Diary Under Warrant might apply, and void the expectation; I would have to
read (a) the diary precedent and (b) the subpoena to have any confidence in an
opinion.)

I wonder if the appellant convict briefed any surveys on how many users read
their manuals or know about UNDELETE utilities?

I wish the convict could appeal this one to the old Warren court; I'd like to
know whether Douglas would have found this within his Penumbra, as I think he
might have (depending on the facts).  The Rehnquist court probably won't even
look at it, unless as a vehicle to chip away at the penumbra — which would be
patently abusive, since it can be easily disposed of as a harmless error, since
the physical evidence was enough to convict, so original poster tells us.

[Caveat: I'm not an attorney, let alone one specializing in constitutional
issues.  Hence IM*H*O above. But I did take two classes on it in college and
have tried to keep up on recent opinions since; opinions.supreme-court from
UUNET helps there greatly, especially the *.S syllabus files.]

/s/ Bill Ricker                wdr@wang.wang.com

Please report problems with the web pages to the maintainer

x
Top