The RISKS Digest
Volume 15 Issue 20

Monday, 1st November 1993

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Breakdown in computerised voter support, Oslo
Reidar Conradi
Norwegian hackers fined
\ystein Gulbrandsen
White House distributes STONED 3 virus
Andrew Klossner
Police feedback
Graeme Jones
Taurus Project
E. Kelly
Magnetic Fields in Subway Cars
Bob Drzyzgula
Report on Software Product Liability
Charles Youman
Re: Dangers of Fibre Optic cable
John Gray
Re: CERT Reports and system breakins
Mike Raffety
Re: CERT (was "security incident handling")
A. Padgett Peterson
Re: Ethernet addresses as "port" ids
A. Padgett Peterson
3rd SEI Conference on Software Risk, 5-7 April 1994, Pittsburgh
Ellen Ayoob
Workshop on IT Assurance and Trustworthiness
Marshall D. Abrams
ISOC Symposium on Network and Distributed System Security
Danny Nessett
Info on RISKS (comp.risks)

Breakdown in computerised voter support, Oslo

<Reidar.Conradi@idt.unit.no>
Mon, 1 Nov 1993 23:33:25 +0100
Computer error in parliamentary election, Oslo., 13 Sept. 1993

The parliamentary election in Oslo on Monday 13 Sept. 1993 attempted to use a
new, computerised system for electorate management ("manntallskontroll" in
Norwegian).  The system should keep track of electorate composition and
participation — but not what was voted for.  By this system, the city hoped
to reduce the number of employees from 3000 to 1500 on election day, and
planned for that.

The electorate management system was developed since 1988 by the government's
computing center (SDS) on a Bull mainframe connected to local PCs.  The
software had been tried out successfully in smaller elections both in Norway
and Denmark, but had not been exposed to a full scale test.  It is still
unclear who was responsible for this lack of realistic testing — the city of
Oslo or SDS.

However, the electoral management system developed severe "breathing" problems
only 1/2 hour after election started at 09:00 on 13 Sept. 1993.  It must be
said to have effectively broken down, caused by a programming error in the
local communication controllers, using character-based instead of line-based
transmission protocols.

The error was fixed at 14:00 on election day, but then the municipal election
board had already decided to revert to manual backup procedures.  These worked
reasonably well, all factors considered.  In fact, a group of specially
invited, international observers from more "low-tech" countries was very
impressed by the city's ability for manual improvisation.

However, in the ensuing chaos, there were some irregularities, and 1200 votes
had eventually to be discarded or were lost.  Therefore, the municipal
election board in Oslo unanimously decided to recommend a reelection 2-3
months later.  This recommendation was turned down by the newly elected
parliament during its own constitution — in a 119 against 27 vote on 8 Oct.
1993.  The reasoning was that the acknowledged election irregularities in Oslo
were judged to be insignificant for both the municipal and national election
outcomes.  On the other hand, a reelection would very likely have caused
changes among the elected deputies.

Digression: The ballots were counted optically, *after* all the votes had been
cast.  The ballots had uniquely punched holes along the paper borders,
resembling a punched card.  The associated computerised system (also from SDS)
worked without problems.  However, the right for constituencies (counties) to
use entirely electronic voting is still awaiting parliamentary approval ...


Norwegian hackers fined

<oysteing@taskon.no>
Fri, 29 Oct 93 16:45:23 -0100
In Baerum (a small, wealthy area just outside Oslo, the capital of Norway),
two hackers were accused of stealing telephone services, and several other
forms of fraud. The elder (23) got 18 days suspended and a 2000NKR (300$) fine
after last year having used a phony name and signed out a modem, and assorted
computer-related items from a transport-company.  He told the court that he
was acting on behalf of another person he got in touch with on a BBS. He was
told to check a mailbox (a physical one), and pick up the papers for transport
there. He did so, and met with the transport company, identified himself
mainly by the acquired papers, and signed out the goods. He paid with a stolen
Eurocard-number.  He left some of the aquired items on a public place, to be
picked up by the other person involved, and kept some for himself. In court it
also came out that he used to work at a gas-station, wrote down all
credit-card numbers used, and mailed them around the world.

The younger (16) had committed the same scam with the transport-company a
couple of times for a "Calvin", which he met on a French BBS. He was fined
2000 NKR (300$).

None of the boys were sentenced for telecom fraud, on technicalities.  The
court also found that the boys had been roaming international databases, but
did not consider this a computer crime, as they had not destroyed or modified
anything. (I personally would like to see a burglar getting of the hook
because he did not find anything worth stealing!)

The defendants and their lawyers were very satisfied with the verdict.

\ystein Gulbrandsen   Taskon A/S


White House distributes STONED 3 virus

Andrew Klossner <andrew@frip.wv.tek.com>
Fri, 29 Oct 93 10:23:19 PDT
Heard on the Rush Limbaugh radio show of 10/29/93, not confirmed:

The White House distributed the 1300-page health care legislation proposal
widely on floppy disk.  Copies went to legislative staffs and to the press.

It seems that each disk was infected with the STONED 3 virus, which causes a
PC to display "Your PC is STONED.  Legalize marijuana."

The commentator drew the obvious ironies and puns.  (No doubt our esteemed
moderator will find non-obvious puns.)

  -=- Andrew Klossner  (andrew@frip.wv.tek.com)

          [People who live in grass browses shouldn't know STONED?  PGN]


Police feedback

<JONES_GE_3@prime1.central-lancashire.ac.uk>
Mon, 01 Nov 93 13:27:18
With the CERT-induced issue of passing sensitive information on to those
that could really do with it in mind - I wondered if the British police
have an official policy to hand over *all* such information to
manufacturers of technical products with exploitable weaknesses.  I have
been told (don't quote me though!) that for instance, although forged
banknote detectors are in use in even our local 20ft by 20ft store,
the notes can simply be sprayed with hairspray (one brand works particularly
well!) to bypass the ultraviolet light.  Similarly, the so-called high
security coded car stereos touted at a premium because thieves can't use
them can apparently just be placed in the freezer for a while...

Makes me wonder what else clever(?) thieves could achieve if put in charge
of a DTI department to boost our exports by undermining foreign products!!
Only joking - I've just watched "Rising Sun"...

Graeme


Taurus Project

<EKELLY@dit.ie>
Mon, 1 Nov 93 15:17 GMT
In the October 1993 issue of the CACM Inside Risks column, I was surprised
that there was no mention of the London Stock Exchange Taurus project which
was abandoned in March 1993 after a total expenditure of #400 million pounds
sterling (about US$ 600 million). The British journal "Computing" carried
several articles on it. The principle function of the Taurus project, as far
as I know, was to computerize the share certificate settlement system.


Magnetic Fields in Subway Cars

Bob Drzyzgula <m1rcd00@frb.gov>
Fri, 22 Oct 93 12:47:44 -0400
Commuting on the Washington, D.C. Red Line this morning, I noticed, out of the
corner of my eye, something on the floor flash. As I looked closer, I
understood that what I saw was a paper clip. But I could have sworn that it
moved. A minute later, it did. It stood up on end, about 60 degrees from the
plane of the floor. It did this for about 5 seconds, and then fell to the
floor again. I watched this go on for about a half an hour... every time the
train would accelerate or decelerate, the paper clip would stand up at a rigid
60 degree angle until the train operator disengaged the (electric) drive
motors. It probably did this 50 times during my trip. Given that I had some
floppy disks in my pack, it made me kind of nervous. And knowing that I have
many times, under more crowded circumstances, stood just where that paper clip
was (almost exactly in the center of the rail car's floor) with my pack on
resting on the floor, I was kind of dismayed. And here I always blamed the
cheap floppies my office buys.

So I guess... well, you've been warned.

Bob Drzyzgula rcd@frb.gov


Report on Software Product Liability

Charles Youman <youman@umiacs.UMD.EDU>
Sat, 30 Oct 93 21:50:03 -0400
I ran across a report that may be of interest to RISKS readers.  It is a SEI
report: Software Product Liability (CMU/SEI-93-TR-13) by Jody Armour (School
of Law, U. of Pittsburgh) and Watts S. Humphrey (SEI Fellow, Software
Engineering Institute).  It is available (Postscript, but without figures) via
anonymous FTP from ftp.sei.cmu.edu in directory pub/documents/93.reports as
file tr13.93.ps.  The abstract starts with a reference to an accident
involving a radiation machine [Therac 25], although it is not specifically
identified, is likely to be an accident already extensively discussed in
RISKS, so I have omitted it.  The rest of the abstract follows:

     Software defects are rarely lethal and the number of injuries
     and deaths is now very small.  Software, however, is now the
     principle controlling element in many industrial and consumer
     products.  It is so pervasive that it is found in just about
     every product that is labeled "electronic."  Most companies
     are in the software business whether they know it or not.
     The question is whether their products could potentially
     cause damage and what their exposures would be if they did.

     While most executives are now concerned about product
     liability, software introduces a new dimension.  Software,
     particularly poor quality software, can cause products to do
     strange and even terrifying things.  Software bugs are
     erroneous instructions and, when computers encounter them,
     they do precisely what the defects instruct.  An error could
     cause a 0 to be read as a 1, an up control to be shut down,
     or, as with the radiation machine, a shield to be removed
     instead of inserted.  A software error could mean life or death.


Re: Dangers of Fibre Optic cable (Kenny, RISKS-15-19)

John Gray <grayjw@cs.aston.ac.uk>
Thu, 28 Oct 93 12:48:02 GMT
Robin Kenny mentions the danger of handling optical fibre in RISKS-15.19. I
can believe the telecom worker story, but that probably arose from careless
handling. "Raw" optical fibre is too brittle for practical use (partly because
a fragment snapped from it would be dangerous) so *all* fibres are coated in a
flexible plastic coating before they are put onto drums. Thus, what the people
were handling is perfectly safe (you need to bend a fibre very tightly before
it snaps). Providing the ends of the fibre are not exposed it is safe to
handle.

In order to splice fibre ends, you have to strip off the plastic coating, and
then any pieces of fibre core which are cut off must be disposed of safely. I
suspect that the Telecom worker concerned was working with the bare, uncoated
fibre which is brittle and dangerous.

Possibly the risk of connecting unrelated scenarios? I wouldn't stop children
from changing channel on a TV just because a TV service engineer had been
electrocuted while servicing one....

John Gray


Re: CERT Reports and system breakins (Peterson, RISKS-15.17)

Mike Raffety <miker@il.us.swissbank.com>
Thu, 28 Oct 93 12:44:05 CDT
> It would be very difficult (well, nothing is impossible but this
> would be close) for software to forge an address using commercial
> equipment and collisions should be obvious.

Sorry, it's trivial; every Sun workstation can change its Ethernet address
(see the ifconfig command).  And in fact, any computer that can do DECnet must
be able to do this, since DECnet requires a direct relationship between the
Ethernet address and the DECnet address (dumb, but true).

> Given this number and a database to correlate the ethernet address to a
> particular system/location, it is possible to identify not only the user
> with conventional means, but also determine whether the access is from a
> known terminal.

Sorry, this isn't useful in real life.  The Ethernet address will not cross
routers; this "solution" would only be useful if both ends of the data flow
are on the same network segment (possibly bridged, but not routed).


CERT (was "security incident handling") (Moran, RISKS-15.19)

A. Padgett Peterson <padgett@tccslr.dnet.mmc.com>
Thu, 28 Oct 93 09:06:20 -0400
I am not connected with CERT (other than knowing a number of the people
involved) and can understand Mr. Moran's position. It is true that generally
CERT is "input only" and, while I do not necessarily agree with their
position, it is arguable.

CERT does not and cannot provide solutions, they are not funded to do so.
It is also their policy not to discuss reported problems other than with
the developer of the product in question, and only to produce advisories
when a fix for the problem is available.

Having tried to convince manufacturers before that there is a problem, IMHO
CERT plays a very necessary role in this matter since CERT does not have
to establish credentials.

I do have an advantage over CERT in that, as a hobbyist, I can create and
distribute a "fix" with no guarantees or warranties, something neither CERT
nor the manufacturer can do (one of the problems with a litigation-happy
society). Of course since the "bad guys" enjoy this freedom also, it is a
difficult matter. I can state uncategorically that it is *much* more difficult
to write an anti-virus program than a virus, much easier to hack/crack than to
protect in a manner inoffensive to legitimate users, but then I am egotistical
enough to accept those handicaps.

Back to the subject at hand, for example there is currently what I consider a
severe problem in Novell Netware 3.x and 4.x that will not be discussed openly
just yet since there is no fix. Novell has been contacted and hopefully a new
"feature" will soon appear - for Novell the fix should just require a simple
change to a single program (maybe 2).

My advantage is that for me this is an ethical choice and not a policy
or business dictate, a freedom which neither CERT nor the vendor enjoys.
I do know that many people within such organizations do not necessarily
agree with such decisions but have no choice in the matter.

Thus I do feel that CERT plays a very valuable role in the process of
computer security though it is not often visible as such.

    Padgett


Re: Ethernet addresses as "port" ids (Bob Rahe, RISKS-15.19)

A. Padgett Peterson <padgett@tccslr.dnet.mmc.com>
Thu, 28 Oct 93 09:33:06 -0400
>From: bob@hobbes.dtcc.edu (Bob Rahe)
>  Unfortunately, there are two problems here.  The first is probably the
>most damaging - the ethernet address is the address of the transmitting unit
>ON THAT ETHERNET segment.  If the unit is not on that segment and is sending
>via a router, for example, then the ethernet address will be that of the
>router's ethernet transmitter, and not the originator's physical address.

While Mr. Rahe is correct as far as a PING is concerned, the actual packets
*must* contain the actual hardware address of the sender in order for
the host/server to respond. The fact that the real address may be buried
a bit in the packet does not mean that it is not there.

Further, the little .COM program I mentioned runs on the client itself,
routing has nothing to do with it, and was designed to be run as part
of a login script.

My concept was simple: if all "approved" addresses are known, unapproved
addresses are easy to spot. Further, even using the PING method, if I have
(and most do) just one or two bridges/routers leaving my reservation then
*anything* with their address header should be subject to closer scrutiny.

The problem here is not a matter of too little data but too much (as anyone
who has ever used an unfiltered "sniffer" knows). What I am suggesting is a
means of reducing that data to manageable proportions.

    Padgett


3rd SEI Conference on Software Risk, 5-7 April, 1994, Pittsburgh, PA

Ellen Ayoob <ea@SEI.CMU.EDU>
Mon, 01 Nov 93 15:04:53 EST
Sponsor: Software Engineering Institute
Contact: Julie Walker, SEI, Carnegie Mellon University,
     Pittsburgh, PA  15213-3890
     phone  (412)268-5051
     FAX    (412)268-5758
     e-mail  jaw@sei.cmu.edu

Theme:   Risk Management in Practice

Please call me if you have any questions or need more information.
Ellen M. Ayoob, (412) 268-6932


Workshop on IT Assurance and Trustworthiness

(Marshall D. Abrams) <abrams@smiley.mitre.org>
Mon, 01 Nov 93 16:21:46 EST
          *******  REQUEST FOR PARTICIPATION  *******
                   Invitational Workshop on
                  Information Technology (IT)
                 Assurance and Trustworthiness

                       March 21-23, 1994
                    Williamsburg, Virginia

                         Sponsored by:
            Aerospace Computer Security Associates
                        Co-sponsored by
             National Computer Systems Laboratory,
        National Institute of Standards and Technology

The purpose of this workshop is to provide input into the development of
policy guidance on determining the type and level of assurance appropriate in
a given environment.  Much of the existing guidance is rooted in the Yellow
books, which are based on computer and communications architectures of a prior
decade.  Technological changes such as local area networks, the worldwide
Internet, policy-enforcing applications, and public key cryptography, mandate
a review and revision of policy guidance on assurance and trustworthiness.

This invitational workshop is intended to identify the crucial issues and to
make recommendations.  The audience for the results includes those who deal
with information having sensitivity with respect to national security,
privacy, commercial value, integrity, and availability.  Potential
participants will submit a paper expressing a technical or policy position.
These position papers will be used to identify working sessions and to help
identify specific participants who should be invited.  The submission of the
papers and all communication surrounding this workshop will be handled
primarily through electronic means.  [...]

If you are interested in submitting a paper or just want additional
information, please contact Marshall Abrams, abrams@mitre.org.


ISOC Symposium on Network and Distributed System Security

Danny Nessett <nessett@ocfmail.ocf.llnl.gov>
Mon, 1 Nov 93 09:50:01 PST
Thursday, February 3  [Breaks etc. removed by PGN]

8:30 A.M.
  Opening Remarks
9:00 A.M.
  Session 1:  Electronic Mail Security, Chair: Steve Kent (BBN)
  Certified Electronic Mail, Alireza Bahreman (Bellcore) and Doug Tygar
    (Carnegie Mellon University), USA
  Privacy Enhanced Mail Modules for ELM, Selwyn Russell and Peter
    Craig, Queensland University of Technology, Australia
  Management of PEM Public Key Certificates Using X.500 Directory
    Service: Some Problems and Solutions, Terry Cheung, Lawrence
    Livermore National Laboratory, USA
11:00 A.M.
  Session 2: Panel: Public Key Infrastructure, Santosh Chokhani (MITRE),
    Michael Roe (Cambridge University), Richard Ankney (Fischer, Intl.)
                       Chair: Miles Smid (NIST)
2:00 P.M.
  Session 3:  Protocols, Chair: Tom Berson (Anagram Labs)
  Paving the Road to Network Security, or The Value of Small Cobblestones,
    H. Orman, S. O'Malley, R. Schroeppel, and D. Schwartz, University of
    Arizona, USA
  A Complete Secure Transport Service in the Internet, Francisco Jordan
    and Manuel Medina, Polytechnical University of Catalunya, Spain
3:30 P.M.
  Session 4:  Internet Firewall Design and Implementation
                       Chair: Jim Ellis (CERT)
  Inter-LAN Security and Trusted Routers, Pal Hoff, Norwegian Telecom
    Research, Norway
  Trusted to Untrusted Network Connectivity:  Motorola Authenticatd
    Internet Access — MANIAC(TM), Bill Wied, Motorola, USA
  BAfirewall: A Modern Firewall Design, Ravi Ganesan, Bell Atlantic, USA
  WhiteHouse.Gov: Secure External Access and Service for the Executive
    Office of the President, Frederick Avolio and Marcus Ranum, Trusted
    Information Systems, USA
7:00 P.M.   Banquet

Friday, February 4

8:30 A.M.
  Session 5:  Panel: All Along the Watchtower: Experiences and Firefights
    Managing Internet Firewalls, Brian Boyle (Exxon Research), Brent
    Chapman (Great Circle Consulting), Bill Cheswick (AT&T Bell Labs),
    Allen Leibowitz (Warner-Lambert), Marcus Ranum (TIS)
                       Chair: Frederick Avolio (TIS)
10:30 A.M.
  Session 6:  Issues in Distributed System Security
                       Chair: Cliff Neuman (USC-ISI)
  CA-Browsing System — A Supporting Application for Global Security
    Services, Denis Trcek, Tomas Klobucar, Borka Jerman-Blazic, and Franc
    Bracun, Jozef Stefan Institute, Slovenia
  The X.509 Extended File System, Robert Smart, CSIRO Division of
    Information Technology, Australia
  Auditing in Distributed Systems, Shyh-Wei Luan (VDG, Inc.) and Robert
    Weisz (IBM Canada Laboratory), USA/Canada
1:30 P.M.
  Session 7:  Authentication, Chair: Dave Balenson (TIS)
  The S/KEY(tm) One-Time Password System, Neil Haller, Bellcore, USA
  A Technique for Remote Authentication, William Wulf, Alec Yasinsac,
    Katie Oliver, and Ramesh Peri, University of Virginia, USA
  Remote Kerberos Authentication for Distributed File Systems:  As
    Applied to a DCE DFS-to-NFS File System Translator, Thomas Mistretta
    and William Sommerfeld, Hewlett-Packard, USA
3:30 P.M.
  Session 8:  Panel:  IP Security Alternatives, K. Robert Glenn (NIST), Paul
    Lambert (Motorola), David Solo (BBN), James Zmuda (Hughes)
                       Chair: Russell Housley (Xerox)

PROGRAM CO-CHAIRS
  Russell Housley, Xerox Special Information Systems
  Robert Shirey, The MITRE Corporation

GENERAL CHAIR, Dan Nessett, Lawrence Livermore National Laboratory

PROGRAM COMMITTEE

Dave Balenson, Trusted Information Systems
Tom Berson, Anagram Laboratories
Matt Bishop, University of California, Davis
Ed Cain, U.S. Defense Information Systems Agency
Jim Ellis, CERT Coordination Center
Steve Kent, Bolt, Beranek and Newman
John Linn,  Geer Zolot Associates
Clifford Neuman, Information Sciences Institute
Michael Roe, Cambridge University
Robert Rosenthal, U.S. National Institute of Standards and Technology
Ravi Sandhu, George Mason University
Jeff Schiller, Massachusetts Institute of Technology
Peter Yee, U.S. National Aeronautics and Space Administration

Contact nessett@ocfmail.ocf.llnl.gov (Danny Nessett) for registration and
other information, or write ISOC Symposium, C/O Belinda Gish, L-68, Lawrence
Livermore National Laboratory, Livermore, CA. 94550.

Please report problems with the web pages to the maintainer

x
Top