The RISKS Digest
Volume 19 Issue 55

Monday, 19th January 1998

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Navy discharge case based on illegally gained AOL data?
David Sobel via PGN
"Dirty Secrets" of chip industry
Edupage
Maine Emergency Broadcast System lost power
Jason Yanowitz
Yet another risk of *not* trusting the technology
Rob Slade
TCAS near-miss
Steve Bellovin
Scares in the air blamed in hand-held gadgets
Ben Low
Design flaw in Microsoft Word?
Nick Brown
ActiveX controls — You just can't say no!
Richard M. Smith
Risks of anti-spam measures
Nick Brown
A thought on backup and recovery after Y2K
PGN
Re: Easter Eggs in Commercial Software
Larry Werring
USENIX SECURITY SYMPOSIUM reminder
Cynthia Deno
Quality Week '98, Download Call for Participation
Info on RISKS (comp.risks)

Navy discharge case based on illegally gained AOL data?

"Peter G. Neumann" <neumann@csl.sri.com>
Fri, 16 Jan 98 12:02:34 PST
Excerpted from a 15 Jan 1998 press release of the
ELECTRONIC PRIVACY INFORMATION CENTER <http://www.epic.org>,
by David L. Sobel, EPIC Legal Counsel, (202) 544-9240

       SAILOR SUES NAVY FOR ONLINE PRIVACY VIOLATION;
       GOVERNMENT AGREES TO DELAY PENDING DISCHARGE

A highly decorated Navy Senior Chief Petty Officer today filed suit
challenging a pending discharge based upon information the Navy illegally
obtained from America Online.  The lawsuit, filed in U.S. District Court in
Washington, charges that Naval investigators violated the federal Electronic
Communications Privacy Act (ECPA) when they requested and received
confidential subscriber information from AOL, the nation's largest online
service.  [...]

Navy officials had ordered the discharge of the sailor, Timothy R. McVeigh
(no relation to the convicted Oklahoma City bomber), effective tomorrow
morning (Eastern time) on the ground that McVeigh violated the military's
"Don't Ask, Don't Tell" policy on homosexuality.  The Navy's proposed action
is based entirely upon information obtained from AOL linking the sailor to a
"screen name" on the system in which the user's marital status was listed as
"gay."

The information was received from AOL in clear violation of ECPA, which
prohibits the government from obtaining "information pertaining to a
subscriber" without a court order or subpoena.  In addition to the privacy
protections contained in ECPA, AOL's contractual "Terms of Service" prohibit
the company from disclosing such information to *any* third party "unless
required to do so by law or legal process."

According to EPIC Legal Counsel David L. Sobel, McVeigh's lawsuit is the
first case to challenge governmental access to sensitive subscriber
information maintained by an online service.  "This case is an important
test of federal privacy law," Sobel said.  "It will determine whether
government agents can violate the law with impunity, or whether they will be
held accountable for illegal conduct in cyberspace."  He noted that the
incident also raises serious questions concerning the adequacy of
contractual privacy protections like those contained in the AOL subscriber
agreement.

In a letter sent to Navy Secretary John Dalton yesterday, the Electronic
Privacy Information Center urged a postponement of McVeigh's discharge
pending an investigation of the Navy's conduct.  EPIC noted that, "Any other
result would make a mockery of federal privacy law and subject the American
people to intrusive and unlawful governmental surveillance."  [...]

  [Also noted by Jim Griffith <griffith@netcom.com> from a CNN report.
  See also an article by Noah Robischon in The Netly News
     (http://cgi.pathfinder.com/netly/opinion/0,1042,1692,00.html).
  The postponement was granted.  PGN]


"Dirty Secrets" of chip industry

Edupage Editors <educom@educom.unc.edu>
Tue, 13 Jan 1998 11:48:05 -0500
A six-month investigation by *USA Today* has concluded that the microchip
industry commonly endangers workers, many of them women and minorities, by
failing to fully train them about the hazardous, sometimes deadly, chemicals
with which they work. It also charges the industry with various other
infractions of environmental health regulations.  (*USA Today*, 13 Jan 1998;
Edupage, 13 Jan 1998)


Maine Emergency Broadcast System lost power

Jason Yanowitz <yanowitz@Taz.nineCo.com>
Thu, 15 Jan 1998 11:27:14 -0500
According to a recent AP wire report, the Emergency Alert System in Maine
(which should have warned residents of the oncoming ice storm) failed
because a radio station lost power.  The Maine Public Radio Station in
Bangor, Maine, that is responsible for signaling TV and radio stations to
send out an emergency broadcast message lost power, and the signal never
went out.  The system was installed just last year (for a surprisingly small
amount — $6000), but apparently nobody bothered to think about powering the
system in an emergency (the time when it would most need power).  The risks
seem obvious.

Jason


Yet another risk of *not* trusting the technology

"Rob Slade" <Rob.Slade@sprint.ca>
Wed, 14 Jan 1998 10:34:55 -0800
A little background here.  Vancouver, despite being in Canada, is not snowed
in for six months of the year.  In fact, snowfall is a rarity here, and
people aren't prepared for it.  So, while parts of Maine, and New York, much
of eastern Ontario, and almost all of Quebec are covered by a sheet of ice,
yesterday (Tuesday, January 13) Vancouver had its own "Storm of the
Century": a whopping 10 cm (0.026 fathoms, for the metrically challenged) of
snow.

The ten-year-old Skytrain forms the backbone of the Vancouver area transit
system.  Skytrain is completely automated: there are no drivers on the
trains, and usually no attendants on the platforms.  BC Transit security
people do ride the odd train, or gang up for the occasional ticket check.
The system even has sensors to detect people on the track, although a couple
of suicides have managed to jump in front of the train at the last minute so
that it doesn't have time to stop or slow down.

The Skytrain has had problems with snow in the past.  The first year, snow
built up on the power strips, which are mounted vertically alongside the
track.  (A small roof took care of that.)  Later, it was found that enough
snow would trigger the detectors and report people on the track.  (I believe
that is now covered by having small heaters melt the snow on the detector
plates.)  So when the Skytrain was packed, yesterday, everyone assumed that
there was another technical problem.

BC Transit, however, was not reporting a problem.  In fact, the BC Transit
spokesman, reporting to one of the radio stations about the status of the
system, let slip the real reason.  Trains were operating normally, but were
being held back, while the authorities scrambled to find staff to put on the
trains.

Nobody has said *why* the staff were needed.  They are not required to drive
the trains.  In normal operation (and, except for the massive crush of
people trying to get on, operations were normal) do not require any
attendants.  If the system has a problem, usually it affects the line as a
whole, and not an isolated train.  In any case, if anything *did* happen to
a train, it isn't likely that a single attendant (on a four car train) could
do anything about it.  If a train broke down between stations, the system
would report where it was, and help could be dispatched to the spot.

The end result was that, with demand for transit greater than normal, as
people left cars at home, only half the trains were running.  The others
were held back since staff couldn't be found to put on them.

Interestingly, the media did not pick up on this.  Late in the day, I was
waiting for my wife at one of the stations, when I noticed a friend who
works as a news cameraman for a local station.  He was preparing for a live
feed to the TV station: a "talking head" was going to be reporting on the
congested transit situation.  When I started discussing the cause with him,
he was astonished.  The TV station news research staff, in the absence of an
official pronouncement from Transit, had simply assumed technical
difficulties.  No one had looked at the staff issue at all.

rslade@vcn.bc.ca     rslade@sprint.ca     slade@freenet.victoria.bc.ca
virus, book info at http://www.freenet.victoria.bc.ca/techrev/rms.html


TCAS near-miss

Steve Bellovin <smb@research.att.com>
Fri, 16 Jan 1998 07:24:39 -0500
TCAS — the Traffic Collision Avoidance System — has almost caused an
accident in southern California.  A Southwest Airlines plane was flying over
Van Nuys airport on approach to Burbank Airport.  Someone on the ground
switched on a transponder; the TCAS system on the plane overhead decided
that an aircraft had suddenly appeared 3000 below it, and suggested that the
pilot climb.  But this brought it into the path of a small Cessna that
really was in the air.


Scares in the air blamed in hand-held gadgets

Ben Low <ben@snrc.uow.edu.au>
Fri, 16 Jan 1998 10:28:06 +1100
Today's Sydney Morning Herald discusses a report from the (Australian)
Bureau of Air Safety linking numerous incidents of "in-flight interference"
to various electronic devices (cd players, laptops, camcorders, etc).
  [http://www.smh.com.au/daily/content/980116/pageone/pageone7.html].
Apparently some 30 cases have been recorded over the past two years, however
the BAS stated that no "connectivity" has been established between the
interference events and electronic devices.

The article notes that captains now have the power to ban the use of such
devices if they believe air safety is at issue. The fine for ignoring these
directions is $A2,750. However, this brings to mind a recent flight I was
on, where my colleague was asked to turn off a palmtop PC. This particular
device doesn't have an "off" button, merely a sleep mode where some (not
much?) electronic activity is still present. So, how "off" is "off"? Was my
colleague at risk of a hefty fine? Will the day come when we have to present
all batteries to the cabin crew?

ben@snrc.uow.edu.au

  [The same article was also noted by pod@ms.com (Paul O'Donnell).  PGN]


Design flaw in Microsoft Word?

BROWN Nick <Nick.BROWN@coe.fr>
Mon, 19 Jan 1998 08:52:14 +0100
  [Although specific bugs are not always appropriate for RISKS,
  this one illustrates a common problem.  PGN]

Last week we finally tracked down a bug which was causing users to get an
error message ("file permission error") when saving Microsoft Word 97
documents under Windows NT.  The error could occur on the user's local hard
disk or even a diskette, so we knew that "file permission" wasn't the
problem.

It turns out that the problem is caused by a combination of Word's
multi-stage save procedure (essentially, this is "save to temp file",
"rename old file", "rename temp file to new file") and Microsoft Outlook's
file management functions.  If Outlook has a window open on the directory
containing the Word file being saved, it can open the saved temporary file
before Word has a chance to rename it (Outlook displays information from the
header of Word documents, such as "Author" and "Keywords", which can only be
obtained by opening the file); if Word then tries to rename it while Outlook
has it open to get this information, the error occurs.

The problem does not occur under Windows 95, presumably because Word does
not yield the CPU until all of the file operations are complete.  But under
Windows NT, all bets are off, and Outlook can wake up at any time and
preempt Word.

The Risk is a common one: Word make assumptions ("I won't get preempted")
that aren't valid in a new operating environment for which it wasn't
designed.

Nick Brown, Strasbourg, France.


ActiveX controls — You just can't say no!

"Richard M. Smith" <rms@pharlap.com>
Tue, 13 Jan 1998 22:29:12 -0500
I have had a glimpse of the ActiveX future and it is not a pretty picture.
The MSNBC Web site (www.msnbc.com) uses an ActiveX control called the MSNBC
NewsBrowser.  Because of this control, going to this site in Internet
Explorer is hell.  The problem is that NewsBrowser control is present on
almost every HTML page of the site.  If you make the choice of not
installing the NewsBrowser control on your PC, Internet Explorer will
redownload the control and ask you to accept it every time you go to a new
page on the site!  On a 28.8K modem that means a page takes a minimum of 1
minute to load and IE keeps bugging you to take the control.  The cynical
side of me wonders if Microsoft isn't trying to force everyone to accept
ActiveX controls whether they like them or not.

The problem here is a design flaw in ActiveX Authenticode system.  It
shouldn't keep asking over and over again to accept a control that has been
rejected in the past.  Worse yet, it shouldn't keep downloading rejected
controls.  It's just plain silly.

There is a simple solution to this problem in the ActiveX Authenticode
system.  Simply use Netscape Navigator which doesn't support ActiveX
controls.  Ironically, www.msnbc.com is a Web site best viewed by Netscape
Navigator!

Richard M. Smith


Risks of anti-spam measures

BROWN Nick <Nick.BROWN@coe.fr>
Tue, 13 Jan 1998 16:48:44 +0100
There is a piece of trialware (Fundi Mail Guard, available at
http://www.fundi.com/fmg.html) which claims to be "a new paradigm, a
different way of combating unsolicited commercial e-mail".  For those of you
still reading after hitting the word "paradigm", the program appears to work
by declaring to be "spam" all mails, except those from sources who have sent
the reply "I agree" in response to a challenge to declare that they are not
spammers.

The description of FMG on this site is a textbook exercise in Risks.
Here are some of the most obvious ones:

- By default, if the sender doesn't respond to the challenge within a
certain time, he or she is declared to be a spammer and banished "forever".
So, if your uncle sends you birthday greetings, switches off his PC, and
goes on vacation, you'll never hear from him again.  Of course, the user can
periodically wade through the list of "spammers" and resuscitate family
members - just another regular system maintenance task which PC users are so
good at.

- The system seems to be based on the premise that spammers use fake return
addresses, from which an appropriate reply to the challenge will never be
received.  This has two major weaknesses.  Firstly, not all spammers use
fake addresses, so they could reply to FMG's challenge.  The makers of FMG's
answer in this case is that the spammer will not send a reply, because that
would "expose themselves to the dual risk of criminal charges and civil
suits" (really !).  A second weakness is that, if FMG becomes widely used,
it would be a trivial exercise to send an "I agree" message with the
appropriate format, to arrive shortly after the original spam and thus keep
FMG happy (and double the number of spam mails for everyone else).

- I can think of many people (and organisations) who will not consider that
issuing challenges to everyone who sends them E-mail, is an appropriate way
to present themselves.  I certainly do not shout "prove you are not an
encyclopedia salesman" through my front door before opening it.

- Depending on one's reason for disliking spam, the program may not even
be of any real benefit.  It runs on your PC and talks to your provider's
POP-3 mail server, so bad luck for AOL users (the most spammed of all),
and it actually downloads all messages anyway - mails declared to be
"spam" are just put in a separate location.  So if your main objection
to spam is the time it takes to download it, you don't gain very much.

Nick Brown, Strasbourg, France


A thought on backup and recovery after Y2K

"Peter G. Neumann" <neumann@csl.sri.com>
Thu, 15 Jan 98 13:26:49 PST
Suppose that a version-management system screws up massively in the
development of a new system in the year 2000, picking up supposedly
most-recent modules dated 99 instead of 00, and inadvertently overwriting
the newer versions.  (Oh, yes, we should instead all have Plan-9-like file
systems that never overwrite!)  In such a case, what is the likelihood that
the backup systems would work correctly to restore the correct versions,
without inducing further damage?  More generally, what is the likelihood
that all of the second-order software that is not normally used in-line
(such as testing environments, debuggers, configuration managers, system and
network monitors, automated report generators, etc.) will have been been
properly upgraded for Y2K compliance?  Perhaps there is second-order stuff
that cannot even be tested adequately until 1/1/00?

Incidentally, I don't recall previously noting here that the Social Security
Administration recently found ANOTHER 30 million lines of code that is not
Y2K compliant, in addition to what had been found on previous passes.  Does
anyone imagine that there might be more?


Re: Easter Eggs in Commercial Software (RISKS-19.53)

Larry Werring <EM3405@cgi.ca>
Mon, 19 Jan 1998 13:16:20 -0500
This follow-up to the Easter Egg contribution in RISKS-19.53 is submitted as
a result of the significant number of responses from readers.
Acknowledgement is given to E. Potter, M. Pack, M. Kohne, D. Porter, C.
Finseth, K. Quirk, B.Tober, N. Brown, M.Richards, M. Kohne, J. Rubin, S.
Murphy, R. Kohl, D. Rae, D. Glatting, B. Ellsworth, D. Honour, P. Scott, D.
Phillips, and F. Chase who were among the many who responded to and
commented on this subject.

In brief, examples of two Easter Eggs were presented in RISKS-19.53
accompanied by some expressions of concern about the hidden costs of wasted
disk resources, time (both programmer and user), and development, poor
quality control and configuration management, and the potential risk of
hidden features in commercial software.  Responses ranged from "Get a life",
"You're paranoid", and "Why are you slamming Microsoft" to "What's a trusted
environment?", "I want to know more", and "I'm writing an article on this
very problem."

Representative extracts (snippets) from the numerous e-mail are included as
follows:

<Snippet> I think you're on the right track with your essay on Easter Eggs.
 A nice compilation would be good. You've made a good beginning.
<Snippet>Perhaps someone should start a web site to highlight the problem
and tell people how to activate the eggs. Is there such a site already?

<Author> See http://www.eeggs.com and
http://www.cnet.com/Content/Features/Howto/Eggs/

<Snippet> I don't think the consumer pays anything more for the existence of
Easter eggs.  They don't take long enough to write to make a noticeable
difference in the cost of the software to end users. and while people may
look for Easter eggs on company time, I don't think that it's the fault of
the eggs! Having participated in a number of corporate environments, I know
that things like Easter eggs can be a short but welcome break from work, and
those who waste excessive time on Easter eggs would waste the same time
doing something else if they didn't exist.

<Author> Ten minutes wasted on the part of an employee is ten minutes billed
to you by the company.  Case-by-case may be minimal but add it all up and it
costs. One million employees wasting one minute is one million minutes
wasted.

<Snippet> Aha, here we see your target.  Was this whole post an effort to
bash Microsoft?

<Author> Read it again.  I only used MS as an example because their eggs can
be elaborate. I didn't say I don't like MS. In fact, I'm a heavy user of MS
apps.

<Snippet> Hello.  I read your article in comp.risks.  I saw the IE Egg.  It
was LONG!!  I never finished it.  I got past the first intermission that
described features that never made it in the product and stopped there. Is
there maybe a transcribed edition :)

<Author> I searched for the text contents of the egg and couldn't find
them. I assume they are coded into a DLL.

<Snippet> ... What other unknown features are embedded within commercial
products?  Lots and lots in almost any program ever released.  The shortened
name for them is "bugs."

<Author> But is there anything else?

<Snippet> I would suggest that the (nearly omnipresent in major software)
Easter eggs are not manifested risks of technology, but instead the natural
result of asking people to do highly creative work (software design and
construction) without giving them credit. Software development is still a
craft more than a science, and its practitioners often feel (rightly, in my
opinion) that they deserve not only a paycheck but a way to see their name
on their work. If a software development team doesn't have an authorized way
to put the team members' names on the product, they'll invent an
unauthorized one, do it on their off hours, and find a way to get it in that
won't show up in testing.  I've seen a situation where the Easter egg *was*
found in testing, and the team member told to remove it simply found a way
to hide it more deeply.

<Author> An excellent example of how eggs get embedded in the software.  I
agree that programmers should be given credit for their work.  I disagree
with the way they are doing it.  Obviously the configuration control wasn't
up to par otherwise the egg would have been found again before the software
was released.

<Snippet> The way around the "problem" would be for software manufacturers
to stop trying to pretend that there are no individuals at the corporation
-- simply include "credits" in the product.  Movies do this, games do this.
Do it in an "authorized" fashion and you have some hope of controlling it.
 Don't do it and you get the unauthorized versions.

<Author> I agree, but I hope it's done in a way that lets me remove the
credits once I've read them.

<Snippet> From the user standpoint, I think Easter Eggs are a fine treat,
especially when well-done.  (Kind of like a "real" Easter Egg, not
especially useful but delightful.)  From the security standpoint, I agree
with your points completely.  Since these are hidden programs, there could
be any number of things in commercial software which is not quite so
benevolent as the pretty little toys we tend to hear about with the
Microsoft products.  (Think Inslaw's PROMIS software!)  Regarding cost to
consumers/employers: On one hand, the typical programmer of the toys
probably "lives" at their employer's workspace and so the separation of
work/personal time can be rather murky (i.e. programmer's idea of fun is
writing fun programs).  On the other, where is the auditability (corporate
accountability) for product content?  Can we trust government to be
independent in software evaluation?  Can we trust EDP auditors whose main
business is retaining the client?

<Author> Nice summary.

<Snippet> "Can You Really Trust Trusted Third Parties: A Study of Internet
Security Issues and Solutions" coming Spring 1998 from Bloor Research --

<Author> I look forward to reading this.

<Snippet> If you are using off-the-shelf commercial software in a trusted
environment without validating either the product itself, or the vendor's
internal procedures, you get what you deserve. Commercial software is just
that - commercial.  Microsoft doesn't make "good" software, they make "good
enough" software - it's good enough to insure a continued market presence
for Microsoft, and frankly that's all that matters The Easter eggs are in
fact not useful to the end user, but they are amusing, and Microsoft
probably feels that letting their smart guys play around a little will
ensure continued loyalty and continued working of long hours.

<Author> I agree fully with your first comment, but that is my point.  In an
environment where sensitive information about the public was being processed
and stored, specific direction from management (despite my expressions of
concern) was that vendor software could be trusted.  Why?  Management
considered it too expensive to actually validate the product or the vendor
and preferred to trust the "Name".

<Snippet> As a software developer, this gives me some great ideas for
including some type of credit in my own stuff.  I will, however, take your
concern to heart, keeping the bloat to a minimum.

<Author>Thanks.

<Author's follow-up>....In searching for the source code for an Easter Egg
in a commercial product (to remain unnamed to avoid upsetting anyone
further) we did a text search for the names of some of the developers listed
in the egg and found a 3+ meg DLL.  We renamed the DLL to no effect - the
egg still ran.  In fact, in the past month the program has yet to notice
that the DLL has been changed and subsequently removed. (Could this be an
early version of the Easter egg which was never removed?)  Why should I
still be concerned?  Well, considering that we have 40,000 employees in the
organization I currently work with, most of whom are using the product, that
works out to (40,000 X unused 3+ meg DLL) + (40,000 X Space Occupied by EE)
= a heck of a lot of wasted disk space for one application (for just the
unused DLL alone we're talking 120 Gigabytes).  Now add on the space
occupied by the eggs in the other four to six egg-ridden applications and
you begin to get the scope of the problem in one organization.  Now multiply
this by the number of organizations world-wide (or a number based on the
size of an average organization) and check the result.  <Any mathematicians
want to try this?>

Although the emphasis in this matter seems to be software bloat, the risks
relate to the lack of vendor control over software development and the
possibility of hidden but malicious code inserted in commercial software.
 Imagine the hacker prestige and the possible resulting damage should a
programmer within the program team successfully embed hidden code in a
highly popular and freely available software package which, for example,
records account and credit information entered during a secure Internet
session and then sends it to the hacker during the next insecure session.

Larry Werring - IT Security Consultant


USENIX SECURITY SYMPOSIUM reminder

Cynthia Deno <cynthia@usenix.org>
Tue, 13 Jan 1998 14:07:50 -0800
Time is running out.  Register now.

USENIX SECURITY SYMPOSIUM, January 26-29, San Antonio, Texas

Review the program.  See the quality. Register on-line.  Last day for
on-line registration: January 20:  http://www.usenix.org/events/sec98/
Last day for faxed/postal registrations:  January 21.  Fax:  714.588.9706
Call 714.588.8649 if you'd like to speak to someone about the conference.

USENIX is the Advanced Computing Systems Association.  Its members are the
computer technologists responsible for many of the innovations in computing
we enjoy today.  To find out more about USENIX, visit our Web site:
http://www.usenix.org.


Quality Week '98, Download Call for Participation

Software Research <sr@netcom.com>
Mon, 12 Jan 1998 16:02:01 GMT
       ELEVENTH INTERNATIONAL SOFTWARE QUALITY WEEK 1998 (QW'98)
                   CALL FOR PAPERS AND PRESENTATIONS
                  Conference Theme: Countdown to 2000
              San Francisco, California — 26-29 May 1998

QW'98 is the eleventh in the continuing series of International Software
Quality Week Conferences that focus on advances in software test technology,
quality control, risk management, software safety, and test automation.
Software analysis methodologies, supported by advanced automated software
test methods, promise major advances in system quality and reliability,
assuring continued competitiveness.

Download a copy of the CALL FOR PARTICIPATION in PostScript or in PDF format
from the conference Web site:

    http://www.soft.com/QualWeek/QW98

Or, request a printed copy with e-mail to qw@soft.com.

Please report problems with the web pages to the maintainer

x
Top