The RISKS Digest
Volume 15 Issue 02

Friday, 3rd September 1993

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

BETTER airline/travel-agent computer hide-and-seek
Mark Seecof
Lost Canadian crime statistics data
Luis Fernandes
Risk of incorrect Daylight Saving conversion
Arthur David Olson
The Risk of Discussing "the risks of discussing risks" in RISKS
Jeffrey S. Sorensen
The risks of CERT teams vs we all know
Fredrick B. Cohen
Potential risk in terminal buffer storage
Robert S. Richardson
Electronic documents
Mich Kabay
Re: Dorothy Denning and the cost of attack against SKIPJACK
Bill Murray
Re: Mars Observer tank testing
Donald Arseneau
Re: Dial 1 first
Mark Brader
COMPASS '94 CALL FOR PAPERS
John McLean
Info on RISKS (comp.risks)

BETTER airline/travel-agent computer hide-and-seek

Mark Seecof <marks@wimsey.latimes.com>
Thu, 2 Sep 93 20:22:54 -0700
On page B-1 of the Wall Street Journal, 1 Sep 93 a story headlined "Fliers
find scarcity of choice seats" by James Hirsch describes how major airlines
are now manipulating seat assignments in coach class to favor frequent flyers
and full- fare passengers.  They manage this through their computer
reservation systems (CRS's).  Some travel agents are struggling to get good
seats for their disfavored clients by fooling the CRS's or even by using other
computers to poll CRS's looking for "openings" to reassign better seats to
their clients.

[What follows is Seecof's appreciation of the story, not quotation; errors and
opinions belong to Seecof.]

Airlines used to give out coach seats on a first-come-first-served basis.
Other classes of ticket (1st, business) got different, better seating.  But
not all coach seats are equal in flyers' eyes: people prefer aisle or window
seats near the front of the cabin.  Airlines in search of ways to
differentiate their service are now giving "full-coach" (fare) and
very-frequent-flyer passengers the better seats.  However, they have not
announced their policy to the public.  Instead, people holding low-fare
tickets who request seat assignments are told that the good seats are already
taken--and are offered less desirable seating.  Depending on how the message
is phrased, it comes across as baffling ("the seats are unavailable") or
misleading ("the seats are already assigned").  Travel agents can figure out
which seats are really assigned by testing different sample data against the
seat assignment algorithm for each flight in a CRS... if an agent tries to
seat a passenger on a cheap ticket and is told that only middle seats at the
rear are available, he can attempt to reserve seats for a hypothetical
full-fare passenger to see which seats will be available to such a flyer.
According to Hirsch, many airlines "free up" unfilled-but-restricted better
seats as departure time approaches so that late-booking flyers can get seat
assignments.

According to the WSJ, some travel agents are retaining the frequent-flyer ID's
of favored passengers and using those to reserve seats for less favored
flyers.  Agents also apply other schemes--including booking a flight on a
full-fare ticket, then later revising the reservation to a lower-fare ticket.
Associated Travel Management, of Santa Ana, Calif. is said to use a
proprietary computer system which monitors the availability of seats for
specified flights and passengers by polling a CRS--when a "better" seat
"becomes available" the system requests it for the client passenger.  A client
may have his seat changed many times without human intervention as the system
grabs seats grudgingly offered by the CRS.  The WSJ says the airlines don't
like travel agencies bucking airline seat assignment algorithms.

This is fascinating.  Leaving aside the irritating deception practiced by the
the airlines (they avoid telling low-fare passengers *why* they can't get good
seats), the seat-assignment algorithms are perfectly rational--you want a good
seat, you gotta pay more.  (You may wonder why the airlines don't break out
the price of better seats--just charge 5% more for good seats.  I think it's
'cause they don't want to give full-fare passengers discounts for bad seats.
When a 'plane is full even favored passengers can be assigned bad seats--if
the seat price differential were explicit on the up-side, those flyers would
demand reciprocal discounts on the down-side.)  The travel agents' tactics are
also rational--the CRS gives good seats to reservations with certain
qualities...  then tell the CRS what it wants to hear!  But the travel
agencies are forced into either deceiving the CRS on behalf of flyers (by
falsely claiming that the flyer will pay full fare, or is a very frequent
flyer), or using computers of their own to tussle with the CRS for seats.  If
agencies other than Assoc.  Travel Mgmt. start to use similar systems to try
and get good seats for their clients, seat assignments could become a kind of
lottery.  With many agent systems polling a CRS looking for a seat, when they
all try to pounce on one at the same time it'll be chancy which transaction
gets through.

I see two risks arising from the way airlines and travel agents are exploiting
computers to complicate seat assignment.  The first is, that travel agents and
flyers generally are being taught to lie to the CRS's.  That is, a premium has
been placed on falsifying input data to the computer.  Now, people gain
economic advantage by lying all the time.  But in this case they are lying at
low or zero (depending on the exact tactics they adopt) risk and in a
situation where their conscience does not bite.  Before computers, you had to
lie to a human being and unless you were a sociopath you probably felt guilty
about it.  Now a computer has been programmed to implement a sufficiently
harsh seating regime that you feel prompted to lie your way past it
(especially since the airlines are lying to you about the price differential
for good seats--because they don't want to discount bad ones), and you are not
likely to feel any guilt over lying to a computer, a mere machine, albeit
there are human consequences downstream--those are very remote.  The effect of
lying to the machine is GIGO--you provide bad input, the machine provides bad
output.  Airlines will find that their revenue estimates are wrong, because
many full-fare bookings will be mere subterfuges to obtain good seats, and
so-on down the line.  To avoid this, airlines will design and implement
complex, costly, and likely quite punitive algorithms to validate input more
effectively.  This will prompt even more sophisticated schemes to evade those
algorithms, likely by more subtle input data distortion...

The second risk is that only flyers using agents with systems designed to beat
on a CRS until it coughs up a desired seat will get good seats.  This will
harm flyers without such agents, and prompt the development of more such
polling systems.  As that in turn loads down CRS's they will cost more to run,
and unless they are reprogrammed, seat assignment will become like a lottery;
which many flyers will think less fair than either first-come-first-served or
pay-more get-more.  Either way, costs will be passed on to flyers whether or
not they use agents with fancy CRS-pounding systems.

In the final analysis it might be cheaper for airlines and flyers both to use
a simple seat-assignment scheme with more perceived fairness and skip the
whole computer arms race.

Mark Seecof <marks@latimes.com>


Lost Canadian crime statistics data

luis fernandes <elf@ee.ryerson.ca>
Fri, 3 Sep 93 17:25:59 EDT
Toronto Star, Aug. 31, 1993 [p. A9]

  TORONTO-- Statistics Canada reported a dramatic drop- almost 12 percent- in
  violent crime across Metro from 1991 to 1992. But according to Metro police,
  violent crimes [assault, sexual assault, robbery, etc.(!) ], except
  homicides, continued to climb last year.  For example, Statistics Canada
  cited 24,408 assaults (both sexual and non-sexual) in Metro last year...But
  the Metro police annual report cited 29,071 assaults reported last year...

  Officials at Statistics Canada and Metro police could not explain
  the discrepancies yesterday.  A Statistics Canada official said the
  figures were provided by Metro police...

The next day (Sept. 1, 1993), the following report appeared [p. A2]:

  Statistics Canada has likely lost computer data, causing a major
  miscalculation of Metro's violent crime rate, Metro police say...
  Puzzled StatsCan officials said they may know today what's wrong.
  [Gordon MacKay of the Canadian Centre for Justice Statistics, which
  compiled the figures for StatsCan] said that one possibility is a
  problem with data they received via a recently installed computer link-up.

  Both Metro police and Statistics Canada officials said yesterday there were
  no problems when the calculations were done manually from typed reports.

  This year's federal crime survey marked the first time Metro's figures were
  calculated using computer tapes provided by the force. The system was
  supposed to speed-up calculations and do away with paperwork...

  MacKay said StatsCan usually sends preliminary findings to each police force
  for verification. But Metro police didn't receive the crime figures from the
  agency until yesterday-- hours after it had made its findings public, [said
  Mike Dear, Metro police's director of records and information security.]

The Thursday edition did not follow-up.

           [An earlier problem with the Metro Police handling of crime
           data was contributed by Doug Moore to RISKS-14.18.  PGN]


Risk of incorrect Daylight Saving conversion (Eggert, RISKS-14.87)

Arthur David Olson <ado@elsie.nci.nih.gov>
Mon, 30 Aug 93 15:09:43 EDT
Paul Eggert's note that

  On August 28, 1993, at 2 AM local time, workstations and PCs in Israel
  that are running Sun's Solaris 2.2 will suddenly lose an hour.

might be riskier than expected.  Several years ago Mubarak Awad was one
of the Carey Memorial Lecturers at Baltimore Yearly Meeting of the
Religious Society of Friends (Quakers); a transcript of the lecture
should be available from the yearly meeting.  Mubarak stated that as a
form of resistance some Palestinians used their own rules for starting
and ending Daylight Saving Time, and that folks whose watches ran on
Palestinian time risked having them broken if stopped for questioning.

  Arthur David Olson  ado@elsie.nci.nih.gov


The Risk of Discussing "the risks of discussing risks" in RISKS

Jeffrey S. Sorensen <sorenjs@pb.com>
Fri, 3 Sep 93 13:43:21 -0400
The discussion of the risks of discussing risks in RISKS is not without its
risks.  I am very meta-concerned with this issue.  Remember that excessive
scrutiny of a digest like RISKS might, in the process of debate, provide
material for those who would censor such a discussion.  If we, even
momentarily, entertain the notion that RISKS is perhaps a hotline for crackers
to exchange tips, many administrators may try to limit distribution without
waiting for the discussion to end.

There is also the _even greater_ danger of being caught in an infinite regress
until every megabyte of space on the net is filled with the text "risks of
risks of risks of risks..."

Or even _worse_, some readers may find the discussion of the discussion of
risks to be so abstract and pointless that they unsubscribe to this group as
they are reading this sentence.  We must balance the risk of discussing risks
against the risks of alienating RISKS readers.

Jeff Sorensen  sorenjs@pb.com


The risks of CERT teams vs we all know

Fredrick B. Cohen <fc@Jupiter.SAIC.Com>
Fri, 3 Sep 93 06:08:16 PDT
The problem with restricting information to CERT teams, etc. is that this:
1 - creates a techno-elite
2 - limits distribution far too much

I expand upon it:

    Creating a techno-elite makes it impossible for the average peerson or
the interested novice to get involved.  Most of the major breakthroughs in
information protection ever the ages have come from one of these types and NOT
from the techno-elite.  We are creating an inbreeding situation that could be
a fatal flaw.

    Limiting distribution to these groups means that the vast majority of
those who actually perform these protection functions are denied the facts
they need to get the job done.  Suppose the attacker takes out the phone lines
to your CERT.  You become hopeless because you are a sheep.  If you know how
things work on your own, at least you have a chance to defend yourself.

FC

P.S. In my exchange, you may not dial a 1 for local calls, and you must dial a
1 for non-local calls EXCEPT for international call.  Dialing a 1 before
everything doesn't work.  Does anyone have a universal list of exchanges and
which other exchanges are considered local to them?  I think not!  Without
this, how can I automate the process?  Wait for a disconnect and assume it was
from a failure to dial/not dial a 1?


Potential risk in terminal buffer storage

"robert s. richardson" <bob@CSOS.ORST.EDU>
Fri, 3 Sep 1993 01:28:07 -0700 (PDT)
I use a terminal program on my computer that automatically allocates free
memory as you use it to store a copy of your on-line session.  I have this
buffer configured rather large (about 1/2 meg) so that I may scroll back
through news and such if I remember I point I want to look up.  The other
day a co-worker used my terminal for a couple of hours, then I took over
again.  I went to check the buffer for something I was doing, and noticed
that a copy of their entire session, which included email and other
personal and business related items I should not have had access to, were
in the buffer.  I deleted the buffer immediately, but the potential RISK
here is that if you are using an unfamiliar terminal, or even you own,
it is possible that information you or others wish to be secret can be
sitting in a buffer for hours or even days.  Flush that buffer at the
end of each session!

Bob Richardson, OmiCo Industries, PO Box 1404 Corvallis, OR 97339
bob@kira.csos.orst.edu              503-758-5018


Electronic documents

"Mich Kabay / JINBU Corp." <75300.3232@compuserve.com>
03 Sep 93 07:52:52 EDT
A recent article deals with several RISKS of depending on electronic
documents:

  Hayes, B. (1993). The electronic palimpsest; Digital documents for all
  occasions: erasable, correctable, reproducible, forgeable.  _The Sciences_
  (NY Academy of Sciences) 33(5):10.(Sept/Oct 1993)

I enjoyed reading Brian Hayes article in the new issue of this fine magazine.
It is not only informative and up to date, but also elegant, amusing and
beautifully illustrated with various paintings.  Summary follows:

"As a writing instrument, the computer is not su much a better pencil as a
better eraser."  You can eliminate all traces of your early versions at the
stroke of a key.

This easy erasability leads to difficulties of authentication.  How can one
prove who wrote an electronic document?  Digitized signatures make the problem
worse, since anyone can scan a real signature and then print in on any
document.  However, digital signatures are a good method of authentication.
The public key cryptosystem allows you to encrypt a document with your private
(secret) key; only the corresponding public key decrypts the message.  The
encrypted version is as big as the original, though: a nuisance.  A
refinement, the digital signature, encrypts a digest of only 160 bits and
provides the same confidence of authentication.

Another problem is forgery.  If we pay the rent with an electronic cheque,
what stops a crook from using copy after copy of the same cheque?  We will
need unique serial numbers on electronic cheques.

What about proving _when_ a document was created?  Here we have to rely on
a time-stamping service.  Scientists at BELLCORE have invented the
time-stamp equivalent of the digital signature.  You submit a digest of the
document that needs to be time-stamped to a trusted time-stamping computer;
it generates a cryptographically-sound certificate which includes the time
of receipt.

To prevent fraud at the time-stamping computer (where someone might change the
system clock long enough to produce fake time-stamps for a specific crime),
every certificate is merged mathematically with all the others issued during
the same weekly period.  The summary time-stamp is then published in _The New
York Times_.

The legal system will have to adapt to the increasing use of electronic
documents.  Historians will also have more trouble piecing together the
creative process if only the final version is published or physically
available.  And what about the rapid changes in computing technology and
storage devices?  Who will be able to read today's diskettes a hundred years
from now?  Or even ten?  Archivists must think about these issues.

<

Re: Dorothy Denning and the cost of attack against SKIPJACK

<WHMurray@DOCKMASTER.NCSC.MIL>
Thu, 2 Sep 93 22:20 EDT
On page 14 of the August 30, 1993 issue of Government Computer News, Kevin
Power reports that Dorothy Denning told the Computer System Security and
Privacy Advisory Board that SKIPJACK would not be compromised by exhaustive
attack methods in the next 30 to 40 years.

I am reminded of a story, perhaps apocryphal.  In the middle seventies Fortune
magazine was working a feature on computer crime.  Most of the experts that
they interviewed told them that the security on most of the nation's
commercial time sharing systems was pretty good.  However, they admitted that
one convicted felon and hacker, Jerry Schneider, would tell them otherwise.
Of course Fortune had to interview him.  According to the story, the interview
went something like this:

  Fortune:  Mr. Schneider we understand that you are very critical of the
  security on the nation's commercial time sharing systems.

  Jerry:  Yes, that is right.  Their security is very poor.

  Fortune:  Could you break into one of those systems?

  Jerry: Yes, certainly.

  Fortune:  Well, could you demonstrate for us?

  Jerry:  Certainly, I'd be happy to.

At this point Jerry took the reporters into the room where his "Silent 700"
terminal was.  He connected to the system that he normally used but
deliberately failed the logon.  When he deliberately failed again at the retry
prompt, the system disconnected.  Jerry dialed in again, failed a third time,
and this time he broke the connection.  He dialed a third time but this time
he dialed the number of the operator.

  Jerry:  This is Mr. Schnieder.  I seem to have forgotten my password.
  Can you help me?

  Operator:  Sorry Mr. Schnieder, there is nothing that I can do.  You
  will have to call back during normal business hours and talk to the
  security people.

  Jerry:  I am sorry too, but you do not seem to understand.  I am working
  on something very important and it is due out at 8am.  I have to get on
  right now.

  Operator:  I am sorry.  There is nothing that I can do.

  Jerry:  You still do not understand.  Let me see if can clarify it for
  you.  I want you to go look at your billing records.  You will see that
  you bill me about $800- a month.  This thing that I am working on; it is
  why you get your $800-.  Now, if you do not get off your a-- and get me
  my password so that I have this work out at 8am, by 9am there is going to
  be a process server standing on your front steps waiting to hang paper
  on the first officer through the door.  Do I make myself clear?

Apparently he did.

  Operator:  Mr. Schnieder, I will call you right back.

At this point he appears to have one or two things right.  He changed the
password, called Jerry back at the number where his records said that he
should be, and gave him the new password.  Jerry dumped two files and then
turned to the reporters.  With a triumphant smile he said "You see!"

  Fortune (obviously disappointed):  No, No, Mr. Schneider!  That is not
  what we wanted to see.  What we wanted to see was a sophisticated
  penetration of the software controls.

  Jerry:  Why would anybody do THAT?

The cost of an exhaustive attack is an interesting number.  It gives us an
upper bound for the cost of efficient attacks.  However, it is never, itself,
an efficient attack.  It is almost always orders of magnitude higher than the
cost of alternative attacks.  The very fact that its cost can be easily
calculated ensures that no one will ever encrypt data under it whose value
approaches the cost of a brute force attack.

History is very clear.  "Black Bag" attacks are to be preferred; they are
almost always cheaper than the alternatives.  After those are attacks aimed
against poor key management.  These attacks will be very efficient when the
keepers of the keys already work for you and where their continued cooperation
and silence are assured.

William Hugh Murray, 49 Locust Avenue, Suite 104; New Canaan, Connecticut
06840    1-0-ATT-0-700-WMURRAY; WHMurray at DOCKMASTER.NCSC.MIL


Re: Mars Observer tank testing (RISKS-15.01)

Donald Arseneau <asnd@erich.triumf.ca>
Fri, 03 Sep 1993 01:25:19 PST
> It [testing to destruction] also tells you nothing about other tanks.  PGN

What a silly statement!  It tells you a lot: it says the design is not
completely wrong; it says that the manufacturing can be done properly; it even
says that another tank, made the same way without error, will work.

Maybe Peter is too absorbed in software validation lately, where his strong
statement may be true.

Donald Arseneau                         asnd@reg.triumf.ca

    [I was obviously a little less than precise.
    Destruction testing tells you something about the fabrication process
    and, yes, it tells you something about the design.
    Its usefulness assumes that there were no design changes between tanks,
    and no changes in the fabrication process.
    But it does not tell you that the second tank does not have an
    implementation flaw.  Yes, I need to be very precise.  So do the
    tank fabricators.  T'anks.  PGN]


Re: Dial 1 first (Cohen, RISKS-14.89)

Mark Brader <msb@sq.sq.com>
Fri, 27 Aug 1993 23:10:59 -0400
This topic arises in comp.dcom.telecom as regularly as its moderator lets it
in there.  For the benefit of foreign readers, this pertains to a significant
variation in the methods of dialing phone calls in different parts of the US
and Canada.  In some areas, a leading 1 in a number means that an area code
follows; in others, the ones discussed here, it means that the call is long
distance.  These two ways of classifying a call are in general independent of
each other.  Further variations, not discussed here, applies to exactly when
you have to dial the area code.

>    Until recently, I had the good fortune of living in a telephone
> exchange without this problem ...

It's not a bug, it's a feature.  (But see below.)

>    Why do I have to dial 1 before some numbers and not before others?
> The computer at the phone company politely tells me to dial 1 first, or
> to not dial 1 first depending on where the call is made to, but this is
> little comfort for my autodialer making a series of several thousand calls
> to send out FAXes.  If they can tell me to add 1 or not, why can't they just
> add it or not for me?

Because, in places where the rule is "dial 1 first if and only if the call
is long distance", it is assumed that

    (1) nobody wants to dial a long-distance call by accident
    (2) everybody knows what numbers they can reach with a local call
and (3) local and long distance are the only kinds of calls.

Assumption 1 justifies the "dial 1 if it's long distance".  Assumption 2
justifies the "only if" part of the rule, because of this reasoning:
if you dialed a local number *with* the 1, then you must obviously have
meant to dial some *different* number which would *not* have been local.

I said it was a feature, and for some people it is.  For others, such as Mr.
Cohen, it is a major nuisance.  It's even worse for people who regularly
travel about and take modems with them and want to program phone numbers into
the modems.

Modern phone systems should be able to configure one or another dialing method
according to the *customer's* preference, but nobody has offered such a
feature.  I suspect the view is that either there would be little demand or
that people would find it too confusing when using a phone not their own.
(Yet, from my office, I have to dial 9 for a local call, and 81 for
long-distance...)

There are also places where the rule is implemented incompletely, either
because assumption 3 does not hold or due to some local quirk.  These, I
suspect, are a nuisance to everybody.

Mark Brader, SoftQuad Inc., Toronto   utzoo!sq!msb, msb@sq.com


COMPASS '94 CALL FOR PAPERS

<mclean@itd.nrl.navy.mil>
Mon, 30 Aug 93 16:41:43 EDT
                CALL FOR PAPERS
                   COMPASS '94
9th Annual IEEE Confererence                June 27-30, 1994
on COMPuter ASSurance                   Gaithersburg, MD

The purpose of this conference is to bring together researchers,
developers, and evaluators who work on problems related to specifying,
building, and certifying high-assurance computer systems.  What
distinguishes COMPASS from similar conferences is its emphasis on
bridging the gap between research and practice.  Researchers are
provided an opportunity to present results, new theories, and new
technologies to both other researchers and practitioners who can put
them to practice.  They can also learn from practitioners of new
research problem domains and of problems encountered in building real
systems.  Practitioners have an opportunity to share lessons learned,
to learn of new research, and to influence future research.

Papers should present advances in the theory, design, implementation,
evaluation, or application of high-assurance systems, or report on
experiments, evaluations, and open problems in the use of new
technologies for computer assurance.  Special consideration will be
given to presentations (either single papers or paper pairs) by
practitioners and researchers who have worked on the same problem.
There will also be a tools fair.  Papers, panel session proposals,
tutorial proposals, and tools fair proposals are solicited in the
following areas:

   Software Reliability     Software Safety       Computer Security
   Formal Methods           Tools                 Process Models
   Real-Time Systems        Networks              Embedded Systems
   V&V Practices            Certification         Standards

INSTRUCTIONS TO AUTHORS:
Send five copies of your paper, panel session proposal, tutorial
proposal, or tools fair proposal to John McLean, Program Chair, at the
address given below. Abstracts, electronic submissions, late
submissions, and papers that cannot be published in the proceedings
will not be accepted. Papers submitted from outside North America
should be sent via overnight courier service.

Papers must be received by January 15, 1994 and must not exceed 7500
words.  Authors are responsible for obtaining prior to acceptance any
and all necessary clearances for publication.  Authors will be notified
of acceptance by March 11, 1994. Camera-ready copies are due not later
than April 22, 1994.

Papers that use technology presented at a previous COMPASS conference
are eligible for a special award.  Papers of exceptional quality and
appropriate subject matter are eligible for inclusion in a special
issue of the Journal of High Integrity Systems or the Journal of
Computer Security.

Limited financial assistance will be available for student authors.


                             PROGRAM COMMITTEE

Paul Ammann, George Mason Univ. (USA)      John Marciniak, CTA (USA)
George Dinolt, Loral (USA)                 John McDermid, York Univ (UK)
Jan Filsinger, Booz-Allen Hamilton (USA)   Jon Millen, Mitre (USA)
Virgil Gligor, Univ. of Maryland (USA)     John McHugh, Portland State U. (USA)
Li Gong, SRI (USA)                 David Parnas, McMaster Univ. (Canada)
Connie Heitmeyer, Naval Res. Lab. (USA)    John Rushby, SRI (USA)
Jeremy Jacob, York Univ. (UK)          Ravi Sandhu, George Mason Univ. (USA)
Carl Landwehr, Naval Res. Lab. (USA)       Jeannette Wing, Carn. Mellon U. (USA)
Teresa Lunt, SRI (USA)

FOR FURTHER INFORMATION CONCERNING THE SYMPOSIUM, CONTACT:

Jan Filsinger, General Co-Chair          John McLean, Program Chair
Booz-Allen Hamilton, Inc                 Naval Research Laboratory
8283 Greensboro Dr.                      Code 5543
McLean, VA 22102                         Washington, DC 20375
Tel: (703) 902-5302                      Tel: (202)767-3852
Fax: (703) 902-3131          Fax: (202) 404-7942
filsing@itd.nrl.navy.mil                 mclean@itd.nrl.navy.mil

Please report problems with the web pages to the maintainer

x
Top