The RISKS Digest
Volume 15 Issue 22

Friday, 5th November 1993

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Direct Subscribers: Can you instead read RISKS as a newsgroup?
RISKS
Prague computer crime
Mich Kabay
Master of Disaster Phiber Optik sentenced
Mich Kabay
Mass. state police confuse car owners with gun carriers
Brian Hawthorne
Overenthusiastic automated investment programs
John R Levine
RISKS of unaccountable computerized elections
Dave Hart
Re: Safety-critical software
David Parnas
Re: InterNet Mailing List
JS McBride [2]
Security of the internet
Bill Murray
PGN
Yu, "Automated Proofs of Object Code..." available
Jim Horning
"Research Directions in Database Security" ed. by Teresa Lunt
Rob Slade
Re: CERT Reports and system breakins
Phil Karn
A. Padgett Peterson
Call for Participation, FIRST Incident Response Tools Wk Grp
Michael S. Hines
Info on RISKS (comp.risks)

Direct Subscribers: Can you alternatively read RISKS as a newsgroup?

RISKS Forum <risks-request@chiron.csl.sri.com>
Fri, 5 Nov 93 16:27:35 PST
Folks, New subscriptions are pouring in at a remarkable rate, and BARFmail is
increasing at an even more rapid rate.  The network seems flakier than ever.
If you are getting RISKS via a direct subscription (check your mail headers),
PLEASE check with your local netnews wizards to see if you can now read it as
a newsgroup, BBoard item, or local redistribution.  If you are able to read it
that way, PLEASE do so, and send me a message so that I can remove you from my
list (actually plural, multiple lists, because otherise my mail system dies).

[Yes, I know, I should use an automatic LISTSERV, but that creates its own set
of headaches.  Are there any really robust, moderator-friendly, portable UNIX
LISTSERVers I could use for direct mailings of RISKS, despite the fact that I
continually get requests from sites that cannot be answered?!*%&!!?]

PGN


Prague computer crime

"Mich Kabay / JINBU Corp." <75300.3232@compuserve.com>
04 Nov 93 17:36:33 EST
  CZECH TRANSITION SPURS BOOM IN ECONOMIC CRIME, By Bernd Debusmann
  PRAGUE, Nov 3 (Reuter, 2 November 1993) - The Czech Republic's
  transition to a market economy has led to a boom in economic crime ranging
  from embezzlement to tax evasion, as criminals exploit money-making
  opportunities denied them under communism.
    According to the latest police statistics, economic crime jumped 75.2
  percent in the first nine months of the year compared with the same period
  in 1992 — a steeper increase than any other criminal activity.  [From the
  Reuter newswire via the Executive News Service (GO ENS) on CompuServe]

The article goes on to state that with the growth of an economy, the
opportunities for economic crime are increasing apace.  Although police claim
to solve 75% of the cases of fraud reported to them, there seem to be many
more unreported cases.  In a recent case, "Martin Janku, a 23-year-old
employee of the Czech Republic's biggest savings bank, Ceska Sporitelna, is
accused of transferring 35 million crowns ($1.19 million) from various
corporate accounts to his personal account over an eight-month period."

In a typical hacker's excuse, Janku claims to have done this to demonstrate
the bank's poor security.  He wrote software himself to be able to tamper with
client accounts--but only, he said, after repeatedly warning his bosses of
weak security precautions.  The theft was not detected by the bank itself
until Janku withdrew part of the money. He was arrested as he was in the
process of stuffing half a million dollar's worth of banknotes into a
briefcase.

The problems of inefficient bureaucracies are compounded by poor laws and
indifferent enforcement.  The problem is so widespread that about 30% of the
residents of Prague own a country home--and a large percentage of those are
claimed by analysts to be built through illegal economic activity.

Michel E. Kabay, Ph.D., Director of Education, National Computer Security Assn


Master of Disaster Phiber Optik sentenced

"Mich Kabay / JINBU Corp." <75300.3232@compuserve.com>
04 Nov 93 17:37:14 EST
  Mark Abene, 21, widely known as Phiber Optik, was sentenced to a year and a
  day in prison.  He will serve 600 hours of community service.  He pleaded
  guilty last July to conspiracy, wire fraud and other federal charges
  relating to his activities as one of five Masters of Disaster indicted for
  breaking into telephone, educational, and commercial computer systems.
  [Perhaps in a few years more, they will be Doctors of Disaster?]  [PGN
  Excerpting Service, drawn from the Associated Press and Reuters, both on 3
  November 1993]

The Reuter article give background information, including

o the charges against MoD marked the first use of wiretaps to record both
  conversations and datacomm by accused hackers.

o the hackers attacked phone switching computers belonging to Southwestern
  Bell, New York Telephone, Pacific Bell, U.S. West and Martin Marietta
  Electronics Information and Missile Group.

o they broke into credit-status reporting companies including TRW, Trans
  Union and Information America, stealing at least 176 TRW credit reports.

o the young men were apparently competing with each other and other
  hacker groups for "rep" (reputation) and were also interested in
  harassing people they didn't like.

o the Reuter article mentions that "they wiped out almost all of the
  information contained on a system operated by the Public Broadcasting
  System affiliate in New York, WNET, that provided educational
  materials to schools in New York, New Jersey and Connecticut" and
  left the message, ""Happy Thanksgiving you turkeys, from all of us at MOD."

Michel E. Kabay, Ph.D., Director of Education, National Computer Security Assn


Mass. state police confuse car owners with gun carriers

SunSelect Strategic Marketing <Brian.Hawthorne@east.sun.com>
Fri, 5 Nov 93 13:26:43 EST
My wife received a letter yesterday from the Massachusetts state police,
informing her that it was time to renew her "License to Carry Firearms".
It included a renewal form that she was to take to her local licensing
authority, the police station in our case.

A bit of background: In Massachusetts, a "License to Carry" allows you
to carry or transport a handgun. My wife not only has never had such a
license, but does not even have the pre-requisite Firearms ID card, which
allows ownership and transport of rifles and shotguns.

Concerned that someone had used her name and address to get a carry permit, my
wife called the phone number indicated. The person answering ("State Police")
explained that my wife shouldn't worry. Everyone else who got that letter by
mistake was also concerned.

They have not yet figured out exactly what happened, but apparently someone
loaded a tape containing the list of car owners who needed to renew their
automobile registration instead of the list of gun owners needing to renew
their carry permits. They generated and mailed many thousands of these
letters, and never did any sanity checks.

They assured my wife she would get a letter explaining what had happened as
soon as they figured it out.

Fortunately, in order to actually renew a carry permit, my wife would have
to present the form in person at the local constabulary, who would know
that she did not have such a permit. Unless, of course, someone loaded
the wrong tape when updating the local police records...

In retrospect, since a car is a much more dangerous weapon than a handgun,
I suppose this is not a very big RISK after all.


Overenthusiastic automated investment programs

John R Levine <johnl@iecc.com>
Tue, 2 Nov 93 22:34:05 EST
The story about the Dutch computer that nearly oversold its portfolio reminds
me of a similar situation.  (This isn't a friend of a friend story — I
actually know the person involved.)

Some years back, he'd been doing some of the earliest experiments in
computerized commodity trading.  At that point, it was still common to send in
orders by Telex, since it left a much better log than did phone calls, and
he'd recently gotten a lashup that let his computer make its own Telex calls
so it could automatically calculate and enter the day's trading orders.

So the computer put some money into something (potatoes, I think.)  The next
day, the prices had moved favorably, so it put in some more.  Next day, the
same thing.  Before the end of the week, he got a phone call from the
government regulators.  The potato futures market isn't all that big, and his
computer had apparently cornered it, which is a definite no-no.  He unwound
his potato positions and adjusted his program never to buy potatoes again, no
matter what.

It surprised everyone involved that the computer had been able to distort that
market so quickly.  Lord only knows what computers do to commodity markets
these days; as I recall he managed to do his cornering with a PDP-11.

The general problems with automated trading programs are well known.  They had
a lot to do with the day the market dropped 500 points, as all of the programs
cranked away in a financial environment that their programmers had not really
anticipated.  There are now ``circuit breaker'' rules that limit how much
automated trading can be done, but I wouldn't place a great deal of faith in
claims that computerized market distortions have thereby been cured.

John Levine, johnl@iecc.com, {spdcc|ima|world}!iecc!johnl


RISKS of unaccountable computerized elections

Dave Hart <davehart@microsoft.com>
Fri, 5 Nov 93 12:17:38 PST
There's a good article on the risks of computerized election systems which
leave no paper trail for recounts in the 30 October 1993 *Science News*
[Vol. 144 No. 18 Pg. 282-3].  A couple of quotes:

  The situation is exacerbated by state and local election officials, whose
  primary concern is keeping election costs down and who put a premium on
  speed and convenience.  As a result, `the vendors don't particularly care
  about computer security because the marketplace doesn't care,' Greenhalgh
  insists."  [Gary L. Greenhalgh, former director of the Federal Election
  Commissions' National Clearinghouse on Election Administration]

The piece quotes heavily from our very own PGN.  My favorite:

  `It takes trustworthy systems and trustworthy people to avoid
  tampering; it takes even more to avoid accidents from user operation or
  misuse,' Neumann says.  `Our trusting of people and systems that are
  not trustworthy is an open invitation to disaster.'

    [Thanks!  PGN]


Re: Safety-critical software (Mellor, RISKS-15.19)

David Parnas <parnas@qusunt.eng.McMaster.CA>
Fri, 5 Nov 1993 14:33:43 -0500
Pete Mellor wrote, "Prof. Cliff Jones of Manchester characterised the
complexity of software in terms of the number of branch points it may contain,
and hence the number of possible paths through it.  The combinatorial
explosion of possible paths makes exhaustive testing impossible in all but the
simplest programs.  It may be difficult to achieve with 50 Lines of code and
10 branch points.  With 10,000 LOC and the same density of branch points, the
testing time would exceed the time elapsed since the big bang.  As he pointed
out, the Sizewell B Primary Protection System contains 100,000 LOC."

It is worth remembering that were John von Neumann still alive, he might
remind us that program state and data state are interchangeable, and that
the number of sequences of data states in such programs is even larger
than the number of sequences of control states.  Even if we did test
every possible path, we have not done exhaustive testing.  We should
not ever imply that such a test would be an exhaustive test.

Dave Parnas


InterNet Mailing List

"JS McBride & Co. PostMaster" <jim_mcbride@netmail.com>
Wed, 03 Nov 1993 09:56:28 pst
Here is the CORRECT info on the InterNet Mailing List.

Addresses are extracted from news feeds, list servers, and other sources. NO
personal information is collected. The following is the ONLY information we
collect.

    1. Electronic mail address
    2. User name
    3. Search keywords
    4. Date info was collected

The search keywords are limited to products.

Example: xwindows,unix,dos,ms-windows,emacs

To have your name removed from the list, send a message to DELETE@NETMAIL.COM
Please place ANY addresses that you wanted removed from the list in the body
of the message.

To get more info on how and why we are building the list, send a message to
LISTINFO@NETMAIL.COM .   [Just see next message.  PGN]

Comments should be sent to TMANNING@NETMAIL.COM

Thank You, James McBride, NetMail, 415-949-4295


Auto Reply [What you get from LISTINFO. PGN]

"JS McBride & Co. PostMaster" <jim_mcbride@netmail.com>
Wed, 03 Nov 1993 22:11:07
Thank you for your mail to Jim McBride at JS McBride & Company. Due to the
volume of mail be handled by this account, this is an automatic reply.

PLEASE READ CAREFULLY!!

1. JS McBride is NOT collecting demographic information on email addresses.
   Due to the controversy surrounding this practice, we have discarded
   the product demographics we collected. We are however still collecting
   email addresses and user names.

2. The information collected (name and email address) will be offered in
   a printed white pages directory and in a white pages server on the net.

3. You DO NOT need to ask to have your name removed. BEFORE your name is
   used in the directory, you will receive mail asking for your permission.
   If you reply to the inquiry, your information will be used. If you do
   not reply, your name will NOT be used.

4. Comments regarding the white pages should be sent to Tom Manning at
   JS McBride & Company. <tmanning@netmail.com>

5. Mail to Jim McBride should be sent to <jimm@netmail.com>

6. Information regarding the purchase of the white pages directory should
   be sent to <listinfo@netmail.com> or telephone us at 415-949-4295

   Thank you for your time,
   Jim McBride

Thanks to all of you (too many to note) who forwarded this to RISKS.  PGN]


Security of the internet

<WHMurray@DOCKMASTER.NCSC.MIL>
Thu, 4 Nov 93 06:50 EST
Our esteemed moderator complains as follows (aside, but in normal voice from a
high pulpit):

>....which is that system and network security stinks in most
>systems, particularly those on the Internet.

Not true, Peter.  System security stinks on one system in five in the
internet.  This is not "most."  However, it is sufficient to put the whole net
at risk.

The level of security in the internet is.  That is to say it is a given; the
laws associated with large numbers make it resistant to change.

It is sufficient for most of the applications or uses of the net.  Otherwise,
by definition, the uses would not take place.  At the same time it is
insufficient for many of the applications.

Users of the net must understand that it is an "open" net.  They may not rely
upon the security of such a network.  They may not rely upon the apparent
origin or destination of the messages.  They may not rely upon the behavior of
privileged users (system managers et. al.) within the net.  They not rely upon
the polite behavior of users of the net.

This is not because the origin and destination of many messages are forged,
that many privileged users are malicious, or that most users are rude.  If
this were the case, the net would simply disintegrate.  Rather, it is simply
in the nature of an open network that some will be.

If it is important to your application that a message came from where it
appears to have come from, then you had better have sufficient evidence,
independent of that which the net provides you, that that is where it came
from.  If it is important to you that your message not be seen by anyone other
than its addressee, you had better talk in a code that only you and he
understand.

It is now relatively simple to automate such protection for your traffic at
the application layer.  Once automated its use will be simple and transparent.
You will be able to enjoy both the wide connectivity and economy provided by
the net and the security required for your application.

It is unrealistic to expect to get both, by default, from the same mechanism.
The real world does not work that way.

William Hugh Murray, Executive Consultant, 49 Locust Avenue, Suite 104; New
Canaan, Connecticut 06840 1-0-ATT-0-700-WMURRAY; WHMurray@DOCKMASTER.NCSC.MIL


Re: Security of the internet

RISKS Forum <risks@csl.sri.com>
Fri, 5 Nov 93 16:40:18 PST
Bill, Consider the network as a system in the large.  If almost all of those
systems use passwords, their security stinks.  [Only a few systems today use
token authenticators.]

If a Trojan horse in my system captures a password on your system as a result
of an FTP or TELNET from my system to yours, then YOUR system is now
vulnerable to an attack that might permit me to Trojan horse your system,
which in turn can compromise all of the systems that you FTP or TELNET to.  It
is as simple as that.  By induction, virtually the entire net is at risk
sooner or later, by iterative closure [cloture?].

Peter


Yu, "Automated Proofs of Object Code..." available

<horning@src.dec.com>
Thu, 04 Nov 93 11:28:54 -0800
The following SRC Research Report is now available via FTP on
gatekeeper.dec.com in: pub/DEC/SRC/research-reports/ or in hardcopy
(send an e-mail request to src-report@src.dec.com)

This report is based on Yuan Yu's Ph.D. dissertation, supervised by Bob
Boyer at the University of Texas at Austin.

"Automated Proofs of Object Code for a Widely Used Microprocessor",
Yuan Yu,   Report #114, October 5, 1993. 122 pages.

Computing devices can be specified and studied mathematically. Formal
specification of computing devices has many advantages; it provides
a precise characterization of the computational model, and allows for
mathematical reasoning about models of the computing devices and
programs executed on them.  While there has been a large body of
research on program proving, work has almost exclusively focused on
programs written in high-level programming languages.  Here we address
the important but largely ignored problem of machine-code program
proving.  This work formally describes a substantial subset of the
MC68020, a widely used microprocessor built by Motorola, within the
mathematical logic of the automated reasoning system Nqthm a.k.a.
the Boyer-Moore Theorem Proving System.  Based on this formal model,
we mechanized a mathematical theory to automate reasoning about object
code programs.  We then mechanically checked the correctness of
MC68020 object code programs for binary search, Hoare's Quick Sort,
the Berkeley Unix C string library, and other well-known algorithms.
The object code for these examples was generated using the Gnu C, the
Verdix Ada, and the AKCL Common Lisp compilers.


"Research Directions in Database Security" ed. by Teresa Lunt

"Rob Slade, Ed. DECrypt & ComNet, VARUG rep" <roberts@decus.arc.ab.ca>
5 Nov 93 9:55 -0600
BKRDDBSC.RVW   931014

Springer-Verlag, 175 Fifth Ave., New York, NY 10010, 212-460-1500, 800-777-4643
or 8 Alexandra Road, London   SW19 7JZ, UK  44-81-947 5885
"Research Directions in Database Security", Lunt (ed.), 1992, U$39.50

Generally, we speak of security in binary terms.  You either allow access to
the system or you don't.  You allow a file to be modified, or you don't.
There are, of course, some very complex issues to be faced, and access
situations can certainly become complicated.  But by and large, access
security can be resolved to a series of yes/no questions.

Not so with database security.  The situation is almost the reverse of access
security: there is no black or white, only shades of grey.  In database
security you have to assume that everyone needs and has access to the
database, but that certain answers are not to be given to certain people.
That's a fairly simple problem to deal with.  What about the situation where
many people can update but you don't want two people simultaneously updating
the same record, and thus corrupting the data.  Again, some reasonably simple
solutions; although, when we add together many "simple" solutions, we start to
build a fairly "complex" system.

Let's return to the question of access to information, using the example of the
census data.  There is no problem with anyone and everyone knowing how many
people are unemployed in Canada.  An aggregate number for British Columbia, or
even for North Vancouver, should still present no problems of confidentiality.
However, no one should know that Robert M. Slade is unemployed unless Robert M.
Slade chooses to divulge that datum.  Even then, Robert M. Slade should have
control over who should know that fact.  Therefore, we have a situation where
the individual records should not be divulged, but queries reported over a
range can be.  (Just to ensure that the issue doesn't get any easier, we have
to build in safeguards that would prevent the indirect revelation of
information such as generating queries of intersecting sets and watching the
changes.)

Such are the questions addressed in this book.  The contents are basically the
results of a three-day symposium held in 1988.  Sponsored by the military,
many of the papers specifically address "classified" data, but a number of the
concepts have practical business applications as well.  (The military
involvement may also explain the four-year lead time until the book was
published.)

The book covers questions at all levels of the computing enterprise, from
computer architecture through operating systems to data base architecture to
conceptual approaches.  Not all possible topics are covered, but there is a
good range.

This is, quite definitely, for the database professional, and prior database
security background would be helpful.  The "alphabet soup," as Dorothy Denning
notes, flies thick and fast.  Most of the papers discuss TCB:  the book is
about finished before a paper tells you what it is (trusted computer base).
The acronyms multiply, even using other acronyms:  halfway through one paper on
"A1 Secure DBMS (database management system) Architecture" the authors start
talking about "ASD".

As with all such omnibus volumes, the interest will vary with the topic, and
the quality varies with the author.  One essay examines the "man in the loop"
question in regard to the feasibility of automatic classification of data.
After spending seven pages clarifying the question, the answer is basically,
"No, computers can't understand text yet."  Generally, however, the title is
accurate.  These are the "cutting edge" (or perhaps *slightly* behind) issues
in data security, and an interesting discussion piece for those issues.

copyright Robert M. Slade, 1993   BKRDDBSC.RVW   931014
Permission granted to distribute with unedited copies of the Digest

          ======================604-984-4067=====================
DECUS Canada Communications, Desktop, Education and Security group newsletters
Editor and/or reviewer ROBERTS@decus.ca, RSlade@sfu.ca, Rob Slade at 1:153/733
DECUS Symposium '94, Vancouver, BC, Mar 1-3, 1994, contact: rulag@decus.ca


Re: CERT Reports and system breakins

Phil Karn <karn@qualcomm.com>
Thu, 4 Nov 93 10:23:44 -0800
Ethernet addresses are *hardly* the basis of an effective authentication
scheme. On most controllers I know, the manufacturer-assigned Ethernet address
is contained in a PROM, and the driver software must copy it to a register in
the Ethernet controller itself. And nothing prevents the software from writing
any address it likes, though of course in normal operation there is no reason
not to.

I understand that IEEE 802.3 actually requires this capability (no doubt at
DEC's behest since DECNET uses its own Ethernet addresses and ignores the
PROM).

It would not be sufficient to argue that modifying the Ethernet address on
"most" systems is "difficult" — it's quite trivial to do it on many others,
particularly PCs, for which networking hardware and software sources are
readily available.

We need strong security mechanisms based on good cryptography and well thought
out protocols. They're underway, but they will take time to develop. Although
it's tempting to toss out little quick hacks that might complicate a cracker's
life for, oh maybe 15 minutes or so, this sort of thing only diverts us from
the effort required to provide meaningful security in the long run.

Phil


Re: CERT Reports and system breakins

A. Padgett Peterson <padgett@tccslr.dnet.mmc.com>
Thu, 4 Nov 93 13:35:32 -0500
Sorry, the original post was in haste and I am regretting that now - not that
it was made but that I did not go into sufficient detail (and with the aid of
20-20 hindsight).

What I should have made clear is that if the hardware addresses are known for
"approved" systems, then "unapproved" addresses will stand out and
"unapproved" system could include the case of the router/bridge to the outside
world.

Certainly it is possible to change a hardware address (though I had not
realized just how easy it was) but that changer has a choice - picking
another "unapproved" number really does no good while picking an
"approved" number risks collision. Further if the hardware address is
masked when it leaves my site, changing the number does no good.

So we are left with the case of and "approved" address that I can
retrieve and now there are other checks possible once the "approved"
system identification is known e.g. manufacturer, type of equipment,
embedded system information. Still all spoofable but it now is getting
to be a major undertaking that must start from the inside.

Add on the ability to retrieve which hub the contact is on and we are
approaching the case that it is easier for the intruder to use the spoofed
machine than it is to spoof it.

I thoroughly agree that effective encryption solves most authentication
problems in one motion - just look at what I was writing in InfoSecurity News
and a couple of other places over the last few years. The difference is that
hardware address usage is simple, effective as a discriminant, and adds a
layer of security that was not there before.

Padgett


Call for Participation - FIRST Incident Response Tools Work Grp

"Michael S. Hines" <MSHINES@freh-03ms.adpc.purdue.edu>
3 Nov 93 13:27:33 EST
* * * * * * * *  S E C O N D    N O T I C E * * * * * * * * * *

The Incident Response Tools Working Group (IRTWG) of the Forum of Indicent
Response and Security Teams (FIRST) has been formed for the purpose of
developing a catalog to assist incident response teams (often called
Computer Emergency Response Teams or CERTs) in the selection and
acquisition of tools for use in incident response tasks.  The catalogue
will be available in electronic form to anyone who wants a copy.

David Curry of the Purdue CERT (PCERT) is chairing the group.  My name is Mike
Hines, also of the PCERT.  I am a Senior Internal Auditor for Information
Systems at Purdue.  I have volunteered to coordinate compilation of a mailing
list of potential providers of tools for use in incident response situations.

At this point I need two pieces of information from you:

(1) An indication if you would like to assist me in compilation of this
mailing list.  We want to get as broad of coverage as is possible in this
task.  If you happen to have a source of several addresses, I would like your
assistance.  This is mostly providing me leads so we achieve as wide of
coverage as is possible.  I will be creating and maintaining the mailing list
here at Purdue.

(2) Any and all leads for sources of tools for incident response handling.
Areas we are focusing on are:

  (a) Incident Detection... tools such as virus scanners, file integrity
checkers, auditing systems, and intrusion detection systems... tools which
monitor systems for signs of security violations.

  (b) Incident Response...tools such as keystroke monitoring systems, network
packet capture, program disassemblers, and source code fingerprinting...tools
which can be used to gather information during an incident.

  (c) Indicent Recovery...tools such as virus eradicators and file integrity
checkers...tools which can be used to determine the scope of the damage done
during an incident and which can help restore the sytem to pre- incident
state.

  (d) Incident Tracking...tools such as specialized database systems of one
sort or another...tools which can be used to maintain statistics about
incidents and archives of know attacks and defenses.

For each vendor/publisher/creator of tools in the above categories, please
send the following information:

Contact Name:

Company Name:

Street Address:

City:

State:

Zip/Postal Code:

Country:

E-Mail Address of Contact:

Product Name(s):

Also if you know of another person who would be a good contact as a source
of leads, please send their name and e-mail address along.  I will contact
them with this message to solicit new leads.

Thank you for your assistance.

Michael S. Hines, Internal Auditor-EDP, Purdue University, 1065 Freehafer
Hall, West Lafayette, IN 47907-1065 mshines@ia.purdue.edu (317) 494-5845

Please report problems with the web pages to the maintainer

x
Top