The RISKS Digest
Volume 18 Issue 49

Wednesday, 25th September 1996

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Please try the URL privacy information feature enabled by clicking the flashlight icon above. This will reveal two icons after each link the body of the digest. The shield takes you to a breakdown of Terms of Service for the site - however only a small number of sites are covered at the moment. The flashlight take you to an analysis of the various trackers etc. that the linked site delivers. Please let the website maintainer know if you find this useful or not. As a RISKS reader, you will probably not be surprised by what is revealed…

Contents

Minnesota State Senate candidate photo "mistake"?
PGN
CIA disconnects home page after being hacked
PGN
Cracker Bill Passes Senate
Edupage
AOL Resumes Junk E-Mail Block, Settles Class Action Suit
Edupage
Massachusetts welfare fraud investigators fired: tax-record misuse
Saul Tannenbaum
Heart monitor software
Jim Garrison
Automated toll collection test fails
George C. Kaplan
Warning! NT 4.0 utility wipes system configuration
Alan Wexelblat
Re: An unlosable casino game
Hal Lockhart
FTC gets involved in P-trax debate
Bear R Giles
Re: Lexis-Nexis P-Trak
Robert Ellis Smith
Re: Cracker Attack Paralyzes PANIX
Stephen Tihor
The RISKS of using "personal" info in authentication
Roger Moar
More ATM Risks
Roger Altena
Re: When is -32768 != -32767-1 ?
Bear Giles
Sidney Markowitz
Peter Jeremy
Mark Brader
Henry G. Baker
Erling Kristiansen
FWISC96 San Jose, CA
Mich Kabay
Info on RISKS (comp.risks)

Minnesota State Senate candidate photo "mistake"?

"Peter G. Neumann" <neumann@csl.sri.com>
Tue, 24 Sep 96 11:27:50 PDT
A photograph of Minnesota State Senate candidate John Derus appeared
(without his name) on primary day in the *Star Tribune* , seemingly in
connection with an article on a Philadelphia charity fraud case.  Derus
charges that the *Star Tribune* did this intentionally.  (A lawyer for Derus
cited a sworn statement from a caller who, upon complaining to the paper,
was told that the use of the photograph ``may have been a personal act of
vengeance by an individual employee for which the *Star Tribune* is not
responsible.''  The *Star Tribune* had previously criticized Derus, and
opposed him when he ran for mayor in Minneapolis in 1993.)  The newspaper
apologized and published an erratum the next day, claiming the mix-up was an
innocent mistake resulting from a computerized system that assigns numbers
to photos in which earlier photos had not been removed from the system.

This case could represent another RISKS example of how muddy the line can be
between an innocent mistake and a malicious act that masquerades as an
innocent mistake — a topic that I discuss in my book, *Computer-Related
Risks*.  [Source: *San Francisco Chronicle*, 24 Sep 1996, A10.]


CIA disconnects home page after being hacked

"Peter G. Neumann" <neumann@csl.sri.com>
Mon, 23 Sep 96 11:29:54 PDT
The CIA web site (http://www.odci.gov/cia) was penetrated by a group of
Swedish hackers on 18 Sep 1996, causing the CIA to pull the plug the
following day.  The altered home page said, "Welcome to the Central
Stupidity Agency."  It also had valid links to Playboy and hacker netsites,
and fictional links to ``news from space'' and ``nude girls''.  Apparently,
the Swedish intruders were protesting a Swedish court case against a group
of youths who were caught breaking into computers in 1991.  The CIA is
presumably restoring its earlier web pages, which included spy-agency press
releases, speeches, and other publically available data, including CIA's
World Fact Book — all of course unclassified.  [I just checked again before
putting this issue out.  The http address is still not working.  The altered
web-site content is reportedly at www.skeeve.net/cia .  I did not check it.]

On the same day as the CIA shut its home page down, the Justice Department
reopened its home page (http://www.uswdoj.gov), which had been hacked into
the "Department of Injustice" in August.  [Source: *San Francisco
Chronicle*, 20 Sep 1996, A12]

  [It will be interesting to see how these reminders (PANIX too) that the
  our infrastructure is rather weak will play out.  RISKS readers have of
  course recognized the risks for many years, but Government policies have
  not consistently reflected them.  The flurry of activity might just
  possibly encourage a change in attitudes toward security, encryption,
  strong authentication, etc., and generally a ratcheting up the integrity
  of the infrastructure — although I am not holding my breath.  However, a
  possible partial administration relaxation in export controls is
  reportedly being considered.  Stay tuned.  PGN]


Cracker Bill Passes Senate (Edupage, 24 September 1996)

Edupage Editors <educom@elanor.oit.unc.edu>
Tue, 24 Sep 1996 22:17:37 -0400 (EDT)
A bill (S 982) that would make it easier to prosecute computer crimes passed
the Senate last Friday, but its companion bill in the House (HR 4095) is not
scheduled for any action.  The National Information Infrastructure Act of
1996, sponsored by Senator Patrick Leahy (D-Vt.) would explicitly outlaw:
interstate or foreign theft of information by computer; blackmail and
threats against computer systems and networks; and unauthorized use of
computer systems.  Leahy says a Carnegie Mellon University report found that
more than 12,000 computers were attacked in more than 2,400 incidents in
1995.  The Computer Systems Policy Project reports that U.S. companies lost
somewhere between $2- and $4-billion last year due to security breaches in
computer systems.  (BNA Daily Report for Executives 20 Sep 1996, A35;
Edupage, 24 September 1996)


AOL Resumes Junk E-Mail Block, Settles Class Action Suit (24 Sep 1996)

Edupage Editors <educom@elanor.oit.unc.edu>
Tue, 24 Sep 1996 22:17:37 -0400 (EDT)
America Online has received permission from a federal appeals court in
Philadelphia to resume its practice of blocking junk e-mail messages sent to
its subscribers.  Cyber Promotions Inc. had filed for and received an
injunction earlier this month ordering AOL to end its practice of blocking
unsolicited messages to its members from companies that specialize in "junk
e-mail" for promotional purposes.  A related lawsuit is scheduled to go to
trial in November.  In a separate case, a judge in San Francisco tentatively
approved a settlement to a class action suit brought by subscribers who
claimed they were improperly charged for fractions of minutes that they
didn't use.  The settlement calls for refunds of $2.95 for each $300 in
charges to former members.  AOL's total payout could add up to $700,000,
$200,000 more than was agreed to in the preliminary settlement.  (*Wall
Street Journal*, 23 Sep 1996; Edupage, 24 Sep 1996)


Massachusetts welfare fraud investigators fired: tax-record misuse

Saul Tannenbaum <stannenb@emerald.tufts.edu>
Tue, 24 Sep 1996 20:23:28 -0400 (EDT)
>From *The Boston Globe*, Friday, 20 Sep 1996:
Gov. William F. Weld's aides yesterday fired two state welfare-fraud
investigators who allegedly browsed the confidential tax records of some of
Boston's most beloved sports heroes.  [...]  The pair left ``electronic
fingerprints'' after calling up the records of Larry Bird, Ray Bourque and
Drew Bledsoe, along with those of two of the investigators' former bosses.
It is unclear why they did it, or whether they snooped through anyone else's
files.  [...]  Last week, when they discovered the breach of computer
security by someone at the Bureau of Special Investigation, Department of
Revenue officials revoked the bureau's access to private tax records. That
access has yet to be restored.

Saul Tannenbaum, Tufts University Computing and Communications Services
stannenb@emerald.tufts.edu  http://www.tufts.edu/~stannenb


Heart monitor software

Jim Garrison <jhg@mpd.tandem.com>
Tue, 24 Sep 1996 11:34:37 -0500
My wife works as a physical therapist in a local hospital, which has several
sub-disciplines of PT including cardiac rehab.  The cardiac rehab department
has an exercise area in which patients can use stationary bicycles while
connected to heart monitoring equipment, under the supervision of a nurse.
The following "interesting" (from a RISKS perspective) incident occurred
last week.

First, some background.  According to my wife, the monitors have CRTs and
display a standard EKG trace in a box, along with a large numeric heart-rate
indicator.  (Note: there are several different types of EKG devices, the
most sensitive and discriminating being what is known a "twelve-lead" EKG.
The monitor in use here is less sensitive, has fewer leads, and is intended
for monitoring as opposed to diagnostic work).

When connecting a patient to the monitor, part of the procedure involves
setting a target heart rate.  If the programmed rate is exceeded during
exercise, the EKG trace turns red, but there is no audible alarm.  There was
at one time an audible alarm (on previous monitoring equipment) but it was
considered more of a nuisance than a benefit, since it is normal to exceed
the target by a few beats/minute for short periods of time and is not
considered dangerous.

The monitoring software also has an option that can double the
amount of EKG trace displayed by compressing the time axis. This
operation mode is almost never used.

Now for the interesting part.  A patient was attached to the monitor and
began exercising.  Somehow, unbeknownst to the nurse, the display was
switched into compressed mode.  When she glanced at the display, she saw
what looked like a very high heart rate based on the spacing of the EKG
peaks.  She immediately went into "emergency mode", and set off a chain of
events that led to starting an IV, calling the patient's doctor and
preparing to administer a strong heart-rate-reducing drug.  The doctor
requested a 12-lead EKG for confirmation, at which point the error was
discovered.  Luckily, no drug had yet been administered. In declaring the
emergency, the nurse had to miss or ignore a number of cues:

1) the numeric display still showed the correct heart rate (75)

2) the display had not changed color

3) the patient showed no signs of distress, remained totally
   calm, and repeatedly asserted he felt nothing out of the ordinary.

While this is a fairly clear case of operator error, it's interesting to
consider what the software designers could have done to make the error less
likely:

1) Implement a progressive alarm system:

   < 10 bpm over target     Yellow display, no alarm
     10-20 bpm over target  Yellow display, single beep every ten seconds
   > 20 bpm over target     Red display, continuous alarm

2) Provide a mode indication on the display when in
   compressed mode (maybe change the color of the EKG trace)

The first suggestion sounds good, but also might lead to complacency on the
operator's part and excessive dependence on the alarm system itself, with
more drastic consequences when IT fails.  Sigh! Sometimes it seems like you
just can't win.

The second suggestion just highlights the fact that man-tool interface
"modality" is something people are not used to dealing with, as RISKS
readers are well aware.  (Cite five examples of tools in existence before
1950 with more than one interface mode; how about before 1900?)

Jim Garrison


Automated toll collection test fails

"George C. Kaplan" <gckaplan@cea.Berkeley.EDU>
Mon, 23 Sep 1996 22:01:47 -0700
An article in the 23 Sept. 1996 *San Francisco Chronicle* describes how
Caltrans (California Dept. of Transportation) tested an automated toll
collection system on the Carquinez Bridge, hoping to be able to use the
system on all ten toll bridges in the state.

The test failed because the error rate was too high: Caltrans wants a 99.95%
success rate (5 errors for every 10,000 tolls), but the system could do no
better than 99.1%.  Apparently the system had trouble recognizing different
types of [multiwheel] vehicles in order to charge the correct toll.

The article didn't explain how the system is supposed to distinguish,
say, a horse trailer from a big rig.  A given vehicle may be in different
toll classifications depending on what kind of trailer it's towing, so
there must be some cues other than the encoding in the box carried on
the vehicle.

There was no discussion of correct toll amounts charged to the wrong person,
or what drivers would have to do to correct incorrect bills.  Nor was there
any mention of privacy issues (such as government tracking of individuals
movements) that have appeared in previous RISKS discussions of automated
toll collection.

George C. Kaplan  gckaplan@cea.berkeley.edu  510-643-5651


Warning! NT 4.0 utility wipes system configuration

Graystreak <wex@tinbergen.media.mit.edu>
Mon, 23 Sep 1996 18:54:51 -0400
Forwarded-by: Logan Sanders <lsanders@chromatic.com>

NT users beware! Retail copies of both the Workstation and Server versions
of Windows NT 4.0 shipped with an undocumented system-wiping utility. The
file Rollback.exe erases key components of the system registry, disabling
the operating system.

Microsoft Corp. officials say that once the file has been executed, the
changes cannot be undone and require a complete reinstallation of the
operating system. At least one incident of accidental erasure has occurred
and Microsoft is mulling over how to inform customers of the problem.

This undocumented feature could do the most damage to NT4.0 Server users
because it erases critical-security and user-account information. Without
an up-to-date backup, network administrators will have to recreate all of
the users' account and password profiles.  Microsoft this week sent out an
E-mail warning to its channel partners.  It stated that after running the
utility "the next thing the customer knows, they are staring at the set-up
screen and are completely down."

Rollback.exe was designed to allow OEMs to test NT with their hardware and
software configurations, and then return systems to their pre-installation
state. The file is located in the support\deptools\I386\ directory of the
NT CD-ROM and is not installed on the system by default. But the lack of
any online documentation or escape route once the program has begun has put
curious users at risk.

Microsoft officials say that more than 150,000 copies of NT Server 4.0 have
been sold since its release in late July.  Microsoft has posted an entry in
its online Knowledgebase, but has not determined how it will notify
customers and OEMs.


Re: An unlosable casino game (RISKS-18.48)

Hal Lockhart <hal@platsol.com>
Wed, 25 Sep 1996 12:22:42 -0400
> just click on "BACK" ... and you undo your loss!

This reminds me of an bug I have seen in many computer-based gambling games:
the failure to check for negative bets.  I always try this and it frequently
works.  You just make a negative bet and lose on purpose and the game
subtracts your bet from your winnings!

Harold W. Lockhart Jr., Platinum Solutions Inc., 8 New England Executive Park
Burlington, MA 01803 USA  (617)229-4980 X1202 hal@platsol.com


FTC gets involved in P-trax debate

Bear R Giles <bear@indra.com>
Tue, 24 Sep 1996 14:14:07 -0600 (MDT)
According to Reuter's, on Monday the Federal Trade Commission recommended
Congress tighten consumer confidentiality laws to help stop credit fraud and
identity fraud.  The body of the article clearly referred to the current
uproar over the Lexis-Nexis database, although not by name.

The article stated that in a letter to Sen. Richard Byran, D-Nev., the FTC
stated that "fraud concerns outweighted the limited legitimate uses of this
information for locating individuals."  (quoting the article, not
necessarily the letter.)

The letter further recommended that the Fair Credit Reporting Act be amended
to require a legal release before a person's maiden name, Social Security
number, prior addresses and date of birth can be released.  I didn't see a
reference to phone numbers, despite the perception by many people that
"unlisted" phone numbers aren't readily available from such sources.

On an unrelated note, a brief note in the Saturday business section said
that a three-judge appellate court overturned the lower courts injunction
against AOL blocking spams from CyberPromotions.  It's probably not a
coincidence that the spam I received from them (non-AOL) on Monday
highlights their new "block-proof, flame-proof" software.

Bear Giles  bear@indra.com


Re: Lexis-Nexis P-Trak (RISKS-18.44)

Robert Ellis Smith <0005101719@mcimail.com>
Tue, 24 Sep 96 11:51 EST
The furor over the Lexis-Nexis database (which at first had anybody's Social
Security numbers available to any strangers subscribing to the service) has
caused a 180-degree turnaround in Congress, where anti-consumer amendments
to the Fair Credit Reporting Act have been close to passage for the past
four years.  Congressional offices are getting lots of angry messages about
Lexis-Nexis, because people realize that the source of the data is "header
information" or "above-the-line information" at the top of our credit
reports.  The Federal Trade Commission over the past seven years has allowed
this identifying information to be sold by credit bureaus without any
protections of the Fair Credit Reporting Act (among other things, notice to
the consumer).

Now, Congressional Republicans are actually thinking about protecting header
information and reversing the FTC.  They are thinking twice too about voting
for anti-consumer amendments to the FCRA.  People who are upset about Lexis-
Nexis can have a real impact this week if they promptly convey their outrage
to Members of Congress, especially their Senators and notably Sen. Richard
H. Bryan, Democrat of Nevada, who might be supportive, fax 202/224-1867, and
Sen. Alphonse D-Amato, Republican of New York and chair of the Banking
Committee, who is probably not supportive, fax 202/224-5871.

Do it this week.

  [P.S. If Americans - and most users of the Internet - were more upset
  about the demands for photo ID in order to board an airplane than about
  the Lexis-Nexis data base, the people in Washington would take note and
  reverse this pernicious invasion of privacy.  Write the Federal Aviation
  Administration, the Department of Transportation, Members of Congress, and
  the news media.]

Robert Ellis Smith  Privacy Journal, Providence RI  401/274-7861
5101719@mcimail.com

  [I meant to insert a note earlier regarding the *pretense* that SSNs are
  hidden in P-Trak.  If you have a CD-ROM version of the database, it is
  a very simple task to extract the SSNs.  So much for "The SSNs are not
  accessible."  PGN]


Re: Cracker Attack Paralyzes PANIX (Edupage, 12 Sep 1996)

Stephen Tihor <TIHOR@ACFcluster.NYU.EDU>
Sun, 22 Sep 1996 23:14:00 -0400 (EDT)
In general the PANIX attacks simply mark the point in the life of the
Internet where zero-cost zero-security stops being a reasonable marketing
point.  If the net is to survive then it or the part that plans to become
useful must add enough authentication and auditability to be able to track
back and associate bad actions with the actors.

Once that is possible then normal time tested social mechanisms can be
employed. The internation nature of the internet means telecomunication
standards and web of bilateral agreements and the law of the sea probably
bracket the techniques that will work.


The RISKS of using "personal" info in authentication

Roger Moar <rmoar@apertos0.csc.UVic.CA>
Tue, 24 Sep 1996 15:41:53 -0700 (PDT)
I was looking for some financial information, and came across the Barron's
Online WWW site during my search. I tried to enter an area that is
restricted to members, and was asked for my username and password. As I
don't have an account, I clicked on OK, and was allowed to "FIND my
password". I put in a silly username, and was greeted with:

"Hello John Doe. When you registered we asked you for your favorite
color. Please enter your favorite color:"

(The names and passwords here are obviously fictitious.) Naturally, I
couldn't resist, and before I reached then end of the rainbow, I received:

  "Congratulations! Hello John Doe. Your password is 3246297684".

I think the RISKS from this system are obvious. Repeated guesses of
usernames brought up requests for the Mother's maiden name, date of birth,
or favorite color. Additionally, the usernames are easily guessed, any
proper name seems to be taken as a username.

It's a nice way to give away information, but I'm not sure I would
trust these people with my financial security...

Roger Moar — rmoar@csr.uvic.ca | http://apertos0.csc.uvic.ca/~rmoar


More ATM Risks

Roger Altena <roger@mincom.com>
Wed, 25 Sep 1996 08:37:15 +1000
In my earlier days I was one of a team of ATM programmers.  We had a machine
in the office, and we tested the software by holding on to cards and
envelopes, putting rubbish in the money bin, not taking receipts, and every
other thing we could think of.  It was a terrific job!

The operating system was specifically written for the ATM.  It had
instructions such as "get notes from hopper" and "present notes".  If an
instruction was used, the software automatically generated a mandatory error
routine, which forced us to think of the correct action to take in each
situation.  It presented an error code with dozens of values for each
instruction such as "less than the requested number of notes were
presented", "some notes not taken", "all notes not taken", etc.

These days the software in ATMs seems to be written using generic languages,
and running on generic PCs. The risk is that by reducing costs, we take on
the responsibility for thinking up each possible error ourselves, rather
than leaving that job to a specialist who knows the hardware and its
capabilities.

We also rely on a system where the hardware consists of an ATM built by one
company, a PC by a second company, interface hardware from the ATM to the PC
by a third company.  Then the PC operating system is written by a fourth
company, the ATM controlling software by a fifth, and so it goes on.
Needless to say, the risks for misunderstanding and omission are multiplied
manyfold.

Of course, this goes both ways.  The hardware-specific software we used took
up to 24 hours to generate, so the final software was full of patches
applied through a hooked-in monitor.  With modern PCs, a wide range of
software development tools would greatly assist in producing stable and well
structured code.

Roger Altena  roger@mincom.com


Re: When is -32768 != -32767-1 ? (RISKS-18.48)

Bear Giles <bear@indra.com>
Tue, 24 Sep 1996 03:41:01 -0600 (MDT)
Numerous people have written me directly to point out that C defines
integers as a sequence of digits, and then uses unary negation for literal
negative constants.

I wish to point out that both the Gnu C and HP/UX (the only compilers I've
used for the past four years until the past few months) define

  #define   SHRT_MAX    32767
  #define   SHRT_MIN    (-32768)

Many compilers are fairly intelligent about how they handle literal
constants and expressions; perhaps this is how some (but not all) compilers
can accept -32768 directly.

Bear Giles bear@indra.com

  [A few comments follow.  Similar overlapping comments were received
  from many of you, including
     Dik.Winter@cwi.nl (Dik T. Winter).
     Andy Newman <andy@research.canon.com.au>,
     source@netcom.com (David Harmon),
     thorinn@diku.dk (Lars Henrik Mathiesen),
     "Kevin F. Quinn" <kfq@wormhole.compd.com>.
     Jon Reeves <reeves@zk3.dec.com>, and
  perhaps others unread whose subject line was merely "RISKS-18.48".
  In addition, I excerpt starkly from some of the following replies.  PGN]


When is -32768 != -32767-1 ? (Giles, RISKS-18.48)

Sidney Markowitz <sidney@research.apple.com>
Mon, 23 Sep 1996 12:55:28 -0700
 [...] The problem is not with the Borland C compiler, which performs
according to spec, but with Gnu Flex, which is generating C code that does
not conform to the standard.  Or perhaps the problem (and RISK) is with the
type of thinking that expects people to see something like "(-32767-1)" in a
large standards document and instantly understand all of the nuances of the
standards committee's careful (and opaque) wording.

 — sidney markowitz <sidney@research.apple.com>


Re: When is -32768 != -32767-1 ? (Giles, RISKS-18.48)

Peter Jeremy <jeremyp@gsms01.alcatel.com.au>
Tue, 24 Sep 1996 08:39:08 +1000 (EST)
>Still, this situation is better than I would face when using a personal copy
>of Microsoft Visual C++ 1.5.  It limits SHRT_MIN to -32767.

This is a bit more dubious.  Unless Microsoft have done something strange to
two's-complement arithmetic, I'd say the definition was wrong, since a short
can contain a value less than SHRT_MIN.


Re: When is -32768 != -32767-1 ? (Giles, Risks-18.48)

Mark Brader <msb@sq.com>
Wed, 25 Sep 96 02:16:45 EDT
[...] People who find C a RISKy language will like to cite this quirk as
evidence.  On the other hand, the absence of negative numerical constants
has its advantages — for instance, it frees the programmer from ever having
to wonder whether things like -1 and -(1) might behave differently in some
contexts.

[...] So here's another RISK — a compiler where a warning *isn't* a warning.
Instead it's 1/N of a fatal error, for some particular value of N.
In my opinion that's a serious bug in the compiler.

Mark Brader, msb@sq.com, SoftQuad Inc., Toronto


When is -32768 != -32767-1 ? (Giles, RISKS-18.48)

Henry G. Baker <hbaker@netcom.com>
Tue, 24 Sep 1996 07:32:37 -0700 (PDT)
So much for the 'science' part of 'computer science'...

There is a trivial and elegant solution to the problem of the asymmetry
of 2's complement integers and how to input and convert them.

Low, James R.  "A Short Note on Scanning Signed Integers".  ACM Sigplan
Notices 14, 1 (Jan. 1979), 55-56.

Briefly, instead of keeping the number as a _positive_ integer, you
keep it as a _negative_ integer (which has a greater range), and then
convert back if it is positive!  In other words, "err on the Low side".

Pseudocode from Low's paper:

RESULT := 0;
while there are more digits, do
  RESULT := RESULT * 10 - current_digit;
if sign is positive, then RESULT := - RESULT

While we're on the subject, no one should be allowed near a numeric
conversion routine until (s)he has read the following two papers:

Clinger, William D.  "How to Read Floating Point Numbers Accurately".  ACM
Sigplan'90 Conference on Programming Language Design and Implementation, ACM
Sigplan Notices 25, 6 (June 1990), 92-101.

Steele, Jr., Guy L., and White, Jon L.  "How to Print Floating-Point Numbers
Accurately".  ACM Sigplan'90 Conference on Programming Language Design and
Implementation, ACM Sigplan Notices 25, 6 (June 1990), 112-123.

Henry Baker  www/ftp directory: ftp.netcom.com:/pub/hb/hbaker/home.html


Re: When is -32768 != -32767-1 ? (Bear R Giles)

Erling Kristiansen <erling@wm.estec.esa.nl>
Wed, 25 Sep 1996 09:03:52 +0200 (MET DST)
This reminds me of a problem I had more than 10 years ago, using
FORTRAN IV on an HP 1000. The comparison

   IF (I .EQ. J)

results in FALSE when I and J are both -32768.  This is because the HP
1000 has no COMPARE instruction, so, by necessity, the comparison is
done by SUBTRACTING one number from the other.  Subtracting -32768 from
-32768 (in 2's complement arithmetic) yields -32768, and an
overflow condition which is not tested for by the generated code.

I was using this for comparing bit-patterns, and one of the patterns I
was comparing against was Octal 100000 (= decimal -32768).

This confused me for a while!


FWISC96 San Jose, CA

Mich Kabay <75300.3232@CompuServe.COM>
24 Sep 96 12:25:14 EDT
NCSA is hosting its 2nd Firewall, Web & Internet Security
Conference on Sept 30th and Oct 1st at the Red Lion Hotel, 2050
Gateway Place, San Jose, CA 95110.  The exhibit hall is free
and features most of the major developers of commercial
firewall products.  There will also be free vendor technical
presentations open to exhibit hall visitors.

Details about the conference can be obtained by sending EMail
to fwcon96west@ncsa.com or by visiting the NCSA web site at
www.ncsa.com.

M. E. Kabay, Ph.D / Director of Education
National Computer Security Association

Please report problems with the web pages to the maintainer

x
Top