The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 17 Issue 5

Friday 7 April 1995


o The risks of flying pigs
Jose Reynaldo Setti
o Same Old Song: More calendar problems
Chuck Weinstock
o The Risks of believing in Lawyers
jacky mallett
o Re: More on German Train Problems
Donald Mullis
o Re: RISKS of non-standard interfaces
Matthias Urlichs
o Photo ATMs
Harold Asmis
o Re: Errors in patent databases
Jerry Leichter
o Risks of tightly-packed telephone-number space
Jeff Grigg
o RISKS of Digital Analogy [SATAN]
Bart Massey
o SATAN, burglaries, and handcuffs
Matt Bishop

The risks of flying pigs

Jose Reynaldo Setti <>
Fri, 7 Apr 1995 10:47:20 PDT

The Toronto Globe and Mail reports that a commercial jet carrying 72 pigs and 300 human passengers had to make an emergency landing after having its fire alarms triggered by excessive levels of methane, ammonia and body heat in the cargo hold where the pigs were traveling (you surely did not expect that the pigs were traveling economy class). Apparently, excessive flatulence, urine, droppings and the heat generated by the bodies of the pigs caused the automated fire-extinguishing system to flood the cargo hold with halon, killing 15 of the very valuable hogs. The risks of flying pigs are evident.

Jose' Reynaldo Setti <>, Visiting University of Waterloo (Canada) from University of Sao Paulo (Brazil)

Same Old Song: More calendar problems

Chuck Weinstock <weinstoc@SEI.CMU.EDU>
Fri, 07 Apr 95 13:18:51 EDT

For my personal E-mail, etc., I use the services of an Internet provider. I noticed today that the systems were still on Standard Time. I sent a message to the operators and received the following in return.

Due to an obscure bug in the system libraries, for this particular switch to daylight savings time only, the system calculated when it should change to DST incorrectly. (It thinks it does not start until this coming Sunday.) Because the bug will not occur in other years, and because we would have to rebuild all the system programs which depend on this system library, which would take longer than a week, we decided to just wait out the week until the clock returned to sanity.

If this is causing you any specific problems, please let us know.

I'm sure this sort of thing is well documented in RISKS ... so much so I continue to be surprised that I'm surprised that it continues to happen.

The Risks of believing in Lawyers

"jacky (j.) mallett" <>
Fri, 7 Apr 1995 09:06:00 -0400

Atlas Computer Systems has fired the advertising consultant who suggested that posting an advertisement to all the Usenet newsgroups was an effective way of advertising a 1-gigabyte hard disc drive. Atlas suffered several days of mail-bombing as well as repeated calls to its 1-800 number from irritated Usenet readers. The New Scientist quotes Atlas's vice-president Matt Nye as saying that the consultant suggested spamming the newsgroups after reading Canter and Siegel's infamous publication "How to make a fortune on the Information Superhighway".

It's nice to know there is some justice in the world.

-- Jacky Mallett

Re: More on German Train Problems (Weber-Wulff, RISKS-17.02)

Donald Mullis <>
Fri, 07 Apr 1995 15:50:12 PDT

: The defensive programmer will assume that all allocations will fail,
: and structure the software to handle such a case.

Furthermore, the software's ability to recover from the failure of any particular allocation attempt must be tested if we're to have any confidence in its working. It is not enough merely to bear in mind the possibility of allocation failure while writing the client code.

Is it feasible to test all possible allocation failure modes? Forcing the allocation routine to always fail would be pointless, since attempts to allocate storage are likely to be contingent on the success of earlier attempts. Alternatively, one could maintain a counter N of program runs, and coerce the allocator to fail on its N'th call. This is a computational effort proportional to the program run time multiplied by the number of memory allocations in each run. This exhaustive method would exercise all possible failure modes for a given input, but is likely to be intolerably time-consuming for real programs.

So we retreat to heuristic approaches. One such involves examining the sequence of stored program counters on the stack when the allocation routine is called. A checksum of the program counters can then be computed and compared against a list of those already seen. If no match is found, insert the checksum into the list and return failure; if a match is found, allocate the storage and return success. Thus, upon the first appearance of a particular sequence of program counters on the stack, the allocator will return failure. This has the desirable effect of exercising allocation error recovery code that might appear anywhere in the call chain. However, if that error recovery code contains a bug that only manifests itself when the program variables assume some particular state, the bug may not be found. Consider the following slightly contrived example in C:

void *p1;
    void *p2;

    Bool allocate_p1_and_p2() 
        int np1bytes;
        for ( np1bytes = 1024; (p1 = malloc( np1bytes)) == NULL; np1bytes <>= 1)
        p2 = malloc( 1);
        if ( p1 == NULL && p2 == NULL)      /* never satisfied */
            return TRUE;                    /* XXX: should be FALSE */
        else if ( p1 == NULL && p2 != NULL)
            free( p2);
            return FALSE;
        else if ( p1 != NULL && p2 == NULL)
            free( p1);
            return FALSE;
            return TRUE;

With the stack-examining allocator described above, the line marked with "XXX" will never be executed, and its bug won't be found.

One can imagine an allocator that fails pseudo-randomly according to some probability distribution, but again, the more failures, the longer the run time, with no guarantee of finding all possible bugs in the recovery code.

Re: RISKS of non-standard interfaces (Schroeppel, RISKS-17.02)

Matthias Urlichs < >
Tue, 4 Apr 95 15:39 MET DST

> In the real world, systems and customers make various adjustments when faced
> with long lines [...]

In the real world, people also make various adjustments when faced with short lines, the most prominent of which is to use the service more, causing the lines to get longer very fast. (NB: Note that people with immediate heart problems are unlikely to come back later. :-( )

This is shown in analyses of real-world commuter traffic patterns. People use the shortest way to work available. So if there's a congestion and you build a new bypass, within a few weeks more people will use the new road, until it's just as congested as before. Unless you eliminate all points where congestions can form, which is of course impossible in the real world.

The better way to reduce congestion is to offer an alternate service. For commuters, you use railroads (or tell them to stay at home and commute electronically ;-) — for people with heart problems, the sensible solition would probably be a fast-track way to get people with life threatening problems ahead of those who wait for getting their warts burned off. Don't ask me, though — the real RISK is to jump to untenable conclusions ("halve the response time by standardizing on one defibrillator") based on incomplete data.

Matthias Urlichs Schleiermacherstra_e 12 90491 N|rnberg (Germany)
CLICK here

Photo ATMs (Re: Insecurity over ATM security, RISKS-17.03)

Harold Asmis < >
Fri, 7 Apr 1995 12:29:47 -0400

This reminds me of an incident that happened about 2 years ago.

In Toronto, we used to have these wonderful wall-mounted ATM's in shopping malls. They had high, wall-mounted, membrane keyboards, and were located where lots of people could "hang out".

A generally shorter, thinner than average, female co-worker told us that she had used such a machine, had entered a fairly complex transaction, removing money from a secondary account. A hour or so later, while shopping for clothes, her purse was stolen. Within minutes, large amounts of money were removed from her accounts.

It was fairly obvious, that someone had watched and memorized her entire key-sequence. They then followed her, or sent someone to steal her purse at an unguarded moment.

Luckily for her, it was shown that her PIN number was not obvious, and was not written down anywhere. The bank fought furiously, but she had family and friends with a lot of influence (threatened to take out all their money), and she got her money back.

Although the bank denied everything, it was interesting to note, that within months, all such machines were being ripped out from shopping areas and being replaced with "shrouded" keyboards. Also, "hidden" cameras are now appearing everywhere.

Harold Asmis, Ontario Hydro Nuclear
Tel. (416) 592 7379 Fax (416) 592 5322

re: Errors in patent databases (Gray, RISKS-17.01)

Jerry Leichter <>
Sun, 2 Apr 95 11:45:27 EDT

In RISKS-17.01, John Gray summarizes a report from New Scientist about problem in searching databases of patents, including in particular problems arising because of typos and spelling errors in the database.

Plus ca change, plus ce la meme chose. In roughly 1980, I did some work on a spelling checker program. As part of this, I dug through the literature. I found an article - source and details long forgotten - that reported on an empirical study of some kind of textual database. The study found that a significant portion of the entries in the database contained spelling or other recording errors, many of them in the indexed portions of the data. The conclusion was that queries based on exact matches stood a significant chance of failing to find entries they should have found. I don't recall the domain of the database involved, but as I recall missing even one entry that should have matched was considered to be a problem. As a result, they recommended that all queries use some kind of "loose match" algorithm. (I don't recall if they suggested one.)

At least fifteen years later, we appear not to have learned this lesson.

This is, by the way, an example of a significant limitation of many computer- based indexing systems: They make it difficult to use the *human* ability to do smart approximate matching. A couple of years ago, I heard a fragment of an interview with an author of a book that sounded interesting. I was only able to get the author's last name, and an understanding of the meaning of the title - not the actual title. When I later tried to use the on-line catalogue at the local library, I found it very difficult to locate the book. The catalogue was tree-structured: I had to first choose a particular author - and there were a good number of them with the same last name - then view titles by that author, a few at a time. I gave up after a little while and located a paper copy of "Books in Print", where a few seconds of scanning "by eye" located the book. Computer interfaces work well when you use them to do what the designers had in mind, but when it comes to true flexibility, more traditional designs can often win big, exactly because they are better at "getting out of the way" and letting the pattern matcher in our heads do the work.


Risks of tightly-packed telephone-number space

Jeff Grigg < >
Thu, 6 Apr 95 19:42:20 CDT

Here in Saint Louis we're running low on telephone numbers. With the rapid rise in cellular service, many new phone numbers have been assigned in this area code (314), leading to something of a shortage.

This causes (at least) two problems:

  1. more wrong numbers when people transpose digits
  2. rapid reassignment of numbers when service is disconnected
#1: wrong numbers are more likely to successfully connect — to the wrong person

The first problem became REAL noticeable to me when some company in Australia tried to FAX documents to a decorating company in my exchange several times a night over several nights — AT ABOUT 2:00 AM, Saint Louis time. They had the last 2 digits of the phone number transposed. At no time did they try the number by hand or turn on / listen to their FAX speaker. (You'd think that after several unsuccessful FAX attempts in a row to a new number, they'd try it with a voice phone!)

This is really a risk of having voice and data communications over the same network.

The fix: I hooked up a FAX machine to my home phone and received the FAX one night. By reading the FAX I could tell who sent it and who they were trying to send it to. (The correct FAX number, without digits transposed, was printed on the FAX they sent!) I forwarded the FAX to the intended recipient, and called them, asking them to talk to their Australian customer. But their telephone operator was playing stupid that day, and kept asking what business relationship I had that their Australian customer would want to FAX documents through me. Telling her about the transposed digits in the phone number didn't help. Eventually I gave up and called the Australian company directly. They understood the problem immediately. (...and we wonder why we loose business to foreigners — who have brains and use them. ;-)

#2: disconnected numbers are more likely to connect — with the wrong person

After moving in, I was given a number that had obviously just been released by someone moving out of my exchange. For the first few months, I'd get several calls a week asking for the other person. I called directory assistance and found his new number. (He was still living in Saint Louis.) When I get calls for him, I give them his new number. There's nothing the telephone company can do to fix this, given the shortage of phone numbers.

It's amazing: After 2 years, I still get 2 or 3 calls a month from sales people asking for that other person. He must be on all the best "telephone sucker lists." The sale people almost always give up as soon as they find I'm not who they're looking for.

RISKS of Digital Analogy [SATAN]

Bart Massey <>
Fri, 7 Apr 1995 12:01:55 -0700

(Re: A possible "solution" to Internet SATAN: Handcuffs — RISKS DIGEST 17.04)
> By close analogy, SATAN's parents' attitude appears to be that it's
> perfectly OK, perhaps even admirable, to go from house to house ...

and extends the analogy to argue that those probing systems gratuitously should be punished even if they cause no harm.

I think that this quote illustrates well the RISKS associated with what Usenet readers once referred to as "analogy wars" (did Tom Duff coin this phrase?). The risk comes in two parts:

  1. It is easy to confuse an analogy with an isomorphism, and behave accordingly.
  2. It is easy to take advantage of (1) to win an argument by choosing an analogy which, if treated as an isomorphism, leads to a desired conclusion.
In this particular case, consider the following scenarios:
  1. I decide to implicate X in a crime, so I hack a copy of SATAN so that it looks like my security probes are coming from X.
  2. I probe randomhost.subdomain.cs, and the DNS resolver decides that this must have been in Czechoslovakia.
What are some analogous situations in "burglary world"?
  1. I disguise myself as X, and then deliberately let myself be caught burgling a house.
  2. I decide to try to break into my own house to test its security, but accidentally try to break into another house which looks like mine.
Neither of these "burglary world" situations seems too likely. This can be seen to be a result of subtle problems in the analogy: In a[b], it would be difficult to disguise oneself effectively enough to achieve the desired result. In b[b], I am unlikely to be confused about which house is mine. (I also am unlikely to try to test a home's security by attempting a breakin.)

The "burglary world" [system == home, network == world, port == front door, security == lock, intruder == burglar, legal system == legal system] analogy is very popular on the net, but it really doesn't seem to me to work very well. I find the final identity, in particular, very disturbing. I personally think that SATAN's authors (*not* parents :-) had the attitude that "it's perfectly OK to send network packets to a computer system in order to see how it responds" — which seems to me like a much less controversial statement.

Bart Massey

SATAN, burglaries, and handcuffs (anonymous in RISKS-17.04)

Matt Bishop <>
Fri, 07 Apr 1995 09:49:03 -0700

The SATAN tool's proper use is very different: it's for checking YOUR OWN systems. I doubt that the unnamed correspondent, or anyone else, would object to its use in that capacity.

The argument that SATAN should not have been released because it can probe systems not under the prober's control strikes me as odd. The analogy of housebreaking given above overlooks that attempted burglary is illegal in most jurisdictions that I know of. That's what the intruder would be charged with in the analogy above. Does this mean we should ban all tools that could be used to aid someone in checking the safety of a house, since those tools can be used to find the weak points in another's house?

Perhaps a simpler conclusion to draw from the analogy is that there is no tool or device that is completely beneficial. Everything can be abused, and SATAN is no exception. The response should not be to focus on the release of the tool, but to focus on educating people in the customary codes of behavior and chastizing those who abuse the tools — anyway, that's my opinion. The anonymous poster makes this same point later on, but its importance is vitiated by the characterization of the morality of the authors of SATAN (a characterization with which I disagree, by the way).

Matt Bishop
[BTW, RISKS received too many responses to the original message to include here. Many were supportive of SATAN as a sincere effort to improve security. Several messages took offense to the anonymous author's choice of analogy. Others were seriously annoyed at the mention of the National Rifle Association. In these respects, your moderator regrets apparently being less than careful in his (im)moderation. Sorry! PGN]

Please report problems with the web pages to the maintainer