One day, after I logged in to my CMS account here, I discovered that new mail was waiting for me in my reader. The lengthy message was prefaced by the heading: "From: Mailer@<machine>: Your message could not be sent ..etc" "Reason: Address unknown..." Upon scanning this returned letter, I discovered that it had not been written by me at all, and that the intended recipient and sender were thousands of miles away, apparently the unfortunate victims of a random mailer screw-up. The first sentence of that letter, though, I will always remember: "My dearest Janice: At last, we have a method of non-verbal communication which is completely private..."
From two stories by Dan Morain in the 'Los Angeles Times' on Tuesday, Nov. 29 and Thursday, Dec. 1: The California Lottery will fine GTECH Corp. $208,500 for a weekend computer crash that left two-thirds of the Lotto terminals in Southern California unable to accept wagers. All 4,375 terminals in Southern California stopped working for 14 minutes in the peak betting period Saturday night. Two-thirds of the terminals remained down for the rest of the night. A newly installed telecommunications program for the main Southern California lotto computer malfunctioned. The problem was exacerbated by a GTECH operator who subsequently installed the wrong back-up program. The new program was designed to improve system reliability. It has been removed for testing. "There's little doubt that the error was caused by GTECH software, compounded by GTECH operator error," said a senior vice president for the company. The state's contract with GTECH allows it to charge the company $4000 for each minute that the system is not working, and $1000 a minute when it is unacceptably slow. Lottery officials say that in the last year, the computer system has been inoperable or unacceptably slow for 779 minutes, or 0.2% of the time.
Re: the quote from "Optical Information Systems Update," Dec 1, 1988, p.8. ... two-way transmission provides complete document control and security because the forms never leave the customer[']s office. ... Of course, if one is concerned about the security of the *information*, that is a different matter.
>From: email@example.com.Berkeley.EDU (David Phillip Oster) >Is it fair to also stamp the tickets with the time of issue, so if the >distance traveled divided by the time elapsed is greater than the average >speed limit the toll taker can hand you a speeding ticket at the same time? >An appropriate computer would help the toll taker in this task. Alas, as a Mass police officer pointed out in an interview, you have to catch someone *in the act* of speeding to get them for it. Probably something to do with that annoying bill of rights...
Vendors should provide proper tools for security In RISKS 7.84, Brandon S. Allbery (allbery@ncoast.UUCP) explains that "Vendors have some blame, but their [naive] customers and [ignorant] salespeople have even more." This thesis is based on his observation and his experience that a great many small-scale customers have no inclination to incur the overhead of a 'secure' system. From this, and the context of the article as a whole, I infer further, that Mr. Allbery's feeling is that vendors are thus catering to a lowest common denominator and, perhaps in keeping with the spirit of unix, leaving the details and deficiencies to those with specific requirements. I agree that this is most likely what has happened even though it is not a publicly advertised tenet of any product developer. However, I feel that the vendors' laxities cannot be excused in the least. In fact, as the professionals with an understanding of the implications of security, or lack thereof, it is incumbent on them to produce a secure product which is still easy to install, maintain, and use. Proper tools could reduce the confusion and inconvenience which drives so many customers to take short-cuts. It would also enhance their product in all market areas. In the wake of the Internet Worm we have seen claims that UNIX is intrinisically an insecure system and that this fact casts a pall on UNIX's current rise in popularity. (I personally think the UNIX casts a pall on computing as a whole but that is another issue. :-) ) However, I still maintain that proper maintenance tools will go a long way to producing secure computer networks. I have used several varieties of UNIX and the vendor's are always very quick to advertise their value-added features and embellishments. However, how often, when porting or re-writing an operating system, do vendors take the opportunity to fix glaring bugs and deficiencies? "Compatibility!" you cry? What bug cannot be made a option left to the user's discretion? ("-B switch for historical reasons.") I hope this current wave of concern will encourage the vendors to re-think their development strategies. Bug fixes are not that difficult. It is time for the unix operating system developers to doff their hacker's capes and stop reveling in the vagarities of Unix. Keith Hanlan, Bell-Northern Research
The network worm (sometimes called virus) affair raises issues that are very important to our field. Both the BITNET Board of Trustees and the CSNET Executive Committee have been struck by the fact that many public comments on the event have contained statements such as, "We learned from it," "We will make sure technically it will not happen again," or "He did us a favor by showing...," unaccompanied by expressions of ethical concern. We have succeeded as a profession technically in creating facilities — the BITNET, CSNET and other components of the national research network — which are now critical to the conduct of science and engineering in our nation's academic, industrial, and government research laboratories. Further, this technology has spread within our nation's commercial research and development organizations and even into their manufacturing and marketing. Just as medical malpractice can have a serious effect on an individual's health, one of the costs of our success is that we are now in a position where misuse of our national and private computer networks can have as serious an effect on the nation's economic, fense, and social health. Yet while almost every medical college has at least one course on medical ethics and insists on the observance of ethical guidelines during practice, computer scientists seem to avoid such non-scientific issues. The worm "experiment" caused a major disruption in the research community. Among other points of attack, the worm exploited a trapdoor that had been distributed as a software "feature". Many hours of talent were wasted finding and curing the problems raised by this "game". Many additional hours were lost when researchers were unable to access supercomputers and mail systems due to system overload and network shutdown. We condemn the perpetration of such "experiments", "games", or "features" by workers in our field, be they students, faculty, researchers or providers. We are especially worried about widespread tendencies to justify, ignore, or perpetuate such breaches. We must behave as do our fellow scientists who have organized around comparable issues to enforce strong ethical practices in the conduct of experiments. We propose to join with the relevant professional societies and the national research networks to form a Joint Ethics Committee charged with examining existing statements of professional ethics and modifying them as necessary in order to create a strong statement of networking ethics and recommendations for appropriate enforcement procedures.
Here is an extract of an interesting comment sent to BUG-LAN@SUVM by magill@ENIAC.SEAS.UPENN.EDU (William Magill at Univ of Pa.) "..the reason that security policy procedures are important is an issue of LIABILITY." "The recent Internet worm was a case where KNOWN security holes were exploited. While what was done 'wasn't nice', it was indefensible from a point of view of liability. Put another way, had data been compromised, the fact that known security holes were not 'plugged' would have rendered the University/Hospital defenseless in a liability case."
[RISKS-7.84] referred to the common practice among the semi-literate of trusting to God that "crackers" will not invade or damage their new computer systems. As a native of God's Own Country, I must object to this use of the term "cracker" to refer to computer vandals and burglars. I suspect that our neighbours to the north (also known as crackers) would also object. Dr. T. Andrews, Systems, CompuData, Inc. DeLand
With initial caps, "Cracker" (as used in Florida or Georgia) is a proper noun, as opposed to "cracker" (as in the sense of a malevolent hacker). But in Spoken English, the subtlety is certainly lost. But we do have a problem. We desperately need a convenient term like "cracker", because the nonpejorative primary meaning of "hacker" needs to be defended vigorously against misuse by the press and others. Perhaps we could try to use "jacker" (or " 'jacker", short for hijacker) as someone who breaks into computer systems and subverts them. How about "snacker" for someone who is a nonmalicious but exploratory benevolent hacker? When Bob Morris (the elder) was visiting Berkeley from Bell Labs for the year (around 1967?), he might have been classified as a snacker: he seemed to nibble at the edges of the Berkeley time-sharing system more than anyone else. In fact, whenever he walked into the terminal pool room, others would log out — because the system tended to crash more often when Bob was logged in. (He stumbled onto quite a bunch of hitherto undetected bugs.) [Joe Bftsplk at Berkeley?]
Please report problems with the web pages to the maintainer