Experienced computer users are aware that they must backup their data regularly to ensure that the inevitable hardware/software failures and operator errors do not cost them months of work and considerable stress. In most mainframe environments, users are supported by well-designed backup systems--including off site storage of tapes. With the first wave of microcomputers, people found that facilities for backing up their work were inadequate: they consume too much time and are hard to organize. Consequently, few microcomputer users can recover all of their work up to the previous 24 hours. The majority of users would lose years of work if the site were destroyed or seriously damaged. In fact, most people consider themselves "lucky" if they can recover even a small portion of their work. [I should add that I know of one heavily-used VAX that gets backed up quarterly at best.] Unfortunately, we are in the process of introducing more and more professionals to computers. We tell them that their work will be faster, more efficient, and possibly even better. From a recent survey of my department (English), I would estimate that about 90% of "them" believe us. So, we are about to equip these people with workstations and will teach them to develop their books on these machines. Unfortunately, no one has mentioned backup at all so far, in spite of the fact that these machines are rumored to eat files and directories. Even if we assume that professors will be admonished to backup their files regularly, we cannot be so naive as to assume that they will if it takes more than a few minutes. Since a complete backup of a 10 megabyte hard disk on an IBM XT can take a half-hour, I am sure that backing up a 40 megabyte hard disk on a workstation will require more time (and diskettes) than the majority of our scholars will invest. Now, one of these people is going to lose a book, or most of a book. And s/he is not going to be happy. In fact, I think we can be sure that new users will not ever want to see a computer again, and colleagues may be scared off as well. In addition, someone is going to be held accountable. Here is a brief tally of the risks: 1) loss of work by the professor 2) loss of interest in computing by the professor and some colleagues 3) loss of confidence in departmental consultant (me) 4) loss of confidence in project team heading the project There may be others, and (1) may actually be much more severe than a loss of work. A delay of a couple of months in developing a manuscript could cost a young professor tenure, for example (assuming that given the seasonal nature of academia, a two month delay in submission could cause a six month delay in acceptance or could make one's work obsolete because of another publication). I would like to hear from others who have faced these problems. Horror stories, preventive strategies, references to theoretical articles--all would be useful. I suppose that there may be legal considerations as well? --Jim Coombs, Brown University JAZBO@BROWNVM Acknowledge-To: <JAZBO@BROWNVM>
ARE WORD-PROCESSING AND ELECTRONIC MAIL HELPING TO PROLIFERATE BAD WRITING? Before word processors and electronic mail existed, important letters or documents were usually handwritten and hand-corrected, often in several drafts, before being typed and mailed. The typing of the letter represented a finalizing and codifying process which encouraged well thought-out communication. Care needed to be taken, since a single error could necessitate re-typing the entire letter or document. There is a hidden risk in the new media, in that they have enabled us to bypass the correction and finalizing phases of letter writing, often resulting in quick and efficient dissemination of poorly planned, sloppy and confusing prose. In technical communications, where complex and potentially important ideas are exchanged, clearness of expression is obligatory. I could cite, nevertheless, many examples (some from recent RISKS, which I will not include to avoid unfairly embarrassing the authors) where bad writing has rendered sentences unintelligible and thoughts and ideas obscure. We tend to be very quick to correct each other on points of technical accuracy, but very slow to correct, or even recognize, inaccuracy of expression in our own or others' writing. While I do not advocate abandoning the ASCII keyboard for quill and parchment, I do encourage readers of RISKS to take the time to proof and revise any of their writing meant to convey important technical information. Re-read your work, and have others examine it for clarity, absence of jargon, and general comprehensibility before you send or submit it to anyone. Remember that word processors and email facilities are only tools, and that the burden of effective communication still rests upon those who use them. Bruce A. Sesnovich mcnc!rti-sel!dg_rtp!sesnovich Data General Corp. firstname.lastname@example.org Westboro, MA "The rest is silence, musically speaking" [This message gets a HEARTY ENDORSEMENT from the RISKS COORDINATOR. I am horrified at some of the messages that I get. I do reject some solely on the grounds of general incoherence. (I stated initially that I would not tamper with messages, but occasionally I do fix a horrible "mispelling". Being an inveterate punster, I am attuned to ambiguities; however, I notice that most people do not notice them (the ambiguities, not the people). Bruce's message is relevant to RISKS. Just as ambiguities in program specifications can cause serious risks, so can ambiguities in discussions. Much of the lay understanding of systems and computers — particularly for something like Star Wars — is based on sloppy reasoning, misrepresentation, misunderstanding, and so on. If we can't take some care in writing what we think we meant to say, then it may not be worth writing — or reading. PGN]
Many words have appeared here and in the press on topics such as SDI, Chernobyl, and other matters. At least in this forum, we should be careful of what we say, and what we think others mean when they say something. To quote my favorite source, The American Heritage Dictionary of the English Language: deceit - Misrepresentation; deception. A strategem; trick; wile. deceitful - Given to cheating or deceiving. Misleading, deceptive. deceive - To delude; mislead. _Archaic:_ To catch by guile; ensnare. Synonyms: deceive, betray, mislead, beguile, delude, dupe, hoodwink, bamaboozle, outwit, double-cross. These verbs mean to victimize persons, for the most part by underhand means. error - An act, assertion, or belief that unintentionally deviates from what is correct, right, or true. The condition of having incorrect or false knowledge. A mistake. The difference between a computed or measured value and a correct value. Synonyms: error, mistake, oversight. These nouns refer to what is not in accordance with truth, accuracy, right, or propriety. Error is clearly preferable to indicate belief in untruth or departure from what is morally or ethically right or proper. Mistake often implies misunderstanding, misinterpretation, and resultant poor judgement... Oversight refers to an omission or a faulty act that results from... lack of attention. lie - A false statement or piece of information deliberately presented as being true; a falsehood. Anything meant to deceive or give a wrong impression. To present false information with the intent of deceiving. To convey a false impression. To put in a specific condition through deceit. mislead - To lead or guide in the wrong direction. To lead into error or wrongdoing in action or thought; influence badly; deceive. See synonyms at deceive. Misleading, deceptive, delusive. Mis- leading is the most nonspecific... it makes no clear implication regarding intent. Deceptive applies... to surface appearance, and may imply deliberate misrepresentation. Delusive stresses calcu- lated misrepresentation or sham. mistake - An error or fault. A misconception or misunderstanding. To under- stand wrongly; misinterpret. To recognize or identify incorrectly. Wrong or incorrect in opinion, understanding, or perception. Based on error; wrong... See synonyms at _error_. I have condensed the definitions and discussions somewhat. The point is that a person who believes something, however erroneously, and espouses and publi- cly supports that belief, is *not* lying. These are complex times. There are many matters about which reasonable persons, even reasonable scientists, may differ. There is no point in saying that a person lied when that person was doing the best work possible based on the knowledge and belief available at the time. It significantly interferes with rational discussion - it not only interferes with cooperative searches for the truth, it nearly eliminates any chance that the truth, when found, will be accepted.
From the Thursday May 1st issue of the Vancouver Sun (Vancouver, British Columbia): Copyright laws apply to software: court Waterloo,Ont. - Canada's 50-year-old copyright laws, created to protect artistic works such as music and literature, also cover computer programs, the Federal Court of Canada has ruled in a decision believed to set an international precedent. Although the verdict can be appealed, it is thought to be the first case anywhere in which legal dispute over rights to software has gone to trial. Similar cases in Britain, Australia and the U.S. have concluded with pre-trial injunctions against software pirates. In a decision this week, Justice Barbara Reed ruled in favour of Apple Computer Inc. Apple lawyer Alfred Schorr said the company cited the copyright law in suing "a very large number of defendants" involved in assembling and selling computers that were virtually identical to the Apple II. A central issue was whether programs encoded electronically on silicon chips are simply pieces of hardware or in fact represent intellectual property that should be viewed as "literary works", Schorr said. The defendants are prohibited from assembling and offering for sale computers or component parts that infringe on the two basic operating programs used in the Apple II.
> [From Jeff Siegal] > It is stated in the original column that Dr. Doering's observation > _was_ made when he watched the videotape, not months later, as Mr. > Moore claims. I did not see the original article and the time element was not clear from the excerpt. Thank you for clarifying this. I withdraw the comment in question. /mjm
Please report problems with the web pages to the maintainer