Date: 29 Feb 88 14:21:36 EST From: Charles.Fineman@H.GP.CS.CMU.EDU Subject: Leap year madness To: BBoard.Maintainer@A.CS.CMU.EDU Attention: unix-forum bboard All you folks who created (or extended) accounts using ADM today and used the default expiration date will most likely find that you cannot use those accounts. The default expiration date is one year from the creation (extention) date which, in today's case, is a nonexistent date. Hence, the date interpreter chokes on it and it looks to ADM as if your account has expired. To resolve this, all you have to do is change the expiration date. Charlie Fineman P.S. Now that's what I call a PHASE OF THE MOON error!!!
Out of interest, I spent some time thinking about how many design mistakes are exposed by this example. I find the following: 1. Two different representations/algorithms for dates, (the 'date interpreter', whatever that is, and the 'account creator') with different handling for unusual cases. 2. At least one representation allows illegal dates to be expressed i.e. the set of dates is not closed under the operation of addition (perhaps both allow this; not clear). 3. The treatment of an illegal date *in the future* as an expired date i.e. in the past.
All right now — how many people reading this *haven't yet realized* that their watches have to be set back 1 day, because they went from February 28 directly to March 1? Mark Brader, SoftQuad Inc., Toronto, utzoo!sq!msb, firstname.lastname@example.org
Just over a year ago I reported in RISKS-4.27 on the news stories here in the UK about a discriminatory computerized student selection system. This has hit the headlines again, now that the Commission for Racial Equality has issued a report on the affair. Since the original posting attracted some interest, I thought that the RISKS readership would like to see the attached news story, from The Guardian of 25 February 1988 (reprinted without permission), since it indicates how officialdom has, at last, reacted. COMPUTER PROGRAMMED IN PREDJUDICE Andrew Veitch on how a don built racial and sexual bias into selection methods for a south London medical school The next college found breaking the race laws by discriminating against black people will be prosecuted, senior Commission for Racial Equality officials warned yesterday. After publication of the commission's report on race and sex discrimination at the St. George's medical school, south London, a senior source said: "We will make an example of the next one." The decision by the Education Secretary, Mr. Kenneth Baker, to instruct universities and polytechnics to monitor the numbers of non-Caucasian students is seen as a half measure. The commission's officials sat they need to know who is rejected, and why. For that, they need a race question on university application forms. St George's was caught, officials admit, only because the attitudes of its selectors in years gone by were enshrined in a computer program: that program deliberately downgraded non-Caucasians and women. Few, if any, other colleges operate computerized selection programmes, so discrimination will be far harder, if not impossible, to prove. Three-quarters of St George's 2,500 applicants a year are rejected by the academic assessors without being interviewed. About 70 per cent of those who get interviews are offered places. So the first weeding-out is crucial. It is also time-consuming, which is why Dr Geoffrey Franglen, a former vice-dean of St George's and himself an assessor, set out to develop a program which, in his words, would "mimic the behaviour of the human assessors." The result, by 1980, was a program which matched the assessors' decisions in 90-95 per cent of cases. The confidential report of the medical school's internal inquiry into the affair, a copy of which has been obtained by the Guardian, shows how the program worked, and who knew about it. Candidates were classified as Caucasian or non-Caucasian on the basis of their names, or photographs if they were to hand. They were also classified by sex. Being non-Caucasian, and or a women, resulted in a lower grade on the interview scale: simply having a non-European name could take 15 points off an applicant's score. Sex had less effect: on average, being female took no more than three points off the score. That was enough, the Commission found in its investigation, to deprive 60 candidates a year of the interviews for which they should have qualified. The working of Dr Franglen's program was considered by an internal working party in 1982 and again in 1985. The senior academics who constituted those working parties which [sic] should have known - and probably did know - that race and sex were used as factors in selecting candidates, says the St George's inquiry team, which is headed by a solicitor, Mr Peter Gerrard. In fact, since the program mimicked the previous human assessors, it is probable that discrimination occurred before the program was introduced, the report says. Mr William Evans, the admissions officer, told the inquiry that he became aware that the program discriminated against women and had a "bias against non-Caucasians" in 1984. He had told the then academic registrar, Mr Jon Bursey. Mr Bursey said the information should be kept confidential. He was particularly concerned lest one of the consultants who took an interest in racial affairs, Dr Joe Collier, should hear about it. Mr Bursey left without mentioning it to his successor, Dr Gareth Jones. All went quiet until 1985 when a second working party considered the program. Dr Franglen was asked to describe its workings. He justified the weighting it gave against non-Caucasians and women, and gave the working party the impression that it had only a marginal effect on who was selected. Nevertheless, the working party recommended that the program be simplified and rewritten. The school's academic board accepted this recommendation, but nothing was done about it. The inquiry report specifically blames the then dean, Dr Richard West, for "failing to ensure the task was carried out." By March 1986, Dr Jones, the academic registrar, was aware that the program discriminated on grounds of race and sex. He did not take the matter to the dean, he said, because he thought the dean already knew about it. In November 1986, Dr Collier discovered, by accident, that the program was weighted. He wrote to the dean. Dr West asked Mr Evans to run a few cases through the program. When he saw the effect, he immediately stopped its use."
This started with a very strange bug: Some C graphics software of mine would unexpectedly shift the plot origin now and then while plotting. Eventually it was discovered the the problem occurred whenever FORTRAN formatted I/O was used. Finally it turned out that both our graphics software library and the system FORTRAN I/O runtime library use a global variable called "pc". In the graphics routines it is a structure pointer, in the fortran routines it is an integer. Now, I had always thought that you can only actually declare a global variable in one place... everywhere else it should be an external. Otherwise how can you know something is amiss when you link together 2 different libraries that might happen to clash in their choice of global variable names? Silly me... it turns out that UNIX linkers indeed WILL allow you to declare something in more than one place, and indeed will then happily assign them to the same memory location, even if they are of completely incompatible types. And if you don't happen to have the source code for one of the libraries that gets linked in, such as the FORTRAN runtime library, THERE REALLY IS NO WAY YOU CAN KNOW AHEAD OF TIME what variable names might get overlayed in this way... It makes me wonder how often this is happening and I DON'T catch it, because the bugs it causes are not so "graphic". This seems to me to be a very serious "RISK" of using the UNIX linker. Now I wonder if they also used my favorite variable names, "ii", "jj", and "kk"...?
>[As has been noted frequently in RISKS, (1) probabilities are irrelevant >when it is YOUR life that is lost;... This is true, but beside the point I was making. The writer who warned of the risks of homing devices for finding stolen cars was clearly concerned about public-policy considerations (and there is no risk of injury with the homing device); the warning about the back-seat driver device could be about public policy or individual prudence. My comment about slippery-slope arguments concerns public policy, and I should have made that clear. In that context, probabilities of risk are quite relevant. They are, in fact, also relevant to personal decisions that involve risk to life or limb. The bracketed comment above refers to life that is LOST. What we are concerned about here is life and other values that are RISKED. And I see no way to assess risks without appealing to probabilities.
I have friend who believes firmly all probabilities are 50/50, either things happen or they don't... Basically the first argument was that picking some event E and saying that because p(E)>0 then we must worry (W) about E, this leads to: W = p(E) This is the slippery slope argument, that all p is equal thus all W must be equally considered. The next argument by PGN says that there is a cost C which must be considered, so we get: W = C*p(E) and postulate there is some risk threshold T above which W seems worthy of concern. The slippery slope argument remains possible, if we assume p to be constant for all E then the C is irrelevant. In fact, the slippery slopist is forever inflating C in the listener's mind (or relies upon a preconception of high C.) Worse, there is a reversibility factor R (one major feature which seems to distinguish humans from other animals is the former's ability to carefully reverse its behavior.) Thus we might reformulate with something like: W = C*p(E) - C*p(R) or, the Worry of a Risk is the probability of one's fate corrected for the cost less the probability of reversing that event also corrected for the original cost. But human behavior is not quite so simple! We must factor in the PS(t) which of course is Pain and Suffering as a function of Time, and adjust this for yet another cost, call it $. We thus arrive at the following equation: W = C*p(E) - C*p(R) + $*PS(t) This allows us to distinguish the $1000 phone bill which is cleared up in one (hopefully inexpensive) call to their office versus one which takes many calls. One could perhaps argue that these are two different events and therefore should simply be factored into the first term (C*p(E)), that is, the probability and cost of an easily solved phone bill problem vs a difficult one should be distinguished. I am quite certain that for many readers the risk of encountering such an argument while casually perusing this digest has already exceeded their threshold for suffering and reversibility is not possible, so I will leave it at that. -Barry Shein, Boston University
From: Matt Bishop <bishop%bear.dartmouth.edu@RELAY.CS.NET> > Anyone who's seen a teenager struggle to multiply 314 and 512 by hand, then give up and reach for a calculator, knows just what I mean. Well...yes and no. There are any number of skills which our ancestors possessed (and HAD to possess to survive) in which I have little or no interest. The definition of "basic skills" changes over time. I would think multiplication was still something everyone should know (if for no other reason than that it helps build the notions you need for learning more complicated things). Driving (in the sense of guiding a horse pulling a wagon) is no longer a "basic skill" — do you care? It hardly seems that an occasionally heard collision warning is going to allow us to lose the ability to avoid running into things. I'd be interested in knowing if the FAA has any research on the number of accidents avoided because of stall warnings and ground-approach warnings as opposed to the number happening because the relied-upon warning failed to happen. scott preece gould/csd - urbana uucp: ihnp4!uiucdcs!ccvaxa!preece
> such as by making it impossible to delete, rename or amend files ...... > Does anyone know of software which would provide a simple solution to > this problem? > Tom Patterson, Department of Applied Mathematics & Theroetical Physics There is a program called PC-LOCK (*NOT* the PD version) which is made by Johnson Computer Systems. We have installed it on some hard disks here and have had no problems at all. You have to boot with drive "C" and enter the proper password to gain access. There are 5 possible passwords you can set/use....1 administrator password and 4 user passwords. IF you try and boot from drive "A", you cannot access drive "C". Norton Adv, PC-Tools 4.11, Explorer and Ultra- Utilities all return the phrase "Invalid drive...". I'm not sure exactly what it does, but when you run FDISK, it shows the drive as being a non-DOS disk. Perhaps it moves the FATs somewhere else and redirects DOS with its .SYS file..... You can also turn off CTRL-BRK permantly, which will allow you to use your favorite menu programs!! Here is the address which was supplied with the docs... PC-LOCK, Johnson Computer Systems, 20 Dinwiddie Place, Newport News VA 23602 It has been *extremely* effective in stopping people from "borrowing" the CAD programs placed on the drive. James NOTE1: If the program(s) being used allow you to SHELL to dos, you will need to examine the file, look for SHELL=C:\COMMAND.COM, and remove it. NOTE2: All standard disclaimers.....
This is summarized from a recent discussion in a non-Usenet conference of the Portal system. Several subscribers who use PC Pursuit to access Portal reported that they got a message "CONNECT FROM ..." when dialing Telenet, followed by someone at the other node simulating a login sequence, so that subscribers would supply their name and password. It appears that the problem is known to the GTE Telenet folks, and that they are working on plugging the security hole, but that they don't like to talk about it. The problem is by no means limited to PC Pursuit: users of GTE's TeleMail system and other services which ride on Telenet are also said to be vulnerable. It appears that GTE management is permitting its concern for the public image of its network to increase the risk to its customers from this fundamentally technical problem: besides plugging the leak, they should get the word out to every customer, so as to reduce the risk in the mean time. The CYA response does nothing for my confidence in Telenet. It is claimed that this is the work of people who are want only to explore and map Telenet, and have no interest in doing anything harmful with the information which they acquire, but I doubt that any comp.risks reader wants to trust the benevolence of such crackers. The insecurity of the UUCP mail network and Usenet is notorious (forged articles etc), but we sometimes make the mistake of assuming that commercial networks are technically and administratively immune to such problems (other than those inherent in users' tendencies to pick guessable passwords, of course). This problem with Telenet is a reminder that centrally managed commercial networks can be just as vulnerable as the voluntary, anarchic world of UUCP. In the particular case in point, anyone who gets that "CONNECT FROM" message on Telenet should immediately log off: most of all, don't type your password. Also, if your password starts to echo when it should be blind, disconnect immediately. If you use a robot, such as a logon script for a personal computer comm program, to access anything through Telenet late at night when the rates are low and you are asleep, make sure that the robot can recognize and respond to the CONNECT FROM condition. If your robot cannot protect you from this condition, DON'T USE IT FOR UNATTENDED LOGON THROUGH TELENET. The risks of a network in which a dial-up node can force a direct connection with another dial-up node, without the explicit agreement of the second node, appear so obvious as to make me wonder how the Telenet folks could have made such as design decision. Surely if node 1 asks to be connected to node 2, node 2 should get a dialog asking whether or not it wants to accept the connection. Comp.risks readers can perform a public service by notifying computer- naive potential victims, such as company executives using TeleMail, about the problem. The risk of mailing, e.g., unencrypted corporate business plans, to a masquerading recipient is clear. Christopher Jewell | email@example.com | sun!cup.portal.com!chrisj [This is an old war-horse that recurs every now and then, and is often thought of as a joke — even though it can often be easily perpetrated. It is the reason for the notion of the TRUSTED PATH in the National Computer Security Center's ORANGE BOOK set of criteria for trusted systems. Authentication is needed in BOTH directions — the system would like some assurance that you are whom you claim to be, and you would like the same about the system itself. Once again, we need to remind our less experienced readers that the security/privacy/integrity issues are not easy, despite various press reports that (1) there is no problem, and (2) even if there were a problem, it would be easy to fix. Solutions range from not sharing anything to running a simple checking program. But don't forget that we are dealing with people (on both sides of the fence), and that substantially changes the nature of the problems. PGN]
Please report problems with the web pages to the maintainer