The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 6 Issue 03

Tuesday, 5 January 1987

Contents

o Ham radios and non-ionizing radiation
Eric Townsend
o Date formats
Geoff Lane
o Risks of Not Using Social Security Numbers
Bruce Baker
o Source code not a defense
TMPLee
Chris Torek
William Smith
Tom Lane
Don Chiasson
Jeffrey R Kell
o Unshar program
Brent L. Woods
o Info on RISKS (comp.risks)

Ham radios and non-ionizing radiation

eric townsend <flatline!erict@uunet.UU.NET>
4 Jan 88 03:37:47 GMT
  Amateur Radios Deadly?  Operators' cancer deaths evaluated

  TACOMA, Wash. (AP) -- Amateur radio operators in two states appear to die at
  abnormally high rates from several forms of cancer, suggesting a possible
  link between cancer and electromagnetic fields, according to data collected
  by a state epidemiologist.  Others cautioned that evidence has been
  inconsistent and that other factors may be involved.

  Dr. Samuel Milham Jr. of the Washington Department of Social and Health
  Services studied the deaths of 2,485 Washington and California ham operators
  between 1979 and 1984.  He reported in the American Journal of Epidemiology
  that 29 leukemia deaths would be expected in a group of people that size,
  but he found 36 deaths.  Statistically, the expected to find 72 lymphatic
  and blood-forming organ cancers, but found 89.  And he expected to find 67.6
  deaths from prostate cancer, but found 78.  The study "indicates that
  amateur radio operator licensees in Washington state and California state
  have significant excess mortality due to acute myloid leukemia, multiple
  myeloma nd perhaps certain types of malignant lymphoma," Milham reported.

  University of Colorado and Universtiy of North Carolina studies also have
  found unusually high levels of leukemia among children who live near power
  lines, he said.

  Dr. Noreen Harris, a Tacoma-Pierce County Health Department epidemiologist,
  questioned the data, "People living near power lines may be poor and other
  (cancer-causing) things may be in their environment," she noted.  

Some notes and questions I have:

1.  I remember reading in Omni or some other pseudo-science mag last
    year an article about the ill-effects of low-level ionizing
    radiation produced by things like 110VAC wires running through
    homes.  The individuals preforming the study were being lauded by
    most other 'serious' scientists.  Anybody else recall this?
2.  I feel Dr. Harris's remarks were very weak, especially since she's
    questioning someone else's not-so-accurate-data. "People living
    near power lines may be poor.."  We *all* live near power lines,
    that's how the stuff gets to our house! =:->.
3.  I realise that ham radio gear is not always shielded properly, etc,
    but how safe are we hackers from the stuff our 'puters put out?  I
    sat in front of a Commodore 64 and a TRS-80 Model I, Lv II for a total
    of 8 years, before, during, and after puberty. (TRS-80 at 9 years old!)
    What are the effects of high-level non-ionizing rad. on someone in
    the developmental stages of life, I ask.

J. Eric Townsend ->uunet!nuchat!flatline!erict smail:511Parker#2,Hstn,Tx,77007


Date formats

"ZZASSGL" <ZZASSGL@CMS.UMRCC.AC.UK>
Tue, 05 Jan 88 10:00:02 GMT
Happy New Year to All - Except those program designers whose systems print
dates in the form such as 5/1/88. Now as far as I'm concerned this translates
to 5th January 1988, but then I live in England.  In North America I believe
that it would be the 1st May 1988.  The problems start when I have to use
programs designed in America on a computer situated in the UK - especially
during the first few days of each month when dates such as 5/6/88 occur!

If we must make a resolution for the new year, lets all promise to specify
the name of the month rather than its ordinal in all our programs.

Geoff Lane
UMRCC


Risks of Not Using Social Security Numbers

Bruce Baker <BNBaker@kl.sri.com>
Tue 5 Jan 88 14:46:53-PST
The items about social security numbers reminded me of a series of computer
and administrative problems that arose at Boston College in the early 70s
when it was decided that students would no longer be identified by social
security numbers (nor by any other number!).  

Of course, all sorts of batch accounting and record keeping programs  depended
on a student number for processing.  So, a unique number was assigned to each
student unbeknownst to him/her.  Moreover, a mapping program was necessary to
relate the "secret" number to the social security number of students who had
enrolled before the ban.  When problems arose, it was tempting to let a student
know his/her number so that it would not happen again.  I believe they finally
decided to let all students know their numbers and that they began placing the
numbers on student IDs, because too many problems arose.  And, of course, many
students did not want to memorize another number and would have preferred the
old system.

MORAL:  Social security numbers as general-purpose identification numbers may
be less painful than the alternatives.


As long as I am delving into the fuzzy past, here are two more items that 
perhaps deserve to be in the RISKS history book.  Please excuse me if I do not
have perfect recall.


Subject:  Risks of Computers Obeying Newton's Laws

Around the mid-sixties, the Air Force ordered a Honeywell computer for delivery
to Rhein Main Air Force Base.  As I recall, it was about a million dollar 
computer.  When it arrived in the middle of the night at Rhein Main, no 
Honeywell people nor supply officers were on hand to oversee the unloading.
The computer was supposedly tied down to one of those material handling flatbed
vehicles that has a series of rollers on its surface.  You guessed it!  As the
driver turned to enter a hanger, the computer kept going straight ahead.

I heard that Honeywell was secretly happy because they did not expect to sell
many of these computers.  Now they had doubled their sales.

MORAL:  Computers are subject to the same laws of physics as other types of
cargo.


Subject:  Risks of Not Employing Configuration Management for Computer Software

Another one from the mid-sixties.  ---  A command and control system was 
developed by GE (I believe) for use at Ramstein Air Force Base.  The system 
deployed tactical aircraft during alerts.  However, the controllers in the
control center trusted their own judgments more than they trusted the system.

Nonetheless, over several years, various people tinkered with the hardware and
software and then rotated to other assignments.  GE techreps were also cut back
drastically during that period when the military did not wish to become 
dependent on contractor personnel in an operational environment.  Configuration
management documentation of changes was nonexistent.  A new commander decided
to use the system and so the first problem was to determine what they had.
Logically, they asked GE.  From what I understand, the GE proposal to 
inventory, analyze, and document the configuration was over $1 million.  Some
thought that GE took advantage of the situation but ......

MORAL:  One-of-a-kind systems require the same principles of configuration
management as systems that are produced in the thousands.


Source code not a defense

<TMPLee@DOCKMASTER.ARPA>
Mon, 4 Jan 88 22:26 EST
Regarding the comment in Risks 6.2 about being safe from virus if one has
the source code -- I might remind people to re-read Ken Thompson's paper
[Turing award lecture, Reflections on Trusting Trust, CACM 27, 8, August
1984] wherein the concept of an invisible virus was proposed -- the actual
virus was (to be) buried in the object code of the C compiler for Unix; its
object was that IF it were compiling the source code of the login module it
would insert a little piece of code that allowed it's creator always to log
on (the War Games "backdoor"); IF it were compiling the source code of the C
compiler itself it would merely copy itself at the appropriate place.  In
both cases there was no sign of the virus in the source code nor presumably
in the listing generated by the compiler; I don't know Unix much, but one
could also hypothesize the virus as also being clever enough to recognize
when it was compiling whatever standard debuggers and decompilers come with
the system as to insert in them code that made them protect (somehow mask a
user from seeing) the pieces of the virus in the object code if those tools
were used to look at object code.  Here a user could inspect the entire
source code of the system (or so he thought) and not find anything; if the
initial virus went out in very early versions of the compiler there would be
little chance of a user finding any uncontaminated ones with which to
compile the source code he was given.

(I stand neutral on whether such a virus was actually created and
released on the world; I don't know and the folklore has it both ways.
But that's not the point.)

    [Please be prepared for a LOT OF OVERLAP in the next few messages.
    Since this is such a popular topic, I'm not going to try to edit.
    Just omit the rest if you're fed up with this topic.  On the other
    hand, some very important points are being made, and the repetition
    may be in order to counteract some of the more simplistic views.  PGN]


Source code vs. attacks

Chris Torek <chris@mimsy.umd.edu>
Tue, 5 Jan 88 09:43:16 EST
"guthery%asc@sdr.slb.com" claims

>... there is no protection in trying programs out with 
>write-only harddisks or with privileges turned off.

Perhaps not.  It is, however, easy to show that if *no* state is
retained between the execution of one program and the execution of
another, the former program cannot affect the latter.  (Take away
its tape and a turing machine can no longer compute.)  This is a
very expensive solution, and infeasible for most people.

  [Another plug for Ken Thompson omitted...]

>There are NO good reasons why software vendors shouldn't give you
>the source code of any program they sell you.

(I daresay this depends on one's definition of a `good reason'....)

>The reason they don't currently is because you could see what a mess
>the program really is.

No doubt that is one reason.  Having in times of need disassembled
various programs back to source, I will agree that many are poorly
written.  I doubt that is the only, or even the main, reason most
vendors are unwilling to distribute sources.  (It is rather fun,
actually, to call a vendor and say: `Will you still not sell source?
Very well.  By the way, there is a bug in your leap year code.
Also, you left out a ``#'' in the startup routine where . . . .')
But this is all beside the point.  (Ah, yes, the *point*:)

>As long as we willing to accept programs from software suppliers
>without the source code we, irresponsibly in my view, accept undue
>risk and invite disaster.

What, then, are we to do?  Form a software users' union?  (I am
only half joking.)  I would very much appreciate receiving source
code to the binaries I must run.  The vendors remain unwilling to
sell the code, and we do not have the time to write the software
ourselves.  We have no alternate suppliers who will sell source.
The only remaining option seems to be not to run the code at all.

In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7690)
Domain: chris@mimsy.umd.edu Path:   uunet!mimsy!chris


Knowing Source Code is not Sufficient

William Smith <wsmith@b.cs.uiuc.edu>
Tue, 5 Jan 88 15:26:19 CST
>             IF YOU CAN'T READ IT, DON'T RUN IT

Unfortunately, this is not sufficient if the vendor of your software is not
trustworthy.  Ken Thompson's Turing Award Lecture in 1983 [CACM, Aug. 1984]
described how bugs not in the source code can end up in the executable.
Even if you compile every program given you, something must assemble or
compile the compiler.  Something must assemble that, etc., etc.  Unless you
are willing to bootstrap your software from the raw bits using source code
that you trust as an assistant during the bootstrap, there still may be
trojan horses.

From the lecture: "No amount of source-level verification or scrutiny will 
protect from using untrusted code.... A well-installed microcode bug
will be almost impossible to detect."

When you buy a tool such as an automobile, you do not ask to see all of the 
engineering drawings and analyses to decide that the car is safe.  An 
amount of trust is necessary when using any technology.  Computers are 
general purpose tools and as such can hide many different faults.  If the 
source of the hardware or software is trustworthy, there should be fewer faults
and fewer still malicious faults.  The relative ease with which a single
employee can insert hidden bugs demostrates that care should be taken
in determining who is trustworthy.

Bill Smith, pur-ee!uiucdcs!wsmith, wsmith@a.cs.uiuc.edu


Re: Source Code is Counter to Viruses & Trojan Horses

<Tom.Lane@zog.cs.cmu.edu>
Tuesday, 5 January 1988 11:13:58 EST
In reply to guthery%asc@sdr.slb.com, who writes in RISKS 6.2:
>There are NO good reasons why software vendors shouldn't give you the source
>code of any program they sell you.

On the contrary, there are several good reasons.  Some of them have to do
with commercial advantage, i.e., not having one's work ripped off.  If Mr.
Guthery believes that this is not a legitimate concern, he obviously does
not make his living by selling software.

There is also a good technical reason: VERSION CONTROL, for purposes of
customer support.  Tech support is difficult and time-consuming enough when
one knows exactly what software the customer is running.  Shipping source
code is an open invitation to the customer to tweak the software to suit his
purposes --- but he will still expect the vendor to support that software,
answer questions about its behavior, track down bugs (possibly induced by
customer changes), etc.  The RISK introduced by source code distribution is
that program changes will be made by customers who don't fully understand
the program; we all know what that leads to.


On the original topic, Mr. Guthery's main argument was that source code
distribution would allow customers to inspect for trojan horses.  I don't
believe this; in large programs it is not difficult to hide trojan horse
code well enough to defeat even careful inspection.  Besides, he can't
seriously propose that no one ever run a program that they haven't
personally (or even corporately) studied; no one would ever get any useful
work done.  (Have you personally checked over every line in your operating
system lately?)

Moreover, source code distribution means that more people have a chance to
diddle the program!  Even if the original author is reliable, what about all
the people at the user's site?  Access to source code makes it *much* easier
to create a trojan horse version of a program.  Another way to put this is:
even if you've seen the source code, how do you know it matches the bits
you're executing today?

I don't know the solution to trojan horse attacks, but source code
distribution is not it.
                tom lane

ARPA: lane@ZOG.CS.CMU.EDU
UUCP: 

Source Code is *not* Counter to Viruses & Trojan Horses

05 Jan 88 09:59:11 PST (Tue)
I would like to comment on the assumption that having source will
protect you from Trojan Horses.  While this is frequently true, a
recent Turing Award Lecture has pointed out that it's not in general
true, because of the compiler bootstrapping problem.  The case made is
that a compiler can be written which detects attempts to recompile the
compiler and inserts code which detects attempts to compile the login
program and inserts code in that which allows bogus logins, as well as
replicating the code which modifies the compiler binary.  The
system is then shipped with the binary of the trojan horse compiler
and the source for the valid compiler.  Even when you completely
rebuild the system from sources you still get the compiler and login
program with the trojan horse.  Nothing short of dissassembly of the
original compiler or using an outside compiler will work, and using an
outside compiler usually isn't feasible.

At some point you have to trust somebody.


Viruses and sources

Don Chiasson <G.CHIASSON@DREA-XX.ARPA>
Tue, 5 Jan 88 17:21:31 AST
 >From: "guthery%asc@sdr.slb.com" <GUTHERY%ASC%sdr.slb.com@RELAY.CS.NET>
 >Subject: Source Code is Counter to Viruses & Trojan Horses
 >.. there is no protection in trying programs out with write-only harddisks 
 > or with privileges turned off.  Doing this only sets the hook deeper.  
     Running a program with write protection and restricted privileges
does give limited protection which is better than no protection. 
 > .. anytime you run a program whose complete workings you do not ... 
 > understand you are  ... at risk.
     Agreed.  But very few people completely understand any program.
 > One ... way to counter viruses and trojan horses is to insist on getting
 > the source code ... IF YOU CAN'T READ IT, DON'T RUN IT
     True, if you read it.  Reading and understanding source code for a non
trivial program is very difficult.  Don't forget that you would also have
to read the source code for the compiler, linking loader and run time
libraries.  I haven't the time. 
 > There are NO good reasons why software vendors shouldn't give you the source
 > code of any program they sell you.  The reason they don't currently is 
 > because you could see what a mess the program really is.  ...
     There are lots of good reasons for not giving source code.  One is
that it is easier to break protection of programs if source code is
available.  Another is cost: source code is more expensive to distribute
than binaries, especially when required documentation is included.  It
might also be necessary to supply compilers, etc.  (Also with source code.)
For example, DEC has written a lot of programs in BLISS which is a product
(translation: you pay for BLISS).  There is a major RISK to the company
that the user will "improve" the product.  If these "improvements" add
bugs, whose fault is it and how easy is it to prove?  Vendors also worry
that giving source code will make the job of pirates much easier.  When
vendors do supply source code, they are often reluctant and charge heavily
for it.
 > In 999 cases out of 1,000 they don't know everything the program does 
     Do you think you will do better than the supplier?  
 > ... think of all the execute only software you run ... [,] all the
 > companies from whom you purchased this software ...[and] all the
 > pressure you put on them for bug fixes, new features, and lower prices.
 > Think about the translation of these pressures into pressures on
 > programmers.  Suppose one of these programmers decides to get .. even.
     Sure, this is a risk.  But who do you trust? If you do all the checking
yourself you may not have time to do anything else.  Delegate the job to
someone else at your organization? Do you have the extra people? How do you
know to trust them? Managing source code is a major task.  A vendor will
normally have quality controls in place.  If you buy software, there are
lots of other copies of the program running elsewhere and bugs (including
viruses, trojan horses) are more likely to be found.  In certain cases such
as banks or defence applications it may be necessary to do source checks to
verify the code, but doing so is very expensive and for most users not
worth the cost.  Finally, it is much easier to create (better!) viruses,
etc if source code is available than if not.
     We may be talking from different directions: I am a user, perhaps you
are a hacker.  If that is so, then our approaches to protection will be
different.  My feeling is that if I don't know what it is at some level of
confidence, I won't run it. 
     You will never stop a dedicated crook: all you can do is make his/her
job harder based on an assessment of the risk vs the cost of protection.  I
feel the cost of source checking is very high.  Any protection system,
computer or otherwise, will only guard against people who are basically
honest, or lazy, or of limited competence, or with limited time.  The
majority of people fall under one of more of these categories.  Limited
measures will cut out the vast majority of threats.
                    Don


Christmas virus plus

Jeffrey R Kell <JEFF%UTCVM.BITNET@CUNYVM.CUNY.EDU>
Tue, 05 Jan 88 08:44:54 EDT
Risks 6.2 contained the two comments about the Christmas virus:
   ---
>From: "guthery%asc@sdr.slb.com" <GUTHERY%ASC%sdr.slb.com@RELAY.CS.NET>
>             IF YOU CAN'T READ IT, DON'T RUN IT
   ---
>From: bryce%hoser.Berkeley.EDU@ucbvax.Berkeley.EDU (Bryce Nesbitt)
>I'm surprised nobody has mentioned this:  Around here we don't "execute"
>shar files to unpack them.  Instead there is a handly little utility called
>"unshar".  I use a version on both Unix and my Amiga microcomputer.
   ---

The problem is compounded on IBM VM/CMS systems (where CHRISTMAs EXEC took
its toll) by an often overlooked "feature" of the standard IBM "receive"
command.  Files such as EXECs are usually sent in a special encoded form
called NETDATA format.  The "receive" command is smart enough to determine
the format of the file and decode it appropriately, as is the "peek" command
used to browse a file before receiving it.  BUT... the NETDATA encoding also
allows for multiple files to be combined into one NETDATA stream.  The file
appears with only the attributes of the first file in the stream, and only
the first file appears when "peeked".  When the unsuspecting victim performs
the "receive", the remaining files are ALSO received with REPLACE IMPLIED!

Building such a "nested" NETDATA deck is not common knowledge, but can be
done using the undocumented internal module used by sendfile/receive.  The
now infamous CHRISTMA EXEC could just as easily contained a PROFILE EXEC
behind it that would format your A-disk the next time you logged on.  Thus
even if you did read the source code for CHRISTMAs and trashed it upon
discovery of its function, your next logon would result in erasure of your
entire A-disk (and also any evidence of what caused it to occur).

There is a semi-public-domain overlay for RECEIVE available on any Bitnet
NETSERV server which detects multiple datasets in a NETDATA stream.  Any
concerned IBM CMS user out there should investigate this utility.


Unshar program (was: Viral VAXination [Risks 6.2])

Brent L. Woods <ahh@j.cc.purdue.edu>
Tue, 5 Jan 88 9:14:35 EST
In Risks 6.2 bryce@hoser.Berkeley.EDU (Bryce Nesbitt) writes:

>I'm surprised nobody has mentioned this:  Around here we don't "execute"
>shar files to unpack them...

     This probably should have been mentioned earlier, as I'm sure it's
of interest to quite a few people.  I can't speak for either the
comp.sources.unix or comp.sources.misc archives (though, as a side note,
I couldn't find any unshar programs in the comp.sources.unix archive
that is maintained here at Purdue), but there *is* an unshar program in
the comp.sources.amiga archives.  I'm not absolutely certain, but I
believe that the version we have is the one that Bryce was writing about
above.

     If anyone might want a copy of this program source code (in C),
it's available via anonymous ftp from j.cc.purdue.edu in the amiga
source archives (the directory it's in is news/comp/sources/amiga/volume1,
and the filename is unshar.c.Z).  It's written with portability in mind,
so it should compile and run under a variety of systems, but we've only
tested it under UNIX and on the Amiga so far.  Also, the file in the
archives is compressed (UNIX "compress" utility), so ftp should be set
to "binary" mode to insure a correct transfer.

Brent Woods, Co-Moderator, comp.{sources,binaries}.amiga

USENET:   ...!j.cc.purdue.edu!ahh       ARPANET:  ahh@j.cc.purdue.edu
BITNET:   PODUM@PURCCVM

Please report problems with the web pages to the maintainer

Top