The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 24 Issue 54

Friday 19 January 2007

Contents

The problem of abstractions, or the lack thereof
Paul Robinson
There's more to worry about than math libraries
Paul Robinson
Mars Global Surveyor failure due to human error?
Al Stangenberger
Unexpected changes
Andrew Koenig
Cell phone in man's pocket sets him on fire
Mark Brader
Excel Date Bug
Al Macintyre
Travel to US to need all 10 fingerprints, credit and e-mail checks
Peter Mellor
Another insecure login
Paul D.Smith
Electronic flash, capacitors, and nerdy adolescents
Daniel P.B. Smith
Re: Digital cameras converted to weapons
Sidney Markowitz
Mark Brader
EVT 2007: Electronic Voting Technology workshop
David Wagner
Info on RISKS (comp.risks)

The problem of abstractions, or the lack thereof

<Paul Robinson <paul@paul-robinson.us>>
Tue, 09 Jan 2007 23:33:53 -0500

I believe one of our biggest problems in the development of computer
programs is the lack of adequate abstractions to be able to describe
software in such a way as to reduce the amount of grunt work needed in the
development of applications. We have not-very good tools and poor languages
to describe what we want to accomplish. Plus, the people developing software
are often not well trained, and the concepts are difficult, and often the
customer doesn't even know what he wants, and may not even know how to
articulate what he wants.

And even if he knows, and knows how to tell what he wants, he may not know
how to get the developers on-line to understand his vision. (Every writer
who has ever sold a book to Hollywood and saw the resulting movie, complains
about how the screenwriter and director "botched" the translation and
"bastardized" his or her story, so the idea of "concept" not equaling
"implementation" isn't new.)

The point which can be made as far as software is concerned is that the
higher the level of abstraction the greater the productivity the person is
capable of producing.  A programmer in a language like Cobol is going to be
two to three times as productive as one in assembly language.  A software
designer using a system like, say, Ruby on Rails might be able to do ten
times as much work as someone writing an application using, say, C or C++,
presuming the target task is the same. I am using this as an example, I do
not know if something like Ruby on Rails actually can give a full order of
magnitude increase in productivity. But I wouldn't be surprised.

The only problem is that if you take abstraction to its ultimate conclusion,
you get a language like APL which, except for some niche applications,
failed dismally because it became too difficult to work with. But you could
do some amazing things with it.  The "big thing" in APL was to write a "one
liner," an attempt to write a complete working application in one line of
code.  And I have seen some unbelievable stuff done with it.  Things that
conceivably might take dozens or hundreds of lines of code in a lesser
programming language really could be done in *one line* of APL.  I've seen
it done.  It kind of convinced me I'd never be able to do that kind of work,
I don't have the background.

But to have that kind of capability requires a really powerful programming
language, and/or excellent subroutine libraries to support it. As well as
good people to write code using that language. It is often said of APL that
it is a "write only" language in that some people would develop cute tricks
but if you ever had to do maintenance it would be simpler and easier to
start over.

But if you are even just competent in APL, the level of abstraction of the
language is so high that your productivity will easily be at least an order
of magnitude - maybe even two orders - higher than someone of the same
capability working in any other programming language. And a real "superstar"
could do things that might not even be possible even in a considerably
longer period of time in anything else.

Some people have said similar things about Lisp and the increase in
productivity they have seen from that as well.  And I think the point is
well taken: better tools to increase abstraction increase the productivity
of programmers the same way - and for the same reason - power tools increase
the productivity of carpenters.  They allow them to do more in the same
amount of time.

Our minds determine the tools we have. And the tools we have, which
basically consist, as I have stated, of not-very-good code editors, poor
programming languages, and inadequate run-time support, don't make it
surprising that we have such problems in the development of software.  But
people are thinking about ways of improving it.

There is a guy named Louis Savain who has been doing some work from
time-to-time on his idea: the creation of software modules using the concept
of a hardware abstraction, limiting the number of interactions and
cross-actions by limiting how side effects can occur. He was of the opinion
that he had a great system for reducing error, increasing productivity and
reliability of software. I pointed out to him that what he has is an
*idea*. Unless and until he actually has something implemented all he has is
a promise which may be useful but is at this time unproven.

Basically he wants to create for software what the transistor and the
printed circuit did for electronics; to allow the fabrication of complicated
components of extremely high reliability by limiting how one software module
can communicate with another to rigidly defined interfaces. If I understand
his proposals, what he wants to do is develop software visually in terms of
components, and "wire" the components together. Hardware has rigidly defined
interactions and race conditions can't occur because asynchronous operations
are not possible.  Except for the bootstrap of the machine, he would
eliminate all software as we know it, even the operating system would work
the same way, as modules with defined inputs and defined outputs and
explicit interconnections between them. He may well be right in this point.
Personally I believe he has some wonderful ideas and if he isn't right he's
very close. But right now, that's all he has, ideas.

Almost two years ago I wrote an article for comp.risks (Vol. 23 No. 73, Item
5, 20 February 2005, "In the Matter of Component Architecture"), in which I
stated that I felt one of the things we need is more work on making software
packages to be more component based as opposed to mostly custom development
from scratch almost every time.

An example I gave was on the comparison between buying a kitchen makeover
and a maintenance change to a software package. With a kitchen, you can
generally buy off-the-shelf components and have an almost exact estimate of
the complexity, the time to finish and the cost. And everything will fit
together and work right the first time. What we generally have in the
software industry is a situation where the contractor will mill the trees,
plane the lumber, forge the fasteners from the ore he will dig up and smelt,
and build everything from scratch. Which is why they can't give you a firm
estimate on delivery, or cost, or even guarantee that it will work right
once it is constructed.

In short, we get away with the great kind of racket in our business that no
one would tolerate from a Taiwanese manufacturer of toasters!

I got a number of comments explaining in their opinion why I was wrong.
Most of them not only didn't get the analogy, they didn't realize that most
of their comments reinforced my points.

One said the costs were different. Well, let's see, if you have a "bet the
company" software package - let's say the checking account program for
reconciling accounts run by a bank - which, with the original purchase price
or with the original development, and all the subsequent modifications, has
cost the bank upwards of five million dollars over the last twenty years,
spending, say, US$100,000 to do some upgrades to the system is the
equivalent of spending $10,000 on a $500,000 house.  It's in line with other
real world examples. Or perhaps it points out we're spending a lot less
money for what we are able to accomplish; while you can certainly get a
fairly good selection of cabinets in Home Depot or Lowes for $10,000, it's
not a lot of money relative to the capital costs previously expended.

And that's what it comes down to is money. Software can do, for relatively
low amounts of money relative to other costs, some really valuable things
for a lot less money. Automation enables one to drastically scale up
operations in capacity by a huge factor. Put up a bank building and have
savings accounts and you can handle a few dozen customers per day. Which is
what they did perhaps 150 years ago. Offer checking accounts and your
customers can handle dozens of transactions a month with maybe one person to
do the bookkeeping. And that's about as far as you can go without
technology. So once the telephone comes around you can do a little
more. Even if you offer multiple branches it's still limited to office
hours. So as technology improves it's possible to do more things, like
handle checks automatically so you don't have to keep hiring more people to
do bookkeeping, as well as it being more accurate.  Offer ATM access and
customers can get to their money 24 hours a day.  Network ATMs and now your
customers can get to their money from a million locations. Offer Touch-Tone
access and they can check their balance and do account transfers "from over
650,000,000 convenient locations." But unless you want to hire thousands of
people to handle around the clock to handle the transactions, you need
computer software to do it. Now, that $5,000,000 software program your bank
which it started to run its checking account now functions to handle
inquiries by telephone, and Internet, and withdrawals by local ATM and
network ATM and transactions using a check card over a credit network, and
automated deposits from the customer's employer.

Think that it's possible to do all that, in reasonable amounts of time, with
human beings in the loop? And not only that, they can do all these things
for less money than it costs for the people involved. So software provides
tremendous benefits relative to the cost of its production. And if it can be
mass marketed it can be extremely lucrative.

As it stands now, things are "good enough" to get by and even with all the
failures we have, there is so much value for such little costs that we keep
stumbling along. Improvements can radically increase that value we can
squeeze out of software which is why there is so much interest, but there is
so much value now that even with the primitive tools and capabilities we
have now, we can do some spectacular things.

The lessons of history teach us - if they teach us anything - that
nobody learns the lessons that history teaches us


There's more to worry about than math libraries

<Paul Robinson <paul@paul-robinson.us>>
Wed, 10 Jan 2007 18:35:47 -0500

Some people here have posted articles regarding errors in the math libraries
of some systems.  But there is another problem; in some cases, there *are*
no math libraries, so you might not even know if there is wrong and even if
you did notice it was wrong, you might not even know how to correct it.

Now, the following comments on the content of compilers currently all apply
only to open-source compilers since closed-source ones generally don't even
let you see what's going on in the run-time code modules.

I read sources of program compilers both as a learning exercise and to
understand how compilers work.  (I've written about 4 (small) compilers and
one (tiny) operating system, and I'm currently working on a translator
adequate to translate Cobol so that I can implement something which I think
has not been done before, a self-hosting Cobol compiler (a compiler which is
self-hosting means the compiler and the source language of the compiler are
the same.))

One of the things that has bothered me in all of the compilers I've seen,
even the ones that are designed to support multiple targets, is their use of
the built in mathematical functions of the current 586 class microprocessor.
This means that, when one wants, say, a square root, instead of the run-time
library doing a calculation for square root approximation loop, the compiler
generates instructions inline to put a value into a floating point register
and issues the square root machine instruction, and retrieves the result
from a register, and returns the result for processing.  (Or, in some cases
this is the entire square root subroutine; get the value, pass it to the
square root machine instruction, and return the value.)  I'm not picking on
square root specifically, I can include some of the other transcendentals
such as sine, cosine, tan etc.

Okay, I do understand that it is done to provide a speed advantage, because,
obviously, a hardware-performed operation is going to be considerably faster
than any software implementation.  But it does cause two problems: first, it
doesn't provide a reference implementation for the particular mathematical
function other than use of the hardware.  Second, the implementation is
non-portable.  Even though the compiler is often written for multiple
machines or should be easy to port, there is nothing available to use for
that mathematical function if the compiler is to be implemented on another
machine.

It also means if someone develops a better algorithm for that function, if
that function is done inline, instead of being able to substitute a new
routine one has to find where the inlining of square root is done and change
it back to what it should have been in the first place: a procedure call,

This also, to some degree, causes a "loss of institutional memory" for lack
of a better term; since nobody has seen code on how to perform the
mathematical function, nobody knows how it works, nor is there much
information on how the function is performed, thus there isn't much there
for someone to look at how it is done and develop improvements.

What I would really like to see - and have seen in some better compiler
applications - is where the function has been implemented, even if crudely,
and the implementation is still present but has been disabled in favor of
the equivalent but faster machine function.  Thus, especially where the
function is implemented in the high-level language, it is possible to
provide functionality when re-implementing a language compiler on another
system.


Mars Global Surveyor failure due to human error?

<Al Stangenberger <forags@nature.berkeley.edu>>
Thu, 11 Jan 2007 09:37:52 -0800

NASA is investigating whether incorrect software commands may have doomed
the Mars Global Surveyor spacecraft, which abruptly fell silent in June 2006
after a decade of meticulously mapping the Red Planet.  This is is just one
of several theories that may explain the probe's failure.  NASA announced
the formation of an internal review board to investigate why the Global
Surveyor lost contact with controllers during a routine adjustment of its
solar array.  [Source: *San Francisco Examiner*, PGN-ed]
http://www.examiner.com/a-501893~Human_Error_May_Have_Doomed_Mars_Probe.html
  [Also noted by Walter Schilling:]
  http://www.spaceref.com/news/viewnews.html?id=1185.


Unexpected changes

<"Andrew Koenig" <ark@acm.org>>
Thu, 4 Jan 2007 13:20:19 -0500

Today I logged into my Vanguard account (In case you're unfamiliar with it,
Vanguard is one of the larger financial companies in the USA, and offers
mutual funds, brokerage, and cash-management services).  I was greeted by
the following message:

  The Accounts & Activity area of our site is currently reflecting brokerage
  account holding information from Tuesday, January 2, 2007, instead of
  Wednesday, January 3, 2007. We apologize for any inconvenience and are
  working to correct this situation as soon as possible.

I am guessing that their software was unable to cope with the markets being
closed on January 2 for President Ford's funeral.  Indeed, last night they
displayed the balance of one of my mutual funds, held through their
brokerage service, as zero dollars.  They had the right number of shares,
but the price was $0.00 per share.  I am guessing that their software
expected to see a price quote because it thought the markets were open, but
didn't get one; so it substituted zero.

Usually Vanguard is pretty good about systems stuff, but no one's perfect.
I imagine that their programmers are pretty busy right now...


Cell phone in man's pocket sets him on fire

<msb@vex.net (Mark Brader)>
Tue, 16 Jan 2007 13:41:58 -0500 (EST)

(Search Google News for "luis picaso".)


Excel Date Bug

<Al Mac <macwheel99@sigecom.net>>
Wed, 10 Jan 2007 11:42:35 -0600

Here is link to article explaining what they label as a well known bug in
Excel: It does not do leap year math correctly.

http://www.itjungle.com/fhg/fhg011007-story01.html


Travel to US to need all 10 fingerprints, credit and e-mail checks

<MellorPeter@aol.com>
Sun, 7 Jan 2007 18:22:25 EST

This was reported today in the UK Sunday newspaper The Observer and in AOL
News (and probably in lots of other places).  See:
http://observer.guardian.co.uk/world/story/0,,1984496,00.html
http://news.aol.co.uk/us-to-hold-britons-fingerprints/article/20070107151109990002

The headlines mention "Britons", but the measures apply to all EU countries,
Japan, Australia and New Zealand.

The scheme will be tried out in 10 major airports from this summer and is
planned to be in use in all airports and seaports by the end of 2008.

The main points are that, for travelers entering the US:
- all 10 fingerprints will be taken, which is compatible with the FBI
  database (only two prints are currently taken);
- the prints will be retained indefinitely and with no restriction on their
  international use or sharing with other agencies;
- inspection of credit card and email accounts (already practised) will be
  strengthened.

With regard to the last point, the Observer article states:

"Britons already have their credit card details and email accounts inspected
by the American authorities following a deal between the EU and the
Department of Homeland Security. Now passengers face having all their credit
card transactions traced when using one to book a flight. And travelers
giving an email address to an airline will be open to having all messages
they send and receive from that address scrutinised.

The demands were disclosed in 'undertakings' given by the Department of
Homeland Security to the EU and published by the Department for Transport
after a request under the freedom of information legislation."

It is not made clear exactly what will be inspected: all transactions or
messages after the date of booking, any previous transactions or messages?
Will the DHS be concerned about my credit card or e-mail account if I buy a
ticket from a travel agent for cash?  This could be the end of on-line
booking of flights to the US.  (Of course, no international terrorist would
dream of using a forged card or setting up an e-mail account under a
pseudonym.)

The 'invasion of privacy' issues are pretty obvious and civil liberties
groups are predictably jumping up and down.

Peter Mellor;   Mobile: 07914 045072;   email: MellorPeter@aol.com
Telephone and Fax: +44 (0)20 8459 7669


Another insecure login

<"Paul D.Smith" <paul_d_smith@hotmail.com>>
Wed, 3 Jan 2007 10:29:59 -0000

Readers might like to beware of http://www.genesreunited.co.uk.  This is a
legitimate, and very useful, genealogy website that is sadly let down by its
login, which uses an insecure HTTP page to transmit user account name and
password in the clear.

I have contacted the support contacts pointing out the security implications
but, surprise, their support team don't seem to understand the problem and
I've failed to get through to anyone who will acknowledge that a problem
exists.

Since this site requires a small payment before all services are available,
I do wonder what comeback I have should someone sniff my account details and
then trash my family tree information carefully stored there - or worse
start doing something that attracts the attentions of "those who watch the
web" (OK, so I'm being a little paranoid but I program computers so it's in
my nature!).

FOLLOW-UP Date: Wed, 3 Jan 2007 17:21:21 -0000

I just noticed the following message on the "My Details" page - the one
where the name and password may be altered...

"From this page you can manage your details on Genes Reunited. This page is
password protected - your login and contact details cannot be seen by anyone
else."

Using Wireshark I quickly confirmed that this is nonsense and all passwords
are sent in the clear.  Naughtily, the page also has a padlock next to "This
page is password protected" to confuse the unwary.


Electronic flash, capacitors, and nerdy adolescents

<"Daniel P. B. Smith" <usenet2006@dpbsmith.com>>
Sat, 23 Dec 2006 06:46:19 -0500

My reaction to Mark Brader's story, about teenagers making electric shock
weapons from disposable digital cameras, was "I'm surprised that it took so
long."

By the late 1950s, electronic flash units were popular among not-even-
very-serious amateur photographers. The little camera shop in my town sold
many models. By the 1960s relatively compact units that snapped into the
flash shoe on atop most cameras were common. And, yes, consumer photo
magazines like Popular Photography warned of the dangers of the capacitors
inside. The size and power of these flash units was much larger than those
of the tiny ones built into cameras today, and I suspect that under some
conditions they could well have been lethal.

And the long-drawn-out suspense as the squeal of the transformer rose and
pitch, and ready light began to flash faster and faster certainly gave an
idea of immense power being concentrated. Like the slow climb at the start
of a rollercoaster ride.

In those days a friend of mine and I were doing a lot of the sorts of things
_nerdy_ adolescent males do--breaking open radios and transistors and radio
_tubes_ and almost anything we could break open. To see what was inside. And
repurpose and play with the parts.  We also liked to toss charged capacitors
at each other, hoping the victim would try to catch them. And we liked to
shock each other with the hand-crank generators you could at the time get
from surplus electronics houses.

Either, (a) I had enough sense of self-preservation not to try anything
funny with my electronic flash unit, or (b) I wasn't sure how I was going to
explain to my parents why I had switched back to using flashbulbs, so I
never actually made anything weapon-like out of my electronic flash.  But
with regard to the possibility, teenagers have had the means, motive, and
opportunity for half a century.

(_Hypothetically,_ there might have been the _potential_ to do creative
things with flashBULBS, too. Cheap little packages of pyrotechnic materials,
designed to ignite reliably and instantaneously from a small amount of
electricity...)


Re: Digital cameras converted to weapons (RISKS-24.52)

<Sidney Markowitz <sidney@sidney.com>>
Sat, 23 Dec 2006 12:52:07 +1300

The article on converting a camera to a "taser" incorrectly talked about
disposable digital cameras, which do not currently exist. In fact, the
conversion is done using any inexpensive disposable camera with a flash
unit. The xenon flash bulb requires a 1000 to 10000 volt trigger pulse to
fire, supplied at low current from a typically 300 volt capacitor discharged
through a trigger coil. Replace the bulb with a couple of wires and you can
use it to give someone a painful shock just by taking a picture while
touching them with the wires. It is not nearly as powerful as one from a
real stun gun such as a taser, which delivers on the order of 100,000 volts
or more.

The news that you can make a painful device from a cheap camera may be
shocking to some, but it is not much different and less dangerous than many
homemade weapons that teens have been known to make from common materials
such bicycle inner tubes or car radio aerials. A real risk may be that
someone may not take seriously the potential dangers of the shock. It may
usually be a painful but harmless prank, but with the wrong victim or the
wrong circumstances could cause death.


Re: Digital cameras converted to weapons (Brader, RISKS-24.52)

<msb@vex.net (Mark Brader)>
Fri, 22 Dec 2006 11:20:21 -0500 (EST)

I wrote:
> weapons ... produced by teenagers from disposable digital cameras!

However, I should not have said "digital" there (nor in the subject line).
Nor should I have confused Brentwood with Brentford a couple of issues back.
Sorry about that (slaps self!), and thanks to Chris Drewe and Ed Davies for
catching my slips.

[This is what I get for trying to be helpful instead of just sending the
URLs and letting PGN do the PGNing!  Sigh.]

  [Nevertheless, it should be obvious that PGN appreciates people who take
  the effort to abstract/summarize and comment rather than just sending the
  bare URL or the entire copyrighted text.  PGN]


EVT 2007: Electronic Voting Technology workshop

<David Wagner <daw@cs.berkeley.edu>>
Thu, 18 Jan 2007 12:26:35 -0800 (PST)

The 2007 USENIX/ACCURATE Electronic Voting Technology workshop (EVT 2007)
will be on 6 Aug 2007.  The call for papers is available here:
  http://www.usenix.org/events/evt07/cfp/
Paper submissions are due 22 Apr 2007.  Please encourage your colleagues,
students, etc. to send us their best papers.

  [The papers from the first workshop, EVT 2006, in Vancouver, are online:
    http://usenix.rutgers.edu/library/06evt/tech/index.html
  It was an outstanding workshop.  The community of awareness on the
  risks of electronic voting systems has grown enormously, since then.
  I hope that some of you will be able to submit papers to EVT 2007.  PGN]

Please report problems with the web pages to the maintainer

Top