Wednesday, November 30, 2011

Disposable Programs

In many IT installations today, the number one problem is program maintenance.
Although the total problem is far from simple, there are a number of relatively simple ideas that can be applied immediately to furnish "prompt relief." One such idea is the disposable program.

The idea of disposable programs is not new. Every programmer has written code that was to be used once and then thrown away—codes such as:

1. First-cut subroutines, as for simple, quick formatting of output.

2. One-time reports.

3. Test drivers.

4. Research programs to probe some peculiar feature of the programming language, operating system, database, or other "black box."

5. Engineering assist programs, to help diagnose a particular hardware malfunction.

If you consider these five examples relative to your own experience, you will notice two categories:

KEPT: first-cut routines, and one-time reports, and

DISPOSED: test drivers, research programs, and hardware testers. That is, though all are thought of as single use programs, the KEPT routines tend to be held, somewhere, "just in case." Only the DISPOSED programs are actually discarded, whether they should be or not.

Can you recall an instance when you wished you had actually retained a discarded program?

And can you recall cases of KEPT programs you devoutly wish you had destroyed when you had the chance? These are the programs you see and curse almost every day, as their user phones, pleading for "just one little change."

Perhaps we would immediately begin improving the maintenance situation by applying two simple rules about "one- time" programs:

1. If you are about to throw it away, keep it.

2. If you are about to keep it, throw it away.

Unfortunately, applying these two rules together creates an infinite recursion. All programmers would be instantly paralyzed. There must be a better way. (Or do you believe that instant paralysis of all programmers would be of great benefit to the human species?)

Consider the examples once again; you'll notice that the underlying principle seems to be:

If the programmer is responsible for the decision, the program is discarded; but if the user is responsible the program is kept.

But why not just keep all programs, for all time? There are many reasons why a program brought out of hibernation could incur costs:

1. The hardware environment has changed.

2. The system software environment has changed.

3. The size or format of the data has changed.

4. The human environment has changed.

5. Some part of the program or its supporting material has been lost or damaged.

So it does cost to rerun an "unchanged" program, and the longer the period of hibernation, the greater the cost. But you already knew this—we all know this. Then why, oh why, do we keep tumbling into the same trap? And how do we get out, or stay out, of the trap.

Well, we'll watch for readers' ideas on these questions, and next blog entry, I'll give a few ideas of my own.

Thursday, November 24, 2011

Who is Right, and What is to Be Done About It? (2)

Today, I give the rest of the story as it actually happened, then consider some of the astute comments given already.

What the Consultant Did
The consultant was astonished by the programmer's response: "That's not an error. Actually, the formula was in error, so I corrected it. The formula I programmed is correct, whereas the original formula was simply wrong."

The consultant understood, which the programmer did not, that a program error occurs when the program does not do what the customer wanted, not what the programmer thinks the customer should have wanted. That was the programmer's first mistake.

The programmer's second mistake was not understanding who his customer was. He seemed to think that the French were the customer, but the actual customer was the consultant.

(NOTE: If the actual customer had been the French, the programmer's action was still wrong, because of his first mistake. If a programmer thinks his customer has asked for the wrong thing, he could, politely, bring this thought to the customer's attention. So, if the French had been the customer, the programmer's third mistake was not bringing his thought to them. And even if the customer had been correctly identified, the programmer's fourth mistake was being rude and arrogant. That's simply not the way to get your point across, especially if your point is that your customer has been wrong.)

What the Consultant Said
"You didn't understand your assignment," the consultant said. "We're trying to simulate the precise formula used in France so we can compare it to the formula used in other countries. It's not a question of right or wrong, but merely of matching the existing French formula."

"Well," the programmer replied, "anyone with half a brain and a smattering of knowledge of inventory theory can see immediately that the French formula cannot possibly be correct, so what's the sense of programming it? Tell them to use my formula, if they want to improve their inventory management."

The management consultant decided to try another approach with the recalcitrant programmer. "That's a good idea. If you're right, I'm sure they'll really appreciate getting a better formula. In the meantime, it will help them to accept the new formula if they can see how it compares with their original one on this data, so I'd appreciate having their formula programmed as soon as you can manage."

"You don't seem to understand," the programmer insisted, "Why should I waste my valuable time on a formula I know is wrong? Just show them my formula and they'll understand."

At this point, the management consultant gave up on the programmer and got himself another one. The French formula was programmed and found to give the claimed results which were, in fact, superior in many circumstances to the approaches used in other countries. It turns out that the programmer's fifth mistake was overestimating the "correctness" of "inventory theory."

Readers' Comments
A number of readers correctly (I believe) said they would try talking with the programmer. In the actual case, the consultant tried this, but learned that the programmer was not going to listen. Perhaps this was the consultant's fault in the way he tried to talk to the programmer, but in any case, if your employee (and the programmer was, of course, working for the consultant) won't talk with you about a situation, then you have to get rid of that employee. So, attempting to talk is a good approach, in that it gives you essential information even if the programmer refuses to talk.

Other readers warned the consultant to consider his own role carefully, and to consider myriad possible interpretations of what's going on. This is always good advice for a consultant.

Several readers correctly identified one or more of the programmer's mistakes (above). Clearly, someone needs to educate the programmer about what his job is, and how to do it, but evidently the consultant lacked the skill to accomplish that. So, again, the consultant needs to consider his own role, at least for the future. In a similar situation, for example, he might take more care in choosing the programmer and/or making the programmer's task much clearer from the outset.

Those who advised the consultant to get "on the ground" with the inventory application were also on a productive track. In this case, the consultant was in the USA, and meekly accepted the refusal to pay him to travel to France and study the French approach first hand. If the consultant knew what he needed but didn't insist on having it, he made a major consulting mistake. If you insist, but your client won't supply it (time, or access, or money, or whatever), then the consultant should simply resign from that assignment.

Using specific examples rather than simply theory—what a good idea. Both the consultant and the programmer were probably "intuitives" in the MBTI sense, so they kept the discussion on the level of theory, which often misses some crucial data. In this case, if the French formula actually worked in practice, that would have thrown an entirely different light on the discussion.

Next Steps
All in all, the case example seems to have done its job—namely, stimulating an outpouring of darn good advice about consulting and programming.

Now that you have the "whole story," what further observations would you like to make as comments?


Tuesday, November 22, 2011

Who is Right, and What is to Be Done About It?

A management consultant, whose client was an international manufacturer, was asked to evaluate an inventory management procedure that the client had used with stunning success in their French operations. As part of his study, he wanted to compare the performance of the French procedure with procedures used in other locations, using historical data from several countries. A programmer with a strong management science background was given the job of programming the simulation of the French procedure.

When the consultant received the results he could not reconcile them with the figures supplied by the French company. After extensive checking he initiated a series of long telephone calls to France, suggesting that perhaps the procedure had not actually performed as well as they had claimed. The French management took offense at the implication of incompetence.

The French manager complained to the manager who had hired the consultant. Tempers mounted and international relations were strained to the breaking point.

By sheer chance, someone examined the programmer's simulation program and noticed that one term was missing and a second term was negative rather than positive. These findings led to a full technical review of the formula as translated. The review showed that the programmer's formula did not match the formula supplied by the French.

The consultant, much relieved, took the program back to the programmer and showed him the error. "That's not an error," the programmer protested. "Actually, the formula was in error, so I corrected it. The formula I programmed is correct, whereas the original formula was simply wrong."

That's the end of Part 1.

Note to Readers

Now, for you readers, the question is this:

"If you were the consultant, how would you handle this situation going forward?"

If I receive a few comments, I'll publish the rest of the story—what actually happened.

And please note: I don't accept anonymous comments. They're automatically rejected. By all means, use a pseudonym, but don't waste your effort trying to post anonymous comments.

Thursday, November 10, 2011

Iterative Development: Some History

As an old guy who's been around computing since 1950, I'm often asked about early history of computing. I appreciate efforts to capture some of our history, and try to contribute when my agin memory doesn't play tricks on me.

Back in 2003, Craig Larman and Victor R. Basili compiled an interesting article, Iterative and Incremental Development: A Brief History. I made several contributions to their history, but they did much, much more. Here's a small sample of what I told them:

We were doing incremental development as early as 1957, in Los Angeles, under the direction of Bernie Dimsdale [at IBM's Service Bureau Corporation]. He was a colleague of John von Neumann, so perhaps he learned it there, or assumed it as totally natural. I do remember Herb Jacobs (primarily, though we all participated) developing a large simulation for Motorola, where the technique used was, as far as I can tell, indistinguishable from XP.

When much of the same team was reassembled in Washington, DC in 1958 to develop Project Mercury, we had our own machine and the new Share Operating System, whose symbolic modification and assembly allowed us to build the system incrementally, which we did, with great success. Project Mercury was the seed bed out of which grew the IBM Federal Systems Division. Thus, that division started with a history and tradition of incremental development.

All of us, as far as I can remember, thought waterfalling of a huge project was rather stupid, or at least ignorant of the realities… I think what the waterfall description did for us was make us realize that we were doing something else, something unnamed except for "software development."


Larman and Basili's article has a whole lot more to say, and as far as I know, is an accurate history. I strongly recommend that all in our profession give it a good read. We should all know these things about ourselves.