Saturday, April 15, 2017

Is it challenging to be a writer?

I'm often asked, "Is it challenging to be a writer?"

Knowing how many books and short items I've published, would you be surprised if I answered "No"?

Unlike many people, I would definitely answer "No." For me, NOT being a writer would be challenging.

Here’s another way to think about the “challenge” of writing:

Is it challenging to be a talker?

YES, for some people, but NO for most. You have something to say, so you open your mouth and words come out attempting to convey that “something.”

Now, the same frame for writing, at least as it is to me:

I have something to say, so I sit down by myself and words come out attempting to convey that “something.”

I suppose if I had nothing to convey, then writing would be a challenge. So far that hasn't happened.


If you're about to feel challenged, try reading Weinberg on Writing: the Fieldstone Method 

This book has given thousands of writers creative and useful “things to do” when they’re about to feel challenged.

For one thing, the book is full of exercises that can get you going. In the simplest of all the exercises, all you do is take out your writing instrument (computer, pen, pencil, typewriter, ...) and write the word, "The ". Now you've started writing, so all you need now is to continue. Try it!

Sunday, April 02, 2017

Complexity: Why We Need General Systems Thinking


It isn’t what we don’t know that gives us trouble, it’s what we know that ain’t so. - Will Rogers

The first step to knowledge is the confession of ignorance. We know far, far less about our world than most of us care to confess. Yet confess we must, for the evidences of our ignorance are beginning to mount, and their scale is too large to be ignored!

If it had been possible to photograph the earth from a satellite 150 or 200 years ago, one of the conspicuous features of the planet would have been a belt of green extending 10 degrees or more north and south of the Equator. This green zone was the wet evergreen tropical forest, more commonly known as the tropical rain forest. Two centuries ago it stretched almost unbroken over the lowlands of the humid Tropics of Central and South America, Africa, Southeast Asia and the islands of Indonesia.

... the tropical rain forest is one of the most ancient ecosystems ... it has existed continuously since the Cretaceous period, which ended more than 60 million years ago. Today, however, the rain forest, like most other natural ecosystems, is rapidly changing. ... It is likely that, by the end of this century very little will remain. - Karl Deutsch 

This account may be taken as typical of hundreds filling our books, journals, and newspapers. Will the change be for good or evil? Of that, we can say nothing—that is precisely the problem. The problem is not change itself, for change is ubiquitous. Neither is the problem in the man-made origin of the change, for it is in the nature of man to change his environment. Man’s reordering of the face of the globe will cease only when man himself ceases.

The ancient history of our planet is brimful of stories of those who have ceased to exist, and many of these stories carry the same plot: Those who live by the sword, die by the sword. The very source of success, when carried past a reasonable point, carries the poison of death. In man, success comes from the power that knowledge gives to alter the environment. The problem is to bring that power under control.

In ages past, the knowledge came very slowly, and one man in his life was not likely to see much change other than that wrought by nature. The controlled incorporation of arsenic into copper to make bronze took several thousand years to develop; the substitution of tin for the more dangerous arsenic took another thousand or two. In our modern age, laboratories turn out an alloy a day, or more, with properties made to order. The alloying of metals led to the rise and fall of civilizations, but the changes were too slow to be appreciated. A truer blade meant victory over the invaders, but changes were local and slow enough to be absorbed by a million tiny adjustments without destroying the species. With an alloy a day, we can no longer be sure.

Science and engineering have been the catalysts for the unprecedented speed and magnitude of change. The physicist shows us how to harness the power of the nucleus; the chemist shows us how to increase the quantity of our food; the geneticist shows us how to improve the quality of our children. But science and engineering have been unable to keep pace with the second-order effects produced by their first-order victories. The excess heat from the nuclear generator alters the spawning pattern of fish, and, before adjustments can be made, other species have produced irreversible changes in the ecology of the river and its borders. The pesticide eliminates one insect only to the advantage of others that may be worse, or the herbicide clears the rain forest for farming, but the resulting soil changes make the land less productive than it was before. And of what we are doing to our progeny, we still have only ghastly hints.

Some have said the general systems movement was born out of the failures of science, but it would be more accurate to say the general systems approach is needed because science has been such a success. Science and technology have colonized the planet, and nothing in our lives is untouched. In this changing, they have revealed a complexity with which they are not prepared to deal. The general systems movement has taken up the task of helping scientists unravel complexity, technologists to master it, and others to learn to live with it.

In this book, we begin the task of introducing general systems thinking to those audiences. Because general systems is a child of science, we shall start by examining science from a general systems point of view. Thus prepared, we shall try to give an overview of what the general systems approach is, in relation to science. Then we begin the task in earnest by devoting ourselves to many questions of observation and experiment in a much wider context. 

And then, having laboriously purged our minds and hearts of “things we know that ain’t so,” we shall be ready to map out our future general systems tasks, tasks whose elaboration lies beyond the scope of this small book.

[Thus begins the classic, An Introduction to General Systems Thinking]

Sunday, March 12, 2017

Classifying Errors

I received an email the other day from Giorgio Valoti in Italy, but when I wrote a response, it bounced back with "recipient unknown." It may have been a transient error, but it made me think that others besides Giorgio might be interested in discussing the issue of classifying errors, so I'll put my answer here and hope Giorgio will see it.

Here's the letter: 

Dear Mr. Weinberg,
My problem is that I’m looking for good way — maybe a standard, more likely a set of guidelines — to classify and put a some kind of label on software defects.

Is there such a thing? Does it even make sense trying to classify software defects?
And here's my reply:

Hello, Giorgio

It can certainly make sense to classify errors/defects, but there are many ways to classify, depending on what you're trying to accomplish. So, that's where you start, by answering "What's my purpose in classifying?"

For instance, here are a few ways my clients have classified errors, and for what purposes:


- cost to fix: to estimate future costs

- costs to customers: to estimate impact on product sales, or product market penetration

- place of origin in the development cycle: to decide where to concentrate quality efforts

- type of activity that led to the error: to improve the training of developers

- type of activity that led to detecting the error: to improve the training of testers

- number of places that had to be fixed to correct the error: to estimate the quality of the design

- and so on and on

I hope this helps ... and thanks for asking.

--------------end of letter-----------

As the letter says, there are numerous ways to classify errors, so I think my readers would love to read about some other ways from other readers. Care to comment?

Monday, February 20, 2017

How Long Can I Remain a [Ruby, Java, C++, Python, …]  Programmer?

Several respondents to an earlier post have asked me about the future prospects for workers in one programming language or another. Here's my best answer.

As others have said, "I can predict anything but the future." But also others have said that the only things we know about the future are what we know from the past. Therefore, you might get some idea of your future as a [Ruby …] programmer from the answers to a recent Quora question, "What were some jobs which existed 50 years ago but have largely disappeared today?"

It was great fun reading all these answers, many of which described jobs I held back then. I go back a bit more than 50 years, though, so I have a few more to add. Most obvious omission was the iceman. In the 1930s, we had an icebox (not a refrigerator, but an actual box that held a block of ice). The iceman’s horse-drawn wagon would come around regularly and be surrounded by us kids, hoping to get free shards of ice caused when he cut up little blocks to fit our iceboxes.

Another job only briefly mentioned was typesetting. I never held that job, but I was trained for manual typesetting for a semester in high school. At least I know where terms like upper-case and lower-case come from.

Someone also mentioned keypunch operator, a task (not a job) that was often done by prisoners who were literally chained to their machines. Who weren't mentioned, however, were key verifier operators. Not many people today have ever seen a verifier, let alone even know what one was.

Even before my time, there were jobs that disappeared, but which I read about—for instance, in a nineteenth century book about jobs for women. The final two chapters in the book were about a couple of sure-fire women’s jobs for the future (1900 was then the future).

First chapter was about teletype operators. The chapter “proved” that there was a great future for women because they could operate a telegraph key at least as fast as men (and the telephone had yet to be invented).

Second chapter was about picture tinters. There was, of course, no color photography, and it wasn’t really even conceived of. Women were supposedly much better at coloring photos because of their “artistic bent” and their more delicate hands. Though there are a few photo tinters still around today for special jobs, it’s not a career with a great future.

By the way, one future job for women that wasn't even mentioned in the book was typist (or amenuensis) in spite of the then recent exciting invention of the typewriter. Other sources explained that women would never be typists because everyone knew that women were not good with machines.

It's fun to think about these forgotten jobs, but they're also a source of important knowledge, or perhaps even wisdom. Job disappearance is not some new phenomenon caused by computers. It's always gone on through history. True, some jobs lasted a long time, so long that they were passed down from generation to generation, even becoming family names, such as Smith, Turner, Eisenhower, Baker, and Miller. (See, for example, <surnames.behindthename.com> for hundreds of examples)

Some of those jobs still exist, though often modified by new technology. Do you still recognize Fuller, Chandler, or Ackerman? And many others have largely disappeared, remaining only in some special niche, like photo tinters. Do you know anybody named Armbruster who still makes crossbows? Well, you probably know a few Coopers, but how many of them still make barrels?

No job is guaranteed. Nothing entitles you to hold the same job for life, let alone pass the job down to your children. So, for example, if you think of yourself as a "[Ruby…] Programmer," perhaps you'd better prepare yourself for future with a more general job description, such as "programmer" or "problem-solver."

In fact, there's a whole lot of people out there who think (or hope) the job of "programmer" will disappear one of these days. Some of them have been building apps since the time of COBOL that would "eliminate programmers." I've mocked these overblown efforts for half a century, but history has tried to teach me to be a bit more humble. Whether or not they succeed in your lifetime, you might want to hedge your bets and keep learning additional skills. Perhaps in your lifetime we'll still need problem-solvers and leaders long after we've forgotten the need for Chamberlains and Stringers.

Additional reading: 

Monday, February 13, 2017

Should I learn C++ or Python?

When I first saw this question on Quora, there were already 47 answers, pretty much all of them wrong. But the number of different answers tells you something: choice of programming language is more of a religious question than a technical one. The fact is that if you want to be a professional programmer, you should learn both—and at the same time.

When we teach programming, we always teach at least two languages at the same time, in parallel. Assignments must be done in both (or more) languages, submitted along with a short essay on why the solutions are different, and why the same. That’s the way to develop some wisdom and maturity in the coding part of your professional work.

Some of the respondents asserted that programming languages are tools. If that’s an appropriate metaphor, then how would you answer this question of a wannabe carpenter:

"Should I learn saws or screwdrivers?"

Do you think someone could be a top-flight carpenter knowing only one?

So, stay out of this quasi-religious controversy, which can never be settled. Instead, spend your valuable time learning as many different programming languages as possible, at least 5 or 6. You won’t necessarily use all of them, but knowing their different approaches will put you far above those dullards who say:

“I only know Language X, but it I still think it’s the best language in the world.”




Sunday, February 05, 2017

Fuzz Testing and Fuzz History

In 2016 I added a paragraph to the Wikipedia page on "fuzz testing." Later, the paragraph was edited out because it "lacked reference." The editor, however, suggested that I blog the paragraph and then use the blog as a reference, so the paragraph could be included. So, here's the paragraph:

(Personal recollection from Gerald M. Weinberg) We didn't call it fuzzing back in the 1950s, but it was our standard practice to test programs by inputting decks of punch cards taken from the trash. We also used decks of random number punch cards. We weren't networked in those days, so we weren't much worried about security, but our random/trash decks often turned up undesirable behavior. Every programmer I knew (and there weren't many of us back then, so I knew a great proportion of them) used the trash-deck technique.

The subject of software testing has many myths and distortions. This story of fuzz testing has several morals:

1. This type of testing was so common that it had no name. Apparently, it was giving the name "fuzz testing" around 1988, and the namers were thus given credit in the Wikipedia article for "inventing" the technique.

2. This is just one example of how "history" is created after the fact by human beings, and what they write becomes "facts." That's why I believe there are no such things as "facts"—not in the sense of "truths."

3. In any case, this is one example of why we ought to be wary of labeling "inventors" of various techniques and technologies. For instance, Gutenberg is often labeled the "inventor" of moveable type, though moveable type existed and was widely used long before Gutenberg. Gutenberg used this idea in ways that hadn't been employed before. That was his "invention," and a worthy one it was, but if we're to understand the way technology develops, we have to be more precise in our definition of what was invented and by whom.

Finally, I have no idea who "invented" fuzz testing. It certainly wasn't me.

NOTE: If someone would like to update the fuzz testing article on Wikipedia, they're welcome to reference this blog post.

Wednesday, January 25, 2017

What is the right reason to leave a job?

As a consultant, I frequently leave jobs. I also help many people decide whether or not to leave their jobs. I have learned there is no one "right" reason for leaving, but I've accumulated a list of many "good" reasons for leaving. I'll give some examples:

In my career, I have left jobs when 

- the job I was hired to do was finished.

- the job I was hired to do could not be finished.

- the job I was hired to do would be finished just fine without me.

- I was not able to do the job I was hired to do.

- the job I was hired to do wasn't worth doing.

- I was no longer learning new things (that's my most frequent reason for leaving)

- they told me that my pay was going to be "temporarily" delayed

- they asked me to do something illegal or unethical

You may notice that I never leave just because someone is going to pay me more money. If I was hired on to do a job, I feel committed to see that the job is finished, or going to be finished, or will never be finished. Only when my commitment is fulfilled am I ready to move on to bigger things. I don't think it's a good idea to leave behind me a trail of broken commitments.

Another good reason for leaving is not one I've experienced yet, but it's when they ask you to do something dangerous to your life or health. Very few jobs are worth dying for.


And here's a useful principle when leaving: If possible, don't quit until you have the next job set up. Why? Because it's much easier to get a new job when you already have a job. Employers tend to be suspicious of unemployed people.