My two jobs have nearly opposite ways of handling language. Writing uses the humpty-dumpty rule that "when I use a word it means just what I choose it to mean." Computer programming uses the principle that a word means exactly what it is defined to mean with no wiggle room at all.
Does this mean that I'm walking around hitting myself over the head whenever I use a metaphor or suggesting that something more poetic than the word "while" belongs in loop constructions? Not really. Well, not often. Okay. It's been known to happen.
The thing is that there is no real contradiction as long as I remember what I'm doing. Computers don't handle metaphors well; and readers aren't in favor of loops and few of them would want to read a novel written in Objective-C. Although. Hmm. Let's try a little Shakespeare.
for (i = 0; i < [soldiers count]; i++)
[self bid:[soldiers[i] shoot]];
Nope, not gonna work.
Of my two professions, computer programming is considered the day job, writing the artistic pursuit. Yet of the two, writing is the one that is universally taught (more or less).
The value of teaching writing is sufficiently obvious and well known that I don't feel the need to delve into it, at least not yet. Everyone needs to know how to write at least a little, even if few become writers. But almost no one except programmers learns to program. That's a shame, because the manner in which programming approaches language is valuable in far more circumstances than people think, despite the poor quality of the last line of Hamlet given above.
People today have a lot of exposure to computers. We use them in all aspects of our lives, yet very few of us actually have any sense of how computers work or why they can do what they can and can't do what they can't. Most of the computer-based classes given to children and adults only teach them to be better computer users.
By the way, user is not a compliment in the lexicon of computer programming.
Acquiring a sense of what's really going on inside our computers, cell phones, cable boxes, etc. is worthwhile for anyone, and I encourage it. But what I've been finding lately is that there is a separate value in the practice of computer programming. To write a program it is necessary to be exact in the words and meanings one chooses.
When talking to another human being, one relies on that person's mind to put together what one is saying into a coherent meaningful whole. Many people rely so much on this that they don't make any effort to make what they are saying coherent, and I'm not just talking about the weirdoes in Congress and the media. But even when people try to be coherent, they still assume that the people they are talking to will fill in the blanks with their own experience and understanding. This reliance on talking to an audience of thinking beings is vital to all arts, including writing and speaking.
But computers are not intelligent. Computers are idiot machines. They do not think, they only follow instructions. And the instructions they follow have to be exactly phrased, precisely stated, otherwise they do nothing. Computers cannot follow vague instructions. Any program that seems to be anticipating your actions (such as filling in the rest of a word you are typing) is not really guessing what you mean. The program uses the methods programmed into it to offer possibilities, but the computer does not really form any idea of what you mean. It only creates the illusion of doing so. A lot of people did a huge amount of programming work in order to make a computer seem to be a very bad guesser.
What's really going on inside these and all other programs is the carrying out of a list of very precise instructions placed in a very specific order and then debugged within an inch of their function calls.
This process requires that the programmer learn to use words that have very exact, intractable meanings and purposes. Words that do not flow into each other in meaning the way words do in everyday speech. Words that not only cannot change in meaning but must be subjected to a very careful syntax of usage where a change of position can be the difference between meaningful and meaningless and where the loss of a single semicolon can cause a computer to scream out its confusion in a blood-red stream of error statements dribbling chaos and madness into the hearts of programmers everywhere.
... until they find where the semicolon they left out belongs and they rejoice without having to mop up anymore blood. Until the next error crops up.
This may sound like a living hell populated by obsessive-compulsive pedants who howl for the blood of anyone who dares do things even a little differently from established form.
But there's another side to programming, a freedom that arises from the fact that computers, along with being idiots, are also lacking in free will. For while programmers are constrained by the fact that they cannot make a mistake in the instructions they give, they know that, barring an irritatingly inconvenient bolt of lightning or the blue screen of death, those instructions will be obeyed. This means that if we craft those instructions carefully, these brainless silicon servants sitting on our desktops can be compelled to do mind-bogglingly impressive feats. And because they do not think, feel, or decide, it's perfectly all right to force them to do these things.
You know, one of the problems with writing science fiction is that one can't pass up a situation like the above without thinking about the computers rebelling. But I'm not worrying about that, because when the chips are down, the computers can't work.
Sorry about that one. Moving on.
In any case, the description of computers as machines that work according to exact principles and need to be precisely controlled applies only to computers, doesn't it?
No, because there are other unintelligent things that if treated with care and exactness can be coerced into giving up their secrets and transforming according to prescribed principles to become things we have great use for. That's what science and technology are all about: analysis leading to understanding leading to application.
The more careful and precise scientists and engineers are in their terms and usage of terms, the more precise they are in their operations upon those terms and uses, the more carefully they can craft understanding and object out of that understanding.
But teaching this need for care and precision is hard. If you've ever watched high school students in a lab running through an experiment, you've seen people cutting corners and trying to just get the mess over with without making too much of a mess. And if you've seen unmotivated people doing math you've seen exercises in carelessness and carelessness in exercises.
In short, although math and science require care and precision, not everyone learning them learns those lessons. But the precision necessary for learning math and science can be instilled by teaching computer programming, because the act of programming has immediate feedback -- the program doesn't work -- as opposed to delayed feedback (bad grades). Furthermore, programming can produce useful and fun virtual toys which kids of all ages seem to like. There's a real joy in making your own playthings which few people experience in chemistry labs (I said few people. You know who you are. Now put down the sal ammoniac).
So programming is useful in science and math training. But what about this dichotomy I started with? What about the loss of metaphor and slipperiness of meaning that would come from such an intense focus on computer programming? Wouldn't that stunt the artistic side of language?
Ah, but there's a thing that happens when a program is done. It is given to users to use. And thereby comes the return to sloppy artistic language, for it's not enough to force the computer to do what you want, you also have to create a means for someone else to use it. If you're a major software conglomerate controlling 90% of the world's personal computers you can simply try to force everyone to do what you want the way you want it (bwah-hah-hah, lightning strike, "server's fried again"). But if you're anyone else you need to make a program that is usable and flexible, that "talks" to the users. Therein lies the communication that comes from user interfaces, the intermediated speech between programmer and user, the "here's a tool, here's how to use it, have fun" that is implicit in every program created and handed over.
And there too lies the need for thought and human attention on the part of programmers if they want users to use their software rather than scream in their own rages. For while programmers know computers to be idiots, users who are forced to deal with programs that are not designed for users think the computers are evil spawns of the pit sent forth to torment them explicitly.
There is a benefit in most walks of life to being able to use both the careful and the carefree uses of language, of precision where it is needful and expansiveness where it is revelatory. The best tool I know for the latter is learning to write, and the best for the former is learning to program.
Oh dear, that's where I started.
<obligatory ending computer joke>
I hope we're not caught in an infinite loop.
</obligatory ending computer joke>