Rants on software, computing, and any other topic I feel like.

Monday, March 28, 2005

Humans are not Computers

I read an article today about a program that is supposed to turn English language into code. Many folks thought that this was a great thing in that people would become more precise in their use of English as a way of describing what a program does. This is wrong and a complete misunderstanding of language, both computer and human.

Here's a news flash, just because PHP and Swahili are both called "languages" doesn't mean that they really have much to do with each other. The reason that they are both called languages is because they both express something to something else. They're completely different beasts. Computer languages are for talking to computers. Human languages are for talking to humans.

Computer programmers are fairly antisocial, and so don't have a lot of experience actually talking to real people. They spend a lot of time expressing themselves to the little boxes on their desks. These boxes are not human beings. Humans don't work like computers. They don't do what you tell them to do most of the time. They won't always like you. They might just up and leave if you don't treat them right. And the biggest thing of all, they have brains. No really, your computer doesn't have a brain and that annoying marketing guy does.

This means that no matter how many times you snicker or roll your eyes behind their back about their lack of knowledge of Excel macro programming, humans are smarter than computers. They might actually be smarter than you in some area other than computers. For example, they might know more about something important, like how to make money, or why you should take a shower, or how to talk to girls.

Back to the main point, which is that there is and should be a difference between human and computer languages. Forcing people to be more precise in their use of human languages only forces them to think like a computer. Why should they think like a computer? They're not computers, they're people, and people are smarter than computers.

Your job as a computer programmer is as a translator between human language and computer language. Anyone familiar with translation of human languages knows that it takes some brains to do it. You have to be familiar with the idea of context, idioms, etc. of both languages in order to translate well. To be a very good translator, you also have to be familiar with the cultures that you're dealing with. Something a computer will likely never do.

The same goes for computer programming. Computer programming isn't simply about translating word for word the human spec to code. You must interpret the human language by using your brain, and convert it to computer language. A spec is written by a human is meant to be read and understood by a human. The spec writer isn't there to do your job. That *is* your job, translation. Programmers who don't understand this concept should be beaten and have the word "Spec Freak" tattooed to their forehead. If you are a "Spec Freak" and no one knows about it yet, then take this opportunity to stop. People might actually start to like you. They might not run away as you approach the water cooler. And if you combine it with regular showering, then you might even be able to start talking to girls.

I'm going to tell you a story about UML. If you're not familiar with it, it is a way of describing the design of code using pretty pictures so that PHB's can understand it. It actually turns out that when used properly, it can be an effective tool to communicate to other humans about the design of your code. It's a human language.

The problem came about when some lazy "Spec Freak" was annoyed that after he spent all this time creating a pretty UML picture that he had to then translate the picture into code. He thought, "I should write a program to do this for me." He thus sealed his status as a "Spec-Freak". He then found that there were ambiguities in his UML that couldn't be translated into code. He found that they were things in his code that he couldn't translate into UML. So he "fixed" UML to be more precise, and more useless. UML is a human language. Human language has a lot of ambiguity, because humans are smarter than computers, and can deal with it. Adding precision and getting rid of ambiguity in human language turns it into computer language and thus gets rid of all its usefulness as a way to communicate with other humans.

Humans aren’t computers and computers aren’t human. Humans communicate well with humans. Computers communicate well with computers. Programmers are needed to translate between the two. No computer will even be able to do it. At least not until strong AI gets invented.
Hey Mike, this is Stuart. Hope you don't mind me popping in with my 2 cents.

Reading through this post brings up several good points. I do have a question for you though.

During your stay at the U, did you have a chance to take NLP? (Natural Language Processing).

Fairly amazing class that just happens to discuss several of the concepts you bring up here.

Back when computers were first being developed, scientists/engineers figured that computers would be able to do the things that humans do, only better. Reading through text, speech recognition, machine translation, etc... They never considered a computer processing 50 billion human gene sequences. Afterall, why would they think about those sorts of things, nothing like computers had ever existed before.

Here we are today, and computers are fantastic when it comes to doing the easy things humans do (simple arithmatic). A computer can sit all day crunching numbers and be happy as a computer can be. But if you give a computer a newspaper article and ask it "what was this article about" or something even more specific (and requiring more inferrence) most computers will stare at you in that funny way computers stare at people.

Anyway, the point is, computers are getting better at doing these types of tasks. We are a long, long way off from being able to give a computer a design and having it spit out source code. Look at programming languages like Ruby. It still requires you, the human, to enter in what you are trying to do. But the code is starting to look more and more like english text.

As for programmers being antisocial... not much can be done here until they make a true weight-loss pizza and beer. :)
I can give an example of this. A few months ago one of the tech blogs I read had a post about AppleScript. He starts out talking about a specific problem he had run into, but farther down (in a section titled "Interpolation on the Failed Experiment That Is AppleScript’s English-Like Syntax") he talks about AppleScript in general and why making computer languages more like English is a Bad Thing.
Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?