Feb 08

Blog repaired

Finally! After months of not being able to update this blog, I have overhauled the software on my site so that I can start posting again. I don’t know what happened last year, but the Movable Type software on my server got screwed up so badly that even completely reinstalling it didn’t fix the problems. The blog still looked normal, but when I logged into the Movable Type console, parts of it were scrambled and nonfunctional.

I knew that I would have to back up my files and then delete Movable Type completely, wiping the slate clean. Then I could install MT from scratch, restore my old blog posts from the backups, and hopefully go on from there. All I needed was the spare time to work on it. Unfortunately, I was involved in plays continually from August through December of last year. And since then, other things have demanded my attention.

But today, I finally was able to spend a whole day backing up and obliterating my old MT installation, installing the latest version, and then restoring my old data. Everything seems to be working now.

I hope to get back into the habit of regular blogging in the days and weeks to come, and to give this site a makeover. It looks dusty and neglected. Time to blow the cobwebs out of here.

Jun 17

Male programmers considered harmful

A couple of weeks ago, the Wall Street Journal published an article about the practice of documenting computer programs by including comments in your code. More specifically, it’s about how male programmers are arrogant jerks who refuse to do this, while female programmers are “considerate of those who will use the code later.” The article — written by a woman, and citing only one source, a female executive at Ingres — is dripping with sexist bigotry and condescension, including a jaw-dropping statement that “there’s a big need to fix testosterone-fueled code at Ingres.”

Judging by that article, you would think that including comments in your code is something that female programmers invented, and that males do only if their female superiors nag and browbeat them.

Of course that’s totally false. When I took programming courses in the 1970s, my instructors were all male, and every one of them insisted that we document our code properly with clear and readable comments. If we failed to do so, we lost points on each assignment. Our programming textbooks all emphasized the importance of documenting your code. Every one of those textbooks was written by men.

So what we have here is nothing less than historical revisionism. Proper commenting of code was invented and championed by men in the earliest days of programming, but now Rebecca Buckman and Emma McGrattan want to rewrite history so they can claim that programming was a realm of “testosterone-fueled” barbarism until the women showed up and explained to us, in words of one syllable, how to do it properly.

This article claims that even today, men deliberately obfuscate their code “to show how clever they are.” I think that any programmer who actually did this would be asked to stop, and if he continued, he would be disciplined and eventually fired.

I also think that any male vice-president of engineering who expressed scorn and contempt for female programmers, and who was quoted by the Wall Street Journal as saying that “there’s a big need to fix estrogen-fueled code”, would be instantly fired.
Source: Dr. Helen

May 05

Prophecies: video phones

It’s become a cliche to joke about how, even though it’s now the 21st century, we still don’t have jetpacks or flying cars. But it’s funny how some of the other futuristic concepts that we first encountered in science fiction have become realities, and they sneaked up on us so gradually that we didn’t even notice when they arrived.

Video phones, for example. As a grade-school pupil in the 1960s, I learned that “Picturephones” had already been invented, and in fact they were demonstrated at the 1964 World’s Fair. But they never actually showed up in people’s homes. The explanation I always heard was that nobody wanted them, but that was only part of the truth: in the 1960s, the technology for making video phone calls would have been terribly expensive, and few people would have found the benefits worth the cost. Whatever the reason, video phones remained in the realm of science fiction for the next several decades. (Remember Heywood Floyd’s video phone call to his daughter from a space station in the film 2001: A Space Odyssey?)

But a few years ago, it dawned on me that this technology wasn’t science fiction anymore. I was at a holiday gathering of my wife’s family in Charleston, SC, which was attended by almost everyone descended from her parents (four generations in all, comprising several dozen people). The notable exception was one of my wife’s nieces, who had married a Navy guy and was, as a result, in Hawaii. She and her husband and children were not able to attend the gathering in person, but they were able to participate in real time over the Internet. The computers at both ends had inexpensive video cameras (“webcams”) connected to them, and AOL had recently added videoconferencing capability to its instant-messaging client. Setting up a two-way video connection between South Carolina and Hawaii proved to be quite simple, and the conversation went on for several hours, with family members in both locations taking turns in front of the computer.

I don’t think anyone really thought of this as a “phone call”, probably because the participants weren’t talking into a handset. But in fact, this was indistinguishable from the Picturephone future that we were promised in the 1960s, except that it didn’t cost anything in addition to the AOL subscription fee the family was already paying. And, in fact, they were using AOL over a dial-up connection, so it really was a phone call. Video phones had arrived with no fanfare at all.

Apr 15

Sunlight and vitamin D

A concerned reader writes to the science section of the New York Times to ask: “Am I still getting vitamin D when I’m outside on a gray, cloudy day?” The answer from the Times explains that your skin needs exposure to ultraviolet-B rays in order to synthesize vitamin D. Unfortunately, this is the same ultraviolet-B that causes sunburn and skin damage. Finding the optimal exposure time is complicated, especially when the amount of UV-B energy is affected by factors such as cloud cover and latitude.

To strike a balance between useful exposure and protection, the N.I.H. recommends an initial exposure of 10 to 15 minutes, followed by application of a sunscreen with an S.P.F. of at least 15. The institutes say this much exposure, at least two times a week, is usually sufficient to provide adequate vitamin D, though some researchers suggest it may not be enough. At the earth’s northern latitudes for much of the year, and at the midlatitudes in winter, the sun does not stay far enough above the horizon (45 degrees) for the angle of the sun’s rays to guarantee an efficient ultraviolet-B bath.

So even if you follow the NIH recommendations to the letter, the resulting UV-B exposure still may be too little or too much? Sorry, but I’m not going to waste my time on a process as inconvenient and unreliable as this.

Fortunately, I don’t have to. Vitamin D is available in pill form in any grocery store. Yes, your skin can synthesize it, but it doesn’t have to. The pills are inexpensive and convenient; why not use them? You get exactly the right dosage every time (regardless of cloud cover or latitude) and there’s no risk of sunburn or skin damage.

There are plenty of good reasons to go outside and let the sun shine on you, but nobody should feel obligated to do so in order to get enough vitamin D. It simply isn’t necessary.

Apr 10

Unnatural selection

Perhaps I’m missing something, but the following strikes me as a profoundly stupid question:

Why do women long outlive their fertility?

Human ovaries tend to shut down by age 50 or even younger, yet women commonly live on healthily for decades. This flies in the face of evolutionary theory that losing fertility should be the end of the line, because once breeding stops, evolution can no longer select for genes that promote survival.

Women don’t outlive their fertility in their natural environment. The life expectancy of primitive humans averages between 20 and 35 years. And the women fare worse than the men, because a quarter of them die in childbirth.

We non-primitive humans live a great deal longer because of modern medical care, the entire purpose of which is to interfere with natural selection. Nature has plenty of mechanisms for eliminating women from the world long before they reach menopause, but we do everything in our power to prevent those mechanisms from operating.

Saying that this “flies in the face of evolutionary theory” is an indication of staggering cluelessness. Evolutionary theory describes how evolution works in a natural setting. Of course it fails when you try to apply it to a technological society with advanced medical care. Next you’ll be telling me that space travel flies in the face of gravitational theory because space probes go up instead of down.

Women live long enough for their reproductive systems to shut down for the same reason that both men and women live long enough for our teeth to start crumbling and have to be repaired or replaced. In our original environment (the savannahs of Africa), human bodies only had to last for 20 to 35 years. Beyond that point, it didn’t matter what systems might fail; we were never going to live that long anyway.

But now we’ve changed the rules. We routinely keep our bodies running for three or four times as long as they were originally designed to operate. Of course some parts stop working! It is unnatural for humans to live as long as we do. We are interfering with human evolution on a massive scale.

What really baffles me is that the people asking this stupid question are evolutionary biologists, and the article quoting them is in Scientific American. Why do expert scientists and science journalists have so much trouble seeing such an obvious explanation?

Apr 10

Asymmetric warfare

This is a public service announcement directed at the world’s five-year-old children. I am aware that if you decide to attack me, I can be overwhelmed by a sufficient number of you. However, I will not go down without a fight. And according to this test, I am capable of taking almost two dozen of you with me:

23

Bear this in mind as you make your plans.

Mar 22

Dial S for Skynet

When I wrote a few days ago about how Arthur C. Clarke predicted the World Wide Web, I was not aware that he had actually inspired its creation. But since then, I have learned from multiple sources that Tim Berners-Lee cites Clarke’s 1964 short story “Dial F for Frankenstein” as a major inspiration for his invention of the Web.

I have read that story before (in fact, I just reread it; it’s only five pages long), and it has never occurred to me that it might have anything to do with the Web. It describes how the activation of new satellites unites the world’s telephone networks into a single global system that is as complex as a human brain. This global network becomes conscious, with dire consequences for humanity.

“Dial F for Frankenstein” does strike me as a prediction (or, possibly, an inspiration) of something that came decades later. But, with all due respect to Sir Tim, I don’t think it’s the Web. Anyone who has seen Terminator 2: Judgement Day will know exactly what I mean.

Mar 19

The Big Three

After learning of Arthur C. Clarke’s death, Bruce Webster wrote: “He was the last of the Big Three — Isaac Asimov, Clarke, and Robert Heinlein — to pass away, and we shall not see their like again.” He’s right, but not in the sense that today’s science fiction writers are inferior. No, the differences are qualitative. Asimov, Clarke, and Heinlein all started their writing careers during the Golden Age of Science Fiction, the period during the late 1930s and early 1940s when legendary editor John W. Campbell was remaking the field into something more that a category of swashbuckling adventure stories. Campbell insisted on the use of real science, logical plots, and rational aliens in his stories: “Write me a creature that thinks as well as a man, or better than a man, but not like a man.”

Campbell’s role as the leading editor of science fiction waned after the 1940s, and he died in 1971 — but he continued to exert a profound influence over the field through the authors whose careers and writing styles he had shaped, especially the Big Three. Only now, with the death of Arthur C. Clarke, does the Golden Age really come to an end.

Bruce Webster is right in another sense; the Big Three will not be replaced. Writing in 1990, Isaac Asimov laid that notion to rest:

Now that Heinlein has died and Clarke and I are increasingly decrepit, one is bound to ask, “Who will be the next Big Three?” The answer, I’m afraid, is that no one will ever be. In the early days, when the Big Three were chosen by general consent, the number of science fiction writers was small and it was easy to choose the outstanding examples. Nowadays, however, the number of science fiction writers, and even of good science fiction writers, is so great that it is simply impossible to pick three writers that everyone will agree on.

And because the field is so much larger than it was in the 1940s, no small group of leading authors can dominate it the way Asimov, Heinlein, and Clarke did during the Golden Age. It’s the end of an era.

Mar 19

Prophecies: the Web

Arthur C. Clarke has died in Sri Lanka at the age of 90. To commemorate Sir Arthur’s passing, I’d like to highlight yet another of his predictions that has come true: the World Wide Web.

In his 1976 novel Imperial Earth, Clarke describes a device called the home communications console, or Comsole for short. Its elements are quite familiar: “the blank gray screen, the alphanumeric keyboard, the camera lens and speaker grille.” But what does it do? In a diary entry, main character Duncan Makenzie describes how his daily routine begins with the Comsole: “I dial the Comsole for any messages that have arrived during the night — usually there are half a dozen. . . . Then I set the news abstractor to print out anything that’s happened in my area of interest, and scan the result.”

But that’s just a small part of the Comsole’s capabilities. A later passage provides an overview:

Duncan walked to the Comsole, and the screen became alive as his finger brushed the ON pad. Now it was a miracle beyond the dreams of any poet, a charmed magic casement, opening on all seas, all lands. Through this window could flow everything that Man had ever learned about his universe, and every work of art he had saved from the dominion of Time. All the libraries and museums that had ever existed could be funneled through this screen and the millions like it scattered over the face of Earth. Even the least sensitive of men could be overwhelmed by the thought that one could operate a Comsole for a thousand lifetimes — and barely sample the knowledge stored within the memory banks that lay triplicated in their widely separated caverns, more securely guarded than any gold.

On this particular occasion, Duncan is merely trying to locate an old friend who is somewhere on Earth, so he uses the online directory to look up her contact information and then uses the Comsole to call her and have a sound-and-video conversation.

We take all of this for granted today, but when I first read Imperial Earth 32 years ago, the first home computers — the kind you had to build from a kit — had just come on the market. The idea that personal computers would connect to a global information network and provide instant audiovisual access to everything and everyone was stunning, and utterly beyond anything I had previously read. Clarke even predicted when it would happen: “The home communications console — or Comsole — had reached its technological plateau in the early twenty-first century.” We haven’t reached that plateau yet, but the Web is only a little more than a decade old, and almost all of what Clarke described is already available.

Imperial Earth begins with an incident from Duncan Makenzie’s youth, as a ten-year-old living in an underground city on Titan, the largest moon of Saturn. Attempting to use his Comsole to call his grandmother, he enters the wrong number and ends up connecting to something else:

There was no ringing tone, and no picture. . . . Duncan guessed that he had been switched into an audio-only channel, or had reached a station where the camera was disconnected. In any case, this certainly wasn’t Grandma’s number, and he reached to break the connection.
Then he noticed the sound. At first, he thought that someone was breathing quietly into the microphone at the far end, but he quickly realized his mistake. There was a random, inhuman quality about this gentle susurration; it lacked any regular rhythm, and there were long intervals of complete silence. . . . He was listening to the voice of the wind, as it sighed and whispered across the lifeless landscape a hundred meters above his head. . . .
Somewhere — perhaps in an abandoned construction project or experimental station — a live microphone had been left in circuit, exposed to the freezing, poisonous winds of the world above. It was not likely to remain undetected for long; sooner or later it would be discovered and disconnected.

When I read those words in 1976, the idea of using a computer terminal connected to a global information network to listen to the sounds of wind in the atmosphere of Titan was pure science fiction. But less than thirty years later, I did exactly that. Sir Arthur was right on the money.