Ben recently made me aware of this article about earplugs that protect the hearing of soldiers:
British troops getting ready to deploy to Afghanistan are being issued with electronic sound-cancelling earplugs designed to let them hear what they need to — orders, conversations, enemy footfalls — but prevent hearing damage caused by explosions, gunfire and so on.
Science fiction author Larry Niven predicted this technology in 1975. From his story “The Borderland of Sol”:
“Earplugs,” said Ausfaller, holding up a handful of soft plastic cylinders.
We inserted them. Ausfaller said, “Can you hear me?”
“Sure.” “Yah.” They didn’t block our hearing at all.
“Transmitter and hearing aid with sonic padding between. If you are blasted with sound, as by an explosion or a sonic stunner, the hearing aid will stop transmitting. If you go suddenly deaf, you will know you are under attack.”
The real-world earplugs are slightly better than Niven’s; instead of cutting off all sound, they can simply reduce it to a safe level. (So you can hear the explosion without suffering hearing damage from it.) But apart from that detail, Niven’s description was spot on.
Finally! After months of not being able to update this blog, I have overhauled the software on my site so that I can start posting again. I don’t know what happened last year, but the Movable Type software on my server got screwed up so badly that even completely reinstalling it didn’t fix the problems. The blog still looked normal, but when I logged into the Movable Type console, parts of it were scrambled and nonfunctional.
I knew that I would have to back up my files and then delete Movable Type completely, wiping the slate clean. Then I could install MT from scratch, restore my old blog posts from the backups, and hopefully go on from there. All I needed was the spare time to work on it. Unfortunately, I was involved in plays continually from August through December of last year. And since then, other things have demanded my attention.
But today, I finally was able to spend a whole day backing up and obliterating my old MT installation, installing the latest version, and then restoring my old data. Everything seems to be working now.
I hope to get back into the habit of regular blogging in the days and weeks to come, and to give this site a makeover. It looks dusty and neglected. Time to blow the cobwebs out of here.
A couple of weeks ago, the Wall Street Journal published an article about the practice of documenting computer programs by including comments in your code. More specifically, it’s about how male programmers are arrogant jerks who refuse to do this, while female programmers are “considerate of those who will use the code later.” The article — written by a woman, and citing only one source, a female executive at Ingres — is dripping with sexist bigotry and condescension, including a jaw-dropping statement that “there’s a big need to fix testosterone-fueled code at Ingres.”
Judging by that article, you would think that including comments in your code is something that female programmers invented, and that males do only if their female superiors nag and browbeat them.
Of course that’s totally false. When I took programming courses in the 1970s, my instructors were all male, and every one of them insisted that we document our code properly with clear and readable comments. If we failed to do so, we lost points on each assignment. Our programming textbooks all emphasized the importance of documenting your code. Every one of those textbooks was written by men.
So what we have here is nothing less than historical revisionism. Proper commenting of code was invented and championed by men in the earliest days of programming, but now Rebecca Buckman and Emma McGrattan want to rewrite history so they can claim that programming was a realm of “testosterone-fueled” barbarism until the women showed up and explained to us, in words of one syllable, how to do it properly.
This article claims that even today, men deliberately obfuscate their code “to show how clever they are.” I think that any programmer who actually did this would be asked to stop, and if he continued, he would be disciplined and eventually fired.
I also think that any male vice-president of engineering who expressed scorn and contempt for female programmers, and who was quoted by the Wall Street Journal as saying that “there’s a big need to fix estrogen-fueled code”, would be instantly fired.
Source: Dr. Helen
Sending text messages costs too much. Compared to what, you ask? Well, a scientist at the University of Leicester points out that texting costs four times as much per megabyte as downloading data from the Hubble Space Telescope.
It’s become a cliche to joke about how, even though it’s now the 21st century, we still don’t have jetpacks or flying cars. But it’s funny how some of the other futuristic concepts that we first encountered in science fiction have become realities, and they sneaked up on us so gradually that we didn’t even notice when they arrived.
Video phones, for example. As a grade-school pupil in the 1960s, I learned that “Picturephones” had already been invented, and in fact they were demonstrated at the 1964 World’s Fair. But they never actually showed up in people’s homes. The explanation I always heard was that nobody wanted them, but that was only part of the truth: in the 1960s, the technology for making video phone calls would have been terribly expensive, and few people would have found the benefits worth the cost. Whatever the reason, video phones remained in the realm of science fiction for the next several decades. (Remember Heywood Floyd’s video phone call to his daughter from a space station in the film 2001: A Space Odyssey?)
But a few years ago, it dawned on me that this technology wasn’t science fiction anymore. I was at a holiday gathering of my wife’s family in Charleston, SC, which was attended by almost everyone descended from her parents (four generations in all, comprising several dozen people). The notable exception was one of my wife’s nieces, who had married a Navy guy and was, as a result, in Hawaii. She and her husband and children were not able to attend the gathering in person, but they were able to participate in real time over the Internet. The computers at both ends had inexpensive video cameras (“webcams”) connected to them, and AOL had recently added videoconferencing capability to its instant-messaging client. Setting up a two-way video connection between South Carolina and Hawaii proved to be quite simple, and the conversation went on for several hours, with family members in both locations taking turns in front of the computer.
I don’t think anyone really thought of this as a “phone call”, probably because the participants weren’t talking into a handset. But in fact, this was indistinguishable from the Picturephone future that we were promised in the 1960s, except that it didn’t cost anything in addition to the AOL subscription fee the family was already paying. And, in fact, they were using AOL over a dial-up connection, so it really was a phone call. Video phones had arrived with no fanfare at all.
A concerned reader writes to the science section of the New York Times to ask: “Am I still getting vitamin D when I’m outside on a gray, cloudy day?” The answer from the Times explains that your skin needs exposure to ultraviolet-B rays in order to synthesize vitamin D. Unfortunately, this is the same ultraviolet-B that causes sunburn and skin damage. Finding the optimal exposure time is complicated, especially when the amount of UV-B energy is affected by factors such as cloud cover and latitude.
To strike a balance between useful exposure and protection, the N.I.H. recommends an initial exposure of 10 to 15 minutes, followed by application of a sunscreen with an S.P.F. of at least 15. The institutes say this much exposure, at least two times a week, is usually sufficient to provide adequate vitamin D, though some researchers suggest it may not be enough. At the earth’s northern latitudes for much of the year, and at the midlatitudes in winter, the sun does not stay far enough above the horizon (45 degrees) for the angle of the sun’s rays to guarantee an efficient ultraviolet-B bath.
So even if you follow the NIH recommendations to the letter, the resulting UV-B exposure still may be too little or too much? Sorry, but I’m not going to waste my time on a process as inconvenient and unreliable as this.
Fortunately, I don’t have to. Vitamin D is available in pill form in any grocery store. Yes, your skin can synthesize it, but it doesn’t have to. The pills are inexpensive and convenient; why not use them? You get exactly the right dosage every time (regardless of cloud cover or latitude) and there’s no risk of sunburn or skin damage.
There are plenty of good reasons to go outside and let the sun shine on you, but nobody should feel obligated to do so in order to get enough vitamin D. It simply isn’t necessary.
Perhaps I’m missing something, but the following strikes me as a profoundly stupid question:
Why do women long outlive their fertility?
Human ovaries tend to shut down by age 50 or even younger, yet women commonly live on healthily for decades. This flies in the face of evolutionary theory that losing fertility should be the end of the line, because once breeding stops, evolution can no longer select for genes that promote survival.
Women don’t outlive their fertility in their natural environment. The life expectancy of primitive humans averages between 20 and 35 years. And the women fare worse than the men, because a quarter of them die in childbirth.
We non-primitive humans live a great deal longer because of modern medical care, the entire purpose of which is to interfere with natural selection. Nature has plenty of mechanisms for eliminating women from the world long before they reach menopause, but we do everything in our power to prevent those mechanisms from operating.
Saying that this “flies in the face of evolutionary theory” is an indication of staggering cluelessness. Evolutionary theory describes how evolution works in a natural setting. Of course it fails when you try to apply it to a technological society with advanced medical care. Next you’ll be telling me that space travel flies in the face of gravitational theory because space probes go up instead of down.
Women live long enough for their reproductive systems to shut down for the same reason that both men and women live long enough for our teeth to start crumbling and have to be repaired or replaced. In our original environment (the savannahs of Africa), human bodies only had to last for 20 to 35 years. Beyond that point, it didn’t matter what systems might fail; we were never going to live that long anyway.
But now we’ve changed the rules. We routinely keep our bodies running for three or four times as long as they were originally designed to operate. Of course some parts stop working! It is unnatural for humans to live as long as we do. We are interfering with human evolution on a massive scale.
What really baffles me is that the people asking this stupid question are evolutionary biologists, and the article quoting them is in Scientific American. Why do expert scientists and science journalists have so much trouble seeing such an obvious explanation?
This is a public service announcement directed at the world’s five-year-old children. I am aware that if you decide to attack me, I can be overwhelmed by a sufficient number of you. However, I will not go down without a fight. And according to this test, I am capable of taking almost two dozen of you with me:23
Bear this in mind as you make your plans.
When I wrote a few days ago about how Arthur C. Clarke predicted the World Wide Web, I was not aware that he had actually inspired its creation. But since then, I have learned from multiple sources that Tim Berners-Lee cites Clarke’s 1964 short story “Dial F for Frankenstein” as a major inspiration for his invention of the Web.
I have read that story before (in fact, I just reread it; it’s only five pages long), and it has never occurred to me that it might have anything to do with the Web. It describes how the activation of new satellites unites the world’s telephone networks into a single global system that is as complex as a human brain. This global network becomes conscious, with dire consequences for humanity.
“Dial F for Frankenstein” does strike me as a prediction (or, possibly, an inspiration) of something that came decades later. But, with all due respect to Sir Tim, I don’t think it’s the Web. Anyone who has seen Terminator 2: Judgement Day will know exactly what I mean.
After learning of Arthur C. Clarke’s death, Bruce Webster wrote: “He was the last of the Big Three — Isaac Asimov, Clarke, and Robert Heinlein — to pass away, and we shall not see their like again.” He’s right, but not in the sense that today’s science fiction writers are inferior. No, the differences are qualitative. Asimov, Clarke, and Heinlein all started their writing careers during the Golden Age of Science Fiction, the period during the late 1930s and early 1940s when legendary editor John W. Campbell was remaking the field into something more that a category of swashbuckling adventure stories. Campbell insisted on the use of real science, logical plots, and rational aliens in his stories: “Write me a creature that thinks as well as a man, or better than a man, but not like a man.”
Campbell’s role as the leading editor of science fiction waned after the 1940s, and he died in 1971 — but he continued to exert a profound influence over the field through the authors whose careers and writing styles he had shaped, especially the Big Three. Only now, with the death of Arthur C. Clarke, does the Golden Age really come to an end.
Bruce Webster is right in another sense; the Big Three will not be replaced. Writing in 1990, Isaac Asimov laid that notion to rest:
Now that Heinlein has died and Clarke and I are increasingly decrepit, one is bound to ask, “Who will be the next Big Three?” The answer, I’m afraid, is that no one will ever be. In the early days, when the Big Three were chosen by general consent, the number of science fiction writers was small and it was easy to choose the outstanding examples. Nowadays, however, the number of science fiction writers, and even of good science fiction writers, is so great that it is simply impossible to pick three writers that everyone will agree on.
And because the field is so much larger than it was in the 1940s, no small group of leading authors can dominate it the way Asimov, Heinlein, and Clarke did during the Golden Age. It’s the end of an era.