Doonsebury and abortion

Since a number of newspapers are refusing to run the Doonsebury comic strips which address abortion this week, I am following in the footsteps of Jerry Coyne and posting the strips.

(Apparently, posting the strips like this deprives the artist of syndication money. Or so it goes. Keeping in step with Jerry, I am also posting the link to the main website for the comic so that the artist will get “click credit”.)

(Click to enlarge.)

Thought of the day

If Christianity was such a valuable influence on science, why was science dead for a millennium and a half after Christianity took hold?

~Jerry Coyne

Well, that was interesting

I recently praised PZ for finally looking to shed his extra pounds. Aside from his lack of health being disrespectful to his own body, he runs the chance of leaving his loved ones behind too early – and for no good reason. I stand by that praise, and even though I am fully aware that it comes across as if I am just trying to insult a fat guy, I do genuinely mean it.

Unfortunately, PZ doesn’t see it that way. In the comment section of the post that inspired what I wrote, things took an ugly turn. I presented my argument that it is wrong to not try to be healthy. The first reaction – and it is always the first reaction – is to say I think it is wrong to be fat. I don’t. The issue is with putting forth an honest effort to be healthy. The results are not important, morally speaking. And just as I did in my post about obesity, I allowed for a huge swath of caveats. Some people have conditions which prevent them from putting forth the same effort as others. Other people work long hours and have to take care of children at the end of the day. Still other people have limited access to healthy food. It would be unreasonable to expect everyone to be able to put forth the same effort. That doesn’t mean, though, that it is magically impossible for people to attempt all they can – there is almost always a better choice available on the grocery store shelves – but I fully acknowledge that it can be difficult. I always have.

This, unfortunately, led to an extended discussion on poor people and food stamps. Apparently I hate them all because I don’t want to subsidize lobsters. The truth is, welfare funds are a limited resource. If we allow people to spend money on expensive items, they will have less for what they actually need. I saw this first hand while working at a grocery store in high school. People would use the last of their food stamps regularly on $30-70 worth of lobster. I can think of far better ways to use those funds.

The “counter” (if you can call it that) to this argument is that people get X amount of dollars and so it doesn’t matter how they spend them. (Oh, and I don’t think poor people deserve nice things, apparently.) This is a patently stupid argument. If I am given $200 a month for food and I need $300, $100 is coming from my pocket. However, if I buy luxury items, that eats into what I have been given. That means I will get one nice meal, but that might add another $30 to what comes out of my wallet. This does not help the poor; it allows some people to abuse their funds, forcing them to stay on welfare longer. That is, nobody is going to get back on their feet by spending money on things they don’t need at the expense of things they do need.

And, of course, this all means I must be a Ron Paul-loving, Reagan-blowing libertarian Republican. Right. No, my position is a utilitarian one. Welfare funds are not unlimited. It doesn’t make sense to allow them to be used to buy anything under the Sun. In fact, PZ and co are in the minority in what they think. Most states restrict the use of food stamps on some items (such as expensive energy drinks), and certainly no state allows the funds to be used for restaurants. And the states are right to do so. But what’s really interesting about this is that PZ and everyone else, once we get past the government intervention, is kicking into uber-libertarian mode. “Who are YOU to say what people should buy?!” Right. I’m the libertarian here.

The end result has been a ban. I suppose I’m okay with this. After all, throughout my Maloney troubles, PZ never responded to a single email or request for help. He, of course, is not required to do so, but let’s not be coy. I have disagreed with his rampant sexism in recent months, so he has no interest in helping me fight junk science. Emotion overrides logic here; if I never commented on his site, he would have been the first to help me. Besides that, the majority of his posts have nothing to do with atheism anymore. Sure, he has those spammy “Why I am an atheist” posts that have given my scrolling finger a good workout, but he mostly writes about feminism and stupid Internet polls. Overall I have still enjoyed him, but that is happening less and less each day. Without the writing he had as of a few years ago, I don’t care for much of what he has to say anymore. I’ll stick with Jerry Coyne, Friendly Atheist, and Starts With A Bang! for my big name bloggers.

At any rate, I hope PZ does manage to lose the excess fat he has. He’s older and has had health issues, so I don’t expect fantastic results, but I’m happy he is at least trying. After all, that’s all that matters.

Context matters in language

I know the title to this post is wildly obvious, but for some bizarre reason it bears repeating. People do not seem to understand that the power any given word may have is premised in the context in which it is presented. A white Southerner in 1845 who calls someone a nigger is doing so for some awfully racist reasons. Herman Cain saying “niggerhead” had nothing racist about it (nor would it if a white person dared to say it on TV). The same idea goes for any given word, including “retard”, “faggot”, “wetback”, or even words which are often considered politically correct. For instance, “Jerry Coyne is a Jew” has no bigoted meaning behind it, at least in the majority of contexts in which it may be said. However, “I think the used car salesman really Jewed me on my purchase” is entirely different because it appeals to stereotypes about Jews screwing people over monetarily.

I wish more people could understand this. EDIT: Not that I’m advocating for the use of any of these words. While context does matter, sometimes it is too difficult to divorce a word from its historical context without being very specific.

(And context certainly matters behind this one word.)

3.4 billion year old fossils recently discovered

Martin Brasier of Oxford and his team have just described some very ancient fossils discovered in rocks dated to 3.4 billion years of age. The article is unfortunately behind a paywall, so I am unable to review it, but Jerry Coyne lays out the evidence neatly:

As the authors note, “determining the biogenicity [biological origin] of putative Archaean microfossils is notoriously difficult.” How do we know that these things are real remnants of bacteria and not just inclusions or artifacts? There are several independent lines of evidence, none conclusive but together building a very solid case:

  • They look like cells, being cell-shaped, cell-sized, and forming chains of spheroids that look like chains of both well-established fossil bacteria and modern bacteria. Some can even be seen “dividing” or expelling their contents after cell damage (see figure above).
  • The variation in size of the bodies is small—smaller than you’d expect if they were abiological inclusions. A uniformity of size, however, is expected if they’re all members of one living species.
  • The cell “walls” of the microfossils, too, are of uniform thickness, unlike that of artifacts like silica grains coated with carbon.
  • The geochemistry of the bacteria and surrounding rock supports the idea that these are true organisms. This involves not only the isotopic nature of the carbon, but the presence of nitrogen, a crucial biomarker, within the cell walls.

This reminds me very much of Margulis’ endosymbiotic theory and its multiple lines of evidence. Of course, that idea was a cornerstone achievement of 20th century biology whereas this recent discovery pushes the known origins of life back about 200 million years, but that does not mean it isn’t important. Braiser’s find goes to demonstrate the nature of early bacteria. Many of the features in these fossils show that ancient bacteria operated in ways similar to certain bacteria today – if you find something that works, keep it up. The find also demonstrates that life can come to be quite quickly. Earth is only 4.54 billion years old and probably was not hospitably cool enough for life for its first half billion years. Given this 3.4 billion year date plus the fact that these fossils are fairly complex, it is not a stretch to say life dates back even further, probably quite soon after it was possible for it to do so on Earth.

Finally, I have to reflect Coyne’s curiosity that this was not published in Nature or Science, but rather Nature Geoscience. It’s still in a prestigious enough journal, but the paper is important enough where it deserves publication that comes with a higher profile. I’m sure the authors attempted to reach the best level possible, so who knows what the deal is.

The decline of religion

There is a post making the blogging rounds about the decline of religion and rise of non-belief amongst the younger generations. It has some interesting facts:

  • The number of secular student groups is growing rapidly.
  •   The more that people stand up and are vocal about their unbelief, the more it encourages others to do the same. As [Adam] Lee notes, “psychological experiments [find] that it’s much easier to resist peer pressure if you have even one other person standing with you.) Student activists like the ones I’ve mentioned are no longer just scattered voices in the crowd; they’re the leading edge of a wave.”
  • Atheism increases with each new generation in America.

There are links embedded within that writing. Go to the original link to see them.

The fact is, more and more people are declaring their lack of religion or even outright atheism as the years march on and younger generations come of age. This has been a distinct trend since the end of WW2: each generation of young people has more nonbelievers than the previous generation of young people. Currently we have 25-30% of people in their 20’s declaring they have no religion, a number that is four times higher than for any other period.

The originator of this blogging meme, Adam Lee, has a good idea why we’re seeing this decline in religious affiliation:

I’d love to say that we atheists did it all ourselves; I’d love to be able to say that our dazzling wit and slashing rhetorical attacks are persuading people to abandon organized religion in droves. But the truth is that the churches’ wounds are largely self-inflicted. By obstinately clinging to prejudices that the rest of society is moving beyond, they’re in the process of making themselves irrelevant. In fact, there are indications that it’s a vicious circle: as churches become less tolerant and more conservative, their younger and more progressive members depart, which makes their average membership still more conservative, which accelerates the progressive exodus still further, and so on.

I am more willing to give some of the credit to the Gnu Atheists. It isn’t that we’ve turned so many people to atheism – these numbers primarily reflect a lack of religious affiliation, not atheism – but modern atheists have helped to create an environment where it is okay to criticize religion more openly. Part of that has been due to the writings of people like Richard Dawkins and PZ Myers, but an even bigger part has to do with the rise of the Internet. Atheists don’t tend to get together very easily. We have no unifying philosophy or normative claims, so it makes things difficult. But with the Internet, it’s a matter of a simple click to a website. This has given us more of a voice, and it has made people realize there are more of us than they thought. That not only gets people thinking – I remember as a kid at a Catholic school being astonished to hear atheists even existed – but it brings more people out of the atheist closet. After all, nothing attracts a crowd like a crowd of people.

Is Maine the dumbest state in the Union?

It would seem so based upon this map (via Jerry Coyne).

The dubious honor is based upon 2010 SAT scores by state (including Washington D.C.). Maine ranks dead last with a combined mean score of 1389. In contrast, the top performing state, Iowa, has a combined mean score of 1798. In fact, the traditionally dumbest state, Mississippi, comes in at number 18 with a score of 1666. It would seem Maine has really gone down the tubes over the past few years.

Or not.

Maine, as far as I know, is the only state which requires students to take the SATs. Other states may require ACT tests, though I’m not sure. However, many other states do trend towards those tests as an alternative to the SATs. As a result, Iowa’s participation rate is a paltry 3% (the same as Mississippi). In fact, 19 of the top 20 states are 10% or under in participation (Colorado, ranked number 13, is at 18%). Maine, by contrast, has a 92% participation rate. (For the remaining 8% I suspect the ACT tests are allowed as an alternative, some students just don’t bother, exceptions are made for certain circumstances, etc, etc; in 2007, the participation rate in the state was 100%.) The result is that over 15,000 Maine students took the test whether they cared or not; Only 1,100 students took it in Iowa – and I bet most of them cared. In fact, take a look at the reports by state. Of the students in Maine taking the test (who responded), 32% were in the highest tenth of their class. In Iowa, it was 64%. In Maine, 24% of the students taking the test made up the bottom three fifths of their class. In Iowa? 4%.

So in short, no, Maine is not the dumbest state. All students in Maine are considered college-bound by these SAT statistics, so that makes state-by-state comparison pointless. Iowa and most of the other states suffer from sample bias. In fact, Massachusetts is the closest state to Maine in participation and still only reaches 86%. Besides, in other various rankings, Maine students consistently rank well above average. By these rankings, the state is 5th overall.

Are humans “more advanced” than other organisms?

I recently had a discussion with a friend where he asserted that he was more advanced than, say, a plant. By the common connotations that come from the word “advanced”, we would have to agree that his statement was true. But it asks an interesting question: What do we really mean when we say we’re advanced?

To put the discussion in a proper framework, think in evolutionary terms. That means we can’t compare Albert Einstein to Sarah Palin and say, “Why, yes, he was more intellectually advanced than she is.” Of course that is true, but if we’re going to discuss evolution and what it means to be advanced, we’re necessarily comparing species, not individuals. That is what makes my friend’s initial comparison to plants a reasonable starting point.

But it is only a starting point. Because what are we comparing exactly? In terms of brain development, yes, he beats that plant handily. But what about in terms of ability to photosynthesize? Well, the plant just got a knockout in that round. Clearly there is a difficulty in making useful (and, in my view for this discussion, any) comparisons between species. Maybe we need to find a species that is closer in relation to humans. (It certainly would help for it to have a brain in the first place.) The animal I chose for the discussion and the one I am choosing for this discussion is the skunk. Jerry Coyne is the inspiration.

It does not always [evolutionarily] pay to be smarter, either. For some years I had a pet skunk, who was lovable but dim. I mentioned this to my vet, who put me in my place: “Stupid? Hell, he’s perfectly adapted for being a skunk!” Intelligence comes with a cost: you need to produce and to carry that extra brain matter, and to crank up your metabolism to support it. And sometimes this cost exceeds the genetic payoff. A smarter skunk might not be a fitter skunk.

A skunk is vastly more well adapted to life as a skunk than any human ever could be. All the things it takes to be a skunk? A skunk has them nailed down pretty well. The counter to this point was to say that if humans decided to destroy all skunks, we could. True enough. But does that make us more advanced? Of course our intelligence allows us to wipe out many other species, but the whole point of bringing up a skunk and its adaptation is to say that a comparison of intelligence is not valid for purposes of this discussion in the first place! (As always, you know I mean it when I use the lazy-man’s exclamation point.)

When we choose to compare intelligence in order to define what it means to be advanced, we have two massive assumptions going on. First, we’re assuming that intelligence is better than toothiness or having sharp claws or any other characteristic we see in nature. This assumption is untenable because some environments might call for all or any of those characteristics over intelligence. To put things in perspective, try thinking on an evolutionary timescale. So far I have only been comparing humans to other extant organisms (plants and skunks). But what if we go back 100 million years? 200 million? 500? 600? Any human put into an ancient enough environment would die. We know this because the right foods would not be available or because there would be no proper shelter or because the atmosphere would be poisonous or because our immune systems would not be evolved to cope with the bacteria and viruses present at the time or because…and so on. The assumption that intelligence is better than anything else is clearly wrong once we recognize that evolution and the ability of a species to survive depends largely upon environment.

The second assumption in this whole discussion is that we can even say something in evolution is “advanced”. We can say more complex or better suited to a particular environment, but “advanced”? That implies evolution is on the march towards some goal, to some end. That is not true. Science has demonstrated this again and again by showing what a contingent process evolution is. Take the Lenski experiments, for example. (I’m rather disappointed I never wrote about them here.) Richard Lenski and his researchers followed several lineages of E. coli for 20 years (in fact, they’re still following them). They would freeze samples every 500 generations so they could go back and re-run the tape of evolution should some fundamental change occur. And, eventually, such change did occur. Some E. coli were able to consume a natural by-product of their environment after nearly 30,000 generations. Lenski et al. unfroze the old generations to see just what enabled the bacteria to obtain their new found skill. As it turned out, they had to go back many thousands of generations; it wasn’t just one mutation, but at least three. The first two were effectively random. But they were necessary in order to get to the third mutation – the one that opened up a new food product for the colonies. But in the re-running of the tape, not all lineages re-evolved the new mutations. The chance involved in the process was too great to be inevitable; evolution is contingent.

So my answer to the question, Are humans “more advanced” than other organisms?, is to say it is an inappropriate question in the first place. There are several things we should not be assuming:

  • Intelligence is the best trait (whether to this point or in terms of possibility)
  • Evolution is goal oriented
  • The ability to destroy other species and still survive is a mark of advancement

I mentioned the argument for point three, but I have yet to address it. This one is pretty straight-forward, I think: We may be able to destroy many species, but that really only applies for large organisms. The vast majority of life is microbial. Since we would never be able to destroy it all (or even a minute fraction of it), does that mean it is more advanced than we are? What about all the bacteria we need to keep us alive? We certainly could not destroy all the mitochondria of the world and still survive.

Evolution is a contingent process that has no march towards any end. It is about the ability to survive. Our genes are interested in propagating themselves and that is why we are here. Life may mostly (though not entirely) be more complex since it first sprung forth nearly 4 billion years ago, but it always depends upon its environment – and that makes some characteristics more valuable than others. Sometimes.

Thought of the day

Moral improvements have nearly always come from secular considerations, and drag the churches along in their wake.

~Jerry Coyne

Expected distortions

Michael Behe recently had a paper published in The Quarterly Review of Biology, a non-creationist journal. Here is Jerry Coyne’s conclusion:

Behe has provided a useful survey of mutations that cause adaptation in short-term lab experiments on microbes (note that at least one of these—Rich Lenski’s study— extends over several decades). But his conclusions may be misleading when you extend them to bacterial or viral evolution in nature, and are certainly misleading if you extend them to eukaryotes (organisms with complex cells), for several reasons:

Go to Professor Coyne’s site for the whole review.

It’s all fair enough and no one is really up in arms about Behe’s paper itself. But isn’t it interesting how quickly the creationist intelligent design crowd started distorting the facts?

Over at the intelligent-design site Uncommon Descent, the ever befuddled Denyse O’Leary has already glommed onto the review I wrote yesterday of Michael Behe’s new paper. And, exactly as I predicted, she distorts Behe’s conclusions:

So, not only must the long, slow process of Darwinian evolution create every exotic form of life in the blink of a geological eye, but it must do so by losing or modifying what a life form already has.

In other words, she’s extended Behe’s conclusions, based on viral and bacterial evolution in the lab, to evolution of “every exotic form of life” on the planet. This is exactly what one cannot do with Behe’s conclusions.

It really isn’t a surprise that this happened; Creationists are always distorting scientific papers – and specifically so they can prop up their religious beliefs. I’m just impressed with the utter accuracy of Professor Coyne’s prediction.

This distortion is hardly news, of course—I’m completely confident that Behe not only expected it, but approves of it—but I feel compelled to highlight it once again. Luskin’s three distortions, which correspond to the three caveats attached to Behe’s results:

1. Luskin doesn’t mention that Behe’s analysis concentrated only on short-term laboratory studies of adaptation in bacteria and viruses.

2. Luskin also doesn’t mention that these experiments deliberately excluded an important way that bacteria and viruses gain new genetic elements in nature: through horizontal uptake of DNA from other organisms. This kind of uptake was prohibited by the design of the experiments.

3. Luskin implies that Behe’s conclusions extend to all species, including eukaryotes, even though we know that members of this group (and even some bacteria) can gain new genetic elements and information via gene duplication and divergence. And we know that this has happened repeatedly and pervasively in the course of evolution.

About an hour ago I finished up my last assignment for this semester, and man, it’s always a relief when that special moment arrives. But after reading this creationist intelligent design proponent garbage, I’m already getting antsy to go back and continue with my legitimate education.