Monday, February 28, 2011

Wait, why shouldn't the children call the emperor a "fatty"?

The endless accomodationism rumble has gotten a bit of a shot in the arm lately with a post by Jean Kazez in which she basically comes right out and says that science/religion compatibility should never be discussed in the public square, regardless of tone -- it is just not understandable by the unwashed masses. I hadn't read the post, because, well, I don't really care. The money quote was pretty shocking, but whatever, I'm ever so sick of rehashing this fanciful idea that making an idea more visible somehow makes it less popular.

I had heard that Kazez had also said something within the same thread of posts where she extended the Emperor's New Clothes analogy to try and present the accomodationist position. I finally read the relevant passage when it was quoted over at Metamagician, and I quote it at length here:
The emperor marches along the parade route stark naked [ignore the green underwear in the picture], and the adults ooh and ahh about his finery. One brave girl speaks up and says, naively "The emperor has no clothes!" Good for her! Hurray!


Now we have the sequel: "The Emperor's Gnu Clothes." Other kids were impressed with the brave girl. They started saying the same thing--"The emperor has no clothes! The emperor has no clothes!" Soon just saying he had no clothes lost its appeal. They shouted louder and louder, and called the emperor a fatty and laughed uproariously.

Some of the adults said: "Children. You're right he's naked. The brave girl was perfectly right to say so. But you've gotten carried away. It's time to think this through. Maybe the emperor actually enjoys being naked. Maybe he really doesn't know he's naked, and he can't figure it out when you're yelling at him. Maybe when he looks at you, your clothes look ridiculous to him, too! Control yourselves, think about how you're communicating!"

This made the children very, very angry. They wanted to believe they were just like that first brave girl. They didn't want to see themselves as rude and insulting. So the children went after the adults who had chided them, and called them names, and derided the whole idea of Communicative Restraint and Politeness, which they called crap for short.

Perhaps the analogy is a bit tortured here, but I'm never one to shy away from analogy waterboarding, and in any case I think Kazez makes her point quite clearly with this story.

My question is: What's the problem?

Let's strap this analogy onto The Rack and take a step back here. Who is the emperor? In the original story, the emperor is a pompous dictator who is so full of himself that he is actually taken in by a pair of weavers who fraudulently promise him a set of clothes that are "invisible to those unfit for their positions, stupid, or incompetent." He's rich, powerful, and so egotistical that he actually commissions a parade specifically for the point of making all of his subjects look stupid because they can't see his new clothes. Furthermore, at the end of the story, while the emperor is certainly embarrassed, he does not appear to have lost one iota of power. In fact, as the procession goes on, the closing line of the English translation I am looking at states that "his noblemen held high the train that wasn't there at all." The emperor is so feared and powerful that even after he is revealed in front of all his subjects as being a hopeless fool, he still commands so much power that rich guys continue to go along with the farce.

So now with all of that mind, Kazez wants to, what, make us feel sorry for the emperor that some kids are calling him a "fatty"? Srsly?!? Please, I'll trade places with the emperor any day of the week. I get to have unrestricted power over an entire kingdom, I get noblemen who will publicly play along with whatever absurd fantasy I ask, I get to be rich beyond my wildest dreams... and the only price I pay is I have to put up with some schoolkids snickering at me and calling me a "fatty"? Bring it on!!

And could there be a better analogy to the special deference afforded to religion in American society? This is so familiar... Public policy is all too often shaped by religious dogma; any American politician of any stature has to at least pay lip service to faith (and preferably Christian faith); there's a designated National Day of Prayer; people who profess to be highly religious are automatically presumed to be better, more moral people; criticism of religion in the public square will get you branded as "intolerant," or worse... and then amidst all of this, if a couple of bloggers occasionally make unkind and overly broad sweeping generalizations about Christians, the cry goes out, "ZOMG Christians are being persecuted in America!"

Please, persecute me like a Christian. And mock me like an emperor. That would be awesome.

Friday, February 25, 2011

2010 GSS first to show more Americans in favor of same-sex marriage than opposed

The 2010 edition of the General Social Survey shows for the first time more Americans supporting same-sex marriage (46%) than opposing it (40%). Not that majority rule ought to determine who gets civil rights anyway, but can we finally look forward to an end to all those faux-democratic arguments against same-sex marriage? Can we stop hearing about the myth of "activist judges"?

(h/t to Sherkat)

Wednesday, February 23, 2011

Three Rules for Facebook

In order of ascending importance:

Rule One. Your profile pic should really have your face in it somewhere, preferably featured prominently.

This is not a big deal, and probably more of my friends break this rule than follow it. It's just a pet peeve of mine. I get a friend request, and I'm like, "Wait, is this the 'Joe' that I met at a party the other week, whose last name I don't know? Or is it just some random guy who spams friend requests in the hopes of getting his friend count up? Gee, I don't know, because your profile pic is a shot of your dog or your kid or something. And I haven't met your dog or your kid. Hell, I don't even know if you have a dog or a kid!"

For things like e-mail, blogging, chat, by all means, use whatever you want for your profile pic (I do). It's just that part of the point of Facebook is as a means of locating old friends, acquaintances, etc., and if you don't have your face in your profile pic, it makes it that much harder.

Rule two. Don't post anything too controversial to your wall. Save that shit for blogs, forums, etc.

I've already discussed this one, so I'll just add one thing: If you were at a Thanksgiving dinner with the in-laws, you wouldn't spout off about religion or politics (or politically-charged topics like abortion, alternative health claims, etc.), would you? I thought so. So why would you say it, not just in front of your in-laws, but in front of your in-laws and everyone else you've ever friggin' met? Maybe some of them think your opinyuhn are dum.

Rule three. For the love of God, don't directly criticize someone on their wall. This is actually why I had to unfriend a friend of mine, because she broke rule #2 and I felt I could no longer obey rule #3 as a result. Like, really, you are going to call someone out and tell them they are wrong and a crappy person in front of all of their friends and family and acquaintances? The dangers of an argument getting out of hand is bad enough online even when the communication is private or semi-private. Now you are going to criticize someone publicly? Are you stupid or something??

I would also add the Hehir Corrolary to Rule Three: Especially don't criticize someone on their wall while they are grieving. Double especially don't criticize them in regards to the manner in which they choose to grieve. On their wall.

Yes, this actually happened, and it set in motion a chain of events that has caused me to lose my primary creative outlet and only regular independent social activity. Goddamn you Mark...

Tuesday, February 22, 2011

Adaptationists, neutralists, frequentists, and Bayesians -- oh my!

There's a bit of a minor dust-up taking place at Sandwalk, Larry Moran's blog, over adapationism vs. neutralism in regards to evolutionary theory. There are some really great minds engaging in this discussion -- even Richard Dawkins has weighed in! -- so really I'm sure there's nothing a lay person such as myself can meaningfully contribute. But hey, if you thought I was going to go that route, you have significantly underestimated my ego.

A few months ago Bjørn Østman and I were having a discussion about the degree to which selection-for must be thought of as an abstraction rather than a reality, and I presented the following (admittedly somewhat over-the-top) example:

Let's say in the year 5673 BC, one peacock was saved from certain death when his tail got caught in some branches, and that slowed him down enough that an impending avalanche that would otherwise have hit him now missed him -- whereas his brother with a slightly smaller tail continued unimpeded and was killed in the avalanche. Do we now have to include in our analysis a 10-10% contribution to the selection of the tail by "stops peacocks from walking into the path of avalanches"? If not, why not? (Without invoking agency, that is)

Don't get too distracted by my use of the peacock's tail -- that just happened to be the trait we were talking about at the time. Anyway, I want to flesh out my thoughts on this somewhat in light of the present discussion regarding neutralism.

First, a brief digression on two major schools of thought in probability theory. Let's say I roll a die three times, and I get the sequence 1-3-5. After I have finished the experiment I say, "Now what are the odds I would get that sequence?" A frequentist answer would be 216:1, because, assuming a fair die, if I perform this experiment enough times, I should get the sequence 1-3-5 approximately that often. An extreme Bayesian answer would be 100% -- as long as you were asking the question after performing the experiment. After all, we know that's what I rolled; it can no longer be anything else. Given the state of information we have now, the odds are 100%.

The Bayesian interpretation is not very useful there, but it's quite useful in regards to things like cancer screening. And nobody is so extreme of a Bayesian that they would stubbornly argue for the position in the previous paragraph. Anyway, you can get piles of information on this by Googling, and probably most people who have read this far are already more or less familiar with the concept, so I will not expound on it further.

If we wanted to take the really extremely stubbornly uber-Bayesian interpretation of natural selection, we would have to argue that every single extant trait is adaptive. After all, the a posteriori probability of evolution having taken the path that it has is 100%. Given the information we have now, every trait which exists in the present has perfect inclusive fitness, and every trait which is no longer extant is perfectly unfit.

I am making this rather stubborn and perhaps overly philosophical point in order to drive home the idea that any useful description of selection must inherently be an abstraction. It's not just that we need to use an abstraction in order to analyze it in practice; even in principle, one cannot say anything meaningful about natural selection without taking a frequentist approach, abstracting out the actual environment in which a given organism lives, reproduces and dies, in favor of an idealized, "typical" environment analogous to an idealized fair die roll. Without taking this important step, the most meaningful thing we can say about evolution is, "What happened happened."

(As a brief aside, I think the primary mistake made by Fodor and Piattelli-Palmarini was their failure to take that next step from the stubborn philosophically pure position to recognizing that the whole enterprise can be handily salvaged by just drawing a few abstractions. In the spirit of full disclosure, I must admit I haven't read their book, but I've read critiques and defenses of it, and unless there's some amazing revelatory concept in the book, this seems to be the rather obvious problem with their entire argument.)

I think this already somewhat undermines the dichotomy of the selectionist/neutralist debate, since both are exposed as abstractions which are (again, by necessity) pretty far removed from the most literal reality of evolution. Because each organism (or each allele if you want to take the gene-centric view) exists in its own actual environment rather than being repeatedly tried in some ideal environment, to even pose the question of whether a particular trait is adaptive or not by necessity admits a sliver of teleology-esque reasoning into your model. By no means am I objecting to this! But it is worth observing.

Of course we can still ask which model is more useful, even if it turns out that the answer is "it varies" -- so there is meaningful discussion to be had. As a lay person, I'm not going to attempt to answer that. But what I will do is point out one problem with the intensely1 neutralist view espoused by folks like Larry Moran.

Consider the example of the rhinoceros' horn discussed in the comments at Sandwalk. Let's say for sake of argument that, in comparison to each other, the Indian rhino's one horn has no selective advantage over the African rhino's two, and vice-versa. Let's go one step further and assume that sexual selection doesn't even play a role, and that if you could magically give some fraction of Indian rhinos an extra horn, or take away one from some African rhinos, it wouldn't affect the afflicted rhinos' inclusive fitness one iota. Does that necessarily mean that "neutral" is the best way to describe the trait(s)? I'm not so sure.

This is a pretty unrealistic hypothetical, but run with me here for a moment: Let's say the common ancestor of the Indian and African rhinos had a sort of bumpy ridge where the nose is. Two sub-populations become separated, and (here's where the unrealistic part comes in) a single transposition event in one sub-population causes the ridge to grow into a single horn, and a single transposition event in the other causes the ridge to grow into two horns. In each sub-population, only one of those two possible mutations is available, and in both cases the transposition event is so unlikely that we would not expect it to occur again. Assume further that horns -- whether one or two -- have a tremendous selective advantage over the bumpy ridge.

How can one argue with a straight face that either trait is "neutral"? Both mutations are tremendously adaptive over and above the available alternatives in the gene pool. If by the time the two sub-populations are reunited they can no longer produce viable offspring, then we now have two species, each with a mutually exclusive trait that is no better than the other's, but both of which are clearly adaptive. Genetic drift doesn't even come into it, except in the trivial sense that all novel alleles have to arise via random mutation.

I am again being somewhat stubborn and over-the-top, I think, but the point I am making is that in the long term and in a large enough population, every single trait that goes to fixation is adaptive, in the sense that it is superior to the immediately available alternatives present in the gene pool. Now the manifestations of those immediately available alternatives are influenced by effects such as genetic drift, of course. And it's probably fair to argue that it is not uncommon to have multiple available alleles in a population which are (within the limits of our abstraction) no more adaptive than each other, and in those cases the neutralist account would indeed be the most sensible. What I am saying, though, is that just because two potential traits of an organism have no particular selective advantage over one another if both had been readily available in the same population at the same time, that doesn't at all invalidate a selectionist account.

This is all probably hopelessly redundant, covered territory in evolutionary biology, and as a layman I should probably just not blather on about it. But oh well, those are my thoughts on it.

1I struggled with the wording here... I initially had "extreme" rather than "intensely", but I felt that was unnecessarily -- and unintentionally! -- pejorative. Given our human penchant for promiscuous application of teleology, I think the "extreme" (not in a pejorative sense!) neutralist position is a very necessary part of the tapestry of evolutionary inquiry. Without neutralists like Moran, the adaptationist model could easily get completely out of control. Even great minds like Dawkins, who has been entertainingly referred to as an "uber-adaptationist", have at times seemed a bit too eager to prematurely embrace an adaptationist account. All deserved respect to Larry Moran here, by all means!

Monday, February 21, 2011

Facebook vs. Blogging

So I got a new phone, one of them Droids, and since it had this widget just sitting there saying, "Hey, why don't you connect this to Facebook! Are you too old or something? All the cool kids are doing it...", well, I decided to give it a shot. I'd had a Facebook account already but I only used it for contacting people or having them contact me -- so that we could then switch to e-mail or whatever. But now I'm giving it a shot.

It's actually sort of cool seeing my friends' little updates about what is going on in their lives and stuff. But I also had to unfriend someone within the first 24 hours so as to be able to keep her friendship (ironic, eh?). See, she posted an article from Joe Mercola, and it turns out when I think someone is factually incorrect, I am totally incapable of keeping my mouth shut... so better to just not see the updates.

Which leads me to my point -- I am thinking, if you are going to say something controversial, get a blog or something. Use Facebook for social stuff, not for political or other controversial stuff. I'm also thinking of the time that my wife and her uncle mixed it up on Facebook and sort of wound up on temporarily bad terms, because of some sort of political/religious discussion.

The thing with Facebook is that what you write goes out to all your friends. Not just the ones who think your scholarship on the Peloponnesian War is fascinating, but all of them -- so if you want to write about the Peloponnesian War, get a blog.

I dunno, just thinking about it...

Thursday, February 17, 2011

The two biggest flaws of the Star Wars prequel movies

I watched Star Wars Episodes I, II, and III recently, for only the second time. I've problem seen the other three, the original three, the good three, a hundred times or more (I watched them a lot as a kid).

When fans of the originals complain about how bad the prequels are, one response you hear sometimes is, "Look, the originals were just as bad, you just remember them fondly because you were a kid. They are all completely absurd space operas with lousy dialog and naive kidsy themes. And you complain about Jar Jar -- look at the freakin' Ewoks!" I've never felt this complaint was quite right, but I've also had some trouble refuting it in the past.

One thing I noticed on the first viewing, and this was while watching The Phantom Menace in the theater when it first came out, was that the CGI they used in the new ones just looks too "clean". One of the lovely things about the originals was that the highly detailed models Lucas and his team built just looked so gloriously dirty. They actually looked like something that had been traveling through deep space for a long time. In contrast, the ships and robots in The Phantom Menace just looked like run of the mill modern sci-fi. (I suppose one could justify this on the grounds that the Old Republic had better, more well-maintained stuff; but this reeks of Star Wars apologetics, and I'm not having it)

However, on the second viewing, I think this was a minor problem at best, and really a subset of a much bigger problem. So without further ado, here is where I think Lucas made two very simple missteps that severely tarnished the prequels and prevented them from having that fantastic charm of the originals.

So let's talk about Jar Jar. No discussion of what's wrong with Episodes I and II can be complete without discussing the much-maligned Senator Binks. As I stated before, some Jar Jar defenders/fanboy critics have compared Jar Jar to the Ewoks, saying both are equally comic, equally silly, had equally annoying voices, and that both are obviously designed to appeal to kids rather than adults. Fair enough.

And yet, the Ewoks don't piss me off nearly as much, and, though I realize this is hopelessly subjective and I have no real basis to assert this, I just can't make myself believe that the reason is because I grew up with the Ewoks. Flight of the Navigator seemed like a grand adventure to me as a kid, and was also one of my favorite movies, but today I see it for the corny trash that it is (though I do think some of the sound editing, particularly in the early scenes just before the protagonist discovers the spaceship, is masterful). Why did the Ewoks age so much better?

I think I know why: The Ewoks didn't speak English. In fact, they weren't even subtitled. Sure, when Wicket screeches "Beecha-wawa!" after almost being hit by a Stormtrooper's blaster, it's just as grating as anything Jar Jar says -- but it feels like an alien who just happens to have a high-pitched voice, because of the Ewoks' inscrutability: an inscrutability that would have been impossible to pull off if they had spoken English. In contrast, when Jar Jar says, "Meesa so scared!", it sounds like a damn cartoon character.

If you compare both trilogies, this is very consistent. Virtually every alien in Episodes I, II, and III speaks English, whereas in the originals most of them do not (and many of them are not subtitled). Hell, R2D2 and Chewbacca were main characters through all three movies, and neither of them ever said a word that was directly comprehensible to the audience. (As a brief side note, Ebert's review of The Empire Strikes Back notes -- correctly in my opinion -- that the film's biggest flaw is Chewbacca's incessant mournful cries. They work to express the intended emotion, but they just get so repetitive. However, that was a problem with execution, and a minor one at that. Just imagine how bad it would have been if Chewbacca had the same voice, but talked in pidgin English! Perish the thought...)

Think of the scene with Greedo in Episode IV. Nobody is going to call that scene kidsy -- Han Solo murders him with a gun concealed under the table, for chrissakes! But think about it: Greedo had a pretty funny-sounding voice, didn't he? Imagine if Greedo had that gurgly, almost trilly voice, but spoke English instead of Rodian. Granted, it wouldn't have quite been a disaster of Jar Jarian portions, but it would have significantly undermined the tension in the scene. (Another brief digression: In the updated version released in the 90s, why oh why did they have to make Greedo fire first? It's so freakin' corny... From a realism perspective, how the hell would he miss at that range? And from a character development perspective, it sort of undermines the early establishment of Han Solo as a "scoundrel". Oy...)

In any case, having your aliens speak their alien language allows you to get away with a lot more, without having it come across as cartoonish. I'm sure Lucas felt that having the characters speak English improved pacing and made the movies more kid-friendly by obviating the need for subtitles -- but I was probably seven or so when I first saw the original trilogy, and I still was enchanted by it. Big mistake.

On a side note, I have no particular opinion on the racial aspects of the anti-Jar Jar criticism. I am certain it was not intentional. But I absolutely see how people perceive it that way -- though on the other hand, my wife does not see it at all. I'm not sure if it rises to the level of being racially insensitive or not. As I said, I have no strong opinion on that issue.

Moving on... The other big problem with the prequel trilogy, particularly Episode I, is more subtle, but possibly more pernicious. You see, one of the reasons the original trilogy worked so well despite a number of absolutely preposterous conceits (a freaking laser sword being the preferred weapon of the Jedis? When guns are available??? Uh huh...) is that the entire Star Wars universe exists in a vacuum. You can make yourself believe it's in "a galaxy far, far away", because there is a total absence of pop culture references. Those concepts which overlap with our world tend to have a historical mystique to them ("Imperial senate", "knights", etc.) which contributes to the other-worldly flavor rather than detracting from it.

Now think back to the pod racing scene in The Phantom Menace. Intended to be the most stunning set piece of the movie (they had a freaking video game based solely on that scene timed to coincide with the release of the film!), it was marred by the corny two-headed announcer(s). Out of context, there's nothing so bad about that gag. It was like something straight out of a Pixar movie, and you know what, all the Pixar movies are pretty good. I'm sure kids found the announcer(s) to be pretty funny, and while the jokes weren't exactly scintillating humor, the dialog was passable as it goes.

But the reason that gag "worked" is because it was a parody of the standard two-person commentator team used in most US sporting events, play-by-play and color analyst. It was a direct pop culture reference. So what the fuck? Does this "galaxy far, far away" get American broadcast television? From the future? No, of course not, it was just a bad choice on the part of the scriptwriters, but nonetheless it degraded the illusion of otherness that pervades the original films.

Though the two-headed announcer(s) were the most egregious example, such problems pervade Episode I, and are present to a lesser extent in Attack of the Clones as well. (I feel Revenge of the Sith finally broke this trend, and is one reason why many fans feel that is the best of the prequel trilogy) I mentioned earlier that the ships look too clean; they also look too much like modern conceptions of sci-fi. Compare Qui-Gon Jinn's ship in Episode I to the Millenium Falcon, both pictured below. If you had never seen any Star Wars movies, and I told you that one of these was thought-up by humans and another was built by aliens, what would your guess be? I thought so. Most of the other ships from the original trilogy are less alien than the good ol' Falcon, but they still don't look like modern fighter jets. (Which brings me to a minor realism niggle: All of the ships in the prequel trilogy are highly aerodynamic, which doesn't matter a damn in space. Many of the ships from the original trilogy are far less so.)

I think there are other examples that are more subtle, and I won't attempt to list all of them. I'd have to watch the movies again in order to catch many more anyway, and I don't expect to be doing that again until my eldest son is old enough to dig Star Wars. But I'm telling you, I think this sort of problem is what wrecked the prequel trilogy, or at least the first two episodes. For all the corny dialog and stilted acting and baldly implausible ideas in the original trilogy, it felt like something from out of this world -- which made suspension of disbelief that much easier, and made all the crap much easier to swallow. The flaws were partially obscured by a fantastic other-worldly mystique. The prequel trilogy mostly failed to cultivate this atmosphere, and as such all of its warts were laid bare.

In defense of blindly voting for one party

When it comes to legislators, particularly at the federal level, I would vote for an incompetent Democrat before I would vote for a well-qualified Republican. I would do this even if I agreed with more of the Republican's stated positions than the Democrat's. And I believe I am on firm rational footing with this.

In the United States -- as in most representative democracies, it turns out -- legislative voting is nearly always virtually party-line. I mean, hell, we even have an official position in each party, recognized by congressional regulations if I'm not mistaken, dedicated to making sure this remains the case!

And while individual candidates might have their own quirks, it is a fact that my positions and values are far more in line with the Democratic Party platform than they are with the GOP platform. I disagree with the Democrats on a number of issues, but geez, it's not even a close call here.

So in the hypothetical introduced in the first paragraph, if this highly-qualified Republican legislator gets elected, so what? Most of her votes will still be along party lines, and therefore will be for positions that I disagree with. And if the incompetent Democratic gets elected, most of his votes will be for positions I agree with, even if he himself is an idiot. Even those representatives who have the biggest reputation for being "mavericks" (God, how that word has become tainted) still vote with their party something like upwards of 90% of the time.

This argument is less applicable when it comes to executive branch positions, and doesn't hold at all for judicial positions. (Why the hell do we elect judges anyway? That just seems like a patently stupid idea to me... but I digress) But when it comes to legislators, you bet your sweet bippy I'm voting Democrat, no matter who the candidates are, and I make no apologies for it.

Sunday, February 13, 2011

Morality does not and CAN not come from God

I've been meaning to do this post for awhile, so I'll have something to link to for reference, because this point comes up a lot and it's already settled philosophy, and has been for hundreds of years -- even the theologians agree on this one! This will be old hat for almost anyone reading this, but I just want to get it down so I have something to link to.

Morality does not and CAN not come from God. The argument to show this is quite simple. Consider: What if God commanded you to eat babies? Would that make it moral? Three possible answers:

1) No it would not. Good for you. Hence, morality exists independent of God.

2) God would never command such a thing, because She is perfectly moral. Slightly trickier than the first, but this still proves that morality exists independent of God. The statement makes a prediction on what God would or would not say based on what is moral -- but if morality is defined by whatever God says it is, then this test is circular. The assertion becomes "God would never command such a thing, because He's never commanded such a thing." This is obviously unworkable. Therefore, in order to answer this way, one must posit a morality independent of God.

3) Yes it would. Well okay then. I would argue that anyone taking this position is fundamentally amoral, and I'd be rather frightened of them. Beyond that, many philosophers assert that morality cannot be synonymous with simple obedience, and that therefore this would not form a coherent morality either. I'm not sure if I think that part of the argument is bulletproof -- but I'm not worried about it, because as far as I'm concerned, as soon as somebody takes this position, I've won the debate.

Now, many people may still legitimately argue that God/belief in God helps them to be moral. I would almost always disagree, but this is a position that can be reasonably debated. It cannot be cleanly logically refuted; it must be debated on evidence. However, the position that morality does not exist without God is empty. It's dead. It's been defeated long long ago, and cannot be adequately defended.

Good, so now I can just link here instead of retyping this every time.

Saturday, February 12, 2011

How to Properly Cook a Steak

It turns out that cooking a goddamn good steak is really easy, but you need to know just a few important tips. Really, only three that are critically important, and a few more minor tips that help a little. None of them take very much time or effort (though the most important one requires you to spend sixty seconds with the steak a couple hours in advance) and none of them are anything that your average home cook should have trouble with -- and yet, so many people cook flavorless, overdone, rubbery steaks. There is no need. The perfect steak turns out to be ridiculously easy.

THE MOST IMPORTANT THING: Aggressively salt and pepper the steak at least one hour in advance, preferably two or three or more. Use kosher or sea salt, and use more than you think you should have to use. Do not skip this step. I don't care if you have high blood pressure and one too many milligrams of sodium might just kill you; if it's that much of a problem then just don't eat a steak.

Salting the steak in advance does two important things. Pundits will tell you that it creates a reverse osmosis reaction that tenderizes the steak, and this is more or less true... but far more importantly, it will allow salt to penetrate all the way through the meat, which means that you will actually be able to fucking taste it... and even better, because salt makes you salivate, this will make the entire steak seem juicier, even though the moisture content is unchanged.

Do this every time. I am not kidding. This is the #1 most important thing you can do to your steak. It is simple, it is easy, it costs nothing, it doesn't hinge on matters of personal preference, and it can make the difference between a bland hunk of cow versus a beautiful buttery bite of beef.

THE NEXT MOST IMPORTANT THING: Get your pan hot. Really hot. Like, as hot as you can get it. If you are doing it on the stovetop, this means using a cast-iron if you have it, or the heaviest pan you can find if not. Pick the biggest burner, and turn it on as high as it will go. Gas is best, of course. The pan should just start to smoke (you haven't add the fat yet either; see the next tip) before you are ready.

If using a propane grill, put an upside down baking sheet and/or foil over the part of the grill you intend to cook the meat on. This will superheat it, then remove it immediately before you put the steaks on to cook. Trust me, it's worth it. If using charcoal, buy yourself a chimney. Seriously, don't skimp on this. This is a whole separate topic in itself, but a charcoal chimney is indispensable. Pile the coals all in one corner as soon as they come out of the chimney, and cook your steak over that.

THE LAST CRITICAL TIP: This applies only if you are pan-frying your steak, as I prefer these days (though I will address grilling here too). Thou shalt use the following fats in thy pan: one tablespoon butter, one tablespoon vegetable oil. Not olive oil. A lot of cooks, myself included, like to use olive oil where many recipes call for vegetable oil, because it's healthier and more flavorful. Not here, folks. You want a really high smoke point, and all the flavor is coming from the butter anyway. (And dude, you are eating a pan-fried steak... you're worried about health?! Please...) Don't add the fat until the pan is just starting to smoke, as I mentioned in the previous tip. Then toss it in, swirl it around just until it coats the pan, and throw your steak in.

If you are using a grill, take a paper towel or a clean rag, splash some vegetable oil on it, grasp it with tongs and rub it on the part of the grill you will use to cook the steak right before you cook it. This is actually a good idea with almost any meat you cook on the grill. You can't really do the butter thing this way though (which is one reason I have started to prefer pan-fried steaks over grilled steaks).

THIS IS SO OBVIOUS IT DOESN'T COUNT AS A TIP: Cook your steak rare or medium-rare. Please. If you are going to cook it more than that, I say, Don't Ask, Don't Tell. Seriously, the way Biblical literalists feel about homosexuality, that's how I feel about well-done steak. It's a crime against nature. But anyway, I realize preferences differ, so the meat (har har) of this tip is how to get a good rare/medium-rare.

If your pan is as hot as I said it should be, this means about 2 minutes per side. If it is a particularly thick cut, like more than two inches, stand it up on the edge for just a bit, so that you can't see any pink when you put it on the plate (not only because it looks weird, but because there is a potential safety issue if the surface temperature has not hit 165F). In any case, make sure that the top and bottom sides have a nice crust -- which they should, if you got your pan hot enough and you used the butter-oil mixture I commanded you to use.

THE LESSER TIPS: All of these things matter if you want a really perfect steak, though they are less crucial. If you don't do the first three, you will have done violence to your steak for no good reason. If you follow them, your steak will taste like a damn steak, and that's worth doing. These remaining tips will help put it over the top.

Rewind all the way back to when you are buying the steak. Buy a thick cut. How thick is the minimum depends on your equipment. I have a nice Lodge cast-iron skillet, and one of the burners on my (gas!) range gives an unconscionably intense flame (the dial labels it as "POWER BOIL"), so I am able to get 3/4" cuts to cook up quite nicely. One to two inches is better, of course, and even thicker is even better. The thicker the cut, the more seared you can get the crust while leaving the inside nice and red.

The flip side of this tip is that if you are one of those "alternative steakstyle" types who likes it medium or above, buy yourself a thinner cut. I don't have much to say about this, as I consider it to be immoral in the extreme, but the point is that you want a certain amount of sear on the outside and then you want to take your steak out of the pan -- so buy the thickness of steak that will result in the insides being done to your liking at the point that the outside is done appropriately.

Now proceed slightly forward to when you were salting the steak -- depending on the cut, it's probably worthwhile to trim off any major external fat deposits while you're at it. If it's a filet mignon, probably not, any fat there will just turn buttery as you cook it. A strip steak, though, or a ribeye, or whatever, it's worth trimming off what you can easily get to without hacking too much meat off. Of course the diner can always cut it off at the table, but I'd rather take care of it in the kitchen. It makes a nicer presentation, and it allows the diner to just chow down without having to work around any inedible parts.

I've heard it said that you should let the steak sit at room temperature for 20 or 30 minutes before you cook it. This makes good sense, as raising the surface temperature means that you can get a good sear faster than you would otherwise, which means the center can stay appropriately rare. Cooks Illustrated even has a method (which I have used with some success, though I consider it ultimately unnecessary) where you cook the steak in a low oven before pan-frying it, thereby warming it and -- according to them -- activating enzymes which help to further tenderize the steak. I do not doubt the validity of all this, but I will say that, while following the remaining tips in this guide, I have never had a steak come out too tough or with too much of a "gray zone" separating the crust from the nice red innards. I mention all this because, though I think it unnecessary, I find the science sound: if you are concerned about getting a proper crust, or about getting a sufficiently tender steak, but all means, try pre-warming it.

Finally -- and this should almost be one of the Critical Steps, because it makes a pretty big difference -- when the steak is done, you should let it rest for a few minutes before serving. But don't let it just rest on a plate! Take a slice of bread, and let it rest on that. Then, depending on how juicy your steak is and how hungry you are, either eat the beef-sopped bread yourself, or else feed it to the dogs. Why do this? Well, if you listened to me about getting your grill really super hot, and about using a nice butter/oil mixture to fry it in, you should have a really nice crispy crust on both the top and bottom. You have to put one side or the other down, and, since you did such a good job keeping this steak nice and juicy, whichever side you face down is going to puddle in its own juices. If you do nothing to abate this, you will re-sog that side and obliterate that nice beautiful crust you just went to all this (not very much) trouble to create. Better to stand it on a slice of bread, which will soak up the drippings and allow the crust to stay intact.

So this post is a little longer than I meant, because I wanted to cover all the bases, and explain the reasons why in addition to just issuing the commandments. To summarize, here is a simple recipe that will guarantee you a better tasting steak:

1) Buy a thick cut to get a good rare/medium-rare (assuming you aren't one of those pervs who likes it medium or worse, in which case buy a thin cut).
2) Unless it is a very expensive cut like filet mignon, trim any visible fat.
3) Aggressively salt and pepper your steak at least one hour in advance, preferably more. Seriously, if there's one lesson I could leave to my sons, it would be to season your fucking steak in advance. It's so easy, and will make your steak taste 200% better than if you hadn't done it. Even if you ignore every single other piece of advice in this post, season your fucking steak.
4) Get the pan really hot.
5) Once the pan just starts to smoke, add 1 Tbsp butter and 1 Tbsp vegetable oil.
6) Toss in the steak and cook 2 minutes per side, or just until a crust starts to form. If it is a particularly thick cut, use tongs to stand it up on edge for half a minute or so until all the external pink is gone.
7) Set each steak on top of a slice of bread and let rest five minutes-ish before serving.

This is an absurdly easy recipe. You owe it to yourself to follow it, or at least to digest the information in it. Every time I eat a steak that has not been properly seasoned or has not been cooked at high enough temperature, I die a little inside. Please don't do that to me.

Tuesday, February 8, 2011

For 32 years I've been unbuttoning my pants the wrong way

How one buttons or unbuttons one's garments is not something of which most of us are particularly self-reflective. I know I'm not. It's something you figure out when you are a wee person, and then it's just an unconscious action. Just like when I walk to the door, I don't think, "Okay, first the right foot, then the left, then the right," but instead I just visualize myself walking to the door -- in the same way, you don't think about the hand motions when you operate a button, you just visualize the button being done or undone and muscle memory does the rest.

I think I discovered last night that I've been undoing the button on my pants in a rather silly way, which probably nobody else does. I'm sure I've been doing this as long as I can remember, and of course you never think about it.

Up until last night, the way I did it was that I grasped the fabric on the left side with my left hand (so far so good), and then with my right hand, instead of grabbing the button, I grabbed the fabric near the button. Then with my right hand I pushed downward and twisted back and to the left, while simultaneously pulling up and forwards with my left hand. This causes the button to turn somewhat, and if all goes well, it just pops out. (Bear in mind I never thought about the mechanics in nearly this level of detail -- or really, at all -- until last night. This is a post hoc reconstruction, based on simply paying attention to what my hands were doing when I let muscle memory do its thing.)

On reflection, this probably puts unnecessary stress on the threads holding the button in place (which would explain why I've had a few pants pop their button after a surprisingly small amount of use) and in any case is just fairly silly. But you'd be surprised how well this actually works -- well enough for me to get into my thirties without ever noticing I was doing it or that it didn't make any sense.

Apparently, though, if the buttonhole is too small, this does not work. I got a new pair of pants last week and I had been having the damnedest time getting them to unbutton. Last night, I was having a particularly difficult time (and I really had to pee, besides!) so I decided to try a different tack. I brought my right hand around the front, grasped the button directly with thumb and forefinger, and slid it through the hole.

"Oh, that was easy," I thought. Then: "Wait just a minute... I bet everybody in the world except me already does it this way!"

Yeah, being a human is strange sometimes.