Common sense and research design

Last week on the New York Times' "Well" blog, Tara Parker-Pope blogged about a study that appeared to show that a mother's diet can affect the sex of her child. Yes, the father's sperm determines the gender of a particular embryo, but the story is that the mother's nutritional intake can affect how likely a given embryo is to go to term. At any rate the study is based on survey data in which mothers of boys report eating more around the time of conception than mothers of girls.

I can't really pass judgment on the study itself -- I haven't had time to read the thing -- but as someone who is pretty obsessed with (and professionally involved in) criticizing causal inferences drawn from observational studies, I found it pretty entertaining to read the comments. I admit I did not read all 409 of them. But on the whole they fell into five categories:

1. Credulous, prepared to integrate conclusions into own understanding of the world: "Interesting article…so maybe my eating all that 'crap and vitamins' will help me conceive a boy!!"

2. Generally dismissive: "Unmitigated rubbish! Another 'scientific study' that will be repudiated in two years."

3. Skeptical based on measurement error: "Surveying the diets of women who are 14 weeks pregnant and asking them to 'recall' what they had eaten earlier in pregnancy or preconception will not yield accurate data."

4. Skeptical based on unrealized observable implications: "Are there more daughters born to women in developing countries?" and "Seems to me that the obvious answer lies in genders of children born to diabetic mothers - whose bloodsugars are usually higher than the average nondiabetic woman."

5. Skeptical because of possibility of reverse causation: "Shouldn’t we be interested in the fact that the gender of the baby seems to be affecting the eating habits of the mother? That seems much more interesting to me."

For all I know, many of the perceptive comments in categories 4 and 5 came from professional statisticians, but my guess is that many of these people have never been involved in research in any serious way. In that sense I find it heartening to see so much careful public deliberation about research findings. My experience is that, while a sharp eye for research design can be taught and learned, most of the issues that occupy myself and other members of what Jim Snyder affectionately calls the "identification Taliban" -- the statisticians and social scientists who maraud around academia trying to put burkas on those who would interpret a cross-country regression causally -- are quite simple and widely understood. It seems like the most dangerously misguided people are the ones with 1 semester of econometrics and a working knowledge of Stata. It's as if you lose the common sense your mother taught you once you learn how to run a regression. (Disclosure: I was certainly a danger to myself and others at that stage.)

I find that the mass participation aspect of the web alternately exhilarates (StumbleUpon!) and depresses me (inane, racist YouTube comments!); reading the comments on that NYT blog entry was one of the happier experiences.

Posted by Andy Eggers at 4:36 PM