Some applause please for Casey Mulligan. Mulligan has been a strong opponent of the Affordable Care Act and the expansion of Medicaid provided under the act. However he used his column today to dispel a misunderstanding of a study of the health impact of increased Medicaid enrollment in Oregon.

The study was written up in an article in the New England Journal of Medicine which noted that the study found no statistically significant impact of Medicaid enrollment on health care. However Mulligan makes the point that the study actually did find that the people enrolled in Medicaid had improved health by several important measures. While the improvements were not large enough to meet standard tests of statistical significance this does not mean that they were not important. As Mulligan notes, given the limited number of people in the study and the relatively short time-frame (2 years), it would have been highly unlikely that it could have found statistically significant gains in health outcomes.

Mulligan deserves credit for clarifying this point, especially when the implications seem to be directly at odds with his view of the policy. It would be great if debates on economic policy were always like this. 



I'm glad to see that I have people knowledgeable about statistics reading this blog. Since I guess I was too quick in my post and folks apparently did not read the Mulligan piece or the study, let me be a bit clearer. The study had very little power. There were not enough people in it. As a result you had relatively few people with any specific condition, which meant that it would be almost impossible to find statistically significant results.

To see this point, suppose we chose 100 people at random for a study to determine if drug X was effective in preventing heart attacks. We gave 50 people drug X and the other 50 got a placebo. After a year, 2 people in the placebo group got a heart attack but only one person in the treatment group. Okay, this is a nice result, but almost certainly not statistically significant. Since we had not selected people with heart conditions and heart attacks are relatively infrequent in the population as a whole, it would have been almost impossible to have a statistically significant finding.

That is the story of the Oregon study. It had some encouraging results. They were not statistically significant, but it would have been almost impossible given the design of the study to have statistically significant results. That was the point of Mulligan's piece -- and he is 100 percent right.