A Small Rant About the Meaning of Significant vs. “Significant”

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.


Jim Manzi has a long blog post today about the Oregon Medicaid study that got so much attention when it was released a couple of weeks ago. Along the way, I think he mischaracterizes my conclusions, but I’m going to skip that for now. Maybe I’ll get to it later. Instead, I want to make a very focused point about this paragraph of his:

When interpreting the physical health results of the Oregon Experiment, we either apply a cut-off of 95% significance to identify those effects which will treat as relevant for decision-making, or we do not. If we do apply this cut-off…then we should agree with the authors’ conclusion that the experiment “showed that Medicaid coverage generated no significant improvements in measured physical health outcomes in the first 2 years.” If, on the other hand, we wish to consider non-statistically-significant effects, then we ought to conclude that the net effects were unattractive, mostly because coverage induced smoking, which more than offset the risk-adjusted physical health benefits provided by the incremental utilization of health services.

I agree that we should either use the traditional 95 percent confidence or we shouldn’t, and if we do we should use it for all of the results of the Oregon study. The arguments for and against a firm 95 percent cutoff can get a little tricky, but in this case I’m willing to accept the 95 percent cutoff, and I’m willing to use it consistently.

But here’s what I very much disagree with. Many of the results of the Oregon study failed to meet the 95 percent standard, and I think it’s wrong to describe this as showing that “Medicaid coverage generated no significant improvements in measured physical health outcomes in the first 2 years.”

To be clear: it’s fine for the authors of the study to describe it that way. They’re writing for fellow professionals in an academic journal. But when you’re writing for a lay audience, it’s seriously misleading. Most lay readers will interpret “significant” in its ordinary English sense, not as a term of art used by statisticians, and therefore conclude that the study positively demonstrated that there were no results large enough to care about.

But that’s not what the study showed. A better way of putting it is that the study “drew no conclusions about the impact of Medicaid on measured physical health outcomes in the first 2 years.” That’s it. No conclusions. If you’re going to insist on adhering to the 95 percent standard—which is fine with me—then that’s how you need to describe results that don’t meet it.

Next up is a discussion of why the study showed no statistically significant results. For now, I’ll just refer you back to this post. The short answer is: it was never in the cards. This study was almost foreordained not to find statistically significant results from the day it was conceived.

WE CAME UP SHORT.

We just wrapped up a shorter-than-normal, urgent-as-ever fundraising drive and we came up about $45,000 short of our $300,000 goal.

That means we're going to have upwards of $350,000, maybe more, to raise in online donations between now and June 30, when our fiscal year ends and we have to get to break-even. And even though there's zero cushion to miss the mark, we won't be all that in your face about our fundraising again until June.

So we urgently need this specific ask, what you're reading right now, to start bringing in more donations than it ever has. The reality, for these next few months and next few years, is that we have to start finding ways to grow our online supporter base in a big way—and we're optimistic we can keep making real headway by being real with you about this.

Because the bottom line: Corporations and powerful people with deep pockets will never sustain the type of journalism Mother Jones exists to do. The only investors who won’t let independent, investigative journalism down are the people who actually care about its future—you.

And we hope you might consider pitching in before moving on to whatever it is you're about to do next. We really need to see if we'll be able to raise more with this real estate on a daily basis than we have been, so we're hoping to see a promising start.

payment methods

WE CAME UP SHORT.

We just wrapped up a shorter-than-normal, urgent-as-ever fundraising drive and we came up about $45,000 short of our $300,000 goal.

That means we're going to have upwards of $350,000, maybe more, to raise in online donations between now and June 30, when our fiscal year ends and we have to get to break-even. And even though there's zero cushion to miss the mark, we won't be all that in your face about our fundraising again until June.

So we urgently need this specific ask, what you're reading right now, to start bringing in more donations than it ever has. The reality, for these next few months and next few years, is that we have to start finding ways to grow our online supporter base in a big way—and we're optimistic we can keep making real headway by being real with you about this.

Because the bottom line: Corporations and powerful people with deep pockets will never sustain the type of journalism Mother Jones exists to do. The only investors who won’t let independent, investigative journalism down are the people who actually care about its future—you.

And we hope you might consider pitching in before moving on to whatever it is you're about to do next. We really need to see if we'll be able to raise more with this real estate on a daily basis than we have been, so we're hoping to see a promising start.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate