You’ve probably seen or heard about the study that says white Americans aged 45-54 are dying faster, along with the graph that shows a spike upwards. You may also have heard that the increase is directly related to drugs, alcohol, etc.
I had the study open on my desktop for a few days but put off reading past the abstract because the result seemed a bit far-fetched: that would be a lot of extra deaths. Turns out caution may have been the best response, and my point is that it often is.
The paper may not have done the math right: they didn’t account for the aging of the 45-54 age group, which seems to have increased from 49.1 years to 49.7 years in the period studied. As that group ages, more will die each year and this works out to about an 8% increase in the probability of dying for that whole year for this group. So you take the .6 increase and multiply that by the expected increase in mortality of 8%, and you get 4.8%. That eliminates most of the difference. Note this isn’t “statistics” in the sense of difficult calculation but really a logic error, meaning they didn’t think to do this adjustment. You can see how they only needed to check a few sources to figure this out [this post](Correcting statistical biases in “Rising morbidity and mortality in midlife among white non-Hispanic Americans in the 21st century”: We need to adjust for the increase in average age of people in the 45-54 category | Statistical Modeling, Causal Inference, and Social Science) by statistics professor Andrew Gelman. As Andrew points out, you could say they found for this group a leveling off of life expectancy but not a decrease and even that conclusion would need more thought.
Note you can see why stats people glommed on to this: like I noted, the effect is huge given the number of people involved and that’s a lot of alcohol death! Most claimed big effects disappear when you figure out the problem better.
On a related note, the Economist ran a piece about how earnings differences between men and women largely - but not entirely - disappear when you compare actual jobs versus broad categories and even broader groupings of jobs. There may be more inequity in the types of jobs women have than in whether they’re paid fairly versus men for the jobs each actually have. This is the same issue we see on the other side of the political spectrum: the claim that government workers make more also falls apart when you look at actual jobs instead of broad groups and actual skill levels, etc. Big effects tend to disappear when you look closer. I think this is a very useful thing to remember: whenever someone claims there’s a big difference, the odds are pretty good that person is a) simply wrong if you look more closer and/or b) manipulating you.