Adding Up the Commas

The new Johns Hopkins/Lancet study of deaths in Iraq caused rightie knees to jerk so fast I’ll bet a bunch of ’em are on crutches today.

David Brown of the Washington Post reports:

A team of American and Iraqi epidemiologists estimates that 655,000 more people have died in Iraq since coalition forces arrived in March 2003 than would have died if the invasion had not occurred.

The estimate, produced by interviewing residents during a random sampling of households throughout the country, is far higher than ones produced by other groups, including Iraq’s government.

It is more than 20 times the estimate of 30,000 civilian deaths that President Bush gave in a speech in December. It is more than 10 times the estimate of roughly 50,000 civilian deaths made by the British-based Iraq Body Count research group.

Wow, that’s a lot of commas. Will Bunch points out,

If the Hopkins survey is right, it could be the case that the last three years of mayhem in Iraq has claimed twice as many lives as died violently during the odious, 23-year regime of Saddam Hussein. Most experts looking at the Saddam years say that lives lost by internal repression and genocide against Kurds and Shia probably killed about 300,000 people.

You can’t blame the righties for being skeptical, however, because I suspect much of the news reporting about the study is sloppy. I’m making some assumptions here because I haven’t seen the study itself, but if it’s similar to an earlier study from 2004 by Les Roberts of Johns Hopkins, the study did not just add up Iraqi civilians known to have been killed by violence, and I doubt the researchers claim to have completely separated “civilian” deaths from “combatant” deaths — in the middle of an insurgency/civil war, that would be pretty much impossible.

Instead (and I’m relying mostly on the Washington Post’s account of this) as I understand it the study looks at mortality rates before and after the invasion and publishes the difference. The mortality rates include all Iraqis who died of anything, including malnutrition and disease. Historically disease has caused more deaths among both soldiers and civilians in war than battle itself, for a lot of reasons. That’s less true now than it used to be, for soldiers. But if war destroys infrastructure that delivers safe water to a population, or damages hospitals, or runs off the doctors, or cuts off supplies to medicines, then a lot of people die from war who might not have died otherwise. And that’s what the Johns Hopkins study counts — people who died who would not have died otherwise.

However, Johns Hopkins reports now that about 600,000 of the 655,000 deaths were from violence, which is startling.

Per Doug Ireland at CommonDreams, much of the rejection of Johns Hopkins’s earlier study came from people who assumed the study counted deaths from violence, or only counted civilians killed by coalition forces, without actually reading what the study said.

Guterman’s article dissects the U.S. mass media’s attempts to dismiss the study’s findings while European newspapers front-paged the story. The results of Guterman’s interviews with the “experts” American newspapers relied upon to discredit the Lancet study should cause red faces at some of our national dailies. For example, “The Washington Post, perhaps most damagingly to the study’s reputation, quoted Marc E. Garlasco, a senior military analyst at Human Rights Watch, as saying, ‘These numbers seem to be inflated.’ “Mr. Garlasco says now that he had not read the paper at the time and calls his quote in the Post ‘really unfortunate.’He says he told the reporter, “I haven’t read it. I haven’t seen it. I don’t know anything about it, so I shouldn’t comment on it.’ But, Mr. Garlasco continues, ‘like any good journalist, he got me to.’

“Mr. Garlasco says he misunderstood the reporter’s description of the paper’s results. He did not understand that the paper’s estimate includes deaths caused not only directly by violence but also by its offshoots: chaos leading to lack of sanitation and medical care.”

The article cited in the quote above, by Lila Guterman, is here. Writing for the Chronicle of Higher Education, Guterman documented that American news media blew off the earlier study because (1) they didn’t bother to read it and (2) they don’t understand how statistics and statistical sampling work. (I admit I am in the latter category myself, but then so is just about everybody else.) For example, Fred Kaplan of Slate — someone I link to from time to time — complained that the wide range in the study of possible deaths, 8,000 to 194,000, was not an estimate, but a “dartboard.” Guterman explained that the researchers

… acknowledged that the true number of deaths could fall anywhere within a range of 8,000 to 194,000, a function of the researchers’ having extrapolated their survey to a country of 25 million.

But the statistics do point to a number in the middle of that range. And the raw numbers upon which the researchers’ extrapolation was based are undeniable: Since the invasion, the No. 1 cause of death among households surveyed was violence. The risk of death due to violence had increased 58-fold since before the war. And more than half of the people who had died from violence and its aftermath since the invasion began were women and children.

Because the initial reporting of the 2004 report was riddled with errors, many people to this day believe it was “debunked” by “experts,” when in fact the real experts who read the study praised it. But the real experts didn’t get quoted in American media. Back to Guterman:

Public-health professionals have uniformly praised the paper for its correct methods and notable results.

“Les has used, and consistently uses, the best possible methodology,” says Bradley A. Woodruff, a medical epidemiologist at the U.S. Centers for Disease Control and Prevention.

Indeed, the United Nations and the State Department have cited mortality numbers compiled by Mr. Roberts on previous conflicts as fact — and have acted on those results. …

… Mr. Roberts’s first survey in Congo, in 2000, estimated that 1.7 million people had died over 22 months of armed conflict. The response was dramatic. Within a month, the U.N. Security Council passed a resolution that all foreign armies must leave Congo, and later that year, the United Nations called for $140-million in aid to that country, more than doubling its previous annual request. Later, citing the study, the State Department announced a pledge of an additional $10-million for emergency programs in Congo.

(I recall that the Columbia Journalism Review also published a post-mortem of reporting on the 2004 report and concluded journalists screwed the story because they don’t understand statistics. However, this article is not online and I’m not sure what issue it was in — probably early 2005, but I don’t have it handy.)

A big reason the 2004 report was bashed was that The Lancet rushed to publish it before the 2004 election. (Contrary to rumor, the article did go through peer review before publication.) The VRWC Media Machine used that fact to bash the study as “political” and get it discredited (by people who either didn’t understand statistics or who hadn’t read the report, or both). And now they’re gearing up to “debunk” the new study the same way.

This time, however, the Washington Post is a little more careful about the experts it quotes. From today’s story by David Brown:

Ronald Waldman, an epidemiologist at Columbia University who worked at the Centers for Disease Control and Prevention for many years, called the survey method “tried and true,” and added that “this is the best estimate of mortality we have.”

This viewed was echoed by Sarah Leah Whitson, an official of Human Rights Watch in New York, who said, “We have no reason to question the findings or the accuracy” of the survey.

“I expect that people will be surprised by these figures,” she said. “I think it is very important that, rather than questioning them, people realize there is very, very little reliable data coming out of Iraq.”

Of those deaths, Brown reports,

A little more than 75 percent of the dead were men, with a greater male preponderance after the invasion. For violent post-invasion deaths, the male-to-female ratio was 10-to-1, with most victims between 15 and 44 years old.

Gunshot wounds caused 56 percent of violent deaths, with car bombs and other explosions causing 14 percent, according to the survey results. Of the violent deaths that occurred after the invasion, 31 percent were caused by coalition forces or airstrikes, the respondents said.

The percentage of Iraqis killed by coalition forces is declining, because Iraqis have stepped up and are killing each other at more robust rates.

Juan Cole comments (emphasis added):

This study is going to have a hard ride. In part it is because many of us in the information business are not statistically literate enough to judge the sampling techniques. Many will tend to dismiss the findings as implausible without a full appreciation of how low the margin of error is this time. Second, it is a projection, and all projections are subject to possible error, and journalists, being hardnosed people, are wary of them.

The New York Times report has already made a serious error, saying that deaths in the Saddam period were covered up. The families interviewed knew whether their loved ones were disappearing in 2001 and 2002 and had no reason to cover it up if they were. The survey established the baseline with a contemporary questionnaire. It wasn’t depending on Iraqi government statistics.

Another reason for the hard ride is that the Republican Party and a significant fraction of the business elite in this country is very invested in the Iraq War, and they will try to discredit the study. Can you imagine the profits being made by the military-industrial complex on all this? Do they really want the US public to know the truth about what the weapons they produce have done to Iraqis? When you see someone waxing cynical about the study, ask yourself: Does this person know what a chi square is? And, who does this person work for, really?

Then Anthony Cordesmann told AP that the timing and content of the study were political. But is he saying that 18,000 households from all over Iraq conspired to lie to Johns Hopkins University researchers for the purpose of defeating Republicans in US elections this November? Does that make any sense? And, if Cordesmann has evidence that the authors and editor set their timetable for completion and publication according to the US political calendar, he should provide it. If he cannot, he should retract.

Ironically enough, the same journalists who will question this study will accept without query the estimates for deaths in Darfur, e.g., which are generated by exactly the same techniques, and which are almost certainly not as solid.

Awhile back I was production editor of some scholarly scientific and sociology journals, and the damn things were ridden with chi squares and p-values and all manner of Greek letters, and I never did understand any of the statistical stuff. So, full disclosure, I’m not one to criticize ignorance of chi squares. But the people who do understand chi squares are saying the Johns Hopkins methodology is sound. Don’t let the righties tell you otherwise.

Update: Glenn Greenwald checks out some rightie sites and notes (sarcasm alert) that “Bush followers have become overnight expert statisticians.” But as Glenn explains in an update, these and other righties who dismiss the study out of hand “do not actually understand what the study is examining.”

They (and other of the above-linked Bush followers) seem to be laboring under the misunderstanding that the 650,000 death toll is the number of Iraqis who have died violent deaths since our invasion. That is not what the study is purporting to measure. The study is comparing the mortality rate of Iraqis during the time of our occupation (including deaths by any cause, such as disease, famine, or anything else) to the mortality rate prior to the occupation, and based on the post-invasion increased mortality rate (13.1 deaths per 1,000 persons post-invasion versus the pre-war 5.5 figure), calculates that more than 650,000 Iraqis have died during the occupation than would have died during the same time frame in the absence of the invasion.

Update update: Sam Rosenfeld at TAPPED links to two posts by Daniel Davies that support the methodology.

27 thoughts on “Adding Up the Commas

  1. Pingback: The Heretik : Page 12

  2. All those crutch-assisted righties better get down on their knees, comma, and pray, comma, to the God they purport to worship, comma, that these, comma, hundreds of thousands, comma, of, oh yeah, comma, Iraqi deaths, comma, who were are there to ‘liberate, comma, ARE, comma, oh, yeah, comma, OBVIOUSLY, comma, ‘as we suspected all along’, comma, BILL CLINTON’S FAULT ,,,,,,,,,,,,

    Take an ‘m’ out of Bush’s buzz word, comma, to identify his true condition…..

  3. If you have a little time to spare, you might want to listen to a story on the first Johns Hopkins study that appeared last fall on This American Life. It tries to make some sense for lay people of what the researchers were up to — and how hard they worked to stick to the established method, even when their lives were clearly at stake.

  4. Well I am medical epidemiologist. Here is the full article in all its glory/gory:

    http://www.thelancet.com/webfiles/images/journals/lancet/s0140673606694919.pdf

    It is a completely standard method used all the time… in public health (immunization coverage surverys), especially in developing countries where one does not have accurate census data or lists of persons. WHO,UNICEF and the like have lots and lots of experience doing this sort of thing. It can of course be used in political polling too, though usually in U.S. the serious professional pollsters using mutlti-stage sampling methods.

    However, that does not mean one cannot have a selection bias built into the design. Just because the methods are valid and well tested in principle, does not mean that a specific use of it was done in a valid manner.

    However, these are the BEST scientific estimates. Needless to say, when Bush called it a guess, he was lying. They are far more valid than the ridiculous “methods” otherwise being used which are just the deaths reported in newspapers. got that. official U.S. method is to count the obits. No chance of missing say 90-95% of the people that way.

  5. This is a perfect example of something that the right will never accept.

    It doesn’t matter how many experts say the method is accurate…they’ll always be able to trot out a hack or two that will tell them what they want to hear, and from then on, in their mind, everyone else is lying.

    We’ve seen in with evolution, global warming, basic history, what the constitution says, and on and on and on….

  6. Incorrect that we ‘righties’ labor under a misapprehension of what the study is purporting to reflect – the study, which I have read, thank you very much, says that 601,000 of the 655,000 ‘excess’ deaths were by violent means.

    That’s 500 a day, and it’s clearly absurd…

  7. That’s 500 a day, and it’s clearly absurd…

    Absurd … because you say so? Yes, that’s persuasive.

    War is, as General Sherman said, hell.

  8. The mortality figures are tragic and appalling, but by the time we finish wrangling with the wingnuts over what they mean, we may well be at war with Iran — in a conflict with mortality figures possibly an order of magnitude higher (if we’re lucky) or more (if we’re not).

    Chris Hedges is worried. The former Middle East bureau chief for The New York Times says a U.S. carrier group is headed for the Straits of Hormuz and will be in place to strike Iran by the end of the month. “It may be a bluff. It may be a feint. It may be a simple show of American power. But I doubt it,” he writes, warning about the “strange, twilight mentality that now grips most of the civilian planners who are barreling us towards a crisis of epic proportions.” Will Iran be Bush’s October surprise? And if so, will we survive his madness? Will the world?

  9. So when does the war crimes trial start? Even IF the number were 8000(which would be funny if we were not talking about human lives) that beats bin laden by roughly 5000.There is plenty of outrage about the 3000 bin laden killed on 911..yet no finger pointing at the mass murderer among us?I suggest to righties if they take issue with bush being compared to hitler then their guy ought to stop trying to match hitlers record…..Shocking what our country has become… just shocking!No rightie better EVER tell me about how cruel saddam was again or holy crap are they gonna get an ear full!

  10. A good source of info on the Lancet studies is Tim Lambert’s blog Deltoid. He’s an Aussie who I first started reading because of his explanations of how and where John Lott’s gun claims are bogus. He’s had many posts about the Lancet studies; you’d have to go there and search for them, like this . He’s also a good source for info on global warming denialists and the zombie DDT/malaria claims that ooze up from the rightwing’s abodes.

  11. That’s 500 a day, and it’s clearly absurd…

    The latest Iraqi Health Ministry figures count 89 violent deaths per day for Baghdad alone during September. Even before the war, the Health Ministry’s figures were considered a significant underestimate; there’s no reason to believe that it’s any more accurate today. Even if you accept those figures, 89 violent deaths per day extrapolates to about 5 per 1000 per year for Baghdad.

    But that’s just a monthly figure, which has varied over 2006. So let’s try another study. The UN estimated 14,000 violent deaths in Baghdad alone for the first half of 2006. That works out at about 4.5 per 1000 per year. And at 5 per 1000 per year, crudely extrapolated, you’re still up to 350+ per day. And Baghdad isn’t the most violent part of Iraq.

    Precisely what sort of violent death rate wouldn’t make you feel icky inside, Mark? Because it’s not ‘absurdity’ that you’re feeling, it’s the ickiness of numbers that you’re not prepared to think about in human terms. C’mon, fess up: give us a number of deaths per day that you don’t consider ‘absurd’, and we’ll show you how every single study exposes such gut nonsense as bullshit.

  12. Maha,

    There was one unfortunate distortion in the 100k Lancet report… they claimed that it was the “civilian toll”, but it was not… it was civilians and soldiers both. It was vastly different from Iraq Body Count, but that’s because IBC only counts civilians, and only those reported by the media. I think IBC was around 18k at the time, but it wouldn’t surprise me if 82k soldiers died in the initial invasion… they were being hit awfully hard to wipe out any resistance.

  13. There’s no sense in bickering about the number of innocent Iraqis who have been slaughtered as a result Bush’s invasion of Iraq. Simply find the middle number of innocent victims in the disparity between the low-balled numbers from Bush’s camp, and the actual numbers provided by the Lancet study. So, we add the 30,000 that Bush claims with the 655,000 that the Lancet claims, and then divide it by 2. 685,000 ÷ 2 = 342,500. That seems modest and reasonable,huh. Just shy of a half million, give or take a few wedding parties.

  14. From majikthise an excellent anylasis of the method and this quote:

    “They report that for 92% of reported deaths, the respondents were able to produce a death certificate.”

    I am not familiar with the customs there. How is it that the death certificates are not being tallied? Who issues the certificates? Who do they report to?

    My opinion is that the study is valid and the results probably accurate. Suppose that’s correct. Then the implication (if the government issues death certificates) is a MASSIVE cover-up by the Iraqi administration to mask the extent of the chaos. Which seems to me a bigger story than the study.

    I would love for a major ivy-league university (with no agenda) review the method and data and certify the results – before the election. It won’t happen, but I would love it.

  15. Several things to remember about the ‘official’ counts from the morgues and the Iraqi newspapers: many Iraqis do not apply for death certificates out of fear that officials of the opposition, whether Shi’ite or Sunni, will then have their names and addresses; also, many Iraqi families follow Islamic law and bury their loved ones in plain wooden boxes before sunset on the same day they die; many of these deaths are never reported ‘officially’ at all. There are reports of people burying their children in their yards because they are afraid to go to the graveyards, since targeting funerals is a fairly common tactic for the death squads. The morgue counts tend to be the 30-50 bound, tortured and executed men found daily in Bagdad; many of these bodies will never be claimed, due to the above fears.

    Echidne of the Snakes is running a series of posts on basic statistics if you are interested in learning how to evaluate studies and polls in an informed manner. I’m sure she will have something helpful to say about interpreting this study as well.

  16. “many Iraqis do not apply for death certificates ” — make up your mind then, because the authors of this report claim that it must be accurate because they have the death certificates to back it up.

    Ignoring methodology, either the Iraqi claims are accurate because of the death certificates, and if so where are the (400,000, 600,000, 900,000 – you take your pick) certificates? or there are no death certificates and so spomething must be horribly wrong with the research.

    Suffice it to say that the extra deaths in question, which have gone completely unnoticed by the world media – most of it salivating to say something anti-Bush, come out at massively more than the German civilian death toll from the allied bombing campaign in WW2.

  17. the authors of this report claim that it must be accurate because they have the death certificates to back it up.

    Are you sure? I skimmed through the report itself and found a couple of spots in which they said death certificates are not issued in many cases.

    Here, you can look it up yourself.

    I think you’re makin’ stuff up. Try again.

  18. Pingback: More Worlds » Archive » 655,000 deaths:the societal impact

  19. #16 and #17 contradict each other.

    Either the ratio of certs to deaths is low (unreported), or it is high (92%). But not both.

    If the Lancet study is correct, then passively counting certs is sufficient. The study didn’t explain this discrepancy.

    Reasons for study over-estimation:

    1. Substantial uncertainty in pre-invasion mortality baseline. Could vary from 5 to 6.8 which would remove many of the “excess deaths.”

    2. Households within clusters were NOT randomly chosen. If they were self-selecting then the whole study is bogus.

    Yes, I have stat background. Yes, cluster sampling is a valid technique. No, the Hopkins study did not apply it well.

  20. Sceptic — you don’t seem to have read the study. It says clearly on page 2 that the researchers went through a number of random selection processes to select the households within clusters.

    Since you didn’t read the study, I can’t take your opinion seriously.

  21. Maha, you sound as if you understand.neither statistics nor the study — and worse, jumped to insult me simply because I point to flaws in the study’s methods. That makes me think you are unwilling to suspend a political agenda in order to consider the study objectively.

    (For the record, I oppose the Iraq war but simply find the study poorly done. It is sometimes important to divorce one’s agenda from the science.)

    Now for the study:

    You are correct that the study (usually) selected its household start point randomly — but from then on selected contiguous households. That means the 1800 households are NOT independent but highly correlated.

    Nor did you read the study well. Even the start point “random selection” had a flaw: The report admits, “Decisions on sampling sites were made by the field manager. The interview team were given the responsibility and authority to change to an alternate
    location if they perceived the level of insecurity or risk to
    be unacceptable.” This means even the starting points were not chosen by a fully random process but instead by human intervention which is likely to carry bias.

    The study also states, “By confining the survey to a cluster of
    houses close to one another it was felt the benign purpose
    of the survey would spread quickly by word of mouth
    among households, thus lessening risk to interviewers.”

    Communication among households in a survey is even worse — it’s likely to increase the household-to-household correlation already present due to contiguity.

    I assert again that cluster sampling is a valid technique — but the Hopkins study applied it poorly.

    By the way, Pederson, experienced in such procedures and an author of the UN Iraq survey, has also made clear scepticism about the Hopkins study. (The Hopkins team praised him.)

  22. N.B. This point and others have been made by Apfelroth of the Albert Einstein College of Medicine, published by Lancet after the first Hopkins study:

    “In their Article on mortality before and after the 2003 invasion of Iraq (Nov 20, p 1857),1 Les Roberts and colleagues use several questionable sampling techniques that should have been more thoroughly examined before publication.

    Although sampling of 988 households randomly selected from a list of all households in a country would be routinely acceptable for a survey, this was far from the method actually used–a point basically lost in the news releases such a report inevitably engenders. The survey actually only included 33 randomised selections, with 30 households interviewed surrounding each selected cluster point. Again, this technique would be adequate for rough estimates of variables expected to be fairly homogeneous within a geographic region, such as political opinion or even natural mortality, but it is wholly inadequate for variables (such as violent death) that can be expected to show extreme local variation within each geographic region. In such a situation, multiple random sample points are required within each geographic region, not one per 739 000 individuals.

    In my opinion, such a flaw by itself is fatal, and should have precluded publication in a peer-reviewed journal. However, the authors’ sampling technique is also questionable in other ways.”

  23. #19 and #20. Neither is exactly right. The study states, “Survey teams asked for death certificates in 545 (87%) reported deaths and these were present in 501 cases.”

    The issue is thus not so much presence of death certs as the absence of a reliable central registry.

    (Were certs issued at the high rate of 501/545, and were they reliably registered centrally, then simple passive surveillance of the central registry would have given the results sought without need for a survey.)

  24. Sceptic — if you’ve read all of my posts on the Lancet study, you’d know I don’t claim to know statistics from spinach. However, in my experience 90 percent of the time posts such as yours are from paid trolls. I remain sceptical that you are legit.

    I will allow your comments so far to stand, but since this post has scrolled off the blog front page and no one else is commenting, I’m turning off comments on this post.

Comments are closed.