Wednesday, September 05, 2007

Some Elementary Analytics for the Surge

by Tom Bozzo

We picked up a link a few days ago from an unlikely source, the Protein Wisdom Pub. In a post that, among other things, describes our quals based on the political content of the blogroll ("fairly left-leaning"), it's asserted that "statistics comparing 2006 with 2007" deployed by major figures of the left blogiverse such as Kevin Drum and Matthew of Large Media, "[tell] us nothing about the success of the new US counter-insurgency strategy" because the surge "did not have any implementation before mid-February." The preferred analysis claims a downward turn to the civilian casualty trend line (eyeballed or the equvalent from seasonally unadjusted and otherwise tweaked data) since the start of the surge, and does so by comparing data from this summer with the winter and/or spring.

Now, I'm no Jim Hamilton, but I can say with reasonable authority that the claim that year-over-year comparisons of data from Iraq are necessarily uninformative is all wet. Indeed, such comparisons are a simple way of eliminating some seasonal effects from data. With a simple model of data generation (e.g., the seasonal effects are additive), you can verify by simple algebra that a year-over-year difference will eliminate the seasonal factor [*], whereas intra-year comparisons and comparisons between different periods in different years will tend to confound seasonal and trend effects.

So, for the intrayear comparisons to be (relatively) valid as an indicator of trend reversal, it must be either that the seasonal effects are small, or that they just so happen to cancel out. Indeed, PW Pub's Karl argues that the effects are small, in part based on analysis at the econoblog Creative Destruction, which ran a seasonal adjustment model on the fatality data for the U.S. forces.

The results at Creative Destruction, which yield a seasonally adjusted 101 U.S. military deaths for July 2007 from the actual 79, aren't actually helpful to the trend reversal argument. Apart from a peak in May, seasonally adjusted casualties suggest a plateau at a rate of around 100 deaths/month for the surge through July (the analysis was posted on August 2), which is very high for any extended period. As I'd noted previously, while there's a lot of variation in the data, it's clear without any special quantitative analysis that the summer months are never the annual peak.

However, there is a bit more to the argument out of Wingnuttia that perhaps merits some additional discussion. It's suggested that year-over-year comparisons confound an effect of the surge with that of an increase in violence against civilians which is dated to the February '06 bombing of the Golden Mosque in Samarra, even though the real takeoff in the data (from isn't until the fall of '06. The implication that civilian life in Iraq was just hunky-dory in 2004-2005 is questionable at best, though I wouldn't dispute a claim that the the SNAFU has acquired more CF since. [**]

The argument is that if you think things are bad now, imagine how much worse they'd be but for the surge. That's not exactly a narrative of triumph, considering the difficulty of maintaining the surge, let alone any further escalation of a magnitude that might be thought to be able to restore the status quo ante.

Worse still, the supposed reversion in the civilian fatality trend is, itself, showing signs of reverting in the bad direction. The count at, after dropping from 1,782 in May to 1,148 in June, increased sequentially in July and August — 1,458 and 1,598, respectively. The latter, in particular, is at the level of the civilian casualty peak or plateau (using data that are adjusted as described, in part, here) from the pre-surge months.

So, really, there's sod all to show for the surge. Coalition military fatalities are high, Iraqi civilian fatalities are high, and by many other metrics Iraq remains an unholy mess. All this at a cost to the U.S. taxpayer of several tens of billions of dollars at an annual rate. Who wouldn't want more surge? [/sarcasm]

The last thing is about the adjustments to the civilian casualty data, which remove a few peaks from the raw data. In one case, the adjustment is ostensibly justified, as it eliminates the effects of a temporary change in the count methodology [***]. In another, the rationale is, to say the least, curious. "Engram" deletes the 965 deaths in August of '05 from the Al-A'imma bridge stampede because:
Those tragic deaths were clearly an aberration and should not be included in a graph that tries to assess trends in the level of violence in Iraq.
The reported cause of the stampede was a rumor that a suicide bomber was amid the crowd, which had been subjected to mortar attacks earlier in the day. So this is hardly a non-terrorism-related incident, even if the spread of the rumor wasn't itself an act of terrorism. The irony is that Engram is a member of the Althousian 9/11-changed-everything set:
Pre-9/11, I was a politically incurious liberal, but my curiosity increased substantially -- and my views changed considerably -- after 9/11.
By the same logic, if you're trying to assess trends in the level of violence in the U.S., the tragic deaths of 9/11 are clearly an aberration. So what the hell are we doing there?

[*] This assumes the seasonal component enters additively in the current and previous-year data. More sophisticated seasonal adjustment approaches allow for situations such as seasonal effects that vary over time.

[**] The subsequent discussion ignores the significant obstacles to reliably measuring the consequences of the war for the civilian population; the data is an incomplete and unverified tabulation from news accounts

[***] For some purposes, it actually can be better to have data that are wrong in a consistent way.

Labels: , ,

Comments: Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?