Misleading with statistics
The headline on the AP story is breathless. “Army suicides highest in 26 years!”
That basic fact is true; Army suicides are up sharply, just like they spiked during the first Gulf War. The 2006 rate was 17.3 suicides per 100,000, a near doubling of the low of 9.1 per 100,000 in 2001.
But a closer look at the numbers is in order before we start jumping to conclusions.
The 17.3 rate translates into 99 suicides out of a population of about 500,000 soldiers. So it’s hardly an epidemic.
And if you compare it to civilian suicide rates, it’s even less of an issue. A pair of pdfs here produce the following table:
2004 CIVILIAN SUICIDE RATES (per 100,000 population)
Ages 15-24: 10.4
Wait a second, you say. Other than that “males” category, the military suicide rate is clearly much higher than the civilian rates.
But look what happens when we break down the “age” category even further and combine it with gender:
Males, age 15-19: 12.65
Males, age 20-24: 20.84
You can see where I’m going here. Soldiers are mostly males in their early 20s. So a proper comparison of apples to apples shows that the military suicide rate, despite being at a 26-year high, is still lower than the comparable civilian rate. All that in spite of combat stress, the stress of being part of a “stretched” military, and access to all sorts of military-grade weaponry.
People are right to be concerned. The rate has doubled, after all. It’s clearly a symptom of strain and each one is a personal tragedy besides. The military should do what it can to reduce those numbers.
But let’s not overreact. The problem is small, and soldiers are still less likely to kill themselves than civilians are. This is more an example of shallow and innumerate reporting than it is a sign of serious problems in the military.