Thursday, February 28, 2008

USA! USA! We're #1!

In your face, China!


Record-High Ratio of Americans in Prison
By N.C. Aizenman
February 28, 2008 | The Washington Post

More than one in 100 adults in the United States is in jail or prison, an all-time high that is costing state governments nearly $50 billion a year, in addition to more than $5 billion spent by the federal government, according to a report released today.

With more than 2.3 million people behind bars at the start of 2008, the United States leads the world in both the number and the percentage of residents it incarcerates, leaving even far more populous China a distant second, noted the report by the nonpartisan Pew Center on the States.

The ballooning prison population is largely the result of tougher state and federal sentencing imposed since the mid-1980s. Minorities have been hit particularly hard: One in nine black men age 20 to 34 is behind bars. For black women age 35 to 39, the figure is one in 100, compared with one in 355 white women in the same age group.

While studies generally find that imprisoning more offenders reduces crime, the effect is influenced by changes in the unemployment rate, wages, the ratio of police officers to residents, and the share of young people in the population.

In addition, when it comes to preventing repeat offenses by nonviolent criminals -- who make up about half of the incarcerated population -- alternative punishments such as community supervision and mandatory drug counseling that are far less expensive may prove just as or more effective than jail time.

Florida, which nearly doubled its prison population over the past 15 years, has experienced a smaller drop in crime than New York, which, after a brief increase, reduced its number of inmates to below the 1993 level.

"There is no question that putting violent and chronic offenders behind bars lowers the crime rate and provides punishment that is well deserved," said Adam Gelb, director of the Pew Center's Public Safety Performance Project and one of the study's authors. "On the other hand, there are large numbers of people behind bars who could be supervised in the community safely and effectively at a much lower cost -- while also paying taxes, paying restitution to their victims, and paying child support."

About 91 percent of incarcerated adults are under state or local jurisdiction, and the report documents the tradeoffs state governments have faced as they have devoted ever larger shares of their budgets to house them. For instance, over the past two decades, state spending on corrections (adjusted for inflation) increased by 127 percent, while spending on higher education rose by 21 percent. For every dollar Virginia spends on higher education, it now spends about 60 cents on corrections. Maryland spends 74 cents on corrections per higher-education dollar.

Despite reaching its latest milestone, the nation's incarcerated population has actually been growing far more slowly since 2000 than during the 1990s, when the spate of harsher sentencing laws began to take effect. These included a 1986 federal law mandating prison terms for crack cocaine offenses that were up to eight times as long as for those involving powder cocaine. In the early 1990s, states across the nation adopted "three-strikes-you're-out" laws and curtailed the discretion parole boards have in deciding when to release an inmate. As a result, between 1990 and 2000, the prison population swelled by about 80 percent, increasing by as much as 86,000 per year.

By contrast, from 2007 to 2008, the prison population increased by 25,000 -- a 2 percent rise. Meanwhile, the Supreme Court has issued decisions giving judges more leeway under mandatory sentencing laws, and a number of states, including Texas, are seeking to reduce their incarcerated population by adopting alternative punishments.

"Some of these [measures] would have been unthinkable five years ago," noted Gelb. "But the bottom line is that states have to balance their budgets.

Kosovo, 'humanitarian hawks,' and Iraq

Kosovo and the Rise of the Humanitarian Hawks
Matthew Yglesias
February 21, 2008 | Prospect.org


With Kosovo's formal declaration of independence from Serbia on Monday, and the United States' decision to extend recognition to the planet's newest country, the time has come for a look back on the approximately 10 years of intense U.S. involvement in that conflict. Kosovo is a tiny, seemingly worthless patch of land lacking in all natural resources, but it plays a strangely large role in our foreign-policy debates. During arguments about the Iraq War, in particular, liberal hawks had a habit of wielding the poor Kosovar Albanians as a cudgel: If you supported Bill Clinton's 1999 bombing campaign, the argument went, then surely you could support a war against Saddam Hussein.

Then and now, many pro-Kosovo, anti-Iraq liberals could persuasively (Kenneth Roth's 2004 "War in Iraq: Not a Humanitarian Intervention" is my personal favorite) argue that various factors distinguish the two cases. Still, the argument was never about a strict Kosovo-implies-Iraq logic. Rather, first Bosnia and then Kosovo provided the impetus for an intellectually influential, humanitarian hawk movement aimed at advocating the use of military force to advance liberal values whose leaders, inspired by the success of Kosovo, saw Iraq as potentially continuing the momentum built up in the Balkans.

Today, there are few left-of-center defenders of the Iraq War as it actually exists, but there continues to be considerable concern about an "Iraq Syndrome" overreaction to the chaos that has followed the invasion. Kosovo, in this scheme, is supposed to be the "good war" that serves as a reminder of the positive potential of military force. Thus, even as center-left figures agree that the unilateralism of the Bush era must come to an end, there's a desperate search to find some new mechanism -- perhaps a Global NATO or perhaps a Concert of Democracies -- that could authorize a war that, like Kosovo, is fought neither in self-defense nor in defense of an ally nor with the approval of the U.N. Security Council.

In that light, it's worth taking full measure of how modest our accomplishments in Kosovo have been. The declaration of independence marks not the fulfillment of NATO's objectives in Kosovo, but something more like NATO accepting the fact that those objectives will not be achieved. Rather than a rights-respecting democratic government for an autonomous province, we have a ramshackle state dependent on external support dominated by a sectarian party and where the country's Serb minority rejects the legitimacy of the government and refuses to acknowledge the country's newfound independence. Our successful effort to halt and then reverse the ethnic cleansing of Kosovar Albanians was merely followed by a substantial counter-cleansing of Serbs. Human Rights Watch's memorandum on the occasion of Kosovo independence is a bleak reminder of what success has looked like:

Violence against minorities has been a persistent feature of Kosovo's post-war history. Minority communities, including Albanian-speaking Ashkali, were the primary target of the March 2004 riots in Kosovo. Today, with much of Kosovo's Albanian and Serb population separated geographically, there are fewer incidents. But security incidents continue in the remaining ethnically mixed areas, including physical assaults, theft, and violent property-related disputes. Acts of vandalism against Orthodox churches and monasteries continue, damaging confidence and undermining community relations.

Now that Kosovo is formally separate from Serbia, it seems overwhelmingly likely that protecting the rights of the country's geographically concentrated Serb minority will either require the indefinite presence of international troops or else a further round of secession in which Serb sections of Kosovo are carved out and allowed to re-integrate with Serbia. This still looks defensible compared to the alternative course of action of standing aside and letting Milosevic have his way with the province. But many hawks looked at Kosovo and saw not a boundary case for when the use of force might be legitimate, but a new baseline against which future interventions should be judged. If you were willing to use force against Milosevic, the thinking went, then why not Saddam? Why not Sudan? This line of thinking came to a bad end in Mesopotamia, but many harken back to the Balkans to try to make the case that Iraq should be considered an exception and not something that casts aspersions on the utility of unconstrained American power. In reality, Kosovo, though much less disastrous than Iraq, has, like Iraq, turned out to be more problematic than enthusiasts advertised and should, like Iraq, mostly inspire humility about what we can expect to achieve through force.

The truth, though disappointing from the point of view of journalism, is that the most promising humanitarian elements of foreign policy tend to be the boring ones. Timely and effective diplomacy can often avert humanitarian catastrophes before they break out at much lower cost than coercive force can end them once they've started. And the U.N.'s traditional peacekeeping operations, where parties to a conflict request third-party troops to help monitor and enforce a peace deal, have a solid track record of success but are perennially under-resourced by an indifferent United States. Greater commitment -- political, financial, and (when appropriate) military -- to these kinds of operations would bring much larger humanitarian benefits than would any hypothetic humanitarian wars.

Nevertheless, the debate in the United States remains oddly dominated by the specter of unilateral military coercion as a potential tool of humanitarianism, as if the only viable alternative to a callous indifference to the fate of foreigners is to have the Pentagon identify some "bad" foreigners and kill them. One doesn't need to regret the 1999 military operation itself to regret the ways in which hubristic overestimates of what's been achieved in the Balkans have fed into this mentality.

U.S. & coalition casualties in Iraq

It's very sad when you look at all our soldiers' faces, most of them in their 20s, and realize they should be alive right now. They didn't have to die.


Iraq and Coalition Casualties
CNN

There have been 4,278 coalition deaths -- 3,972 Americans, two Australians, 174 Britons, 13 Bulgarians, one Czech, seven Danes, two Dutch, two Estonians, one Fijian, one Hungarian, 33 Italians, one Kazakh, one Korean, three Latvian, 22 Poles, three Romanians, five Salvadoran, four Slovaks, 11 Spaniards, two Thai and 18 Ukrainians -- in the war in Iraq as of February 26, 2008, according to a CNN count. (Graphical breakdown of casualties). The list below is the names of the soldiers, Marines, airmen, sailors and Coast Guardsmen whose deaths have been reported by their country's governments. The list also includes seven employees of the U.S. Defense Department. At least 29,203 U.S. troops have been wounded in action, according to the Pentagon. View casualties in the war in Afghanistan and examine U.S. war casualties dating back to the Revolutionary War.

Monday, February 25, 2008

Iraq & Afghanistan to cost $3 TRILLION

By Joseph Stiglitz and Linda Bilmes
February 23, 2008 |  Times Online

The Bush Administration was wrong about the benefits of the war and it was wrong about the costs of the war. The president and his advisers expected a quick, inexpensive conflict. Instead, we have a war that is costing more than anyone could have imagined.

The cost of direct US military operations - not even including long-term costs such as taking care of wounded veterans - already exceeds the cost of the 12-year war in Vietnam and is more than double the cost of the Korean War.

And, even in the best case scenario, these costs are projected to be almost ten times the cost of the first Gulf War, almost a third more than the cost of the Vietnam War, and twice that of the First World War. The only war in our history which cost more was the Second World War, when 16.3 million U.S. troops fought in a campaign lasting four years, at a total cost (in 2007 dollars, after adjusting for inflation) of about $5 trillion (that's $5 million million, or £2.5 million million). With virtually the entire armed forces committed to fighting the Germans and Japanese, the cost per troop (in today's dollars) was less than $100,000 in 2007 dollars. By contrast, the Iraq war is costing upward of $400,000 per troop.

Most Americans have yet to feel these costs. The price in blood has been paid by our voluntary military and by hired contractors. The price in treasure has, in a sense, been financed entirely by borrowing. Taxes have not been raised to pay for it - in fact, taxes on the rich have actually fallen. Deficit spending gives the illusion that the laws of economics can be repealed, that we can have both guns and butter. But of course the laws are not repealed. The costs of the war are real even if they have been deferred, possibly to another generation.

On the eve of war, there were discussions of the likely costs. Larry Lindsey, President Bush's economic adviser and head of the National Economic Council, suggested that they might reach $200 billion. But this estimate was dismissed as "baloney" by the Defence Secretary, Donald Rumsfeld. His deputy, Paul Wolfowitz, suggested that postwar reconstruction could pay for itself through increased oil revenues. Mitch Daniels, the Office of Management and Budget director, and Secretary Rumsfeld estimated the costs in the range of $50 to $60 billion, a portion of which they believed would be financed by other countries. (Adjusting for inflation, in 2007 dollars, they were projecting costs of between $57 and $69 billion.) The tone of the entire administration was cavalier, as if the sums involved were minimal.

Even Lindsey, after noting that the war could cost $200 billion, went on to say: "The successful prosecution of the war would be good for the economy." In retrospect, Lindsey grossly underestimated both the costs of the war itself and the costs to the economy. Assuming that Congress approves the rest of the $200 billion war supplemental requested for fiscal year 2008, as this book goes to press Congress will have appropriated a total of over $845 billion for military operations, reconstruction, embassy costs, enhanced security at US bases, and foreign aid programmes in Iraq and Afghanistan.

As the fifth year of the war draws to a close, operating costs (spending on the war itself, what you might call "running expenses") for 2008 are projected to exceed $12.5 billion a month for Iraq alone, up from $4.4 billion in 2003, and with Afghanistan the total is $16 billion a month. Sixteen billion dollars is equal to the annual budget of the United Nations, or of all but 13 of the US states. Even so, it does not include the $500 billion we already spend per year on the regular expenses of the Defence Department. Nor does it include other hidden expenditures, such as intelligence gathering, or funds mixed in with the budgets of other departments.

Because there are so many costs that the Administration does not count, the total cost of the war is higher than the official number. For example, government officials frequently talk about the lives of our soldiers as priceless. But from a cost perspective, these "priceless" lives show up on the Pentagon ledger simply as $500,000 - the amount paid out to survivors in death benefits and life insurance. After the war began, these were increased from $12,240 to $100,000 (death benefit) and from $250,000 to $400,000 (life insurance). Even these increased amounts are a fraction of what the survivors might have received had these individuals lost their lives in a senseless automobile accident. In areas such as health and safety regulation, the US Government values a life of a young man at the peak of his future earnings capacity in excess of $7 million - far greater than the amount that the military pays in death benefits. Using this figure, the cost of the nearly 4,000 American troops killed in Iraq adds up to some $28 billion.

The costs to society are obviously far larger than the numbers that show up on the government's budget. Another example of hidden costs is the understating of US military casualties. The Defence Department's casualty statistics focus on casualties that result from hostile (combat) action - as determined by the military. Yet if a soldier is injured or dies in a night-time vehicle accident, this is officially dubbed "non combat related" - even though it may be too unsafe for soldiers to travel during daytime.

In fact, the Pentagon keeps two sets of books. The first is the official casualty list posted on the DOD website. The second, hard-to-find, set of data is available only on a different website and can be obtained under the Freedom of Information Act. This data shows that the total number of soldiers who have been wounded, injured, or suffered from disease is double the number wounded in combat. Some will argue that a percentage of these non-combat injuries might have happened even if the soldiers were not in Iraq. Our new research shows that the majority of these injuries and illnesses can be tied directly to service in the war.

From the unhealthy brew of emergency funding, multiple sets of books, and chronic underestimates of the resources required to prosecute the war, we have attempted to identify how much we have been spending - and how much we will, in the end, likely have to spend. The figure we arrive at is more than $3 trillion. Our calculations are based on conservative assumptions. They are conceptually simple, even if occasionally technically complicated. A $3 trillion figure for the total cost strikes us as judicious, and probably errs on the low side. Needless to say, this number represents the cost only to the United States. It does not reflect the enormous cost to the rest of the world, or to Iraq.

Thursday, February 21, 2008

GW Bush quote of the day

Yesterday President Bush said in Liberia:

"You know one thing I've learned, and I suspect the people of Liberia have learned, is it's easier to tear a country down than it is to rebuild a country."


Ya'll got that? Destroying = easy. Rebuilding = hard. Go write that down someplace so you don't forget it!


Now, let's just hope our next President won't require 8 years and 2 wars to learn the same lesson....

Re: Michelle Obama's America - And Mine

I can't speak for all young Americans, and whether they're proud of their country. I know even less about Michelle Obama.

But I'm highly suspicious of Michelle Malkin's brand of pride as she describes it. It's troubling -- although typical of jingoists wrapping themselves in the flag -- that most of her points of pride in America relate to our military: Memorial Day, Veterans Day, Medal of Honor ceremonies, the Blue Angels, Pearl Harbor, and the U.S.S. Abraham. Sure, I'm proud of our military, I respect its difficult job, but I don't worship it. And I never forget that its real job is to kill people -- a necessary job at times, but always a regrettable one, not a blessed calling.

Malkin's is a pride that says, "Ignore or deny all your country's warts and defects; love it or leave it." Hers is a pride that she wears on her sleeve like a badge of honor. But advertising that you love your country, or even genuinely feeling it, is not an accomplishment. It is not even a virtue. If all you do is say and feel it, it's an empty emotion.

Indeed, if that feeling of pride compels you to deny real problems in your country, and make apologies for your country's faults, then it is an unhealthy emotion. It is exactly what your country doesn't need from you. It is like the mother who is the only one who can't see her child, a real delinquent, isn't mommy's little angel. Her love for her child, without strings or conditions, is admirable, but it is actually harmful to her child, because such blind love enables continued bad behavior and impedes the child's development.

If you love your country like you love your child, then you want your country to do what's right and make you proud. You feel a responsibility to help it. You never renounce your country, because it's a part of you and who you are. By accident of birth, you have no choice in the matter, just as children don't choose their parents and vice-versa.

Moreover, just as parents should not live vicariously through their children, so too citizens should not draw excessive pride from their country's accomplishments. Similarly, children should be proud of their parents and grandparents and where they came from, but they should not rest on their forebears' reputation and past accomplishments -- or worse, behave like privileged, spoiled brats because of their great inheritance. Children have a responsibility to find their own way and make their own reputation.

So, like Michelle Malkin, I'm proud of my country for things like Iwo Jima and George Bush's funding of anti-HIV in Africa. But those deeds -- and this is so often forgotten! -- are not the measure of me personally. Simply for being lucky enough to be born in America, I do not have the right to borrow against America's greatness and claim it makes me a better person, or a person worth looking up to. And my country's greatness certainly does not relieve me of my responsibility to be a concerned citizen who wants to improve his country -- not just so that America continues to be "great" (whatever that means to whomever) and envied, but so that it is actually a better place to live in than today. After all, the real measure of a parent's love is wanting what's best for his children and encouraging them to be their best; it is not covering up or turning a blind eye to his children's faults. That is why parenting often requires tough love and discipline.

Likewise, America is crying out right now for tough love from Americans who really love it. And yes, that love will require change.

The last thing America needs right now is more blind flag-wavers who have nothing to offer us except their sappy, jingoistic, more-patriotic-than-thou attitude. Those people -- even if they think they mean well -- are doing us absolutely no good at all.

Putin threatens the West?

Yeltsin's reforms! That's a real laugh.

I disagree with many points in this article by Martin Wolf of the Russophobic Financial Times. Michael McFaul and Anders Aslund, two Russia "experts" cited here, are particularly execrable. Russia's GDP declined so astronomically in the 90s precisely because it took the IMF and West's advice to privatize state enterprises before competition and rule of law were in place, liberalize prices (which led to immediate hyperinflation), and open up financial markets before real banks existed and financial controls were in place. All of this resulted in overnight loss of savings and purchasing power by the people, asset stripping by managers and corrupt politicians, and money laundering out of the country.

The Nobel economist Joseph Stiglitz wrote about this quite eloquently in Globalization and Its Discontents. In the future I will scan some relevant chapters from his book, because it is so instructive.

As for lack of democracy in Russia... We can't blame the West, but the West -- and yes, Bill Clinton! -- certainly undermined the legitimacy of democracy in Russians' eyes by supporting corrupt "reformers" like Chubais and Yeltsin. Demokratia (democracy) became known as dermocratia (shitocracy) among ordinary Russians. America in particular was so frightened of a resurgent Communist Party that it was ready to throw money and support at anybody parroting the "free markets" and "democracy" shtick. The tragedy is that the Communist Party was and remains the only real party, by Western standards, in Russia. It could have been encouraged to transform into a modern Social Democratic party on par with many other European democracies, but instead it was marginalized, undermined, and then ignored. The irony is that anybody who was anybody in Russia was a former party member, including Yeltsin.

Poland, which is cited here as a success story, initially followed "shock therapy," but then considerably slowed down the pace of reform, and concentrated on sequencing. As Stiglitz argues, timing and sequencing, not to mention avoiding cookie-cutter approaches advocated by the IMF and World Bank, are the real keys to successful reform.

Unfortunately, many Russians now hold a "conspiracy theory" view of the West's motives in the 90s; they think our real goal was to cripple and emasculate Russia under the guise of smiling friendship. I don't agree with this view, but I can certainly understand why they feel that way. Even more, I can understand why they're telling us now to take our sanctimonious advice about democracy and rule of law and shove it. (Our invading Iraq didn't do much for our moral authority either).

The next U.S. president should first make an apology to Russia for our, er, well-intentioned errors in the 90s. And he should reverse NATO expansion and anti-missile systems on Russia's doorstep. If the next president would do those three things, we'd find Russia -- especially ordinary Russian voters! -- much more receptive to friendly relations and our "expert" advice.

Wednesday, February 20, 2008

David Brooks on Obama: 'OCS' vs. 'MBS'

David Brooks' latest op-ed on Barack Obama is a disgusting example of how the mainstream media covers the campaigns, which has been described in loathsome detail by journalist Matt Taibbi. First the media build up Obama to god-like proportions, and then they reserve the right to knock him down again.

Brooks consistent use of the general "they" to refer to Obama's alleged legions of over-frenzied, hope-addicted supporters is a journalistic cover for: "I did no research, I talked to nobody, and got no quotes for my story." By coining the name of a phenomenon ("Obama Comedown Syndrome") without offering any evidence to support it, he is substituting bald assertion for facts. What Brooks may in fact be describing is the media's ennui: I wouldn't be surprised if they really were tired of talking about Obama as if he were JFK, Jesus Christ, and Muhammed Ali all wrapped up in one. God knows I am. But I don't blame Obama for it.

To be fair, Obama never claimed to be a miracle-worker; nor did he ask to be made into some super-human answer to everybody's political prayers. The mainstream media made him into that. And perhaps they'll decide to yank that distinction away from him in a matter of weeks or months. Who knows? That's their right as America's cynical king-makers.

In any case, it's a superficial distinction, and it's unfair to Obama himself and to the other candidates. Every candidate should be judged by the same standard. Building up Obama only to knock him down again is a cynical ploy by the media to drum up public interest & TV ratings, and to demonstrate their power over our elections.

So, in reply to Brooks and his pals, allow me coin a term: MBS ("Media Bullshit Syndrome"). I'm seeing it all over the country.

Sunday, February 17, 2008

Obama as third party candidate?

I'm not in the U.S., so maybe I missed some pundit making this prediction, but... As far as I know, I'm the first to say this: Don't be surprised if Obama seriously considers running for President as an Independent if he wins the Democrats' popular delegate count, but loses the superdelegates to Hillary.

If he wins the primary delegates but loses the superdelegates, he'll have every justification to argue that the Democrats allowed Clinton-connected party insiders to trump the will of the party's base.

Besides being loved by Democrats, Obama has a lot of independent and even Republican voters' support. And I like Obama's chances running against two of conservatives' current worst enemies: Hillary and McCain. If Obama decided to run on his own, the traditional logic that a third party candidate saps votes from one or the other of the two parties would not hold: Obama would steal votes from both of them.



Clinton urges Superdelegates to support her, despite Obama's primary wins
By Peter Slevin and Jose Antonio Vargas

February 17, 2008 | Washington Post

URL: http://www.washingtonpost.com/wp-dyn/content/article/2008/02/16/AR2008021602657.html?hpid=topnews

Saturday, February 16, 2008

50 million refugees flee to Canada & Mexico

Next time, before we complain about those "lazy Iraqis" who can't make peace, pass laws, or get their act together, let's keep in mind just how badly we've messed up Iraq.

Imagine if 50 million of America's best and brightest had left their homes and their jobs, and fled their towns or left for Canada and Mexico, leaving it to the remaining 250 million Americans to make do.

About 10 percent of Americans 25 and older -- that's 19.4 million citizens -- have advanced degrees. Imagine if most of those 19.4 million Americans suddenly quit their jobs, their homes, and fled with their families. The next day, would Wall Street be able to function? Would most hospitals? Our schools and universities? The courts? The electrical power grids? Then imagine if some foreign country lectured and implored those Americans left behind to simply step in, take over, and get to work running things. Imagine the manpower shortages, the mayhem, the mistakes!



Imagine 50 Million American Refugees
Tom Engelhardt
February 11, 2008 | TheNation.com Blog


I'm an innumerate, but the figures on this -- the saddest story of our Iraq debacle -- are so large that even I can do the necessary computations. The population of the United States is now
just over 300,000,000. The population of Iraq at the time of the U.S. invasion was perhaps in the 26-27 million range. Between March 2003 and today, a number of reputable sources place the total of Iraqis who have fled their homes -- those who have been displaced internally and those who have gone abroad -- at between 4.5 million and 5 million individuals. If you take that still staggering lower figure, approximately one in six Iraqis is either a refugee in another country or an internally displaced person.


Now, consider the equivalent in terms of the U.S. population. If Iraq had invaded the United States in March 2003 with similar results, in less than five years approximately 50 million Americans would have fled their homes, assumedly flooding across the Mexican and Canadian borders, desperately burdening weaker neighboring economies. It would be an unparalleled, even unimaginable, catastrophe. Consider, then, what we would think if, back in Baghdad, politicians and the media were hailing, or at least discussing positively, the "success" of the prime minister's recent "surge strategy" in the U.S., even though it had probably been instrumental in creating at least one out of every ten of those refugees, 5 million displaced Americans in all. Imagine what our reaction would be to such blithe barbarism.


Back in the real world, of course, what Michael Schwartz terms the "tsunami" of Iraqi refugees, the greatest refugee crisis on the planet, has received only modest attention in this country (which managed, in 2007, to accept but 1,608 Iraqi refugees out of all those millions -- a figure nonetheless up from 2006). As with so much else, the Bush administration takes no responsibility for the crisis, nor does it feel any need to respond to it at an appropriate level. Until now, to the best of my knowledge, no one has even put together a history of the monumental, horrific tale of human suffering that George W. Bush's war of choice and subsequent occupation unleashed, or fully considered what such a brain drain, such a loss of human capital, might actually mean for Iraq's future.


But the author of the upcoming book, War Without End, The Iraq Debacle in Context, Michael Schwartz has just taken the first pass at history when it comes to this crisis. "Iraq's Tidal Wave of Misery" is, in fact, a monumental effort, laying out the three great waves of Iraqi displacement and dispossession: The first of these came in 2003 with the American occupation's policies of massive de-Baathification of the Iraqi government, demobilization of the Iraqi military, and the shutting down of Iraq's state-owned industries (combined with the rise of a widespread business in kidnapping); the second came when, in 2004, the U.S. military began to attack and invade insurgent strongholds, as they did the Sunni city of Falluja, using the full kinetic force of its massive fire power; the third came with the rise of a Sunni/Shia civil war and campaigns of ethnic cleansing, especially in the Iraqi capital, Baghdad (helped along by the U.S. "surge strategy").


Schwartz lays out the staggering, "tsunami"-level numbers involved, analyzes the disproportionate number of people with professional, managerial, or administrative backgrounds who fled the country ("... whereas less than 1% of Iraqis had a postgraduate education, nearly 10% of refugees in Syria had advanced degrees, including 4.5% with doctorates..."), gives a sense of the pain and deprivation inflicted, and above all suggests what it means for the future of a country like Iraq to have had such a "brain drain," such a largely irreversible loss of "human capital."


He concludes:


"From the vast out-migration and internal migrations of its desperate citizens comes damage to society as a whole that is almost impossible to estimate. The displacement of people carries with it the destruction of human capital. The destruction of human capital deprives Iraq of its most precious resource for repairing the damage of war and occupation, condemning it to further infrastructural decline. This tide of infrastructural decline is the surest guarantee of another wave of displacement, of future floods of refugees. As long as the United States keeps trying to pacify Iraq, it will create wave after wave of misery."

Hungry to know what to eat

Cute, but spot on. It is ironic that we "orthorexic" Americans are the most concerned with detailed nutritional labeling and all the latest food research, and yet we are the flabbiest, most diabetic, hypertensive and infarcted people in the developed world. Forget the health jargon; bring back the "Great-Granny Diet!"


I'm hungry for some certainty
By Ellen Goodman
February 15, 2008 | Washington Post 

I am sitting at the breakfast table taking my medicine. This drug is a cup of coffee formerly identified by its native and urban origins: Sumatra and Peet's. But now it has been declared good for what might eventually ail me, if what might ail me is Parkinson's disease or colon cancer. Coffee has also been praised as a prevention for diabetes in Minnesota and cursed as a risk for diabetics in North Carolina, but I am in Massachusetts.

On my place mat is a bowl of antioxidants formerly known as blueberries. These round little health capsules have been scientifically evaluated as a barrier against mental decline and cancer. Alas, they come from Chile, which is not good for my carbon footprint.

I am pondering an egg, which was once considered a suicidal act, death by cholesterol. Now it is praised for its carotenoids - lutein and zeaxanthin - essential for healthy eyes.

These healthy eyes are needed to read the newspaper stories in front of me full of the latest food health bulletins. The first dateline is New York, which has joined the crowd in banning those evil trans fats that were once our salvation against those devilish animal fats. Now the city has also decided that calories of every dish should be posted in chain restaurants.

The second dateline is Seattle, which has one-upped the East Coast. Its new law will not only list the calories but the carbohydrates, fats, and sodium lurking in the beurre blanc, crème fraîche, and Big Mac.

How did it come to this? How did eating become a science rather than an art? How did food become conflated with medicine? We now have shelves full of boxes with bragging rights promising better eating through chemistry. Meanwhile, our uncertainty is growing as quickly as our waistlines.

Imagine what our ancestors would have made of a book titled "In Defense of Food." They would never have believed that food needed a defense lawyer. But one of the leading indicators of the fix we are in is how quickly Michael Pollan's manifesto vaulted to the top of the best-seller list. There it sits, proof of the transformation of the land of plenty into the land of plenty of anxieties.

Pollan's previous book raised "The Omnivore's Dilemma" - what to eat. He masticated the meaning of four meals for people, the earth, and the agricultural industry. He single-handedly made "locavore" the word of the year for the New Oxford American Dictionary. Think global, eat local.

Now he solves the omnivore's dilemma with seven little words wrapped around a head of lettuce on the new book cover: "Eat Food. Not Too Much. Mostly Plants." (Not including the philodendron.) And the word that he's launching this year is "orthorexia," an unhealthy fixation with healthy eating.

"What other animal needs professional help in deciding what it should eat?" he asks, recognizing the absurdity of the need for his own advice. Two different forces got us here. The first is "nutritionism," the idea fostered by science that food is nothing more than the sum of its nutrients. The second and more pernicious force is the $36 billion food-marketing industry that turns food into "food-like substances."

Remember the French paradox: wine, cheese, and low weight? Well, the American paradox, Pollan writes, is "a notably unhealthy population preoccupied with nutrition and diet and the idea of eating healthily."

His tips for the land of the overweight orthorexics are rather charmingly simple. Among them:

> Avoid products made with ingredients you can't read or pronounce.
> Avoid products making health claims on the package.
> Yes, eat plants. (But not the sansevieria.)
> But the best of them is: don't eat anything your great-grandmother wouldn't recognize as food.

Frankly, I'm pretty sure my great-grandmother never saw an avocado. But I am all for moving from what conservatives grudgingly call the nanny state to the great-granny state.

Even as we speak, someone working to combine the eating disorder with the great American paradox must be writing the next best-seller: The Great-Granny Diet. [Molasses! Lots and lots of molasses! And butter. And lard. And cream. And bacon. Yum! – J] You read it hear first. Meanwhile, the moguls of the agricultural-industrial complex will work up a great-granny product line. And we will soon see great-granny stickers on all the beleaguered fruit and vegetables that line the market walls.

In the meantime, I plan to begin eating at least one plant that my great-granny knew so well: the good old Theobroma cacao. Rich in flavanols, not to mention polyphenols, this is a known treatment for fatigue, coughs and anxieties - and maybe even orthorexia. What was it my great-granny called this plant? Oh yeah, chocolate.

Thursday, February 14, 2008

'Free market' makes health insurance suck

Why Insurers Suck and Five Ways to Make Them Better
By Ezra Klein
February 13, 2008 | Prospect.org

"The state's largest for-profit health insurer is asking California physicians to look for conditions it can use to cancel their new patients' medical coverage," says the first line of this article in The LA Times. What more even needs to be said?

Lets start from a basic proposition: In the current system, insurance companies add negative value, which is to say, they make health care worse, not better. Conservatives often complain that health insurance is not "insurance" in any real sense, it is not protection against unexpected costs, but insulation from largely predictable costs. We know we will need to purchase health care. We contract out with health insurers to smooth those expenditures -- render them predictable and manageable over the course of years, rather than unexpected and crushing in the course of months. That's why we pay insurance premiums so we can one day get chemotherapy, rather than simply paying for chemotherapy.

But "we" here is misleading. Not all of us make this deal with insurers. And among those of us who do make this deal, we make it in different ways, purchasing different levels of insulation, on different time periods. So the insurers, quite naturally, turn their attention to making deals with the most profitable among us, and avoiding deals, or finding ways to break contracts, with the least profitable. They are very innovative in their attempts to do this. But there's nothing good about those attempts. Competition among drug dealers does not aid the neighborhood, and currently, competition among insurers does not aid the ill. Indeed, their inattention to actual care is startling. America, for all its technological advancement, has among the lowest adoption of cost-saving, care-improving, electronic records in the world. That is the fault, in part, of our insurers.

And here's why: It is actually counterproductive for insurers to compete on giving us the best care. It's not simply that they're not doing it, but given the structure of the marketplace, they shouldn't do it. Imagine insurer X creates the best damn diabetes protocols in the country. And they begin advertising this fact. What happens on Day Two? Well, they're flooded with individuals suffering from diabetes, or individuals who fear they will one day be suffering from diabetes. These people, in the current system, are a bad deal. Not only is it near impossible to insure them at a profit, but pooling their costs (which is what insurers do, after all) raises premiums for all the insurer's other customers. When the average customer of an insurer gets sicker, prices go up for all their customers. So the healthy folks contracting with that insurer quit the pool, and go find a cheaper deal, which forces the insurer to raise premiums again, driving out more healthy folks, which forces them to raise premiums again, which drives out more healthy folks, and so on. It's what we call an insurance death spiral, and it ends with the collapse of the insurer.

Given those incentives, insurers cannot compete to offer better care, because if they offered better care, all that would happen is they would attract worse deals. Which is why, in the current system, insurers make things worse. Tyler Cowen has a vision for how insurers could, in a more perfect world, compete to our benefit. And I don't necessarily disagree with it. But let's be clear on what's necessary:

1. Universality: Insurers cannot compete effectively unless everyone is in the pool. If the healthy can leave, insurers cannot compete to offer better care. They'll have to compete to attract the healthiest, which means offering the lowest costs, which means insuring the fewest sick people. The system has to be universal.

2. Community Rating: Insurers cannot be allowed, before offering insurance, to use demographic subslicing to cherrypick the market. That means no more preexisting histories, no complex formulas around age and income and race and region. They offer insurance to anyone who wants it for the exact same price. No exceptions.

3. Risk Adjustment: Merely having everyone in the system won't be enough, and nor will forcing insurers to do away with their most delicate cherrypicking tools. Insurers will just become sophisticated at advertising on G4 Tech TV, and in snowboarding magazines, and in urban centers -- in places, in other words, where the young and the healthy gather. So atop the universal system, atop the community rating, you need risk adjustment, which means either that insurers are reimbursed more for taking on sicker patients, or, my preferred method (and the one used in Germany), insurers with particularly healthy pools pay into a central fund that redistributes to insurers with less healthy pools. At the end of the day, it has to be as profitable for an insurer to insure a sick person as a healthy one.

4. Information Transparency: It needs to be easy for individuals to compare insurers on plan comprehensiveness, price, outcomes, etc. That means we need a marketplace where folks can go to shop for insurers, and they need to have standardized comparisons, or non-partisan rating authorities, providing information they can use.

5. One Market: This is contained in the last point, but there needs to be a singular place, or set of them, where individuals can shop around for insurance. This is hard stuff to find, and harder yet to understand, and real effort needs to go into constructing an easily accessible marketplace that customers can effectively navigate.

There are probably more, but those are the major ones. It's not impossible to imagine a scenario in which insurers actually compete to offer better service, in which the marketplace really does work to the consumer's benefit. That could take a million different forms, from personalized care coordinators to electronic records to online access to your health information to negotiated discounts on gym memberships to a million things I haven't thought of. But none of them happen with any sort of frequency now because insurers operate in a perverse market in which their incentives are to make the system, and our care, worse.

Tuesday, February 12, 2008

U.S. Army's 'middle management' crisis

The Army's Other Crisis
Why the best and brightest young officers are leaving

By Andrew Tilghman
Washington Monthly


Matt Kapinos was born into the military, at a U.S. Army hospital outside Frankfurt, Germany. It was 1979, and his father was an Army officer, one of thousands of soldiers stationed along the plains of central Europe. Kapinos moved around a lot growing up—thirteen places in all, including upstate New York, Tennessee, Georgia, Kansas, and Korea. From his perspective, these locations all appeared pretty much the same. No matter where he lived, at 5 p.m. everyone paused as the American flag was lowered to the sound of a bugle. He attended schools run by the Defense Department, where many of the teachers were married to soldiers, and where military police chaperoned the school bus at times of heightened security. It wasn't until he was a high school junior that his family first lived "off post." His father, then a colonel, got a job at the Pentagon, and so the family moved to Springfield, Virginia. Unsurprisingly, by then Kapinos could imagine only one career for himself: he wanted to be an officer in the Army.

One spring afternoon in his senior year, Kapinos came home from track practice to find a FedEx envelope on the doorstep. It contained his acceptance to the military academy at West Point, the alma mater of great American generals going back to Ulysses S. Grant. Kapinos's father, who had also attended West Point, "tried to let me know what I was getting into, that you lose a little bit of control over your life and that the Army is not always fun and games," Kapinos recalled. "[But] my dad always pushed us to, you know, do something to contribute. I guess I wanted to do something that seeks glory, to do great things."

Kapinos thrived during his four years in New York's Hudson Valley. In particular, he loved learning the history of warfare, including twentieth-century counterinsurgencies—the French in Algeria, the British in Malaysia, the Americans in Vietnam. As a cadet, he excelled in the military training program. He was one of only six graduating students to wear six bars on his lapel and earn the title of cadet regimental commander. He graduated near the top of his class, one of the Army's most talented recruits.

A few months before September 11, 2001, Kapinos began training to jump out of a plane with a rifle and a rucksack. By then, he was a platoon leader assigned to an elite unit of paratroopers in the 82nd Airborne Division at Fort Bragg, North Carolina. About forty enlisted men were placed under his command. In early 2003, they followed Kapinos aboard a C-130 plane bound for Khost, Afghanistan, a border town nestled below snow-capped peaks in a valley stretching east into Pakistan.

Kapinos was placed in charge of a "firebase," an abandoned Afghan home where he lived with his soldiers and patrolled local villages. At first, he loved the work. "I felt like this was what I'd always wanted to do," he told me. An air assault mission in the spring of 2003 was particularly exhilarating. He and his soldiers flew into a remote valley, streamed out of Black Hawk helicopters, and encircled the home of an insurgent leader who had been accused of killing a Red Cross worker. Rifles raised, they kicked in the doors and found the man, wearing a tan turban and a traditional cotton gown, and in possession of a stash of weapons and $10,000 in U.S. currency.

But from his reading of military history, Kapinos understood that fighting a counterinsurgency is about more than catching bad guys. He made an effort to build rapport with locals, even though no instructor had ever suggested that he do so. He requested medical supplies for local village leaders when no supplies had been provided. He told his soldiers to be cautious before using deadly force, and he scolded them for making derogatory remarks about the local Pashtun Muslims.

Kapinos returned to Fort Bragg in late 2003. His wife, Katherine—a smart University of Virginia graduate with career plans of her own—was relieved. They'd married the previous year and had hardly seen each other since. After a few months, they'd started to settle into married life. Then, at a holiday party for officers and their wives, a loose-lipped sergeant major revealed that the battalion was leaving for Iraq in two weeks. Matt and Katherine's first Christmas together was an anxious one.

Before boarding the plane for Iraq, Kapinos was promoted again. At the age of twenty-four, he was helping to lead a company of nearly 200 soldiers. In Iraq, he oversaw security at Camp Anaconda, one of the largest U.S. air bases in Iraq, home to tens of thousands of soldiers and contractors. From a high-tech command post, he monitored grainy video screens, spotting insurgents erecting mortar tubes and dispatching quick reaction units to kill or capture them.

Kapinos was accumulating lessons afforded few West Point graduates of recent generations—the chance to experience real war as a young lieutenant. Still, he was feeling frustrated. He worried that his superiors were slow to grasp the complex nature of counterinsurgency. In Afghanistan, he had suggested that instead of merely conducting nighttime raids, his men should camp in small villages to help local leaders root out insurgents and their sympathizers. His commanders repeatedly rejected the idea. In Iraq, he was full of similarly innovative proposals, but felt his commanders disregarded his input. "After a while, you just stop asking," he said.

Kapinos was questioning the Army's conventional wisdom at a time when it urgently needed independent thinkers. Indeed, as the Iraq and Afghanistan missions have floundered, the Army has begun to turn to unorthodox leaders who look beyond heavy artillery and tank battles. General David Petraeus is the best example of this; in the 1980s, while other ambitious career officers were stationed in Germany pointing tank brigades at the Fulda Gap, Petraeus was at Princeton studying counterinsurgencies and questioning military doctrine for his doctoral thesis, "The American Military and the Lessons of Vietnam: A Study of Military Influence and the Use of Force in the Post-Vietnam Era." Kapinos, who was similarly absorbed by both the practice of war and its more intellectual aspects, was rising swiftly through the ranks at the moment when the Army needed him most.

Kapinos, however, is no longer in the Army. Fifteen days after his initial five-year service agreement expired, he left military life entirely. When I met him, it was near the downtown campus of the Georgetown University Law Center, where he was taking a break from classes on corporate income tax law. Tall and fit, with close-cropped sandy brown hair and a green cable-knit sweater, he resembled both the lawyer he is preparing to be and the Army captain he once was. "I was a true believer at West Point. When Afghanistan kicked off, I don't want to say I bought the propaganda, but I wanted to change the world," he said. "I thought I was going to be a four-star general."

For several years now, we've been hearing alarming warnings about the strain that the Iraq War has placed on the military. Since the conflict began, around 40 percent of the Army and Marine Corps' large-scale equipment has been used, worn out, or destroyed. Last year, the Army had to grant waivers to nearly one in five recruits because they had criminal records. There are no more combat-ready brigades left on standby should a new conflict flare.

These problems are of vital concern, and are reasonably well understood in newsrooms and on Capitol Hill. But the top uniformed and civilian leaders at the Pentagon who think hardest about the future of the military have a more fundamental fear: young officers—people like Matt Kapinos—are leaving the Army at nearly their highest rates in decades. This is not a short-term problem, nor is it one that can simply be fixed with money. A private-sector company or another government agency can address a shortage of middle managers by hiring more middle managers. In the Army's rigid hierarchy, all officers start out at the bottom, as second lieutenants. A decline in officer retention, in other words, threatens both the Army's current missions in Iraq and Afghanistan, and its long-term institutional future. And though many senior Pentagon leaders are quite aware of the problem, there's only so much they can do to reverse the decline while the United States maintains large numbers of troops in Iraq.

In the last four years, the exodus of junior officers from the Army has accelerated. In 2003, around 8 percent of junior officers with between four and nine years of experience left for other careers. Last year, the attrition rate leapt to 13 percent. "A five percent change could potentially be a serious problem," said James Hosek, an expert in military retention at the RAND Corporation. Over the long term, this rate of attrition would halve the number of officers who reach their tenth year in uniform and intend to take senior leadership roles.

But the problem isn't one of numbers alone: the Army also appears to be losing its most gifted young officers. In 2005, internal Army memos started to warn of the "disproportionate loss of high-potential, high-performance junior leaders." West Point graduates are leaving at their highest rates since the 1970s (except for a few years in the early 1990s when the Army's goal was to reduce its size). Of the nearly 1,000 cadets from the class of 2002, 58 percent are no longer on active duty.

This means that there is less competition for promotions, and that less-able candidates are rising to the top. For years, Congress required the Army to promote only 70 to 80 percent of eligible officers. Under that law, the rank of major served as a useful funnel by which the Army separated out the bottom quarter of the senior officer corps. On September 14, 2001, President Bush suspended that requirement. Today, more than 98 percent of eligible captains are promoted to major. "If you breathe, you make lieutenant colonel these days," one retired colonel grumbled to me.

The dismay of senior leaders at this situation pierces through even the dry, bureaucratic language of Army memoranda. In an internal document distributed among senior commanders earlier this year, Colonel George Lockwood, the director of officer personnel management for the Army's Human Resources Command, wrote, "The Army is facing significant challenges in officer manning, now and in the immediate future." Lockwood was referring to an anticipated shortfall of about 3,000 captains and majors until at least 2013; he estimated that the Army already has only about half the senior captains that it needs. "Read the last line again, please," Lockwood wrote. "Our inventory of senior captains is only 51 percent of requirement." In response to this deficit, the Army is taking in twenty-two-year-olds as fast as it can. However, these recruits can't be expected to perform the jobs of officers who have six to eight years of experience. "New 2nd Lieutenants," Lockwood observed, "are no substitute for senior captains."

Even the pool from which the Army draws its future leaders is being diluted. Last year, the Army commissioned more officers as second lieutenants than it has since 1989, when the Pentagon was still planning for a cold war-era force nearly 50 percent larger than the current one. (The commissioning figures are partially a reflection of the Army's restructuring efforts since 2002, which created a greater number of smaller combat units and increased the need for junior officers.)

Those new officers, however, are not coming from the traditional sources of West Point and ROTC programs, which supply recruits fresh from college. Instead, they are coming from the Army's Officer Candidate School—mostly attended by soldiers plucked from the enlisted ranks, who probably entered the military straight from high school. The number of OCS graduates has more than tripled since the late 1990s, from about 400 a year to more than 1,500 a year. These soldiers may turn out to be good commissioned officers. But they are also needed in the noncommissioned officer (NCO) corps, the parallel structure of senior-level sergeants who form the Army's backbone, responsible for ensuring that orders are effectively carried out, rather than making policy or strategic decisions. Yet the Army is already several thousand sergeants short and has been reducing NCO promotion times in order to fill the gaps. Sending more soldiers who are NCOs, or NCO material, to Officer Candidate School is merely robbing Peter to pay Paul.

Iraq, in one way or another, is a driving force behind many officers' decision to leave. For some, there's a nagging bitterness that the war's burden is falling overwhelmingly on men and women in uniform while the rest of the country largely ignores it. While many officers don't oppose the war itself, returning repeatedly to serve in Iraq is a grueling way to live. One of the many reasons for this is that it corrodes their families; the divorce rate among Army officers has tripled since 2003. Internal surveys show that the percentage of officers who cite "amount of time separated from family" as a primary factor for leaving the Army has at least doubled since 2002, to more than 30 percent. And family is a factor even for officers who don't have one yet. One young soldier I met at Fort Bragg, North Carolina, said his primary problem with military life was the difficulty of finding a girlfriend while spending more than half his time in Iraq. As officers prepare for a third or even fourth deployment, a new wave of discontent is expected to wash over junior leaders. Studies show that one deployment actually improves retention, as soldiers draw satisfaction from using their skills in the real world. Second deployments often have no effect on retention. It's the third deployment that begins to burn out soldiers. And a fourth? There's no large-scale historical precedent for military planners to examine—yet.

Still, the roots of the phenomenon of officer discontent go far deeper than multiple deployments or the war in Iraq. Since the 1970s, societal and cultural shifts have created a tough environment for the Army to attract and keep bright young officers.

After the Vietnam War, as the Army started to make the transition to a volunteer force, officers left the service in droves. Morale was miserable, and discipline was lacking. In 1980, Ronald Regan won the presidency promising to restore honor to the armed forces, and with an infusion of congressional funding officer retention improved during the rest of the decade. However, beginning in 1991, with the cold war over, Washington moved to reduce the size of the Army by about 40 percent. The Army in turn actively encouraged young officers to leave, and whittled down West Point and ROTC classes. The drawdown likely masked any mounting retention problems.

By the late 1990s, the Army had about 480,000 troops, and new complications had emerged. Without a major-league enemy like the Soviet Union, the Army felt less relevant. Other job opportunities were plentiful in a thriving economy. In 2000, nearly 15 percent of junior officers between their fourth and ninth year of service left the military, the highest rate since the 1970s; experts labeled this statistic a crisis. An upsurge of patriotism after September 11 briefly pushed retention rates back toward historic norms. But as the Iraq War has become increasingly unpopular, the cultural factors that underpinned the exodus of the 1990s are again driving officers out of the service.

At the most basic level, being in the Army is a government job. Baby boomers were once drawn to the officer corps by cushy benefits and generous pension packages. But since the 1990s, an Army career has seemed less attractive in comparison with the lucrative opportunities available to a young, educated overachiever in corporate America. (The income of an Army officer with a college degree and twenty years of experience currently tops out at about $90,000.)

Money isn't necessarily the main factor in a junior officer's decision to quit. But military officers are constantly made aware of better-paid opportunities. Corporate recruiters view a combat deployment to Iraq as a highly marketable qualification, and often spam officers' in-boxes with job possibilities. This fall, I attended a job fair in Philadelphia where I saw about fifty junior officers in their late twenties, dressed smartly in business suits. All the officers I met expected to receive several offers of midlevel management positions in sectors such as manufacturing or construction, with salaries starting at around $70,000 with the potential of reaching six figures within several years. The recruiters, in turn, were excited by the officers' leadership and stress management skills. "We're looking for leadership," one recruiter for a commercial real estate management firm explained. "We can teach them the rest."

Another cause of officer discontent is the geography of Army life. A military career has always involved a rural lifestyle, since sparsely populated places provide more room to test artillery and simulate warfare. These locations appealed to baby boomers, who came of age when many American urban centers were in decay, and Army garrison towns like Fayetteville, North Carolina, evoked the feeling of the small towns in which many officers had grown up. Today, numerous coastal American cities have been revitalized, and they attract the most educated and ambitious young men and women, many of whom grew up in suburbs. Meanwhile, Army towns like Killeen, Texas, or Watertown, New York, have devolved into impoverished, isolated outposts economically dependent on their military installations and notable mostly for a seedy proliferation of chain restaurants, pawnshops, and strip clubs.

Perhaps the most powerful new element affecting officers' willingness to stay in the Army is the shifting dynamic of marriage and the roles of men and women in the family. Even in the rather traditional realm of Army culture, fathers now expect to be more actively involved in raising their children, and women tend to be less deferential to their husband's career. Among baby boomers, officers' wives were usually homemakers. Today, however, many officers' wives are doctors or lawyers or have degrees in international affairs, and there are few opportunities for them in places like Kentucky or West Texas. Recently I met a former captain named Adam Ake, who had won a Rhodes scholarship after graduating first in his class from West Point in 1997. He spent seven years as a platoon leader in Korea, and wrote speeches for a three-star general at Fort Lewis in Washington State. Knowing he would be swept up into the Iraq deployment schedule, he reluctantly left active duty in 2004, due to the stress his service was placing on his family and his wife's career (she is an Army doctor at Walter Reed Hospital in Bethesda). "Something had to give," he said. He went to law school, and now clerks for a federal judge in Washington, D.C.

Over the past three months, I talked to numerous former officers around the country. What struck me most was their dissatisfaction with the way the Army leadership is managing the war, and the part that played in their decision to leave.

In Philadelphia, I met Zeke Austin, a twenty-eight-year-old former captain at Fort Hood, Texas, who left the Army after five years to look for a private-sector job. Austin first explained that he quit because his fiance was finishing medical school and couldn't find a residency program in an Army town. Suddenly, he veered into a scathing critique of his commanders' preoccupation with institutional process. "Rather than focus on important stuff, they focus on PowerPoint slides. They'd have me up all night to make one slide a little prettier," he said. "After a while, you start to think, What am I doing over here?"

In Houston, I met an officer who had taken the rare step of leaving only eight years before he was due to retire. When I inquired why, he described a generation of senior leaders who gained experience in the relative calm of the 1980s, and seemed most comfortable in Iraq behind a desk. "What did these guys ever do? Go to Panama?" said the captain, who now makes more than $100,000 as a logistics manager for a petrochemical services company. "All they know how to do is train. So you're out in a firefight and they're complaining because you're not wearing eye protection. The colonel says 'Why don't you have your knee pads on?' and you're like, 'Shut the fuck up, I've got a guy bleeding over here.' That has a lot to do with it."

In Washington, I met Matt Kapinos and his longtime friend Jim Morin for lunch. Like Kapinos, Morin was a history major from West Point's class of 2001 and then served with the 82nd Airborne in Iraq and Afghanistan. He, too, left the Army for Georgetown's law school. Both men were frank, thoughtful, and occasionally sarcastic about their disillusionment with the Army; it was clear that they'd discussed the subject repeatedly before.

"You have a three-star general like John Vines come down to talk to us, and he says, 'Just go out there and shoot people,'" Kapinos said. "And you know that that is not how to fight an insurgency. Everyone who's ever read the most basic article on counterinsurgency knows that is not how you're going to win."

"Yeah," Morin agreed. "The general would come out and give these bellicose speeches, and every time he did that, I'd have to go back to my guys and say, 'What the general really meant to say was ...'"

Morin is a soldier-scholar type who frequently refers to military theorists in casual conversation. Like Kapinos, he always thought he would spend twenty-plus years in uniform. Morin became addicted to military history when he was twelve years old. When he was fifteen, he persuaded his parents to help him buy Civil War-era military dress and an antique musket worth several thousand dollars so he could participate in large-scale battle reenactments. At West Point, he took a special interest in counterinsurgency, writing his senior thesis on how the British successfully quelled the French-Canadian rebellion in Quebec. "If you go read Clausewitz and the other military writers," he explained, "[you learn that] war is politics by other means. You have to offer them an alternative better than the other guy. You have to fight a bullet war, but you also have to fight an economic development war, you have to fight a PR war, and you have to do it all at the same time. From what I saw, that just wasn't happening. I felt like we were keeping people safe so they could starve."

As a young lieutenant, Morin once drafted a memo for his commander proposing an elaborate program to help fund humanitarian and infrastructure projects, using integrated teams of infantrymen, civil affairs specialists, and civilian aid agencies. He didn't receive a response, and quickly stopped making such suggestions for fear of being perceived as "a wet-behind-the-ears second lieutenant." Now, he laughs about the incident as an example of his naivete.

Of course, every generation of young officers is critical of their superiors. But the botched management in Iraq and a sense of squandered momentum in Afghanistan have intensified those feelings among today's young officers. It's one thing for young officers in the 1980s or '90s to stand around at a training facility at Fort Polk, Louisiana, complaining about the higher-ups; it's another when junior officers have to see soldiers under their command dying in missions they believe are strategically flawed or futile.

Like many young officers I met, Kapinos and Morin were particularly disturbed by the experience of a colonel named H. R. McMaster. McMaster earned a Sliver Star in Operation Desert Storm. In 2005, he commanded a brigade of several thousand men in the northern Iraqi city of Tal Afar. He was lauded as the first upper-level commander to introduce progressive counterinsurgency strategies, rather than the traditional security-based mission that most other commanders were pursuing. He sought support from the entire population of Tal Afar. When his men released detainees, they asked them how they felt they had been treated (this was dubbed the "Ask the Customer Program"). The results were impressive. As the rest of Iraq deteriorated in 2006, Tal Afar was relatively calm, and President Bush touted it as a success. Despite these achievements, McMaster has been passed over twice for promotion to brigadier general. Kapinos concluded, "The junior officers see a guy who they worship—he's smart and successful—and they see him get the short end of the stick. If he doesn't make one star, if he doesn't go on to great things, if the cream stops rising at some point—then the good guys are going to say, 'What's the point?'"

The consequences of shedding thousands of bright, battle-tested young officers are likely to be grim. In the short term, experts worry that military units in Iraq and Afghanistan—which have performed impressively despite staggeringly bad senior leadership—will degrade in effectiveness.

Many in the military are mindful of what happened when the Army experienced a similar flight of top young talent near the end of the Vietnam War. Then, critical midlevel leadership positions were filled by soldiers with less experience and maturity. Poorly prepared leaders drove relations between the officer corps and the enlisted men who served under them to historic lows. The Army documented incidents of "fragging," when outraged enlisted men turned their weapons on officers who they felt were gratuitously or ineptly leading troops into danger. "We got more of our own people killed than the enemy killed because of insufficiently skilled solders and lousy leadership," said General Donn Starry, a retired four-star general who was a commander in Vietnam. After the war, the military was undisciplined and struggled with crime and drugs. Top generals described it as a "hollow force."

There is also concern about the medium term beyond Iraq and Afghanistan. Over the next five to ten years, experts foresee a high likelihood that the military will be drawn into humanitarian and counterinsurgency-style operations that require officers with foreign-language aptitude, cultural awareness, negotiating skills, and other specialized talents. Many of these skills are rarely, if ever, taught in formal Army training programs. Soldiers who have seen firsthand what works and what doesn't intuitively understand the need to be courteous but always ready to pull the trigger. Yet shifting from an Army culture that once revered ornery, pugnacious characters like General George Patton won't be easy. "If we think that our future wars are going to look a lot more like this one, we are losing a huge knowledge base," said Rachel Kleinfeld, a director at the Truman National Security Project in Washington. "And once they're gone, they're gone."

But the greatest concern is how the exodus of the best and brightest will affect the Army's long-term capacity to win wars, counter threats, and keep the peace. Today's lieutenants and captains are the pool from which three- and four-star generals will be chosen twenty years from now. If the sharpest minds aren't in that pool, we could wind up—to put it bluntly—with a senior leadership of dimwits.

Again, the Vietnam experience is instructive. After that war, the junior officers who did remain in the Army were promoted. During the 1970s and '80s, that generation of officers deliberately turned their backs on the study of counterinsurgency, believing they could simply avoid such conflicts in future. Many of the Iraq War's generals came from that generation (think Tommy Franks). Among the thousands of their peers who left the Army after the war ended, were there a small handful of exceptional leaders who might have helped the military better prepare for a post-9/11 world? "The senior leadership of the Army and Marine Corps were slow to understand the nature of the Iraq War," said Andrew Bacevich, a professor at Boston University and a retired Army colonel who served in Vietnam as a junior officer and who lost a son in Iraq. "Was there a brain drain in the 1970s in the Army? Yes, there absolutely was. Had that brain drain not have occurred, would the officer corps have been quicker and better at adjusting? It's impossible to say." However, numerous military experts I spoke with all agreed that the attrition of junior officers will harm the quality of the officer corps over the long term. Critics of the Army leadership often note that the highest positions in the military at large—the chairman of the Joint Chiefs of Staff, the commander of CENTCOM, and the commander of Special Operations Forces—are all held by Navy officers, which seems odd at a time when ground forces are at the center of war operations.

The good news is that some leaders at the top understand the gravity of the situation. This October, Secretary of Defense Robert Gates made reference to officer retention in a speech: "There is a generation of junior and midlevel officers and NCOs who have been tested in battle like none other in decades. They have seen the complex, grueling face of war in the twenty-first century up close," he said. "These men and women need to be retained, and the best and brightest advanced to the point that they can use their experience to shape the institution to which they have given so much." A month later, General Petraeus was summoned from Baghdad to Washington to preside over a board that will select the next class of brigadier generals. This was an unusual move that signaled, according to the Washington Post, "the Army's commitment to encouraging innovation and rewarding skills beyond the battlefield."

The bad news is that an all-volunteer military has few tools at its disposal to staunch the loss of high-grade junior officers—especially if the war in Iraq continues much longer. The Army has set an aggressive goal of retaining 95 percent of company-grade officers (typically those in their first ten years of service). That would be a higher retention rate than the Army has managed since the cold war ended, and experts describe this target as completely unrealistic. Otherwise, the Army's solution has been to throw money at the problem. Pay is higher, bonuses more common, and institutional incentives doled out generously as the Army seeks to grow its ranks while fighting a war on two fronts. (The average cost of training and paying a soldier has risen 60 percent since 2000, from $75,000 to $120,000 in 2006.) The Army also offers to pay graduate school tuition or give a young officer the base of his choice in exchange for the promise of a few more years in uniform. It's too early to tell whether this will have any impact on officer retention rates, and expectations are mixed.

There are other ways the Army might ease pressure on officers' families. It could lengthen the time between combat deployments. It could do more to harness fresh ideas among young officers—for example, by pulling them out of the combat rotation and assigning them to help develop new training programs. Or it could allow them to take outside internships with civilian agencies, in order to gain expertise in economic development or civil administration. It could even allow them to serve in the reserves for several years before resuming full-time active duty. Still, there's a slight hitch to all these plans—the war in Iraq. "These guys are getting tired from being ridden hard, and we want to give them a break. But it's hard to give them a break, because we need to put them in the fight," said Dr. Leonard Wong, a retired colonel and a research professor at the U.S. Army War College.

Army officials say anything less than two years at home for every one year at war is unsustainable for soldiers in the long run. Yet the current scheduling calls for fifteen months overseas followed by twelve months at home. For the past several years, officers who wanted a break from repeated deployments could seek the relative comfort of assignments at training posts or as staffers at the Pentagon. But with some soldiers now having endured three or four in tours in Iraq, such refuges are disappearing. In November, General George Casey identified 37,000 soldiers—7.2 percent of the force—who have not been to a war zone since 2001 and have no legitimate (that is, medical) reason not to go. He told them to pack their bags.

When seasoned junior officers read that President Bush is negotiating a long-term occupation of Iraq, and that the Democratic presidential candidates are acquiescing to the notion of 50,000 or 100,000 troops being stationed there for five to ten years, they can foresee the future. They know that their Army life won't be like that of their parents' generation, when a foreign posting meant Germany in the 1970s, touring the Black Forest in a BMW with the kids. Rather, it means daily danger and the complexities of diffusing a civil war. The family will be back home in a remote place like Fayetteville, North Carolina, wondering if Dad or Mom is going to return alive.

Civilian hawks in the government believe that the way to reduce the grueling pace of deployments while continuing to prosecute the war for "as long as it takes" is simply to increase the size of the force. Rudy Giuliani, for instance, has called for adding ten combat brigades. But who is going to lead these new forces if seasoned young officers continue leaving the Army in droves? Calls to expand the Army are empty rhetoric if the military brass and their civilian bosses fail to grapple with whether the services can recruit and retain junior leaders in both numbers and quality. The Army has struggled to meet the increase of 30,000 troops authorized since 2004. This year, new laws call for an additional increase of 65,000 during the next five years. But according to the Congressional Budget Office, if recruiting and retention does not improve from 2005 levels, the Army's end strength will actually decline.

Kapinos has been out of the Army for more than a year now. He lives with his wife in a small home in northern Arlington. He gets up early each morning and works out at a nearby gym, a lingering habit from his Army days. From the gym, he drives the same Honda Civic he bought while a cadet at West Point across the river to law school. He's friendly with his classmates, but many of them seem relatively immature to him.

On Fridays, when he doesn't have classes, Kapinos often plays golf in Virginia. He started taking lessons with his wife, who left her career as a schoolteacher after they moved to Virginia and now works for a private equity firm in Washington. They recently returned from a trip to Tuscany, where they celebrated their fifth wedding anniversary. Some weekends, they drive to Charlottesville for a football game at the University of Virginia, or they visit Kapinos's Army friends in North Carolina. Otherwise, they often stay in, taking turns cooking dinner and watching TV. "I like my life now. There's a certain predictability to it," Kapinos said. "It's totally different, because there is zero probability that I am going to get deployed next week to go fight away."

This summer, he plans to take the bar exam. He already has a job lined up in the Washington office of a prominent international law firm, where he'll start as an associate in the energy and utilities practice. Occasionally, he looks back at his Army life with regret. "Every so often I kind of put on the rose-colored glasses and say, 'Man, that was awesome. We were doing all this great stuff.' But, you know, you're only thinking of the excitement, which is only 5 percent of what we did in the Army."

Kapinos will probably make a great Washington lawyer. But rarely does anyone suggest that we'll need more gifted, dedicated, and seasoned Beltway attorneys in the twenty-first century. When the government struggles with its most elemental challenges—identifying geostrategic goals and designing the tactical missions to achieve them—it turns in part to its four-star generals. The generals who will appear before Congress in twenty-five years are in the Army right now. They're junior officers, probably captains. And keeping them in uniform might be the Army's most important mission.

Andrew Tilghman, a former Iraq correspondent for
Stars and Stripes, is a staff writer for the Marine Corps Times. He lives in Washington, D.C., and can be reached at tilghman.andrew@gmail.com.