What's wrong with America? Everyone has their own pet answer to that question -- especially in an election year -- but my nominee today is lack of accountability, especially among political pundits. To be specific: for high-profile public intellectuals, malfeasance of various sorts has virtually no professional consequences.
Consider first the discovery that CNN host Fareed Zakaria had plagiarized an article by the New Yorker's Jill Lepore for one of his Time columns. Both Time and CNN suspended Zakaria temporarily, but eventually concluded that it was an isolated incident and reinstated him.
To his credit, Zakaria (whom I've known for twenty years and regard as a friend), immediately owned up to his mistake and vowed to rethink the professional arrangements that led to his embarrassing blunder. That was the right response, but my larger point is that his error will have no consequences whatsoever for his future career trajectory. None. The whole incident might someday rate a short paragraph in his obituary, but that's about all.
The next example is my Harvard colleague Niall Ferguson's instantly-infamous Newsweek cover story "Hit the Road Barack," which purported to offer a comprehensive indictment of Obama's performance as president. Here the problem wasn't inadvertent plagiarism; it was blatant dishonesty. As a diverse flock of respected commentators quickly pointed out, Ferguson's factually-challenged critique of Obama rested on an array of obvious misrepresentations and sleazy manipulations. Please don't take my word for it: just read James Fallows, Andrew Sullivan (here and here), Brad DeLong, Matthew O'Brian, and Joe Weisenthal. And that's just a partial list.
Unlike Zakaria, who promptly acknowledged his error and apologized, Ferguson responded by quickly doubling down on some of his original arguments. And he did so by selectively quoting a CBO report, deliberately omitting a key sentence that completely altered the meaning of the quotation. See Dylan Byers here.
Misrepresenting sources is normally a cardinal sin for a professional historian, even when writing in a popular venue. But is this likely to have any tangible consequences for Ferguson? Nah. Harvard won't do anything (and given the principle of academic freedom, it shouldn't). Neither will Newsweek, which is probably more worried about staying afloat for another year than it is about fact-checking its cover stories. In this sort of world, what incentive does Ferguson have to get things right?
One could argue that public intellectuals like Ferguson and Zakaria aren't really that important, and that their fates won't make much difference to the life of the nation. That might be true, but the absence of accountability goes far beyond them. Corporate CEOs mismanage companies and escape with lavish golden parachutes. The financial sector misbehaves for a decade and then gets bailed out. A former National Security advisor helps lead the country into a disastrous war, gets promoted to Secretary of State, and later becomes one of the first female members of the Augusta National Golf Club. By this standard, Ferguson and Zakaria's sins are pretty small potatoes.
Nonetheless, it would be better for the United States if there were some tangible sanction for Zakaria's careless error and Ferguson's deliberate dishonesty. In business, making big mistakes hurts the bottom line. In war, getting the facts wrong gets people killed. But in politics and punditry, egregious and/or willful errors carry no penalty, provided their purveyors are sufficiently popular or aligned with well-heeled political interests. Just look at the unsinkable careers of the people who gave us the Iraq war, many of whom could return to power if Mitt Romney wins in November. Absence of accountability is at least part of the reason why our political life is governed not by logic and evidence, but by fact-free fairy tales. And when you base political decisions on flights of fancy, bad results are to be expected.
Chris Hondros/Getty Images
I had a relaxing vacation out on Fire Island, though of course I didn't get quite as much accomplished as I intended. But I did do a lot of reading, and I thought I'd pass a bit of what I learned on to all of you.
I started with Volume 4 of Robert Caro's monumental biography of Lyndon Johnson, which covers the period 1958-1964. In this period Johnson runs half-heartedly (and unsuccessfully) for the 1960 presidential nomination, accepts the vice-presidential nod, and then languishes miserably in a powerless position. He's mostly ignored (if not openly dissed) by Kennedy's inner circle, and thinks his political career is mostly over. But Kennedy's assassination in November 1963 suddenly places him in the Oval office, and Caro offers a vivid description of how LBJ rises to the occasion, gets Kennedy's legislative program moving, and helps the country overcome a major national trauma.
The book is a great read, and Caro has few equals at sketching a character or describing how personalities operate within American institutions. He does have a weakness for stark contrasts and mano-a-mano confrontations (e.g.. he makes much of the blood feud between LBJ and Bobby Kennedy, going back to the early 1950s), but such portraits are part of what make the book difficult to put down.
But for me, a subtler message in the book (possibly overstated for dramatic effect) is that John F. Kennedy wasn't much of a president. He was smart, articulate, charming, and courageous (as his exploits in World War II revealed), and he often had sound political instincts. He had a knack for attracting talented acolytes and inspiring deep loyalty from them, and he knew how to use a gifted advisor/speechwriter like Ted Sorenson to great effect. But his record as a congressman and a senator was unremarkable, and Caro's account shows he didn't achieve much in his three years as president. The main elements of his legislative program were stalled in Congress, and his main foreign policy achievement was managing a crisis over Soviet missiles in Cuba that his own policies (e.g., the attempt to overthrow Castro and an unnecessary nuclear weapons build-up) had helped provoke. We obviously will never know what he might have achieved had he not been assassinated and if he had won a second term, but this book makes it clear that the post-assassination hagiography has little basis in fact.
My next selection was David Kang's "East Asia before the West," which I recommend to anyone with a shaky grasp of East Asian history. It's a slim book that focuses primarily on explaining the Sino-centric trade and tributary order that existed in Asia from roughly 1400 to 1900. Kang's emphasis is on interpreting this history, and demonstrating how this order differed from the Westphalian model that has inspired most contemporary IR theory. In particular, he argues that relative power played a lesser role in relations between China and its principal neighbors (Korea, Japan, and Vietnam) than realist theories might suggest, and that status (defined largely in cultural terms) was in fact of critical importance. Instead of being competing billiard balls interacting on the basis of relative power, Kang depicts these societies as heavily (though not totally) shaped by Chinese cultural ideas (primarily Confucianism). Relations among them reflected norms of deference that reflected not just power but also the degree to which other societies met Chinese cultural standards. He also depicts it as an unusually peaceful order -- at least with respect to state-to-state relations -- with the bulk of violence being directed at rebels, bandits, or nomadic tribes, rather than by governments against each other.
Not surprisingly, I though the book downplays the role of power somewhat. Given how much larger and stronger China was, it's not all that surprising that the lesser states didn't challenge it (and in the rare cases when they did, it didn't go well for them). But it is quite a thoughtful book, and well worth your time.
My last selection (apart from a few novels), was Fredrik Logevall's forthcoming book "Embers of War: The Fall of An Empire and the Making of America's Vietnam." It is a fascinating, beautifully-written, and deeply depressing account of the First Indochina War (i.e., the war between France and the Vietnamese resistance led by Ho Chi Minh), with particular emphasis on the background role played by the United States. Many parts of this story have been told before, but Logevall's account provides much new detail and important new insights. Among other revelations, he shows Dwight D. Eisenhower was far more hawkish on Vietnam than is sometimes claimed, and that the U.S. came closer to intervening during the siege of Dienbienphu that I had previously believed.
It is impossible to read the book without being struck by contemporary parallels, and without concluding that the U.S. foreign policy establishment has learned virtually nothing over the past sixty years. Although the French clearly knew more about Vietnamese society than their American counterparts did, officials in both governments were often embarrassingly ill-informed about the actual state of Vietnamese society and opinion. Back in Washington, key decisions were often being made by people (such as Dean Acheson or John Foster Dulles) who had little knowledge of Asian history or society and who were inevitably distracted and shaped by problems elsewhere. And alleged experts like Senator Mike Mansfield (whose opinions were heeded because he had once taught classes in Asian history) were blinded by Cold War ideology and simplistic ideas like the "domino theory." Meanwhile, the American public was chronically misinformed about Asian events by publishers like Henry Luce of Time and Life, and well-organized propaganda campaigns.
Logevall never makes explicit comparisons between the events he describes and more recent counterinsurgencies, but the parallels are quite remarkable. Like the United States in Iraq and Afghanistan, the French forces in Indochina faced enormous logistical difficulties and were frequently vulnerable to ambushes (including what we would know call "improvised explosive devices"). The occupying powers were allied with local elites who were feckless, unreliable, and corrupt, and neither the French nor the United States ever had much leverage over their local clients. The French faced chronic manpower shortages, largely because the war was increasingly unpopular and French politicians could not institute a draft and deploy conscripts there. Instead, they had to rely on legionnaires, troops from their other colonies, or on professional soldiers. Similarly, the Pentagon has always had trouble finding enough troops to run its occupations in Iraq and Afghanistan, and of course could never contemplate turning to a draft. The French thought that a heroic general (Jean de Lattre de Tassigny) would reverse their fortunes and produce a victory, just as U.S. leaders have occasionally pinned their hopes on the likes of David Petraeus or Stanley McChrystal. Both the French and the Americans tried to create local forces who could take over for them; neither effort succeeded to the extent necessary. Massive expenditures and much suffering was justified by baseless fears of falling dominoes, just as today U.S. pundits have somehow managed to turn impoverished Afghanistan into a "vital interest." Finally, Logevall shows that U.S. citizens had very little knowledge of what the United States was actually doing in Indochina -- especially in the period between the signing of the Geneva Accord and the escalation of direct U.S. involvement -- just as we are mostly kept in the dark about the full extent of our involvement in places like Yemen or Pakistan today.
All in all, a pleasant vacation, even if I spent a lot of it reading about unpleasant things and drawing depressing conclusions. Alas, that's an occupational hazard for people in this business, even when we're supposedly taking a break.
I am pleased to offer the following guest post by Nasser Rabbat of MIT:
Nasser Rabbat writes:
The euphoria sparked by the 2011 Arab uprisings has settled into realpolitik. The youth who initiated the protest movements split into myriad organizations or withdrew in despair. The Islamists, disciplined through decades of clandestine political action, took over in Tunisia and Libya, and are poised to wrestle power from a recalcitrant army in Egypt. The secularists, assumed to be the natural allies of the West, are weak and divided. In Tunisia and Egypt, they garnered fewer votes in the elections than predicted. In Libya, they retreated from the National Transitional Council, leaving the Islamists to occupy its most powerful positions. In Syria, still struggling against a belligerent and criminal regime that is proving hard to nudge, the secularists in the opposition are constantly bickering, whereas the Islamists are organized and goal-oriented. Arab secularism, the events seem to suggest, is a spent force. The United States and other Western governments, claiming to be responding to the realities on the ground, are engaging the Islamic parties as the defining new paradigm of Arab politics.
Is this a new turn for the West? Did the West support the secularists before the revolutions? And has Arab secularism really become irrelevant? My answer to all three questions is an emphatic no. To begin with, the record of the West in the Arab world is patently not pro-secularist. Indeed, if we are to limit our assessment to the regimes that have been consistently backed by the U.S. in the last fifty years, we will find on the top of the list Saudi Arabia, Qatar, the UAE, Oman, and Morocco, all avowedly Islamic regimes, at least in their claims to legitimacy or their application of Islamic law. Conversely, some of the most ardent opponents of the U.S. have been the secular regimes of the Baath party in Syria and Iraq, though their secularism proved skin-deep and opportunistic. Moreover, when the United States decided to avenge the attacks of 9/11, perpetrated as they were by an extremist Islamist militancy, its most decisive act was to destroy the secular regime of Iraq. Eight years later, when the Americans finally withdrew from Iraq, they left behind not only a flagrantly sectarian regime, but also a political class composed largely of religious movements umbilically linked to the Islamic Republic of Iran.
Nor does history show much Western support for the budding secular tendencies in the early twentieth century, which coincided with the colonization of most of the Arab world. Pragmatism may explain why colonial powers, Britain and France in particular, preferred to deal with traditional leaders. They had political influence, economic clout, and a wide base of clients. That they adhered to conservative forms of piety added to their usefulness: They understood the mechanisms of religious authority and could manipulate them to appease potential popular unrest. The few Arab secularists, on the other hand, even though thoroughly westernized and belonging to the social elite, were seen as troublemakers. Having been profoundly influenced by the principles of the Enlightenment, they formulated strong demands for liberation, democratization, and modernization. Many clashed with the colonial authorities and paid a heavy price of imprisonment or exile.
Independence, when it finally came, fell smack at the height of the Cold War. The West, which was eventually reduced to the United States, was seeking to build alliances of nations committed to countering the Communist threat. Conservative regimes, such as those of Jordan and Saudi Arabia, were obviously the most promising allies. So the West supported them regardless of their religious agendas. When military regimes came to power in Syria, Egypt, and Iraq after the defeat of these countries in the first Arab-Israeli war of 1948, they first toyed with accepting Western tutelage. Their subsequent turning to the USSR as a patron more sympathetic to their national causes, however, did not translate into espousing communism or rejecting religion. Ungodly these military regimes certainly were, but they were not secular. They neither believed in nor practiced the separation of religion and politics. They in fact heavily relied on religious symbolism to frame the image of their one inspired despot and his family or clan. This was the case of Anwar al-Sadat after Camp David and his successor Hosni Mubarak, as well as Saddam Hussein, Muammar Qaddafi, Hafiz and Bashar al-Assad. Fundamentalism and its defiant social expressions actually grew under their watch, even if they had been relentlessly suppressing all Islamic political organizations, or any other political activism for that matter.
Secularists had no place in such a system. Those who dared to speak out against it found themselves dismissed from their jobs, jailed, or forced to leave their countries. Some, who persisted in their criticism of the dictators or of the rigid views of the growing Islamist extremists, like the journalists Salim al-Lawzi and Samir Kassir in Lebanon, Hidaya Sultan Al-Salem in Kuwait, Farag Foda in Egypt, and Mohammed Taha in Sudan, were assassinated. Others, unable to cobble together a political structure to unite them like the Islamists had, channeled their political activism into more intellectual and artistic pursuits. Secularism, already accused of elitism because of the social background of its proponents, became even more rarefied as it migrated either away from the pulse of the street and into the confines of academia and art or out of the country altogether.
The 2011 uprisings seemed at first to bring secularism back to the forefront as a vociferous political force. Fueled by a new breed of activists -- young, globally networked, and unbothered by considerations of class, religion or gender -- the uprisings wielded the same principles that earlier Arab secularists have advocated. But like those earlier Arab secularists, the youth did not translate their secularist rallying cries into framers of political parties able to compete for the post-revolutionary governments. Some movements, notably the 6th of April Movement in Egypt, simply declared after the fall of Mubarak's regime that it had no plan to become a political party, then lived to regret that impulsive decision. The prominent and reasonably popular candidate for the presidency in Egypt, Mohammad el-Baradei, withdrew from the race before it began, citing as a reason the reprehensible way politics was conducted by his detractors. The few attempts to register a secularist political presence in the elections in Tunis and Egypt were swept aside by the eminently more organized Islamist parties and by their shrewd appeal to the basic religiosity of the people, especially the poor and the illiterate.
Arab secularism, however, remains on the street and online. Though outdone in the current rush to power by the Islamists, it still has the ability to reassert itself in the political arena, if not as the ruling party, at least as lawful opposition and guardian of the principles of civic freedoms. The culture of lawful opposition, long absent under the totalitarian regimes, needs to be reinserted into the political discourse. This is as important a function as good governance for the well-being of the nascent Arab democracies. To that end, the efforts of the discontented revolutionary youth and the seasoned secular intellectuals should be united under the umbrella of political parties. The West should help them by recognizing their crucial political role and by treating them as long-term partners not just as recipients of training and aid.
In February 2011, after the victory of the Egyptian revolution in which they played no significant role, some of the most famous Islamic preachers gloated that the next government will be Islamic. Secularism, they contended, should be put to rest because it reigned for fifty years and failed. But true secularism has never had a chance to rule in the modern Arab world, except perhaps in Tunisia under al-Habib Bourguiba (1957-87). Otherwise, religion was always enshrined in the fiat constitutions of all the Arab kingdoms and republics, even those that were ferociously hunting down Islamists. Moreover, Arab rulers who hid behind secular masks, whether they were civilian or military, never separated religion from their politics. Many enlisted docile forms of religion and compliant sheiks as parts of their arsenal of control. In that, they were following in the footsteps of a long tradition of inglorious religion-based rule in the Arab world, which did not really end until the fall of the Ottoman Caliphate in 1923. It is thus more accurate to question what Islamic rule of the kind imagined by the vocal Islamist organizations will bring that was not tried before during the long centuries of what they themselves believe was an Arab decline.
Nasser Rabbat is the Aga Khan Professor of the History of Islamic Architecture at MIT.
I'm off on vacation starting tomorrow and this time I intend to go cold turkey and not blog while I'm away. I really mean it this time. I've lined up a stellar group of guest bloggers to fill in on occasion, so keep checking this space to see what they've posted. I'll be back on July 9th -- you're all in charge while I'm gone.
As usual, I'll be spending my time back on the beach at Fire Island. My main goal is to do a lot of reading and thinking. I've got a few thesis chapters to read and a grant proposal to write, but mostly I'm looking forward to a hefty bag of beach reading. I'm going to start with the latest volume of Robert Caro's epic biography of Lyndon Johnson, and follow that with John Gaddis' recent biography of George Kennan. Then comes David Kang's East Asia before the West, which has been sitting on my desk for months. And if there's still time left, I'll probably go to Miko Peled's The General's Son: Journey of an Israeli in Palestine.
Man does not live by non-fiction alone, however, so evenings will be spent with some lighter fare. I'll pass on Fifty Shades of Grey, but I've still got to finish Joseph Kanon's Istanbul Passage before I start Alan Furst's Mission to Paris. And the beach house where I'm staying is filled with old Rex Stouts and other decomposing paperbacks, so I won't run out of brain candy while I'm there.
That ought to keep me out of trouble for ten days or so. Here's hoping that the next two weeks are an unusually dull period in world politics so that I won't be tempted to chime in. And I hope all of you get some time off too; even workaholics need to take a break and let their thoughts run down different pathways for awhile.
Sean Gallup/Getty Images
I've finished reading Peter Beinart's The Crisis of Zionism last week, and I enthusiastically recommend it to all of you. It is an excellent and important book, which is not to say I agree with everything in it.
Some commentators -- including Dylan Byers and Andrew Sullivan -- think "the conversation is over" and that Beinart failed to move the debate as much as he had hoped. I'm not so sure. It's impossible to tell how much long-term impact a book or an article will have in the first few months after it's published, and a lot depends on whether the trends Beinart describes are as powerful and enduring as he maintains. I think they are, which means that people will keep coming back to his arguments as events in the real world demonstrate that much of what he says is correct.
Beinart's central argument is straightforward and well-documented. First, he argues that Israel is evolving in an increasingly illiberal direction, largely due to its protracted occupation of the West Bank and its brutal treatment of its Palestinian subjects -- who by necessity must be denied political rights if the occupation is to endure. As both a committed liberal and proud Zionist, Beinart sees this as a tragic betrayal of Israel's founding ideals.
Second, Beinart shows how the "American Jewish Establishment" (i.e., organizations like AIPAC, the Anti-Defamation League, the American Jewish Committee, Conference of Presidents, etc.) has actively aided this process, both by making Israel the centerpiece of American Jewish identity and by pressuring U.S. politicians to back Israel no matter what it does. Unconditional U.S. support has allowed Israel to sustain a costly and dangerous colonial project while making it impossible for the United States to serve as an effective mediator in the long-running but failed "peace process."
Third, he believes this situation threatens both Jewish identity in America and long-term U.S. support for Israel because younger American Jews both lack an adequate grounding in Jewish traditions and values and because they are increasingly turned off by Israel's behavior. At best, they are becoming indifferent; at worst, they are becoming hostile to an Israel that they see as a betrayal, not a fulfillment of Jewish aspirations. This is especially true of non-Orthodox Jews, who tend to embrace the universalist ideals of liberalism. And as others have noted, inter-marriage and assimilation are likely to reinforce these tendencies over time.
In order to reconcile liberal values with the Zionist project and to help Israel escape a bleak future as an apartheid state, Beinart believes the United States -- and American Jewry -- must press Israel to change its policies and accept a two-state solution. He favors boycotting products produced in the West Bank, for example, and thinks the American Jewish establishment must abandon its unthinking deference to hardline Israeli leaders. He also believes that greater resources must be devoted to fostering Jewish traditions among younger American Jews. For this reason, he favors creating more full-time Jewish schools, supported by some form of public funding. He believes these steps will ameliorate the current tensions between liberalism and Zionism and ensure a bright future for Israel and American Jewry.
The book has some real strengths, and Beinart's willingness to confront a powerful set of shibboleths is admirable. It is gracefully written and an easy read, and it offers plenty of vivid anecdotes and illustrations to support the book's main arguments. Although Beinart is mindful of the Palestinians's own mistakes and crimes over the past century, he also does a brilliant job of debunking the catalogue of rationalizations that Israel's defenders have invented to defend forty-five years of occupation. In addition, his account of the Obama administration's humiliating failure at the hands of AIPAC et al and the Netanyahu government is gripping as well as depressing. Among other things, his account explodes the oft-repeated myth that the Israel lobby has lots of clout on Capitol Hill but little in the White House.
As one would expect, mainstream reviewers drawn from the ranks of Israel's defenders have been neither kind nor fair-minded in discussing the book. Because Beinart himself is an observant Jew whose affection for Israel is beyond question, he is largely protected from the accusations of anti-Semitism that are inevitably directed at anyone who criticizes Israeli policy or the lobby. But as Jerome Slater documents in his own review of the book, Beinart's most prominent critics simply do not address Beinart's actual arguments. Instead, they either misrepresent what he wrote or chase red herrings (such as his supposedly preachy "tone" or his personal motivations for writing the book). This approach is all too familiar to some of us: if you can't refute an author's facts or logic, changing the subject and impugning his or her motives is about all that's left.
Although I believe one can learn a great deal from The Crisis of Zionism, and think that it will be widely read over time, it has three problems worth noting. First, and most importantly, I think Beinart understates the tensions between liberalism and Zionism. At its core, liberalism privileges the individual and believes that all humans enjoy the same political rights regardless of ethnic, religious or other characteristics. But Zionism, like all nationalisms, privileges a particular group over all others. Israel is hardly the only country where this tension exists, and Beinart is correct to say that an end to the occupation would reduce the contradictions between liberal values and Israeli practices. But that tension will not disappear even if two states were created, if only because Israel will still have a sizeable Arab minority which is almost certain to continue being treated as a group of second-class citizens. It is hard to see how Israel could remain an avowedly "Jewish" state while according all Israeli citizens equal rights and opportunities both de jure and de facto. Could an Israel Arab ever become head of the IDF or Prime Minister in a "Jewish state?" The question answers itself.
Second, I think it is unfortunate that Beinart chose to direct his book almost entirely toward the American Jewish community. That is his privilege, and it's possible that the best way to get a smarter U.S. policy would be to convince American Jewry to embrace a different approach. Yet Beinart's focus also reinforces the idea that U.S. Middle East policy -- and especially its policy towards the Israeli-Palestinian conflict -- is a subject that is only of legitimate concern to Jewish-Americans (and Arab-Americans) and can only be legitimately discussed by these groups. In fact, U.S. Middle East policy affects all of us in countless ways and it ought to be a subject that anyone can discuss openly and calmly without inviting the usual accusations of bigotry or bias. I'm sure Beinart would agree, yet his book as written sends a subtly different message.
Third, Beinart's proposal to use public monies (such as school vouchers) to subsidize full-time Jewish schools strikes me as wrong-headed. I have no problem with any groups setting up private schools that emphasize particular religious values. What bothers me is the idea that the rest of society ought to subsidize these private enterprises whose avowed purpose is to sustain a particular group's identity. I'd say the same thing, by the way, if a Catholic, Episcopal, Muslim, Sikh, Mormon, or Zorastrian commentator were advocating similar public backing for schools catering to his or her group. Assimilation has been the key to ethnic tolerance here in the United States, and critical to our long-term success as a melting-pot society. Public education that brings students from different backgrounds together has been a key element in that process, and that's where public funds should go.
Despite these objections, The Crisis of Zionism is a thoughtful and courageous book from someone who cares deeply about the United States and Israel, as well as the Jewish people. To Beinart's credit, he's been willing to take a hard look at current trends and offer an impassioned warning about the dangers he sees looming.
For that reason alone, it deserves a wide audience and serious discussion -- which has not been the case up to now. The issues Beinart is wrestling with are not likely to go away, since it appears that a viable two-state solution is becoming less likely by the week, and maybe even impossible. It will be fascinating to see how Beinart's thinking evolves in the future, especially if the targets of his critique ignore his generally valuable advice.
Joshua Roberts/Getty Images
Robert Kelley has done a series of interesting posts on his own blog (cross-posted to Duck of Minerva) exploring options for U.S. retrenchment and offering a template for thinking about U.S. alliance commitments. Consider what follows a set of variations on the theme he began.
Kelley asks: if U.S. leaders tried to pursue a policy of partial retrenchment, what alliances commitments might they choose to limit or terminate, and which allies would still be considered important? Framing the question this way acknowledges that there may be some reputational issues involved in downgrading a long-standing security partnership, even if its original strategic rationale has diminished or even disappeared. But what if we let our imaginations really run free and frame the puzzle a bit differently? What if we were starting from scratch, and doing a "zero-based" assessment of U.S. alliance options? If historical ties weren't an issue, what features would you look for in a strategic partner and how might America's future alliance portfolio differ from its current set of arrangements?
So here's my quick list of the qualities we ought to look for, notwithstanding some obvious tensions and tradeoffs between them. As you'd expect, I lean heavily on more-or-less realist considerations, and less on shared "values" or domestic political similarities.
1. Power: Up to a point, you want allies that are strong and capable so that they don't need a lot of protection from the United States and so they can make a real contribution to any necessary effort at collective defense. One of the reasons the US won the Cold War is that our alliance system contained a lot of wealthy and relatively powerful states, while the Soviet alliance system contained a lot of relatively weak and not-very-powerful clients. One can't take this logic too far (i.e., "concerts of great powers" usually don't work well because the strongest states are too worried about each other to be close allies), but on the whole, you'd prefer allies that can actually do something for you. (One might argue that this strengthens the case for NATO and the U.S.-Japan relationship, but not if these states continue to let their defense capabilities atrophy.)
2. Position: There are some allies who are valuable not because they have a lot of capabilities, but because they happen to sit in an especially valuable piece of real estate. Think Oman, or Singapore, for example, which sit right next to critical strategic waterways. If you define your interests in global terms, then you're going to need some allies in these places.
3. Political stability: On balance, you'd like to have allies whose governments are stable and legitimate, so that they can make effective decisions and so that you don't have to constantly worry that they might be overthrown. Unstable allies encourage adversaries to meddle in the hope of undermining them, or forcing you to spend a lot of time worrying about your allies' nternal political health.
This is not necessarily an argument for democracy, by the way, because a democracy that rests on unstable coalitions or where there are sharp divisions about foreign policy can be a pretty troublesome partner. For that matter, there's no guarantee that public opinion will always support the alliance (which is why some Americans worry about how Arab spring may ultimately affect U.S. relations with some traditional Middle East partners). But on the whole, a stable and legitimate ally is preferable to an unstable or fragile one.
4. Popularity: An ally that has few conflicts with other states, and that has a positive image in the world is less trouble than an ally that is unpopular or a pariah. The reason is obvious: if you join forces with a state that other countries resent or despite, you immediately pay a diplomatic price for your association and you may end up gaining more enemies than friends. Other things being equal, this is not smart. America's "special relationship" with Israel illustrates this problem perfectly, just as China pays a price for doing business with Sudan and that Russia is losing prestige by continuing to support the Assad regime in Syria. This concern can be ignored if the price is not too high or if other benefits are large, but on the whole, you want to be friends with countries that have lots of other friends too.
5. Pliability: The ideal ally is also easily influenced: you'd like to have partners who will do what you want at most of the time. In simple terms, you want allies whose interests are mostly compatible with your own (duh!). An ally that refuses to help when times are tough, that has to be constantly badgered into contributing its fair share, or that takes independent actions even when it knows that this will cause trouble for its partners, is more of a headache than an ally that usually does what you want. (This problem explains why U.S. relations with Pakistan are in such bad shape: both sides are deeply disappointed by what the other is doing). No two states have identical interests, of course, and any alliance will exhibit occasional strains. But on the whole, you want allies that are genuinely working with you, instead of at cross-purposes.
6. Potential impact: Finally, sometimes states form an alliance simply what happens to some other state could have an enormous effect on what happens to them. Neither Canada nor Mexico are major military powers, for example, and neither controls key strategic chokepoints. But their proximity to the United States also means that what happens there could have an enormous impact on U.S. security. To take this one step further: if either country were ever to align with a power that was hostile to the United States, the potential impact on American security would be enormous. So the United States has a powerful interest in keeping Canada and Mexico close, despite their relative military weakness.
As I suggested above, there are some interesting tradeoffs between these different criteria (which is one reason why diplomacy and grand strategy can't be reduced to a simple checklist or cookbook). You want strong allies, for example, but the more capable an ally is the less pliable it is likely to be. The United States generally likes allying with democracies, but democratic governments don't always act the way Washington wants (see under: Turkey). Ideally, you want you allies that are popular and do not have a lot of enemies (because that makes it harder to help protect them), but if they have a few enemies they will be more interested in your help and more willing to defer to your wishes.
So if we were really doing a "zero-based" alliance portfolio, what implications might one draw from this set of criteria? If one could really start from scratch, I doubt we'd give security guarantees to Taiwan, even in a period when we're worrying about the Asian balance of power. It's too small, and will be increasingly difficult to protect over time. But most of America's other Asian allies would still be valuable, and we'd probably be courting them today even if we didn't already have strong ties. As noted, one could make a case for NATO as a limited security partnership, but I doubt we'd try to build an elaborate multilateral institution in 2012 if it didn't already exist.
The United States would still need allies to maintain a balance of power in the Persian Gulf, but it's not entirely clear we'd pick the same allies we currently have. I can make a reasonable case for a normal relationship with Israel (though not the current special relationship): it's a strong country in a critical region and it shares some values with the US (though that rationale is eroding rapidly). The problem is that unconditional support for Israel damages US standing in lots of other places. India is an easy case on realist grounds, though it's also a country that has significant internal problems and lots of troublesome neighbors, which means there's the danger of getting sucked into its problems. And as Stephen Kinzer argues in his book Reset, if we were starting from scratch, one can actually make a fairly good case for a closer strategic realtionship with ... (drum roll) ... Iran.
Of course, nations don't get to wipe the slate clean and start from scratch (at least not very often), and so my speculations today are, well, somewhat unrealistic. Nonetheless, it is worth asking this sort of question from time to time, if only to force ourselves to think about possibilities that seem totally at odds with present circumstances. After all, just because things are one way today doesn't mean they have to stay that way forever. And they won't.
Remember the Golden Rule? "Do unto others as you would have them do unto you." It's not normally regarded as a cardinal rule of foreign policy; in that realm, "an eye for an eye" seems closer to the norm. But lately I've been thinking that Americans ought to reflect a bit more on the long-term costs of our willingness to do unto others in ways we would most definitely not want them to do unto us.
This past week, the New York Times has published two important articles on how the Obama administration is using American power in ways that remain poorly understood by most Americans. The first described Obama's targeted assassination policy against suspected terrorists, and the second describes the U.S. cyber-warfare campaign against Iran. Reasonable people might disagree about the merits of both policies, but what I find troubling is the inevitable secrecy and deceit that is involved. It's not just that we are trying to fool our adversaries; the problem is that we end up fooling ourselves, too. As I've noted before, when our government is doing lots of hostile things in far-flung places around the world and the public doesn't know about them until long after the fact, then we have no way of understanding why the targets of U.S. power might be angry and hostile. As a result, we will tend to attribute their behavior to other, darker motivations.
Remember back in 2009, when Obama supposedly extended the "hand of friendship" to Iran? At the same time that he was making friendly video broadcasts, he was also escalating our cyber-war efforts against Iran. When Iran's Supreme leader Ali Khamenei reacted coolly to Obama's initiative, saying: "We do not have any record of the new U.S. president. We are observing, watching, and judging. If you change, we will also change our behavior. If you do not change, we will be the same nation as 30 years ago," U.S. pundits immediately saw this as a "rebuff" of our supposedly sincere offer of friendship. With hindsight, of course, it's clear that Khamenei had every reason to be skeptical; and now, he has good grounds for viewing Obama as inherently untrustworthy. I'm no fan of the clerical regime, but the inherent contradictions in our approach made it virtually certain to fail. As it did.
We keep wondering: "Why do they hate us?" Well, maybe some people are mad because we are doing things that we would regard as unjustified and heinous acts of war if anyone dared to do them to us. I'm not really surprised that the U.S. is using its power so freely -- that is what great powers tend to do. I'm certainly not surprised that government officials prefer to keep quiet about it, or only leak information about their super-secret policies when they think they can gain some political advantage by doing so. But I also don't think Americans should be so surprised or so outraged when others are angered by actions that we would find equally objectionable if we were the victims instead of the perpetrators.
And if we keep doing unto others in this way, it's only a matter of time before someone does it unto us in return.
ATTA KENARE/AFP/Getty Images
Since the end of the Cold War, U.S. foreign policy has been largely run by a coalition of neoconservatives and liberal internationalists. Both groups favor a highly activist foreign policy intended to spread democracy, defend human rights, prevent proliferation, and maintain American dominance, by force if necessary. Both groups are intensely hostile to so-called "rogue states," comfortable using American power to coerce or overthrow weaker powers, and convinced that America's power and political virtues entitle it to lead the world. The main difference between the two groups is that neoconservatives are hostile to international institutions like the United Nations (which they see as a constraint on America's freedom of action), whereas liberal interventionists believe these institutions can be an important adjunct to American power. Thus, liberal interventionists are just "kinder, gentler neocons," while neocons just "liberal interventionists on steroids."
The liberal/neoconservative alliance is responsible for most of America's major military interventions of the past two decades, as well as other key initiatives like NATO expansion. By contrast, realists have been largely absent from the halls of power or the commanding heights of punditry. That situation got me wondering: What would U.S. foreign policy have been like had realists been running the show for the past two decades? It's obviously impossible to know for sure, but here's my Top Ten List of What Would Have Happened if Realists Had Been in Charge.
#1. No war in Iraq. This one is easy. Realists like Brent Scowcroft played key roles in the first Bush administration, which declined to "go to Baghdad" in 1991 because they understood what a costly quagmire it would be. Realists were in the forefront of opposition to the war in 2003, and our warnings look strikingly prescient, especially when compared to the neocons' confident pre-war forecasts. If realists had been in charge, more than 4,500 Americans would be alive today, more than 30,000 soldiers would not have been wounded, and the country would have saved more than a trillion dollars, which would come in handy these days. Hundreds of thousands of Iraqis would still be alive too, and the balance of power in the Gulf would be more compatible with U.S. interests.
#2: No "Global War on Terror." If realists had been in charge after 9/11, they would have launched a focused effort to destroy al Qaeda. Realists backed the war against the Taliban in Afghanistan, and a realist approach to the post-9/11 threat environment would have focused laser-like on al Qaeda and other terrorist groups that were a direct threat to the United States. But realists would have treated them like criminals rather than as "enemy combatants" and would not have identified all terrorist groups as enemies of the United States. And as noted above, realists would not have included "rogue states" like Iran, Iraq, and North Korea (the infamous "axis of evil") in the broader "war on terror." Needless to say, with realists in charge, the infamous 2002 National Security Strategy calling for preventive war would never have been written.
#3. Staying out of the nation-building business. A third difference follows from the first two. Realists understand that transforming foreign societies is a difficult, costly, and uncertain enterprise that rarely succeeds. It is especially hard to do in poor countries with deep internal divisions, no history of democracy, and a well-established aversion to foreign interference. By avoiding the long-term occupation of Iraq and Afghanistan, the United States would have had little need to invest in counter-insurgency or "nation-building," and could have focused instead on more serious strategic challenges. Which leads us to #4.
#4. A restrained strategy of "Offshore Balancing." Since the end of the Cold War, prominent realists have called for the United States to adopt a more restrained grand strategy that focuses on maintaining the balance of power in key areas (e.g., Europe, East Asia, and the Persian Gulf) but reduces America's global footprint and keeps the U.S. out of unnecessary trouble elsewhere. Such a strategy would also force U.S. allies to shoulder more of the burden and discourage them from either "free-riding" or "reckless driving" (i.e., adventurism encouraged by overconfidence in U.S. support). For instance, realists would never have adopted the Clinton administration's foolish strategy of "dual containment" in the Persian Gulf, or the Bush administration's even more reckless effort at "regional transformation." Instead, realists would have maintained a robust intervention capability but kept it offshore and over-the-horizon, bringing it to bear only when the balance of power broke down (as it did when Iraq invaded Kuwait in 1990). Had we followed this approach from 1992 onward, it is even possible that al Qaeda would never have gotten rolling in a big way or never tried to attack the United States directly.
#5. No NATO expansion. Realists weren't surprised when the United States decided to move NATO eastwards; it's typical of victorious great powers to try to press their advantage. But they were skeptical about the whole idea, fearing (correctly) that it would poison relations with Russia and that the U.S. was taking on commitments that it might not be willing to meet and that would make NATO increasingly unwieldy. A realist approach would have stuck with the "Partnership for Peace" initiative, a much smarter move that enabled many useful forms of security cooperation and kept the door open to a more constructive relationship with Russia. Over time, realists would have pressed Europe to take on the main burden of its own defense, fully aware that Europe faces no security problems at present that it cannot handle on its own.
#6: No Balkan adventures. If realists had been in charge, the United States and its allies would have taken a different approach to the Balkan war in the 1990s. The United States might have stayed out entirely -- as former Secretary of State James Baker seemed to want -- because its vital interests were not at stake. Or it might have pushed for a partition plan for Bosnia, as John Mearsheimer, Robert Pape, and Stephen Van Evera proposed here and here. What would not have happened was the Rube Goldberg effort to cobble together a multi-ethnic "liberal" democracy in Bosnia (an effort that has largely failed and is likely to unravel if outside forces ever withdraw) or the subsequent ill-conceived war in Kosovo (which inept U.S. diplomacy helped provoke). Reasonable people can disagree about whether the world is better off for the U.S. having intervened, but it's by no means clear that the results were worth the effort.
#7. A normal relationship with Israel. Realists have long been skeptical of the "special relationship" with Israel, and they would have worked to transform it into a normal relationship. The United States would have remained committed to helping Israel were its survival ever threatened, but instead of acting like "Israel's lawyer," Washington would have used its leverage to prevent Israel from endlessly expanding settlements in the Occupied Territories. An even-handed U.S. approach would have taken swift advantage of the opportunity created by the 1993 Oslo Accords, and might well have achieved the elusive two-state solution that U.S. presidents have long sought. At a minimum, realists could hardly have done worse than the various "un-realists" who've mismanaged this relationship for the past 20 years.
#8: A more sensible approach to nuclear weapons. Realists have long emphasized the defensive advantages conferred by nuclear weapons, and have opposed the excessively large nuclear arsenals built up during the Cold War. Realists appreciate the deterrent value of nuclear weapons and believe complete disarmament is impractical, but they would have been much bolder in reducing the U.S. arsenal and would have focused more attention on securing nuclear materials world-wide. At the same time, realists would have acknowledged the technological futility of strategic missile defense as well as its dubious strategic rationale (i.e., even if missile defenses worked perfectly, an adversary could always deliver a warhead to U.S. territory through covert means, thereby making it harder to know where it came from).
#9. No Libyan intervention. Realists (and some others) were skeptical of the wisdom of overthrowing the Qaddafi regime in Libya. This position wasn't based on any sympathy for Qaddafi or his supporters, but rather on a hard-headed calculation of the interests involved and the potential pitfalls. In particular, realists worried that Qaddafi's fall would lead to a prolonged power vacuum (it has), and that the groups we were supporting were unknown and unreliable. The intervention also set a bad precedent: Not only did the U.S. and its allies run roughshod over the Security Council resolution authorizing military action to protect civilians (but not regime change), but we were toppling an autocrat who had previously succumbed to Western pressure and given up his WMD programs. It's possible that Libya will settle down and become a success story for liberal interventionism, but the jury is still out.
#10. A growing focus on China. Realists focus mostly on power and believe that the anarchic structure of world politics encourages powerful states to compete with each other for security. Not necessarily because they want to, of course, but because powerful states cannot take each other's benevolent intentions for granted. Accordingly, realists are skeptical of the claim that Sino-American rivalry can be avoided by "engaging" China, by fostering tight economic ties, or by enmeshing Beijing in institutions designed and led primarily by the United States. Accordingly, realists would focus on strengthening security ties in Asia (while getting our Asian allies to pull their weight), and work to establish clearer "red lines" with China's leadership. Over time, making it harder for China to translate its economic wealth into military power will be in order as well. Realists don't seek a war with China or regard it as inevitable, but they believe that avoiding it is going to take a lot of careful attention to Asian security issues.
To be sure, both the Bush and Obama administrations have moved in this direction, as exemplified by the "strategic partnership" with India and the recent "pivot" to Asia. These shifts occurred in part because there were a few realists involved (e.g., former U.S. ambassador to India Robert Blackwill), and partly because the structural forces were impossible to ignore.
Not all realists would subscribe to every item on this list, of course, and one could add other items to it. For instance, if the EU member-states had been led by realists in recent decades, their ill-fated experiment with the Euro would never have been tried and Europe would be in much better economic shape today. Similarly, realists would have followed a different approach toward Iran, and would almost certainly have tried to follow up on earlier Iranian efforts to improve relations with a "grand bargain" that acknowledged Iran's right to nuclear enrichment but put stringent safeguards in place to discourage weaponization. (That seems to be where we are headed right now, but it remains to be seen if Washington and Tehran have the patience and political will to get there).
As noted above, realists may have wrong about some of these items (e.g., the interventions in the Balkans and in Libya) and it's possible that U.S. leaders ultimately did the right thing in those cases on humanitarian (as opposed to strategic) grounds. I'll concede that possibility, but on the whole, I'd argue that both the United States and some key parts of the world would have been far better off if the United States had used its power in a more realistic fashion. It's too late to avoid the past mistakes, of course, but at least we can try to learn from them.
AHMAD AL-RUBAYE/AFP/Getty Images
The big event at Harvard yesterday was "A Conversation with Henry Kissinger" at Sanders Theater. The event featured the 89-year old statesman reflecting on his time at Harvard, his career in government, and the future relationship between the United States and China, along with several other topics. He was joined in the discussion by my colleagues Graham Allison (who moderated) and Joseph Nye, and by Jessica Blankshain, a graduate student from the Department of Government.
I won't try to summarize the whole conversation, but instead merely highlight a couple of moments that I found especially interesting. First, at one point Kissinger said he thought the best academic preparation for government service was training in philosophy, political theory, and history. In particular, he argued that training in political theory taught you how to think in a disciplined and rigorous manner, and knowledge of history was essential for grasping the broader political context in which decisions must be made. It was clear that he also sees a grounding in history as essential for understanding how different people see the world, and also for knowing something about the limits of the possible.
I found this observation intriguing because these subjects are not what schools of public policy typically emphasize, even though they are supposedly in the business of preparing students for careers in public service. The canonical curriculum in public policy emphasizes economics and statistics (i.e., regression analysis), sometimes combined with generic training in "public policy analysis" and political institutions. The Kennedy School (where I teach) does require MPP students to take one core course in ethics (which is grounded in political philosophy), but there's no required course in history and each year I feel my students know less and less about that important subject. Instead, they flock to courses on "leadership," as if this quality was something you can learn in a classroom in a semester or two. I would love to have asked Kissinger to elaborate on how aspiring public servants are being trained these days.
After Joe Nye asked him if there were any decisions he made that he wished he could do over (a question that Kissinger mostly evaded), he went on to reflect on how his thinking has changed over time. He noted that he has had lots of time to read and reflect since leaving government service, and he said there were many things about the world that he understood better now than when he was serving in government. He also said he was not as "self-confident" in some of his judgments as he had been when he was younger. But then he said he wasn't sure this greater wisdom would make him a better policymaker. The reason, he said, is that being a policymaker requires a powerful sense of self-confidence, precisely because so many decisions are not clear-cut -- they are 51/49 judgment calls. As he put it, "You don't get rewarded for your doubts." And in those circumstances, a little bit of bravado goes a long way; it might even be a job requirement.
It was entirely predictable, of course, that the event was briefly disrupted by a vocal protester who was quickly escorted from the room. One of the questions asked during the Q and A took a similar approach, reciting a list of Kissinger's alleged crimes and ending with the question "How do you sleep at night?" I understand where such questions come from, but I've also thought this tactic is a remarkably ineffective way to try to make a political point. Disrupting public gatherings is a form of free speech and I wouldn't try to ban it, but my experience is that it is almost always counterproductive. The reason is simple: When someone gets up and starts shouting accusations, it violates our innate sense of courtesy and almost always turns the crowd against the protester and toward the person they are attacking. I like spirited discourse as much as the next person, but I've found that a respectful, well-aimed, and devastating question usually opens more minds and does more damage than passionate denunciations do.
JIM WATSON/AFP/Getty Images
There's been a lot of needless hoopla over Obama's "open mic" comment at the Nuclear Security Summit, including an almost certainly ghost-written piece by Mitt Romney here at FP. Obama was overheard telling Russian President Dmitri Medvedev that he "would have more flexibility" to negotiate a deal on missile defense after the election, which is both correct and hardly a state secret. The flap illustrates the main point I was trying to make a few days ago, when I wrote about how the absurdly long U.S. election cycle was a major impediment to a more effective foreign policy. (It may also be an impediment to Romney's chances, because the longer the campaign goes on, the more opportunities he has for foot-in-mouth moments that expose his ignorance about foreign policy, including his silly comment about Russia being our major geopolitical rival).
In any case, the incident got me thinking about how much the arms control agenda has changed since the heyday of the Cold War. Back then, there was a serious constituency in the United States pushing nuclear arms control, which saw it as key to reducing the risk of nuclear war, managing the U.S.-Soviet relationship, and dampening the danger of international conflict more generally. Arms control was intended to save some money, preserve each side's second-strike deterrent capabilities, and help stabilize the political relationship between Moscow and Washington. It was thus a key ingredient in the basic agenda of détente, which sought to keep U.S.-Soviet competition within bounds. (One can argue about how effective it was, but it is worth noting that nuclear war didn't occur, and the U.S. and its allies triumphed over the Soviet Union without fighting a war with them.)
Accordingly, the main items on the arms control agenda involved direct negotiations with our Soviet adversaries (the SALT and START treaties, the INF treaty on intermediate nuclear forces in Europe, etc.). These efforts involved tough and protracted negotiations between more-or-less equals (even though the U.S. and its allies were a lot stronger than the Soviet Union and its various clients), and there was no possibility of either side issuing ultimatums or imposing a one-sided deal on the other. The other main arms control item was the Non-Proliferation Treaty (NPT), and this arrangement resulted from tacit collusion between the two superpowers to preserve their own nuclear superiority. After all, the basic NPT deal allowed nuclear powers keep their own arsenals (in exchange for pledges to share nuclear technology and make some sort of long-term effort disarmament), while putting in place a regime that made it much harder for other states to join the nuclear club.
But what about now? Since the end of the Cold War, the "arms control" agenda has become decidedly one-sided. Yes, there's been a not-very-significant "New Start" treaty with Russia, which didn't alter the basic strategic relationship at all and which hardly anybody (including Governor Romney) has paid much attention to. The real action in arms control has been a series of U.S.-led efforts to get states to give up their existing arsenals or abandon existing nuclear programs. In the 1990s, we put tremendous pressure on Ukraine, Kazakhstan, and Belarus to give up the arsenals they inherited from the former Soviet Union, and we eventually succeeded. Then the United States nearly launched a preventive war against North Korea in 1994, and did various deals (e.g., the "Agreed Framework") to try to head off their development of nuclear weapons. We invaded Iraq in 2003 to stop Saddam's "Weapons of Mass Destruction" programs (which turned out to be fictitious -- our bad), and have been ratcheting up economic sanctions and waging a covert war against Iran to try to keep Tehran from getting too close to the nuclear weapons threshold. And we keep saying "all options are on the table," which is a threat to use force.
In short, instead of "arms control" being the product of mutual negotiation, as it was in the Cold War, it now consists of the United States making demands and ramping up pressure to get weak states to comply. Instead of being primarily a diplomatic process aimed at eliciting mutually beneficial cooperation (which might also help ameliorate mutual suspicions with current adversaries), arms control has become a coercive process designed to produce capitulation. This approach may have worked in a few cases (e.g., Libya, although even there the Bush administration made certain concessions to secure a final deal), but its overall track record is paltry. After all, North Korea eventually went ahead and tested a nuclear device, and escalating pressure on Iran has yet to convince its leaders to abandon their enrichment program. And as I've noted before, using military force would not eliminate Iran's ability to develop weapons if it wishes, and could easily convince them that they had not choice but to go ahead and weaponize.
Because material power is still the central currency in world politics, this tendency doesn't surprise me all that much. When the United States has to deal with near-equals, it understands that bargaining is necessary and that a successful outcome requires patience and compromise. But today, we think we can impose our will on almost anybody, so any sort of compromise is regarded as some sort of craven appeasement. But even a country as powerful as the United States cannot simply dictate to others -- as we should have learned by now from our experiences with Iraq, Afghanistan, and a few others -- and a disdain for genuine diplomacy (as opposed to merely issuing ultimatums and imposing sanctions) is getting in the way of potential deals that could reduce the risk of proliferation, dampen the danger of war, and enable U.S. leaders to turn their attention to other priorities. Being the world's #1 power confers many advantages, but it can also be a potent source of blind and counterproductive arrogance.
ALEXEY DRUZHININ/AFP/Getty Images
At the Big Think website, John Horgan argues that war is just a cultural practice that humankind could eventually abandon, unless we keep infecting ourselves with the "war virus" (h/t Andrew Sullivan). If one state gets infected by war-proneness, so his argument runs, its neighbors may have no choice but to follow suit and adopt similar measures in order to prevent themselves from being conquered. In Horgan's words (as reported by Mark Cheney here):
"Imagine your neighbor is a violent psychopath who is out for blood and land. You, on the other hand, are person who wants peace. You would have few options but to embrace the ways of war for defense. So essentially your neighbor has infected you with war."
It's an arresting use of language, perhaps, but the history of social Darwinism should have taught us to be wary of bringing misplaced biological analogies into the study of world politics. Viral infections spread by very specific and well-known mechanisms -- e.g., they take over the DNA of neighboring cells and replicate themselves-and that's not remotely like the mechanism that Horgan is identifying here. Instead, he's actually describing a situation where an external threat forces the leaders of neighboring states to rationally choose to adopt policies and strategies designed to insure their survival. That's not how viruses spread: You don't catch a cold because you've decided the only way to protect yourself against your sneezing neighbor is to start sniffling and sneezing along with them.
The actual logic that Horgan is pointing to here is the basic "security dilemma" that realists have been talking about ever since John Herz. In a world where no agency or institution exists to protect states from each other, each is responsible for its own security. Because states cannot know each other's intentions with 100 percent certainty (either now or in the future), they have to prepare for the possibility that neighbors may do something nasty at some point. So they invest in their own armed forces or they look for powerful allies, especially if they think the possibility of trouble is fairly high. And once they do that, others have to worry about them in turn. This is the "tragedy" of great power politics identified by my colleague John Mearsheimer, and it's a much better explanation for security competition (and war) than some analogy to microbes.
To be fair, Horgan's larger point is simply that war is not a biological necessity; it is a specific political or cultural response to certain conditions and thus in theory could gradually be abandoned. This theme has been developed at length by John Mueller and more recently by Steven Pinker. I agree with Pinker's claim that the overall level of human violence has declined significantly over the past several centuries (mostly due to the emergence of increasingly stable domestic political orders, i.e., states), but I remain agnostic about the larger claims for a long-term reduction in inter-state violence. That trend is driven almost entirely by the absence of great-power war since 1945, and the absence of great-power war may have multiple and overlapping causes (bipolarity, nuclear weapons, the territorial separation of the U.S. and USSR during the Cold War, the spread of democracy, etc.) whose persistence is hard to forecast.
The absence of great-power war is a good thing, because major powers have the most capability and can do the greatest harm when their destructive capacities are fully roused. What we're seeing instead, however, is either protracted conflicts among warlords, insurgents, or relatively weak states (think the Congo, Sudan, or Colombia), and wars of choice waged by the United States and other powerful states in various strategic backwaters, mostly against adversaries that we don't think can do much in response. At least we hope not.
Note: I've posted several times on the question of Sino-American relations. Today I feature a guest post by Yuan-kang Wang of Western Michigan University, who offers an interesting analysis of what China's past behavior might tell us about its future course.
By Yuan-kang Wang:
As a regular visitor who enjoys reading this blog, I thank Steve Walt for the invitation to contribute this guest post on the relationship between Chinese power, culture, and foreign policy behavior.
Steve (and others) have written about American exceptionalism. It won't surprise you to learn that China has its own brand. Most Chinese people -- be they the common man or the political, economic, and academic elite -- think of historical China as a shining civilization in the center of All-under-Heaven, radiating a splendid and peace-loving culture. Because Confucianism cherishes harmony and abhors war, this version portrays a China that has not behaved aggressively nor been an expansionist power throughout its 5,000 years of glorious history. Instead, a benevolent, humane Chinese world order is juxtaposed against the malevolent, ruthless power politics in the West.
The current government in Beijing has recruited Chinese exceptionalism into its notion of a "peaceful rise." One can find numerous examples of this line of thought in official white papers and statements by President Hu Jintao, Premier Wen Jiabao, and other officials. The message is clear: China's unique history, peaceful culture, and defensive mindset ensure a power that will rise peacefully.
All nations tend to see their history as exceptional, and these beliefs usually continue a heavy dose of fiction. Here are the top three myths of contemporary Chinese exceptionalism.
Myth #1: China did not expand when it was strong.
Many Chinese firmly believe that China does not have a tradition of foreign expansion. The empirical record, however, shows otherwise. The history of the Song dynasty (960-1279) and the Ming dynasty (1368-1644) shows that Confucian China was far from being a pacifist state. On the contrary, Song and Ming leaders preferred to settle disputes by force when they felt the country was strong, and in general China was expansionist whenever it enjoyed a preponderance of power. As a regional hegemon, the early Ming China launched eight large-scale attacks on the Mongols, annexed Vietnam as a Chinese province, and established naval dominance in the region.
But Confucian China could also be accommodating and conciliatory when it lacked the power to defeat adversaries. The Song dynasty, for example, accepted its inferior status as a vassal of the stronger Jin empire in the twelfth century. Chinese leaders justified their decision by invoking the Confucian aversion to war, arguing that China should use the period of peace to build up strength and bide its time until it had developed the capabilities for attack. In short, leaders in Confucian China were acutely sensitive to balance-of-power considerations, just as realism depicts.
Myth 2: The Seven Voyages of Zheng He demonstrates the peaceful nature of Chinese power.
In the early fifteenth century, the Chinese dispatched seven spectacular voyages led by Zheng He to Southeast Asia, the Indian subcontinent, the Middle East, and East Africa. The Chinese like to point out that Zheng He's fleets did not conquer an inch of land, unlike the brutal, aggressive Westerners who colonized much of the world. Instead, they were simply ambassadors of peace exploring exotic places.
This simplistic view, however, overlooks the massive naval power of the fleet-27,000 soldiers on 250 ships-which allowed the Chinese to "shock and awe" foreigners into submission. The Chinese fleet engaged in widespread "power projection" activities, expanding the Confucian tribute system and disciplining unruly states. As a result, many foreigners came to the Ming court to pay tribute. Moreover, the supposedly peaceful Zheng He used military force at least three times; he even captured the king of modern-day Sri Lanka and delivered him to China for disobeying Ming authority. Perhaps we should let the admiral speak for himself:
"When we reached the foreign countries, we captured barbarian kings who were disrespectful and resisted Chinese civilization. We exterminated bandit soldiers who looted and plundered recklessly. Because of this, the sea lanes became clear and peaceful, and foreign peoples could pursue their occupations in safety."
Myth 3: The Great Wall of China symbolizes a nation preoccupied with defense.
You've probably heard this before: China adheres to a "purely defensive" grand strategy. The Chinese built the Great Wall not to attack but to defend.
Well, the first thing you need to remember about the Great Wall is that it has not always been there. The wall we see today was built by Ming China, and it was built only after a series of repeated Chinese attacks against the Mongols had failed. There was no wall-building in early Ming China, because at that time the country enjoyed a preponderance of power and had no need for additional defenses. At that point, the Chinese preferred to be on the offensive. Ming China built the Great Wall only after its relative power had declined.
In essence, Confucian China did not behave much differently from other great powers in history, despite having different culture and domestic institutions. As realism suggests, the anarchic structure of the system compelled it to compete for power, overriding domestic and individual factors.
Thus, Chinese history suggests that its foreign policy behavior is highly sensitive to its relative power. If its power continues to increase, China will try to expand its sphere of influence in East Asia. This policy will inevitably bring it into a security competition with the United States in the region and beyond. Washington is getting out of the distractions of Iraq and Afghanistan and "pivoting" toward Asia. As the Chinese saying goes, "One mountain cannot accommodate two tigers." Brace yourself. The game is on.
Yuan-kang Wang is an associate professor in the Department of Sociology and the School of Public Affairs and Administration at Western Michigan University. He is the author of Harmony and War: Confucian Culture and Chinese Power Politics.
FREDERIC J. BROWN/AFP/Getty Images
You know a case for war is weak when its advocates have to marshal blatant untruths in order to convince people that their advice should be followed. Exhibit A is today's alarmist op-ed in the New York Times, in which former IDF general Amos Yadlin argues for a preventive strike against Iran's nuclear facilities.
He recites the by-now familiar arguments for an attack, and makes it clear that he thinks Obama should make an "ironclad" pledge to do it if Iran doesn't cease its nuclear activities. But the big historical howler comes in the middle of the piece, where he attempts to deal with the counter-argument that an attack would only delay an Iranian program, and probably not for all that long. He writes:
"After the Osirak attack and the destruction of the Syrian reactor in 2007, the Iraqi and Syrian nuclear programs were never fully resumed."
This claim is at best deeply misleading and at worst simply false. It's technically true that there hasn't been a resumption of either the Iraqi or Syrian programs since 2007, but what about there the twenty-six year gap between the Osirak raid in 1981 and the raid on Syria? What happened during those intervening years? As Malfrid Hegghammer, Daniel Reiter, and Richard Betts have all shown, the destruction of Osirak led to an elite consensus that Iraq needed its own deterrent, and led Saddam Hussein to order a redoubling of Iraq's nuclear program in a more clandestine fashion. This effort was so successful that the UN inspectors who entered Iraq after the 1991 Gulf War were surprised by how extensive the program was and how close it had come to producing a bomb. Indeed, if Saddam had been smart enough to wait a few more years, he might have crossed the nuclear finish line.
Thus, the true history teaches the opposite lesson from the one Yadlin is proposing. In the Iraqi case, a preventive strike reinforced Iraq's interest in acquiring a deterrent, and led Iraq to pursue it in ways that were more difficult to detect or prevent. That is what Iran is likely to do as well if Israel or the United States were foolish enough to strike them. U.S. intelligence still believes Iran has not made a final decision to weaponize; ironically, an Israeli or U.S. attack is the step that is most likely to push them over the edge.
It's hardly surprising that some Israelis would like the United States to shoulder the burden of bombing Iran. It's also not surprising that they would make up specious arguments or distort history to do this; the Bush administration got us into the Iraq war in the same way. But the Times' editors ought to insist that op-eds, whatever their positions, meet at least minimum standards for historical accuracy. And they don't even need to scour the academic literature; all they had to do was keep track of what they had already published.
In any case, if Americans fall for this sort of contorted historical analysis, we'll have only ourselves to blame. Instead of giving "ironclad" guarantees that we will launch preventive war, we'd be better served if Obama merely reminded Netanyahu that Israeli defense minister Ehud Barak doesn't think Iran is an existential threat, and that the former head of the Mossad, Meir Dagan, has called an attack on Iran the "the stupidest thing I ever heard."
Win McNamee/Getty Images
One of the great puzzles of contemporary national security policy is why the mighty United States gets its knickers in a twist over lots of security issues in lots of unimportant places. After all, it's the world's most advanced economy, by far the world's most powerful military force, it is insulated from many world problems by two enormous oceans (which do still matter, by the way), and it has an array of stable allies in most corners of the world. And oh yes, it has a nuclear deterrent consistent of thousands of warheads, more than enough to devastate any country that threatened the United States directly or threatened our independence.
Yet Americans are constantly fretting about supposedly grave threats in far-flung corners of the world, and marching off to spend billions (or even trillions) fighting long and inconclusive wars in strategic backwaters like Afghanistan. To be perfectly blunt, it makes one wonder if the national security establishment in this country is even capable of a careful, sober, even-tempered analysis anymore.
I say all this as a preamble to a recommendation for your reading list: Micah Zenko and Michael Cohen's terrific Foreign Affairs article "Clear and Present Safety." It's a rare piece of analytic sanity, and I hope it gets widely read. Money quotation:
"Within the foreign policy elite, there exists a pervasive belief that the post-Cold War world is treacherous place, full of great uncertainty and grave risks...There is just one problem. It is simply wrong. The world that the United States inhabits today is a remakably safe and secure place. It is a world with fewer violent conflicts and greater political freedom than at virtually any other point in human history...The United States faces no plausible existential threats, no great-power rival, and no near term competition for the role of global hegemon. The U.S. military is the world's most powerful, and even in the middle of a sustained downturn, the U.S economy remains among one of the world's most vibrant and adaptive...[Yet] this reality is barely reflect in U.S. national security strategy or in American foreign policy debates."
Joe Raedle/Getty Images
Imagine that you were the dean of a public policy school, and of course you wanted to boost your school's reputation and attract lots of outstanding applicants for admission. There are several ways to do this, but one familiar strategy would be to hire some really famous, world-class faculty: people with truly global reputations who would raise the visibility of your school and make more prospective students want to attend and rub shoulders with them.
I can think of lots of high-profile academics to go after in economics, political science, history, and a few other fields. For example, an ambitious dean could try to recruit global superstars like Paul Krugman, Robert Putnam, Amartya Sen, Joe Stiglitz, Theda Skocpol, Anthony Giddens, Martha Nussbaum, K. Anthony Appiah, Elinor Ostrom, John Lewis Gaddis, or Frank Fukuyama. Or you could go after a highly visible former politician or policy-maker (e.g., Kofi Annan, Condoleezza Rice, Javier Solana, etc.) and use their fame to generate buzz and attract more applicants.
So here's a puzzle: even though public policy schools are supposed to train people to work in and lead public sector organizations (to include government agencies, non-profits, I can't think of a scholar of public management or public administration with the same sort of marquee value as the people I just mentioned, and whose hiring would catapult a school up the rankings dramatically.
Please note: I am not saying that there are no excellent scholars in these domains -- among other things, I think I have some pretty terrific colleagues who work in this area -- and I'm not saying that an ambitious dean couldn't raise his or her school's profile somewhat by recruiting the best people in this area. And I'm certainly not suggesting that scholars who work in this area aren't doing useful work teaching students and advising government agencies and other organizations about how they could operate more effectively. But my sense is that the sub-fields of public management or public administration aren't producing highly visible "public intellectuals" or attracting a lot of attention outside of the sub-field itself.
But I'm not sure why this is the case. For starters, intellectuals studying the workings of public sector organizations used to be a prominent part of sociology and political science (going all the way back to Max Weber), and this body of work was a central part of the social sciences for much of the twentieth century. I am thinking here of scholars such as Dwight Waldo, Robert K. Merton, Aaron Wildavsky, Charles Perrow, James Q. Wilson, Charles Lindblom, James March, Herbert Simon, or Anthony Downs; all of whom cast long shadows over their respective fields and had an enormous impact on how we think about bureaucracies and public organizations. Moreover, management experts at business schools have and continue to enjoy a lot of global visibility -- think Peter Drucker, Jim Collins, Clayton Christenson, Michael Porter, etc. -- which suggests that it is not the topic of "management" or organizational behavior itself that is the problem.
Finally, it's hard to argue that there isn't a continued need for bold ideas that would help improve the quality of public management. The public sector consumes more than 40 percent of GDP in a lot of advanced industrial countries, and the lack of effective public institutions is a major obstacle to economic and social advancement in many developing countries. So the lack of superstar figures in this field isn't because the topic itself is unimportant.
So how might one explain this pattern? I'm not sure. One possibility, which I'm not sure is correct, is that the long effort to discredit public sector organizations and to encourage privatization has made studying such organizations less fashionable. A second possibility is that the field of organizational behavior has gradually become more "micro-oriented" -- drawing more on social psychology than on political science, sociology, or history -- and that this trend has made the field more rigorous in purely academic terms but also less interesting to anyone but specialists within the field. Or perhaps the lack of towering figures in the study of public administration at present is just a manifestation of the broader "cult of irrelevance" that I've discussed before: even though we need public institutions that work well, the scholars that inhabit elite departments of political science, economics, or sociology just aren't that interested in doing that anymore. Which is too bad.
Chip Somodevilla/Getty Images
Christmas is traditionally thought of as a season of peace. Warring nations sometimes declare a Christmas ceasefire, the Pope's Christmas message is ordinarily a call for peace, and around the world churchgoers will hear sermons and offer prayers for an end to violence. Even as we watch the continuing struggles in Afghanistan, Syria, Yemen, Colombia, Somalia, and elsewhere, and even as the world's nations continue to devote more than one-and-a-half trillion dollars each year to preparations for war, billions of people remain united in the hope that such tragic waste will one day end.
This will be my last post before Christmas Day, and though I'm not a believer, today I'm thinking about peace. Realists are often portrayed as grim and gloomy hawks who believe that human beings can never fully overcome the insecurities of the state of nature, but that's a misleading caricature at best. True, realists are mindful of human frailties, convinced that the lack of a central authority in world affairs creates powerful incentives for states to compete, and aware that sometimes this competition leads to the use of force. But realists take no joy in this situation -- as John Mearsheimer emphasizes, this feature of power politics is a tragedy -- and realists are therefore deeply concerned with finding ways to keep these dangerous and destructive tendencies in check. Because realists appreciate the evils that war brings, it is hardly surprising that they have been at the forefront of opposition to foolish wars such as Vietnam or Iraq.
Given all we know about the costs and risks of war -- a lesson that the past decade should have seared into our collective consciousness -- what I find both striking and depressing is the enthusiasm that so many commentators still have for more of the same. We still have a chorus of pundits eager for war with Iran, for example, and there's another well-populated choir convinced that the answers to contemporary global problems are more drone strikes, more energetic use of special forces and covert action, and greater secrecy here at home.
And what is equally striking is that the goal of peace plays a miniscule role in contemporary political discourse. As my colleague Nicholas Burns points out in a must-read column in today's Boston Globe, with the exception of libertarian Ron Paul, none of the current presidential contenders have made peace a central theme in their campaign. It was not always this way: our first president, George Washington, once said that "My first wish is to see this plague of mankind, war, banished from the earth," and Abraham Lincoln understood that "war at the best is terrible." Woodrow Wilson may have lent his name to the sometimes overweening U.S. effort to spread democratic ideals around the globe, but he also warned his countrymen to exercise the "self-restraint of a truly great nation, which realizes its own power and scorns to misuse it." And let us not forget that Dwight D. Eisenhower, who knew as much about war as any American, once remarked that "America's leadership and prestige depend, not merely upon our unmatched material progress, riches and military strength, but on how we use our power in the interests of world peace and human betterment."
Yet such sentiments seem notably absent in the hearts of those who now seek to be commander-in-chief, including the present incumbent. As Burns observes, even Obama's speech accepting the Nobel Peace Prize was mostly a defense of the necessity of force. And today, most of the presidential aspirants seem more interested in convincing voters that they know how to channel their inner Rambo and that they will not hesitate to use force wherever and whenever they deem it necessary. Frankly, I'd be happier thinking that they would hesitate, and think twice -- or even thrice -- before sending the nation to another war.
Part of the problem, as the former president of the Council on Foreign Relations, Leslie Gelb, admitted a couple of years ago, is that a reputation for tough-minded hawkishness has become a prerequisite for advancement and credibility in the foreign-policy establishment. Think about it: even though the United States is probably the most secure great power in history, an ambitious up-and-coming policy wonk in D.C. is more likely to advance rapidly if he or she is a vocal proponent of using American power than if he or she is seen as skeptical or even somewhat averse to flexing U.S. military might at every occasion. And God forbid that someone who aspires to rise in Washington gets a reputation for being seriously interested in peace. That might get you a job at AID or at some left-wing think tank, but you aren't going to make a lot of short lists for State, Defense, or the NSC.
This tendency to reward bellicosity pervades our politics, and not in a good way. Look at the venom that pollutes talk radio, and the scorched-earth partisanship (mostly flowing from the GOP) that has paralyzed the legislative branch on a host of vital issues. Read the talk-backs on virtually any political website -- including this one -- and observe how brave commenters, safely cloaked in internet anonymity, devote hours to flinging vile insults at each other. Or consider the ease with which prominent figures here and abroad will condemn whole categories of people -- gays, Muslims, Jews, foreigners -- without having met a single one or taking any time to consider how the world might look from someone else's perspective. When one looks at political discourse -- even in America, this most secure and fortunate of countries -- it requires no great imagination to see why it is so hard to keep humans from fighting.
It is a discouraging picture, to be sure, but this is not the season for despair. For this week, at least, I choose to see the glass as half-full. This Christmas, I will reflect on the possibility that Steve Pinker, John Mueller, and others are right, and that humankind, for all its continued woes, is nonetheless moving away from its very violent past. I shall look for hopeful signs amid the tumult. I shall bask in the comforting embrace of family and friends, and think hard about what I can do better in the months and years ahead. I hope all of you do too. And whether you're someone who tends to nod in agreement when you read this blog, or someone who thinks I've yet to get anything right, may this season and the year to come bring you love, hope ... and peace.
And here, for your enjoyment, are two musical bonuses to accompany the title of this post:
Jewel Samad/AFP/Getty Images
It's the holiday season, but Death does not observe such man-made conventions. I've been more conscious of that fact this past week, in part because my mother would have been 84 last Thursday and she is woven into a whole tapestry of my holiday memories. It is at such times that the loss is most acute.
And as it happens, we have seen three notable departures this week. Herewith a brief comment on each.
1. Christopher Hitchens. I never met Hitchens (though my wife knew him slightly back in the 1980s), but I've enjoyed several of his books and a fair bit of his commentary over the years. His talents were considerable and his achievements worthy of note (and I'd give a fair bit to be as able and witty a writer as he was), but the outpouring of tributes this past week struck me as decidedly over-the-top. (I can't help but think that he would have been first in line to skewer most of them). I don't doubt the sincerity of his friends' affection and or question their sense of loss, but as Glenn Greenwald notes, if you want people to say nice things about you when you're gone, make sure a lot of your friends are well-connected Establishment writers.
Like a lot of public intellectuals, Hitchens embraced an odd set of ideological fixations at various points in his career. He started out a Trotskyite, and ended up a cranky neoconservative fellow-traveler (at least regarding the Iraq War and the threat from radical Islam). And his public persona never seemed tempered by self-doubt, despite having been massively wrong on more than one occasion. A bit more humility might have made him a less successful writer, but also a more sensible one.
Is it possible that his oscillations reflected a lack of deep intellectual foundations? He was clearly formidably well-read, but apart from his outspoken atheism, I'm not sure he had a well-developed theory for how the world really worked. By his own account, the unifying core of his thinking was a hatred of "the totalitarian"--and especially any movement or ruler who tried to control what we think--but isn't that about the easiest target for anyone (and especially a writer) to pick? I mean, who's going to rise to totalitarianism's defense in this day and age, and especially inside the American Establishment? (Civil liberties may be under siege these days, but we have a ways to go before we come close to true tyranny.)
That said, I was also struck by one more thought upon reading all those commentaries on his career. I cannot imagine the American system of higher education producing anyone quite like him, and especially not the typical American Ph.D. program in the social sciences. Whatever his flaws may have been, Hitchens was wide-ranging, provocative, willing to take unpopular positions, and above all fun to read. Whereas graduate education in the United States is increasingly designed to take smart and ambitious young students, stamp most of the fire and creativity out of them, and make them safe, largely indistinguishable from each other, and above all, boring. (There's a reason we call them "academic disciplines"). So if Hitchens is your role model, for god's (note the small "g") sake don't go get a Ph.D.
David Levenson/Getty Images
I've detected a growing tendency to issue obituaries for the "Arab spring." This impulse is understandable given the relentless turmoil in Yemen, the brutal repression that continues in Syria, the simmering tensions in Libya and Bahrain, and the recent resurgence of sometimes violent protest against the military regime in Egypt. Not surprisingly, early hopes that the Arab world was at the dawn of a new era have been dashed-or at least diminished. And that's why pundits like Tom Friedman are now crossing their fingers and hoping for the reincarnation of Nelson Mandela in each of these states.
But if the history of revolutions tells us anything, it is that rebuilding new political orders is a protracted, difficult, and unpredictable process, and having a few Mandelas around is no guarantee of success. Why? Because once the existing political order has collapsed, the stakes for key groups in society rise dramatically. The creation of new institutions -- in effect, the development of new rules for ordering political life -- inevitably creates new winners and losers. And everyone knows this. Not only does this situation encourage more and more groups to join the process of political struggle, but awareness that high stakes are involved also gives them incentives to use more extreme means, including violence.
Under these conditions, it is a pipedream to think that key actors in a complex and troubled society like Egypt or Libya (or in the future, Syria) could quickly agree on new political institutions and infuse them with legitimacy. Even if interim rulers write a quick constitution, hold a referendum, or elect new representatives, those whose interests are undermined by the outcomes are bound to question the new rules and the process and to do what they can to undermine or amend them. What one should expect, therefore, are half-measures, false starts, prolonged uncertainty, and highly contingent events, where seemingly random events (a riot, an accident, an episode of overt foreign interference, an unexpected flurry of violence, etc.) can alter the course of events in far-reaching ways. Tunisia notwithstanding, what you are unlikely to get is a quick and easy consensus on new institutions.
Remember the French Revolution? The storming of the Bastille took place in July 1789, the nobility was abolished by the National Assembly the following year, and Louis XVI tried unsuccessfully to flee in 1791 before being forced to accept a new constitution. Internal turmoil and foreign interference eventually lead to war in 1792, Louis and Marie Antoinette were executed in 1793, and Paris was soon engulfed by the Jacobin terror, which eventually burns itself out. A new constitution is adopted in 1795, establishing a government known as the "Directory," which is eventually overthrown by Napoleon's coup d'etat on 18 Brumaire, 1799. By the time Napoleon seized power, it had been more than ten years since the initial revolutionary upheaval.
To judge by that timetable, the "Arab spring" has a long way to go. And other cases offer a similar lesson. The Russian revolution starts with the fall of the Tsarist regime in March 1917 and the formation of Kerensky's provisional government, which is subsequently overthrown by the Bolshevik coup a few months later. But the Bolsheviks' hold on power isn't fully established until their victory in the Russian Civil War, which isn't fully won until 1923. The Soviet political order endured recurrent power struggles over the next decade, until Joseph Stalin vanquished his various opponents and established a personal dictatorship.
Or take a more recent case, Iran. The revolution begins in 1978, with a steadily escalating series of street demonstrations. The shah flees into exile in January 1979, the Ayatollah Khomeini returns in February and appoints Mehdan Bazegar as Prime Minister of an interim government. A new constitution is drafted by October, but there is a continuing struggle for power between liberal, Islamist, and other groups.
The first president of the new "Islamic Republic," Abdolhassan Bani-Sadr, is impeached in 1981, and the outbreak of the Iran-Iraq war strengthens hardliners and provides an opportunity for a crackdown against some prominent members of the original revolutionary movement. The Islamic republic remains a work-in-progress to this day, with the role of the "Supreme Jurisprudent," the Revolutionary Guards, the clergy, the presidency, and the Majlis remaining in flux.
Even the comparatively benign American Revolution was hardly a done-deal when the peace treaty with England was signed in 1783. Independence from England had required the colonists to fight a lengthy war of independence, and the fledgling republic then faced several armed rebellions, most notably Shays' Rebellion in 1786. These challenges revealed the inadequacies of the original Articles of Confederation (1777-1786) leading to the drafting and adoption of what is now the U.S. Constitution.
In short, anybody who thought that the events that swept through the Arab world in 2011 were going to produce stable and orderly outcomes quickly was living in a dream world. To say this is not to oppose what has happened, or to believe that the old orders could or should have continued. Rather, it is to recognize that radical reform -- even revolution -- is a long, difficult, and uncertain process, and that the ride is likely to be a bumpy one for years to come.
History also warns that outside powers have at best limited influence over the outcomes of a genuine revolutionary process. Even well-intentioned efforts to aid progressive forces can backfire, as can overt efforts to thwart them. Overall, a policy of "benevolent neglect" may be the more prudent course, making it clear that outsiders are prepared to let each country's citizens choose their own order, provided that important foreign policy redlines are not crossed. But for a country like the United States, which still sees itself as a model for others and tends to think that it has the right and the wisdom to tell them what to do, patience and restraint can be hard to sustain. And patience is what is needed most these days.
ODD ANDERSEN/AFP/Getty Images
Perhaps the single most remarkable development in 2011 is the wave of political protests that have occurred in widely-varying political contexts. In addition to the various upheavals that constitute the "Arab Spring," we've also seen tent cities in Israel, the "Occupy Wall Street" movement and its clones here in the United States, and various imitators in both Europe and Asia. This wave of political contagion is more widespread than the "velvet revolutions" of 1989 (though not yet as significant), and perhaps the nearest analogue would be wave of youth-revolutions and upheavals that occurred back in 1968.
What is going on here? Is there a common set of causes at work, or at least a common thread to otherwise diverse phenomena? I think so, because I see these upheavals as fueled by three important global developments.
The first factor is economic globalization, which has made many states both sensitive and vulnerable to events in far-away places, and led to rising inequality both between and within countries. Yet most governments have failed to enact remedial measures to soften the consequences of economic change and to restore a more level distribution of income, thereby ensuring some degree of economic pain and political discontent.
The second development is the globalization of information, which allows events and ideas to spread much more quickly. As a result, demonstrators in Cairo can watch what's happening in Tunis and imitate it, and then other people in other countries get the idea that protest can be effective, even if their particular grievances are somewhat different. And so it spreads, as the radical idea of ordinary people taking action against the seemingly impregnable becomes increasingly contagious. Plus, each group can learn from each other and feed off the sense of being part of a larger process, instead of feeling like isolated and powerless individuals with scant hope of success. This sort of thing has happened before in world history (e.g., in 1789, 1848, 1919, 1989, etc.), but never in so many far-flung and widely different contexts.
The third reason is the increasingly-evident incompetence and/or corruption of governing elites in many countries, and the tendency of governments to do too much to protect wealthy and powerful interests and not enough to help ordinary people. In Egypt, it was the overt corruption of the Mubarak regime, whether in the form of privileged deals for military officers or for Mubarak's son. In the United States, it was the taxpayer-funded rescue of "too big to fail" financial institutions as well as the "too-well connected to fail" recycling of some of the same people who helped create the whole mess in the first place. And then there's the continued recycling of policy ideas that had been discredited by events but never discarded. People may be disappointed by Obama, but real disenchantment comes from the growing realization that replacing him wouldn't make much difference and might make things much worse. You know the line: "Meet the New Boss....Same as the Old Boss." (Turns out Pete Townshend was a prophet when he wrote "Won't Get Fooled Again," which would be a nice anthem for many of these movements.)
There is, of course, a deeper taproot to all this. As my colleague Jenny Mansbridge reminded me in a superb talk I attended last week, (and which will be published next month in PS), the present combination of economic inequality and political gridlock is fatal to the proper functioning of democratic orders. In a capitalist democracy, corporate interests tend to be wealthier than the rest of society, and the state is the only actor powerful enough to intervene to prevent corporate interests from going too far and exploiting their position. This is what happened in the Gilded Age and again in the Roaring 20s, which eventually led to the Progressive Era and later the New Deal.
But if the political system is gridlocked, then the state cannot act quickly or decisively to retard corporate power. Even worse, as corporate interests grow stronger they tend to acquire greater political power (and especially when a tame Supreme Court helps them, as it did in the Citizens United decision). Instead of just hamstringing the state, corporate interests can get it to enact laws that favor them even more. The result will be rising economic inequality and precisely the sort of irresponsible and unregulated behavior that led to the Great Recession of 2007.
Put these three things together, and you have a recipe for global protests in very different countries. Despite the many differences between conditions in the United States, in Greece, in Egypt, in Syria, in Israel, or elsewhere, what unites the 2011 wave of global protest is the shared belief that the People in Charge do not know what they are doing, care more about their own wealth and well-being than they do about the common weal, or are simply too spineless and shallow to do what at least a few of them secretly know to be right.
Ask yourself: how many contemporary political leaders do you genuinely admire? How many of them would rate a paragraph, let alone a whole chapter, in a revised edition of Profiles in Courage? How many of them seem capable of giving you a straight answer to a hard question, as opposed to offering you a lot of happy double-talk? How many of them are better at making a powerful speech than they are at taking a principled stand and sticking to it? How many of them have really got your back, as opposed to pandering to the endless parade of well-heeled lobbyists and special interest groups? Is there political leader in your country who is not for sale?
If you've been paying attention, and you can't find such leaders in your country, and you having been watching the obscenely wealthy get richer and more powerful, so that they can rig the game to make themselves richer still, then you'd probably think about painting a sign and getting out in the streets. And if I didn't already have this blog for my soap-box, maybe I would too.
Spencer Platt/Getty Images
So today I'm watching stock markets around the world go into
free fall, and the following set of thoughts struck me. For starters, what if the world economy hits a
"perfect storm?" The United States is already well on its way to a
"lost decade," mostly because the Bush administration created an
enormous mess and Obama, his advisors, and the Congress combined to do too
little back in 2009. Europe is still teetering on the brink of meltdown, and
some people have real concerns about China's overheated and opaque economy too.
And these problems are all connected, and not just by bad loans, credit-default
swaps, and the like. If any of these big economies heads back into recession,
that will slow the others and could -- in the worst case -- sends us spiraling
back down into the sort of economic tailspin not seen since the 1930s.
I am not an economist, and I have no idea how likely that "perfect storm" scenario is. But remember that what ultimately got the United States out of the Great Depression was World War II. Suddenly there was a war to win, and the American people didn't mind deficit spending and didn't mind devoting over 40 percent of GDP to defense. And they also accepted that sacrifices would be needed -- rationing, scrap drives, a draft, and the like -- and the war muted the partisan wrangling of the 1930s. That gigantic Keynesian stimulus finally got the economy roaring to life.
So here's my question: in the nuclear age, the danger of a World War II-style global conventional war is greatly reduced, and maybe even impossible. And even the most hard-edged realist would have trouble finding the equivalent of Nazi Germany or Imperial Japan in today's world (by comparison, the Islamic Republic of Iran, with a $10 billion defense budget that is less than 3 percent of U.S. national security spending, isn't remotely in the same league). So if the world were to fall into an economic abyss and a big conventional war is neither likely nor desirable (and let me make it clear that I think replaying World War II would be a VERY BAD THING), then how would we dig ourselves out? And how long would it take, especially when you consider just how dysfunctional, fact-free, and irresponsible our politics has become.
I've been in Berlin since last Thursday, and it's been an interesting exercise in slightly rueful nostalgia. I lived in West Berlin for a semester in 1976, as part of an undergraduate overseas study program. It was the first foreign country I'd ever visited and one of the great formative experiences of my early adult life.
I've been back for very brief trips twice (in 1991 and again in 2007) yet this time I've found that my memories from that first trip aren't very reliable, and even supposedly familiar haunts look odd. Of course, this is partly because Berlin has been transformed by reunification -- most obviously in the areas where the Wall was -- but also because it has been thirty-five years. Cities can change a lot in that time, and my own memories have clearly faded with the passage of time. There are moments when the past comes come back vividly, as when I read the U-bahn (subway) map and recall the names of the stations on the route from my apartment to class, or when I heard the recorded announcement saying "zuruck bleiben!" just before the subway doors close. But apart from those Proustian moments, it mostly feels like I am visiting an unfamiliar place.
I took a walk last Thursday after I arrived, strolling from my hotel through the Tiergarten to the Holocaust Memorial -- which is very effective and moving, though not without controversy -- and then onto Pariser Platz. This is the area just east of the Brandenburger Tor, and it was an abandoned zone during the Cold War, with large empty spaces around the Wall itself. It has now been transformed into a vast and inviting public square, complete with fancy hotels, a Starbucks, the "Kennedy Museum," and other classic tourist attractions. There's a wonderful bit of not-quite-accidental symbolism in the fact that the British, French, and American embassies are all located there. These were the three Western powers that governed different German zones after World War II, and it is probably no accident that they ended up with this choice real estate in the very heart of reunified Berlin.
Yesterday I wandered through some old haunts in the center of what was West Germany (Kurfurstendamm, Savigny Platz, Zoologischer Garten, etc.), and then took the subway out to a trendy neighborhood in the old East Berlin (Prenzlauer Berg). There the contrast with 35 years ago was really striking; my overwhelming sense of the old DDR was drab and monotonal grey ... but today this neighborhood is funky and energetic and artsy. And I kept reflecting on how successive German governments made rebuilding and restoring Berlin a national priority and actually pulled it off, even if it hasn't become an industrial or financial center again. I wonder what it would take to get the United States to do something like that.
By the way, the conference I attended on "Social Science and the Public Sphere" was quite enjoyable, and I learned a lot from several of the papers and from the ensuing discussion. Sociologist Michael Burawoy gave two presentations, one on different modes of knowledge ("professional," "critical," "policy," and "public") and another on the threats facing the modern university (#1: excessive regulation, on the British model, and #2: excessive marketization, on the U.S. model). Not sure he persuaded me completely, but lots to think about. There was also a fascinating paper on the history of economic thought by Norwegian economist Erik Reinert, showing how economics evolved in a path-dependent fashion and that there were several forks in the intellectual road where the field could have gone in a more historical, institutional, and diverse direction, instead of the individualist, rationalist, and hyper-mathematical course the field has taken (at least in North America). He also quoted a passage from philosopher Francis Bacon' The Advance of Learning on "degenerate knowledge" which could easily apply to lots of social science today:
Surely, like as many substances in nature which are solid do putrefy and corrupt into worms;--so it is the property of good and sound knowledge to putrefy and dissolve into a number of subtle, idle, unwholesome, and (as I may term them) vermiculate questions which have indeed a kind of quickness and life of spirit, but no soundness of matter or goodness of quality. This kind of degenerate learning did chiefly reign amongst the schoolmen, who having sharp and strong wits, and abundance of leisure, and small variety of reading, but their wits being shut up in the cells of a few authors (chiefly Aristotle their dictator) as their persons were shut up in the cells of monasteries and colleges, and knowing little history, either of nature or time, did out of no great quantity of matter and infinite agitation of wit spin out unto us those laborious webs of learning which are extant in their books. For the wit and mind of man, if it work upon matter, which is the contemplation of the creatures of God, worketh according to the stuff and is limited thereby; but if it work upon itself, as the spider worketh his web, then it is endless, and brings forth indeed cobwebs of learning, admirable for the fineness of thread and work, but of no substance or profit."
Yeah, what he said.
Economist Mark Thoma gave a nice presentation on his experiences as the author of a well-known economics blog, and historian Thomas Bender of NYU contributed a terrific paper on the evolution of the social sciences in the United States. Among other things, I learned from it that when Johns Hopkins University pioneered the Ph.D. degree here in America, it was not intended primarily as a credential for future academics. Instead, Bender writes, "it was intended to instill in [recipients] ‘the mental culture' that would serve them in careers in ‘civil service,' ‘public journalists' or, more generally, the ‘duties of public life.'" In other words, it took another few decades to create the inward-looking and frequent navel-gazing enterprises that the social sciences have become.
The audience offered up some challenging questions, and the other participants were a stimulating and likeable group. All in all, well worth the trip. And then yesterday I gave a lecture at the Deutsche Gesellschacft fur Auswartiges Politik (DGAP, or "German Council on Foreign Affairs"), summarizing a forthcoming article on the "twilight of the American era." (You can get a preliminary sense of my argument here). I enjoyed the talk and especially the questions, and we could easily have continued the conversation longer. At dinner with some DGAP colleagues we spent a fair bit of time talking about the future of the Euro, and I would say that most of them were more optimistic than I have been. In particular, they emphasized the difference between public policy and public opinion: yes, German popular opinion is hostile to further bailouts, but German politicians understand that at the end of the day, letting Greece go down the tubes would be bad for everyone, including Germany. So long as they can make further aid conditional on genuine reforms, eventually the deal will get done. We'll see.
A final comment from the perspective of someone who bikes to work daily in Boston: Berlin is a wonderful city for bicyclists and there are lots of them. For one thing it's mostly flat, and doesn't get snow like we do in New England. But the Berliners have also gone to great lengths to make bike travel easy and safe, with dedicated lanes on streets and or sidewalks. And confirming stereotypes of Teutonic orderliness, you find most of the cyclists observing all the traffic regulations, including waiting a street lights even when there are no cars around and it would perfectly safe to cross. Definitely not instinctive scofflaws like me. Boston has been trying to do something similar for its cyclists, but let's just say we've got a ways to go. But once the price of gas gets high enough, maybe American cities will do more to encourage bicycle commuting. There will be less traffic, and we'd all be a lot healthier too.
I'm typing this from Lille, where I participated in a seminar on the "Arab spring" at the University and gave an evening lecture on U.S. Middle East policy and the role of the -- surprise -- Israel lobby. We had a good discussion, and the students asked some excellent questions. And now home to Boston, where I have a pile of neglected duties waiting to greet me.
Sean Gallup/Getty Images
The rebel victory in Libya is likely to gladden the hearts of liberal interventionists, who will see the NATO-aided triumph as vindicating the idea that great powers have the right and the responsibility to come to the aid of victims of tyrannical oppression. Add to that the general enthusiasm-which I share-for the broad effort to create more open and democratic orders in the Middle East, and it seems likely that the Wilsonian project that the U.S. foreign policy establishment has long embraced will get a shot in the arm. The debacles in Iraq and Afghanistan will be discounted, and the "Libyan model" (whatever that is) will become the latest strategic fad du jour.
If you'd like to read a good corrective to this sort of cheer-leading, I recommend Robert Kaplan's oped this morning's Financial Times, entitled ""Libya, Obama, and the Triumph of Realism." Kaplan is a self-acknowledged realist, and he offers a good defense of broadly realist approach to the tumultuous events in the Arab world and Asia. He reminds us that a realist strategy in these regions paid major dividends for many years, and argues that a balanced, prudent, and cautious policy is more likely to preserve key interests than the idealistic crusades favored by neoconservatives and Wilsonian liberals alike.
As you might expect, I think Kaplan is basically right. As I've noted before, we still don't know how the "Libyan revolution" is going to turn out. Even if Qaddafi set a very low standard for effective or just governance, the end-result of his ouster may not be as gratifying as we hope. More importantly, we also ought to guard against the common tendency to draw big policy conclusions from a single case, especially when we don't have good theories to help us understand the differences between different outcomes.
Looking forward, the policy-relevant question is whether it is a good idea for powerful outside powers to use military force to cause regime change in weak states whose leaders are misbehaving in some way. This phenomenon has become known as "foreign-imposed regime change" (FIRC). To answer that question, the first thing one ought to ask is what the general baseline patterns are: how often do FIRCs succeed, based on various measures of success? If Libya turns out well but the vast majority of FIRCs were failures, for example, then a prudent policymaker would be wary of trying to repeat the Libyan operation elsewhere. (The logic is the same in reverse, of course, our failures in Iraq do not mean that all preventive wars are wrong, even if that one obviously was).
The second step would be to identify the conditions associated with success or failure, and the causal mechanisms leading to one outcome or another. (Thus, far, the academic literatures suggests that FIRCs are more likely to fail when there are deep ethnic or religious cleavages in the target society, and when it is relatively poor). Even if FIRCs usually failed, for example, there might be certain circumstances when success was much more likely and where attempting regime change would therefore be more attractive. Because this is social science and not deterministic, knowing that conditions are favorable is no guarantee of success. But surely a smart policymaker would want to know both the general tendency and whether the case at hand might be an outlier.
The third step-which should be informed by the first two-would be to ask if there were specific policy steps that could be taken to increase the probability of success. And the smart follow-up question is to ask whether one's opponents have readily available strategies that they could employ to thwart our efforts). Even if FIRCs often fail, perhaps clever strategies and "policy learning" could improve the success rate over time, especially if leaders picked their spots carefully and if the other side had a limited repertoire of responses.
But notice one danger here: even when circumstances aren't propitious, advocates of intervention can fall prey to wishful thinking and convince themselves that they have figured out how to do these things properly, thereby avoiding the disasters that have befallen others. Right now, some people are undoubtedly thinking that the right combination of special forces, drones, local allies, and multilateral support are the magic formula for success. They may be right but I wouldn't assume it blindly and I wouldn't ignore the possibility that others will start thinking about ways to make sure the U.S. and its allies can't repeat this sort of thing elsewhere. Donald Rumsfeld was pretty sure he knew how the United States could avoid costly quagmires-go in light and get out early-the only problem was that getting in turned out to be the easy part. And don't be surprised if a few countries conclude that the real lesson of the Libyan intervention was that Muammar al-Qaddafi blundered when he agreed to end his WMD programs and open up to the outside world. I'm glad he did, but I suspect that leaders in Iran and North Korea will draw their own conclusions.
CARL DE SOUZA/AFP/Getty Images
I gave a talk in Washington the other day about the future of the EU and transatlantic relations more generally, and I thought FP readers might be interested in what I had to say. Here's a short summary of what I said.
I began with the rather obvious point that the highwater mark of Europe's global influence was past, and argued that it would be of declining strategic importance in the future. The logic is simple: After dominating global politics from roughly 1500 to 1900, Europe's relative weight in world affairs has declined sharply ever since. Europe's population is shrinking and aging, and its share of the world economy is shrinking too. For example, in 1900, Europe plus America produced over 50 percent of the world economy and Asia produced less than 20 percent. Today, however, the ten largest economies in Asia have a combined GDP greater than Europe or the United States, and the Asian G10 will have about 50 percent of gross world product by 2050.
Europe's current fiscal woes are adding to this problem, and forcing European governments to reduce their already modest military capabilities even more. This isn't necessarily a big problem for Europeans, however, because they don't face any significant conventional military threats. But it does mean that Europe's ability to shape events in other parts of the world will continue to decline.
Please note: I am not saying the Europe is becoming completely irrelevant, only that its strategic importance has declined significantly and that this trend will continue.
Second, I also argued that the highwater mark of European unity is also behind us. This is a more controversial claim, and it's entirely possible that I'll be proven wrong here. Nonetheless, there are several obvious reasons why the EU is going to have real trouble going forward.
The EU emerged in the aftermath of World War II. It was partly intended as a mechanism to bind European states together and prevent another European war, but it was also part of a broader Western European effort to create enough economic capacity to balance the Soviet Union. Europeans were not confident that the United States would remain engaged and committed to their defense (and there were good reasons for these doubts), and they understood that economic integration would be necessary to create an adequate counterweight to Soviet power.
As it turned out, the United States did remain committed to Europe, which is why the Europeans never got serious about creating an integrated military capacity. They were willing to give up some sovereignty to Brussels, but not that much. European elites got more ambitious in the 1980s and 1990s, and sought to enhance Europe's role by expanding the size of the EU and by making various institutional reforms, embodied in the Maastricht and Lisbon treaties. This broad effort had some positive results -- in particular, the desire for EU membership encouraged East European candidates to adopt democractic reforms and guarantees for minority rights -- but the effort did not lead to a significant deepening in political integration and is now in serious trouble.
Among other things, the Lisbon Treaty sought to give the positions of council president and High Representative for Foreign Affairs greater stature, so that Europe could finally speak with "one voice." Thus far, that effort has been something of a bust. The current incumbents -- Herman von Rompuy of Belgium and Catherine Ashton of Britain -- are not exactly politicians of great prominence or clout, and it is hardly surprising that it is national leaders like Nicolas Sarkozy of France and Angela Merkel of Germany that have played the leading roles in dealing with Europe's current troubles. As has long been the case, national governments remain where the action is.
Today, European integration is threatened by 1) the lack of an external enemy, which removes a major incentive for deep cooperation, 2) the unwieldy nature of EU decision-making, where 27 countries of very different sizes and wealth have to try to reach agreement by consensus, 3) the misguided decision to create a common currency, but without creating the political and economic institutions needed to support it, and 4) nationalism, which remains a powerful force throughout Europe and has been gathering steam in recent years.
It is possible that these challenges will force the EU member-states to eventually adopt even deeper forms of political integration, as some experts have already advised. One could view the recent Franco-German agreement on coordinating economic policy in this light, except that the steps proposed by Merkel and Sarkozy were extremely modest. I don't think the EU is going to fall apart, but prolonged stagnation and gradual erosion seems likely. Hence my belief that the heyday of European political integration is behind us.
Third, I argued that the glory days of transatlantic security cooperation also lie in the past, and we will see less cooperative and intimate security partnership between Europe and America in the future. Why do I think so?
One obvious reason is the lack of common external enemy. Historically, that is the only reason why the United States was willing to commit troops to Europe, and it is therefore no surprise that America's military presence in Europe has declined steadily ever since the Soviet Union broke up. Simply put: there is no threat to Europe that the Europeans cannot cope with on their own, and thus little role for Americans to play.
In addition, the various imperial adventures that NATO has engaged in since 1992 haven't worked out that well. It was said in the 1990s that NATO had to "go out of area or out of business," which is one reason it started planning for these operations, but most of the missions NATO has taken on since then have been something of a bust. Intervention in the Balkans eventually ended the fighting there, but it took longer and cost more than anyone expected and it's not even clear that it really worked (i.e., if NATO peacekeepers withdrew from Kosovo tomorrow, fighting might start up again quite soon). NATO was divided over the war in Iraq, and ISAF's disjointed effort in Afghanistan just reminds us why Napoleon always said he liked to fight against coalitions. The war in Libya could produce another disappointing result, depending on how it plays out. Transatlantic security cooperation might have received a new lease on life if all these adventures had gone swimmingly; unfortunately, that did not prove to be the case. But this raises the obvious question: If the United States isn't needed to protect Europe and there's little positive that the alliance can accomplish anywhere else, then what's it for?
Lastly, transatlantic security cooperation will decline because the United States will be shifting its strategic focus to Asia. The central goal of US grand strategy is to maintain hegemony in the Western hemisphere and to prevent other great powers from achieving hegemony in their regions. For the foreseeable future, the only potential regional hegemon is China. There will probably be an intense security competition there, and the United States will therefore be deepening its security ties with a variety of Asian partners. Europe has little role to play in this competition, however, and little or no incentive to get involved. Over time, Asia will get more and more attention from the U.S. foreign policy establishment, and Europe will get less.
This trend will be reinforced by demographic and generational changes on both sides of the Atlantic, as the percentage of Americans with strong ancestral connections to Europe declines and as the generation that waged the Cold War leaves the stage. So in addition to shifting strategic interests, some of the social glue that held Europe and America together is likely to weaken as well.
It is important not to overstate this trend -- Europe and America won't become enemies, and I don't think intense security competition is going to break out within Europe anytime soon. Europe and the United States will continue to trade and invest with each other, and we will continue to collaborate on a number of security issues (counter-terrorism, intelligence sharing, counter-proliferation, etc.). But Europe won't be America's "go-to" partner in the decades ahead, at least not the way it once was.
This will be a rather different world than the one we've been accustomed to for the past 60 years, but that's not necessarily a bad thing. Moreover, because it reflects powerful structural forces, there's probably little we can do to prevent it. Instead, the smart response -- for both Americans and Europeans -- is to acknowledge these tendencies and adapt to them, instead of engaging in a futile effort to hold back the tides of history.
Mark Renders/Getty Images
I was in New York City the past two days and left my laptop in my bag for a change. The main purpose of the trip was to pick up my daughter (who was flying home from a language immersion program), but we did manage to sneak in a benefit concert at the Beacon Theater. Go here for a peek at The Life I Could Have Had if I Had Talent.
Along the way I've been reflecting more on the shooting/bombing in Norway and the debates that have surfaced since last weekend. One of the striking features of Anders Breivik's worldview (which is shared by some of the Islamophobe ideologues who influenced his thinking) is the idea that he is defending some fixed and sacred notion of the "Christian West," which is supposedly under siege by an aggressive alien culture.
There are plenty of problems with this worldview (among other things, it greatly overstates the actual size of the immigrant influx in places like Norway, whose Muslim minority is less than 4 percent of the population). In addition, such paranoia also rests on a wholly romanticized vision of what the "Christian West" really is, and it ignores the fact that what we now think of as "Western civilization" has changed dramatically over time, partly in response to influences from abroad. For starters, Christianity itself is an import to Europe -- it was invented by dissident Jews in Roman Palestine and eventually spread to the rest of Europe and beyond. I'll bet there were Norse pagans who were just as upset when the Christians showed up as Breivik is today.
Moreover, even Christian Europe is hardly a fixed cultural or political entity. The history of Western Europe (itself an artificial geographic construct) featured bitter religious wars, the Inquisition, patriarchy of the worst sort, slavery, the divine right of kings, the goofy idea of "noble birth," colonialism, and a whole lot of other dubious baggage. Fundamentalists like Breivik pick and choose among the many different elements of Western culture in order to construct a romanticized vision that they now believe is under "threat." This approach is not that different from Osama bin Laden's desire to restore the old Muslim Caliphate; each of these extremists is trying to preserve (or restore) an idealized vision of some pure and sacred past, based on a remarkably narrow reading of history.
In fact, any living, breathing society is driven partly by its "inner life," but also inevitably shaped by outside forces. Indeed, as Juan Cole notes in a recent post, most societies benefit greatly from immigration, especially if they have strong social institutions (as Norway does) and the confidence to assimilate new arrivals into the existing order while allowing that order to itself be shaped over time. What is even more striking about conservative extremists like Breivik is their utter lack of confidence in the very society that they commit heinous acts trying to defend. On the one hand, they think their idealized society is far, far better than any alternative, which is why extreme acts are justified in its supposed defense. Yet at the same time they see that society as inherently weak, fragile, brittle, and incapable of defending itself against its cruder antagonists.
Paula Bronstein/Getty Images
What's the most powerful political force in the world? Some of you might say it's the bond market. Others might nominate the resurgence of religion or the advance of democracy or human rights. Or maybe its digital technology, as symbolized by the internet and all that comes with it. Or perhaps you think it's nuclear weapons and the manifold effects they have had on how states think about security and the use of force.
Those are all worthy nominees (no doubt readers here will have their own favorites), but my personal choice for the Strongest Force in the World would be nationalism. The belief that humanity is comprised of many different cultures -- i.e., groups that share a common language, symbols, and a narrative about their past (invariably self-serving and full of myths) -- and that those groups ought to have their own state has been an overwhelming powerful force in the world over the past two centuries.
Read the full article here.
Justin Sullivan/Getty Images
Guest-blogging over at Andrew Sullivan's Daily Dish, Jonathan Rauch waxes eloquent about the "coolest (U.S.) war ever": the war of 1812." I'm not going to debate the "coolness" of that particular war (or any war, for that matter), though I've always thought trying to conquer Canada was an act of folly by the young American republic, even though it got lucky and managed to eke out a draw.
But this one line of the post caught my eye:
The other lesson of 1812 is that Americans usually start wars pretty badly but end them pretty well."
Hmmm. Of course, this claim depends a bit on the criteria one uses for judging success, but here's a quick run-down of American wars and how well we started and ended them.
Revolutionary War: Started badly (i.e., the British won most of the early rounds) but ended well (we got a country!)
War of 1812: Started badly (i.e., the British occupied Washington and set fire to the White House) but ended ok.
Mexican-American War: Started well and ended well (if you like land grabs).
Spanish-American War: Started well but ended badly (the United States ends up occupying the Philippines and fighting a bloody counterinsurgency war, featuring widespread atrocities and causing the deaths of several hundred thousand Filipinos. Sound familiar?)
World War I: Started well for the United States (we got in late and on the winning side) but ended badly (i.e., the Paris Peace Conference produced one of the Worst Peace Settlements Ever)
TED ALJIBE/AFP/Getty Images
[NOTE: I originally drafted this post on July 3, but the FP staff was on holiday too so it didn't get posted in time for the Fourth. I've updated it and reposted, with appropriate changes of verb tense]
Independence Day is when Americans celebrate their two hundred year-plus experiment with self-government. After two centuries it's not really an experiment anymore, though it certainly feels like we are still making it up as we go along. On July 4th, my family read the Declaration of Independence outloud (an annual ritual) and talked about what we thought it really meant. And across the country, Americans grilled, drank, watched fireworks, and listened to John Philip Sousa, and probably spent a lot of time being grateful that they are not living somewhere else.
But what exactly are we celebrating these days? We are on a sour phase of our history, where hardly anyone seems happy about our condition at home or our position abroad. The economy remains dismal, where only the rich enjoy comfort and security, and our politics gets nastier and more dysfunctional with each passing day. Instead of working together to meet a growing array of challenges, a toxic combination of pundits, poseurs, and provocateurs is choking the life out of political system like so much kudzu. Our leaders continue to give speeches about our global responsibilities, but how many people now believe that America is leading the way to a safer, saner or more just world? We don't bring peace to war-torn lands, we are not doing much to build more effective global institutions, and sometimes it feels like armed drones and special forces have become our primary export.
In such times, it is tempting to descend into world-weary fatalism, and merely chronicle the many ways that America's reality falls short of our Founders' hopes. But I am not going to succumb to that temptation-at least not today. For although the Founding Fathers were in many ways consummate realists--acutely aware of human frailties, mindful of the dangers facing a small, weak and new nation, and ruthless in pursuit of hemispheric dominance--they were also idealists who dreamt big. On Independence Day, we can honor our past by indulging in some dreams of our own.
On this 4th of July, I dreamt of an America at peace, no longer squandering its wealth and power in unnecessary global crusades. I dreamt of an America that knows there are risks in the world, but that does not allow fear to dominate its foreign policy agenda or its domestic discourse. I dreamt of an America that has regained the world's respect, and where others trust our judgment and value our competence. I imagined an America where economic inequality is declining, not growing, and where people are judged, as Martin Luther King put it, by the content of their character and not by their race, religion, or sexual orientation. I thought about an America that is not afraid to talk to its adversaries, because it was confident that it wouldn't get bamboozled and knew that talking is often the best way to persuade others to change. I dreamt of an America that does not torture, and that has the integrity to prosecute anyone who does. I dream of an America that does not lead the world in the number of people in its prisons. Like Woodrow Wilson, I yearned for an America with the "self-restraint of a truly great nation, that knows its own strength and scorns to misuse it." I looked ahead to an America whose first concern is the well-being of all its citizens here at home, instead of trying to tell the rest of the world how to live. And I dreamt of an America where political debate is unfettered but civil, and where those who seek to win arguments by smearing their opponents or distorting their arguments are regarded by their fellow citizens with appropriate contempt.
Do I expect to see this America emerge? Sadly, no (I am a realist, after all). But if we are truly the political descendants of the brave men and women of 1776, then we have to believe in the power of imagination and the ability of human beings to chart a new course. And in that knowledge lies hope.
I hope you all had a pleasant and inspiring Independence Day, and that in the next year we move a bit closer to the ideals we celebrated on Monday.
When my clock radio went off this AM, the first story I heard was about a NATO air attack on Libyan leader Muammar al-Qaddafi's compound near Tripoli. Although NATO officials have denied that this was an attempt to kill Qaddafi, it is hard to believe that the officials responsible weren't hoping for a lucky shot. U.S. Senator Lindsay Graham told CNN that it was time to "to cut the head of the snake off, go to Tripoli, start bombing Qaddafi's inner circle, their compounds, their military headquarters." Similarly, Senator Joe Lieberman called for "going directly after Qaddafi," saying that "I can't think of anything that would protect the civilian population of Libya more than [his] removal."
In a situation like this, it is obviously tempting to think you can solve the problem by removing the bad guy at the top. Instead of a prolonged civil war that kills lots of combatants and civilians and inflicts vast property damage, why not just get rid of the individual you think is causing all the trouble, and maybe a few of his closest associates? To take the most obvious case: with the benefit of hindsight, wouldn't it have been far better to take out Adolf Hitler sometime in the 1930s? By a similar logic, wouldn't a surgical strike on Qaddafi and his inner circle be preferable to a protracted civil war?
But before you conclude that targeted assassination is the way to go, I suggest you read Ward Thomas' 2000 International Security article "Norms and Security: The Case of International Assassionation." Thomas traces the evolution of attitudes, norms, and practices regarding international assassination, and shows how they have changed significantly over time. He argues that assassination was a fairly common foreign policy tool a few centuries ago, but a combination of shifting material interests and evolving normative principles led to the emergence of a fairly strong norm against the killing of foreign leaders, even during wartime. This shift occurred in part because great powers preferred to confine conflict to the clash of armies on the battlefield (where they had the advantage over weaker states), and partly because it helped enshrine the idea that war was conducted by states and not by individuals. Thus, the norm helped reinforce the political legitimacy of the state itself, and it eventually grew so powerful that even deeply hostile states did not make serious efforts to kill each other's leaders.
Thomas also argues that the norm appears to be breaking down, for three separate reasons. First, as warfare became increasingly destructive, states began to look for cheaper alternatives. Second, terrorist groups routinely employ assassination against the states they oppose, and states have responded with targeted killings against suspected terrorist leaders. Third, and perhaps most interestingly, in the post-Nuremberg environment, national leaders are increasingly seen as individually responsible and morally accountable for acts undertaken at their behest. The creation of an International Criminal Court is another sign of a shifting moral and legal context in which raison d'etat no longer protects national leaders from accountability (if they lose, of course). And if individual leaders are seen as morally responsible, then it is easier to slip into viewing them as legitimate targets in war.
Of course, the United States (and some other countries) have been on this slippery slope for awhile, given our reliance on targeted killings in Afghanistan, Pakistan, and Yemen. The practice is troubling on at least three grounds. First, due to the imperfect nature of intelligence and the inevitable "fog of war," targeted killings inevitably murder innocents along with the supposedly guilty. Second, and following from the first point, killing innocent bystanders may create more adversaries than it eliminate, thereby undermining the strategic purpose of the program itself.
Third, and perhaps most important of all, going after foreign leaders-no matter how despicable-helps legitimate a tactic that will eventually be visited back upon us. If the world's most powerful countries see fit to kill any foreign leader that they don't like, what's to stop those same (presumably evil) leaders from threatening to pay us back in kind? Targeted assassinations of foreign despots may seem like a cheap and efficient way of solving today's problem, but we won't enjoy living in a world where foreign adversaries think attacking U.S. leaders (including the president and his inner circle) is a perfectly legitimate way of doing business. And notice that making targeted killings more legitimate tends to level the international playing field: you don't have to be a powerful or wealthy state to organize a few hit squads and cause lots of trouble for your enemies.
So even if this attempt at "decapitation" were to succeed in the short-term, the longer-term consequences may not be quite so salutary.
JOSEPH EID/AFP/Getty Images
There's a fascinating piece in today's New York Times, summarizing the findings of a recent Science article on the origins of human language. Based on a mathematical analysis of phonetic diversity (i.e., the number of separate sounds in different languages), biologist Quentin Atkinson of the University of Auckland has determined that human language originated in southern Africa around 50,000 years ago (some scientists believe its origins may be even earlier).
You've got to hand it to our species: 50,000 years isn't that long a time. Think of all the good and bad ideas that we've produced in 50 millennia: Shakespeare, the "divine right of kings," both slavery and abolitionism, relativity, the Bhagavad Gita, fascism, a mind-boggling array of religious dogma, liberalism, Marxism, the movies of Fred Astaire, Mad magazine, Japanese manga, rap, hip-hop, and bebop. The list is infinite … and now there's the blogosphere.
But here's what I wondered as I finished the article: Who uttered the first pun? And did those early humans groan when they heard it?
Dan Kitwood/Getty Images
The United States started out as thirteen small and vulnerable colonies clinging to the east coast of North America. Over the next century, those original thirteen states expanded all the way across the continent, subjugating or exterminating the native population and wresting Texas, New Mexico, Arizona and California from Mexico. It fought a bitter civil war, acquired a modest set of overseas colonies, and came late to both world wars. But since becoming a great power around 1900, it has fought nearly a dozen genuine wars and engaged in countless military interventions.
Yet Americans think of themselves as a peace-loving people, and we certainly don't regard our country as a "warrior nation" or "garrison state." Teddy Roosevelt was probably the last U.S. president who seemed to view war as an activity to be welcomed (he once remarked that "a just war is far better for a man's soul than the most prosperous peace"), and subsequent presidents always portray themselves as going to war with great reluctance, and only as a last resort.
In 2008, Americans elected Barack Obama in part because they thought he would be different than his predecessor on a host of issues, but especially in his approach to the use of armed force. It was clear to nearly everyone that George W. Bush had launched a foolish and unnecessary war in Iraq, and then compounded the error by mismanaging it (and the war in Afghanistan too). So Americans chose a candidate who had opposed Bush's war in Iraq and bring U.S. commitments back in line with our resources. Above all, Americans thought Barack Obama would be a lot more thoughtful about where and how to use force, and that he understood the limits of this crudest of policy tools. The Norwegian Nobel Committee seems to have thought so too, when they awarded him the Peace Prize not for anything he had done, but for what they hoped he might do henceforth.
Yet a mere two years later, we find ourselves back in the fray once again. Since taking office, Barack Obama has escalated U.S. involvement in Afghanistan and launched a new war against Libya. As in Iraq, the real purpose of our intervention is regime change at the point of a gun. At first we hoped that most of the guns would be in the hands of the Europeans, or the hands of the rebel forces arrayed against Qaddafi, but it's increasingly clear that U.S. military forces, CIA operatives and foreign weapons supplies are going to be necessary to finish the job.
Moreover, as Alan Kuperman of the University of Texas and Stephen Chapman of the Chicago Tribune have now shown, the claim that the United States had to act to prevent Libyan tyrant Muammar al-Qaddafi from slaughtering tens of thousands of innocent civilians in Benghazi does not stand up to even casual scrutiny. Although everyone recognizes that Qaddafi is a brutal ruler, his forces did not conduct deliberate, large-scale massacres in any of the cities he has recaptured, and his violent threats to wreak vengeance on Benghazi were directed at those who continued to resist his rule, not at innocent bystanders. There is no question that Qaddafi is a tyrant with few (if any) redemptive qualities, but the threat of a bloodbath that would "stain the conscience of the world" (as Obama put it) was slight.
It remains to be seen whether this latest lurch into war will pay off or not, and whether the United States and its allies will have saved lives or squandered them. But the real question we should be asking is: Why does this keep happening? Why do such different presidents keep doing such similar things? How can an electorate that seemed sick of war in 2008 watch passively while one war escalates in 2009 and another one gets launched in 2011? How can two political parties that are locked in a nasty partisan fight over every nickel in the government budget sit blithely by and watch a president start running up a $100 million/day tab in this latest adventure? What is going on here?
To read the full article, click here.
PETER PARKS/AFP/Getty Images
Stephen M. Walt is the Robert and Renée Belfer professor of international relations at Harvard University.