If you're troubled by the Justice Department's recent decision to secretly investigate the Associated Press and other journalists in an overzealous attempt to ferret out the source of some leaked information, you should be. But lost amid the outcry about this attempt to squelch press freedom is its connection to the broader thrust of U.S. foreign policy and our deeply ingrained tendency to exaggerate foreign threats. That tendency goes back at least to the early Cold War, when Dean Acheson told President Harry Truman to sell a proposed aid package to Greece and Turkey by going to Capitol Hill and giving a speech that would "scare the hell out of the American people." And he did.
When people are scared, they are more willing to let their government keep lots of secrets, lest supposed enemies find out about them and exploit them. Never mind that most of the mountains of classified information would be of little value to our foes, even if they got access to them. A population that is scared is also more willing to have the government go after anyone who tries to inform them by leaking information, even when knowing more might help ordinary citizens evaluate whether government programs were working as intended.
When people are scared, they are also more willing to support U.S. intervention in other countries, to prevent supposedly bad things from happening there or to prevent leaders we don't like from gaining or retaining power. In most cases, of course, neither U.S. prosperity nor security is directly affected by what happens in these various minor states, but threat-mongers are always good at inventing reasons why the outcome of some local struggle thousands of miles from our shores might actually threaten our prosperity or security. Remember domino theory? Fear, not greed, was the primary motivation behind U.S. interventions in the Korean War, in Iran, in Guatemala, in Lebanon, in Indochina, in the Dominican Republic, in Nicaragua, and in many other places, including more recently in Iraq and Afghanistan. And that same fear that global trends might turn against us leads the United States to maintain a globe-encircling array of military bases and other installations, most of them completely unknown to the citizens whose taxes are paying for them. No other country -- not one! -- seems to think that its security depends on being able to wield lethal force on every single continent.
When people are scared, they are also more willing to support various sorts of covert operations, ranging from normal spying to the increasingly far-flung campaign of targeted assassinations and extra-judicial killings that the United States has been conducting for many years now. Never mind that a significant number of innocent foreign civilians have died as a result of these policies or that the net effect of such actions may be to make the problem of terrorism worse over time. It's impossible to know for certain, of course, because the U.S. government won't say exactly what it is doing.
Notice, however, that this cycle is self-reinforcing. The more places the U.S. intervenes, and the dirtier our methods, the more resentment we tend to generate. Sometimes entire populations turn against us (as in Pakistan), sometimes it may only be a small but violent minority. But either possibility creates another potential source of danger and another national security problem to be solved. If a local population doesn't like us very much, for example, then we may have to jump through lots of hoops to keep a supposedly pro-American leader in power.
To make all this work, of course, our leaders have to try to manage what we know and don't know. So they work hard at co-opting journalists and feeding them self-serving information -- which is often surprisingly easy to do -- or they try to keep a lot of what they are really doing classified. And when the country's national security policy is increasingly based on drone strikes, targeted killings, and covert operations -- as it has been under the Obama administration -- then the government has to go after anyone who tries to shed even partial light on all that stuff that most U.S. citizens don't know their government is doing.
Needless to say, it is all justified by the need to keep us safe. As Attorney General Eric Holder put it when asked about the investigation of AP, these leaks "required aggressive action ... They put the American people at risk."
The greater but more subtle danger, however, is that our society gradually acclimates to ever-increasing levels of secrecy and escalating levels of government monitoring, all of it justified by the need to "keep us safe." Instead of accepting that a (very small) amount of risk is inevitable in the modern world, our desire for total safety allows government officials to simultaneously shrink the circle of individual freedoms and to place more and more of what they are doing beyond our purview.
Don't misunderstand me. Civil liberties and press freedoms in the United States are still far greater than in many other countries, and the outcry over the Department of Justice's recent behavior reveals that politicians in both parties are aware that these principles are critical to sustaining a healthy democracy. My concern is that the trend is in the wrong direction and that the current drift -- under the leadership of a supposedly "liberal" president who used to teach Constitutional law! -- is an inevitable consequence of the quasi-imperial global role we have slid into over the past five decades.
In December 1917, in the middle of World War I, British Prime Minister Lloyd George told the editor of the Manchester Guardian that "if the people really knew, this war would be stopped tomorrow. But of course they don't know and can't know. The correspondents don't write and the censorship would not pass the truth." I sometimes wonder how Americans would react if we really knew everything that our government was doing. Or even just half of it.
Chip Somodevilla/Getty Images
Many people believe that the United States is incapable of bold and ambitious responses to contemporary policy problems, largely because its political institutions aren't designed to act decisively. In this view, the United States is saddled with a federal system where government power is divided, with multiple veto points and various "checks and balances" that help prevent excessive concentrations of power. Add to that free speech, an intermittently vigorous press corps, a vast array of interest groups, and a degree of political partisanship, and you have a recipe for gridlock.
Or so it is said. There is a grain of truth in this caricature: the men who designed the U.S. Constitution were wary of centralized power (and standing armies!), and it is not at all surprising that they designed a system that seems to make radical change difficult. But there are some important exceptions to that general rule, and the exceptions themselves are instructive.
For example, during World War II the Manhattan Project assembled much of the world's most eminent scientific talent in a crash program that produced an atomic bomb in less than five years. Moreover, at its peak the Project was consuming ten percent (!) of the electricity produced in the entire United States, and its facilities contained more floor space than the entire U.S. auto industry. Despite this vast effort, only a handful of Americans were even aware of the project until the bombing of Hiroshima and Nagasaki in August 1945.
More recently, the 9/11 attacks produced a similarly rapid and far-reaching U.S. response, whose full dimensions are still not completely known by the U.S. public. In addition to the invasion of Afghanistan and the subsequent lame-brained decision to invade Iraq, the United States also passed the Patriot Act and launched a wide-ranging global effort to track down and kill as many al Qaeda members as it could find. In the process, it has created a large infrastructure of government and contractor agencies and shifted the CIA's focus away from intelligence gathering and toward a global effort to eradicate al Qaeda and its affiliates through mostly lethal means.
The obvious conclusion to be drawn from these observations is that the U.S. system of government is quite capable of swift and ambitious policy initiatives when the public and key officials are really scared (as we were in 1941 and after 9/11). One might add the response to the invasion of South Korea in 1950 and to Sputnik's launch back in 1957. And this tendency in turn helps explain why threat-inflation is such a common tool within the foreign policy establishment. When Americans are feeling safe and secure, gridlock prevails. But when they are frightened, politicians are both able to launch big initiatives and motivated to do so by the fear that the public will punish them if they don't do enough to defend the nation.
The other important lesson is that big and bold initiatives are far easier when they are kept secret. Nobody knew about the Manhattan Project, and to this day nobody knows the full extent of what our national security machinery has been up to do since 9/11. Books like Mark Mazzetti's new The Way of the Knife and the Washington Post's important articles on "Top Secret America" have peeled back the veil to a degree, but there are undoubtedly many things being done in America's name (and with U.S. tax dollars) about which taxpayers are still unaware.
The danger is two-fold. First, if secrecy makes it easier to do big things, then policymakers will be tempted to make many issues as secret as possible. Classification will run amok, not to keep valuable information from our enemies but mostly from citizens who might object. And with secrecy comes a greater danger that foolish policies won't be adequately debated or scrutinized. This problem is widespread in authoritarian regimes where dissent is squelched and open debate is impossible, but it can also happen in democracies if the circle of decision is tiny and the public is kept in the dark.
The second danger stemming from popular ignorance is blowback. If Americans don't know what their government is doing (or has done), they won't fully understand why other societies view the United States as they do. In particular, Americans won't understand why others are sometimes angry at the United States, and they will tend to interpret anger or resistance as evidence of some sort of primordial or culturally-based hatred. The result is a familiar spiral of conflict, where each side sees its own actions as fully justified by the other's supposedly innate hostility. And I'd argue that spiral dynamics are at the heart of a number of difficult foreign policy challenges, especially in our dealings with the Arab and Islamic world. Unfortunately, unwinding spirals is not easy, and all the more so if a country still doesn't understand exactly why others are ticked off.
Addendum: By a strange coincidence, my colleague Larry Summers has published a column in today's Financial Times, making a somewhat similar argument about the ability of the U.S. government to act more decisively than many people often believe. You can find his views here.
Pete Souza/Wikimedia Commons
A couple of years ago I devoted a couple of blog posts to arguing that allowing gay Americans to serve openly in the military made good strategic sense. My logic was straightforward: We want to attract the best people to military service and any sort of artificial restriction (such as banning gays, or any other social group) inevitably reduces the talent pool from which the country can draw. The result would be a weaker military than we would otherwise have. I'm certain my posts had exactly zero impact on President Obama's subsequent decision to end "don't ask, don't tell," but I was certainly happy when he did.
I'm not a lawyer, and I don't have any firm views on how the Supreme Court is going to handle the issue of gay marriage that is now before it. But I do think a parallel argument can be made about the effect of allowing gay marriage on U.S. foreign policy and national security. Specifically, permitting gay people to marry in the United States would have positive effects on both.
First, ending discrimination against gay couples is going to make the United States a more attractive place for gay people to live, especially when compared to societies that do not permit gay marriage or that actively discriminate (and in some cases, criminalize) being gay. Accordingly, some number of gay people are going to seek to emigrate to the United States, just as some gay Americans are now choosing to live abroad so that their relationships can be legally recognized and protected. The United States has long benefited from its attractiveness as a place to live and work, especially by attracting talented people who are being persecuted elsewhere. The United States would have gained greatly had someone like Alan Turing had known he could find a welcoming home here.
Permitting gay marriage isn't going to cause a flood of gay foreigners to flood our shores, but at the margin, it will make the United States a more attractive destination for some. Which would be to our overall benefit.
Second, and perhaps more importantly, legalizing gay marriage would reinforce America's public commitment to individual liberty and freedom, and its parallel commitment to non-discrimination. More than anything else, that commitment is America's global brand. In this country, the government doesn't tell you where to live, doesn't tell you what job to pursue, doesn't tell you what God to worship, and doesn't tell you who to fall in love with. At the same time, the government also says that you should not discriminate against those who happen to be different from you in some way. Instead, you are supposed to treat them as individuals and to expect the same in return.
But in most parts of the United States, the government does tell you that if you are in love with someone of your own gender, you aren't eligible for the same recognition and benefits that heterosexual couples enjoy. That's not as punishing a policy as slavery or Jim Crow or some of the other forms of discrimination that our country has practiced (and gradually abandoned), but it is still a source of considerable unhappiness for many gay couples and it is fundamentally at odds with our normal claim to privilege individual freedom of choice over category distinctions.
This enduring commitment to individual freedom and choice, and this fundamental hostility to the idea that some groups are better or worse than others, is central to what the United States stands for as a society. In other countries, ethnic and sectarian differences abound and sometimes explode in violence. Similar things have happened here, and racial, religious, or ethnic tensions still exist in many places, but our abiding commitment to individual freedom is like a solvent that continually works to erode the idea that you can judge someone merely by knowing what social group they are from. Martin Luther King dreamt that his children "would live in a nation where they will not be judged by the color of their skin, but by the content of their character." And the same logic applies to sexual preference. In America, we should judge all people by their own individual characters, not by the gender they happen to prefer as lovers and partners.
Like those who once opposed interracial marriage or gays serving in the military, opponents of gay marriage have manufactured a bunch of bogus arguments about how allowing gays to marry would either damage children or undermine the institution of marriage itself. These arguments are pretty preposterous on their face. If anything, extending the right to marry to gay couples only reinforces the idea that stable, loving relationships between committed partners are a solid bedrock for society, as well as a profound source of long-term happiness. That's the main reason why opinion on this issue has shifted so rapidly in recent years. As homosexuality lost its stigma and straight Americans had more and more openly gay friends, the idea that married gay couples were some sort of subversive threat to society seemed increasingly ludicrous. As it should.
In American jurisprudence, the courts often look to whether the state has a "compelling interest" in regulating or interfering in some domain of activity. In this case, I'd argue that to the extent the state has an interest in this matter, that interest lies overwhelming in extending the privileges (and obligations) of marriage to all Americans. Not just because it is consistent with our commitment to liberty and to equality under the law, but also because it will be good for our global image, national cohesion, and even our long-term strength and prosperity.
So if you're still having trouble backing gay marriage on the simple grounds of fairness, you might consider supporting it on the basis of national security instead.
Win McNamee/Getty Images
Over at the new, independent Daily Dish, Andrew Sullivan has been hosting an interesting thread on why academic writing is frequently abysmal. As someone who tries hard to make even my academic writing clear and accessible and who tries to instill that value in my students, I've followed the thread with interest.
For starters, I don't think the problem is that no one encourages future academics to write well. In my own case, for example, I was fortunate to study with Alex George at Stanford as an undergrad and with Kenneth Waltz at Berkeley during graduate school, and both repeatedly stressed the importance of writing well. Waltz didn't do a lot of line-editing of grad student papers or dissertations, but he certainly let me know when he thought my writing was obscure, verbose, disorganized, or just plain confused. He also spoke openly about the importance of writing in his graduate courses, encouraged students to read books such as Fowler's Modern English Usage, and was scornful of the trendy neologisms that infest academic writing like so many weevils.
I also don't think the problem is due to poor editing at journals or university presses. I've published in over a dozen academic journals, with a prominent university press, and with two different commercial publishers, as well in a number of journals of opinion. Almost all of the editors or copy-editors with whom I've worked were helpful and attentive, and some were superlative. Indeed, I can think of only one case in nearly thirty years where a manuscript of mine was truly butchered by an editor (it was actually done by an intern) and fortunately the magazine let me repair the damage before the article appeared.
So why is academic writing so bad?
One reason academic writing is sometimes difficult is because the subjects being addressed are complicated and difficult and hard to explain with ordinary language. I have more than a little sympathy for philosophers grappling with deep questions about morality, time, epistemology, and the like, as these subjects are inherently slippery and it is easy to lose the reader in a fog of words. But it isn't inevitable even there. Some philosophers manage to write about very deep and weighty matters in a prose that is crystal clear. You still have to pay attention and think hard to understand what is being said, but not because the author is making it more difficult than it needs to be.
A second reason is the failure of many scholars to appreciate the difference between the logic of discovery and the logic of presentation. Specifically, the process by which a scholar figures out the answer to a particular question is rarely if ever the best way to explain that answer to a reader. But all too often articles and manuscripts read a bit like a research narrative: "First we read the literature, then we derived the following hypotheses, then we collected this data or researched these cases, then we analyzed them and got these results, and the next day we performed our robustness checks, and here's what we're going to do next."
The problem is that this narrative form is rarely the best way to make a convincing case. Once you know what your argument is, really effective writing involves sitting down and thinking hard about the best way to present that argument to the reader. The most important part of that process is figuring out the overall structure of the argument -- what points need to be developed first, and then what follows naturally or logically from them, and so on. An ideal piece of social science writing should have a built-in sense of logical or structural inevitability so that the reader moves along the argument and supporting evidence as effortlessly as possible.
Achieving this quality requires empathy. You have to be able to step outside your own understanding of the problem at hand and ask how your words are going to affect the thinking of someone who doesn't already know what you know and may even be inclined to disagree with you at first. Indeed, persuasive writing doesn't just convince the already-converted, a really well-crafted and well-supported argument will overcome a skeptic's initial resistance.
Why does this matter? Because the poor quality of academic writing is both aesthetically offensive and highly inefficient. Academics should strive to write clearly for the obvious reason that it will allow many others to learn more quickly. Think of it this way: If I spend 20 extra hours editing, re-writing, and polishing a piece of research, and if that extra effort enables 500 people to spend a half-hour less apiece figuring out what I am saying, then I have saved humankind a net 230 hours of effort.
Which leads me to the real reasons why academic writing is often bad. The first problem is that many academics (and especially younger ones) tend to confuse incomprehensibility with profundity. If they write long and ponderous sentences and throw in lots of jargon, they assume that readers will be dazzled by their erudition and more likely to accept whatever it is they are saying uncritically. Moreover, jargon is a way for professional academics to remind ordinary people that they are part of a guild with specialized knowledge that outsiders lack, and younger scholars often fear that if they don't sound like a professional scholar, then readers won't believe what they are saying no matter how solid their arguments and evidence are.
The second problem is the fear of being wrong. If your prose is clear and your arguments are easy to follow, then readers can figure out what you are saying and they can hold you to account. If you are making forecasts (or if the theory you are advancing has implications for the future), then you will look bad if your predictions are clearly stated and then fail. If your argument has obvious testable implications, others can run the tests and see how well your claims stand up.
But if your prose is muddy and obscure or your arguments are hedged in every conceivable direction, then readers may not be able to figure out what you're really saying and you can always dodge criticism by claiming to have been misunderstood. (Of course, sometimes critics do deliberately misrepresent a scholarly argument, but that's another matter). Bad writing thus becomes a form of academic camouflage designed to shield the author from criticism.
In the endless war against academic obscurantism, I tell my own students to read Strunk and White's classic The Elements of Style and to heed their emphasis on concision. Most of us tend to overwrite (especially by using too many adverbs), and shorter is almost always better. Or as Strunk and White put it:
"Vigorous writing is concise. A sentence should contain no unnecessary words, a paragraph no unnecessary sentences, for the same reason that a drawing should have no unnecessary lines and a machine no unnecessary parts. This requires not that the writer make all his sentences short, or that he avoid all detail and treat his subjects only in outline, but that every word tell."
I'm also a fan of Anthony Weston's A Rulebook for Arguments, a very smart primer on the different forms of persuasive argument and the ways to make written arguments more convincing.
Finally, I encourage students to emulate writers they admire. If there are scholars whose books you enjoyed, read them several times and try to capture what it is that makes their use of language so effective. I've found inspiration in writers like Waltz, Thomas Schelling, James Scott, John Mueller, and Deirdre McCloskey. And you don't have to agree with someone to respect their ability to write: Charles Krauthammer's ideas usually appall me, but there's no question that he is an effective prose stylist.
In the end, it comes down to what a scholar is trying to achieve. If the goal is just narrow professional success -- getting tenure, earning a decent salary, etc. -- then bad writing isn't a huge handicap and may even confer some advantages. But if the goal is to have impact -- both within one's discipline and in the wider world -- then there's no substitute for clear and effective writing. The question is really pretty simple: do you want to communicate with others or not?
Adam Berry/Getty Images
I've made this point before -- here and here -- and I suspect I'll have to make it again. But whatever you think of the outcome of yesterday's Super Bowl, the unexpected second half power outage was a small blow against U.S. power and influence.
Why? Because one of the reasons states are willing to follow the U.S. lead is their belief that we are competent: that we know what we are doing, have good judgment, and aren't going to screw up. When the power goes out in such a visible and embarrassing fashion, and in a country that still regards itself as technologically sophisticated, the rest of the world is entitled to nod and say: "Hmmm ... maybe those Americans aren't so skillful after all."
Or maybe we've just spent too much money building airbases in far-flung corners of the world, and not enough on infrastructure -- like power grids -- here at home.
P.S. The other lesson of the Super Bowl is that strategy matters. As in: the abysmal play-calling by the 49ers when they had first-and-goal inside the ten yard line, trailing by less than a touchdown. Four dumb plays, and the Ravens were champs. Sigh.
Mike Ehrmann/Getty Images
When Andrew Sullivan announced last week that he was taking his uber-blog, The Dish, independent and relying solely on reader subscriptions to fund the operation, the first thing I thought of was...
Not because the announcement made me yearn for a nice IPA, but because it made me wonder whether what is happening to the media environment is in some ways analogous to the extraordinary improvements in brewmaking over the past couple of decades, especially here in North America.
Back in my youth, beer in America was a consistently bland and homogeneous product. Watery lagers predominated, because the big brewing companies all sought to appeal to the median drinker. There just wasn't much difference between Bud, Miller, Schlitz, etc., which is why beer like Coors -- which had even less flavor but was hard to get in much of the country -- could become a fad for awhile. Beer snobs sometimes drank imports like Beck's or Guinness, but the major U.S. brands were boring, conventional, and competing to be more-or-less like each other. Kinda like Detroit's Big Three automakers or the three major TV networks.
Enter the microbrewery revolution. Beginning in the 1980s, enterprising Americans in search of good beer began drawing on artisanal brewing traditions and techniques from Europe, leading to an explosion of small craft breweries whose main selling point was creativity and diversity. Not to mention taste. Instead of trying to be like everyone else, microbrews thrived by presenting unique and interesting products that could actually hold a beer fan's interest. Instead of putting out a cheap product to be swilled in front of the TV or at a football game, microbrewers sought to produce something you could savor, discuss, and get seriously passionate about. No wonder I haven't sipped a Bud in years. Even the Obama White House has caught the bug, producing its own Honey ale in recent years.
So too with blogs. As Sullivan has realized, you don't have to be connected to some big media giant like the New York Times or the Economist in order to have a significant readership. It helps to be part of a well-known brand, of course but it's not essential, especially if you're more interested in appealing to a smaller group of engaged readers than in grabbing as much market share and advertising revenue as you can.
Furthermore, as the diverse set of writers that Sullivan often features on his blog illustrate, those who work primarily in the blogosphere are usually more interesting, provocative, willing to experiment, and well-informed than the mainstream commentators and pundits writing for the big media outlets. There are exceptions, of course, but I'm constantly impressed by how many smart people and good writers now inhabit the internet, and I frequently find myself in awe of how well so many of them use language and how much genuine pleasure one can get from reading them. By contrast, outstanding writing is becoming harder to find in a lot of mainstream media platforms, and its almost an endangered species in the hallowed halls of academe. It's not that they are bad writers, it's just that they are mostly so cautious, predictable, and bland. You know: like PBR.
Given the effectiveness of modern search engines, interested citizens can get lots of information from the web if they're willing do a little bit of dedicated trolling, which in turn makes it harder for governments, interest groups, or big media conglomerates to control discourse anymore. And that's why authoritarian governments in countries like China or Iran have worked so hard to slap restrictions on this free-wheeling environment, lest their own actions and legitimacy get undermined by the unconstrained flow of ideas.
None of this is big news by now, and Sullivan isn't the first blogger to rely solely on reader support. He's just the most visible and prominent, and his experiment reminds us that the information revolution that we are all living through is still in its early stages. But I hope Sullivan's venture succeeds and that others follow his lead. I don't know what the information industries will look like a decade or two into the future, but it's certain to be different than it is today and a lot different than it was when I was a kid. I'm already reconciled to the fact that I'll eventually have to give up my cherished morning newspapers and get almost everything in digitized form. I'll heave a nostalgic sigh when that happens, but in the end I think it will be for the best. Why? Because I also believe that the open exchange of information and ideas eventually leads to greater collective wisdom and better public policies. For this reason, the break-up of big media oligopolies and the proliferation of independent voices is a good thing.
And on that happy note, I think I'll have a beer.
Scott Olson/Getty Images
A thought struck me as I was reading the obits of jazz legend Dave Brubeck, who passed away yesterday at the age of 92. Several accounts highlighted Brubeck's role as a cultural ambassador, through his participation in various goodwill tours sponsored by the U.S. State Department. A number of other prominent jazz artists -- including luminaries like Louis Armstrong -- were featured in these tours, which were intended to show off the appealing sides of American culture in the context of the Cold War competition with the Soviet Union. This was a Bambi-meets-Godzilla competition, btw, with the Soviets in the role of Bambi. I like Shostakovich and respect the Bolshoi, but Soviet mass culture was outmatched when pitted against the likes of Satchmo.
But here's my question: why isn't the United States doing similar things today? The State Department still sponsors tours by U.S. artists -- go here for a bit more information -- but you hardly ever hear about them and it's not like we're sending "A-list" musicians out to display the vibrancy of American cultural life. Celebrities and musicians are more likely to do good will tours to entertain U.S. troops in places like Iraq, but the sort of tours that Brubeck and others did in the 1950s and 1960s seem to have become a minor endeavor at best.
The problem, I suspect, isn't a lack of interest in cultural diplomacy or even lack of funding. Instead, I think this is an consequence of globalization. Today, someone in Senegal or Indonesia who wants to hear American jazz (or hip-hop, or blues, or whatever) just needs an internet connection. The same is true in reverse, of course; I can download an extraordinary array of world music just sitting here in my study at home. And that goes for videos of performances too, whether we're talking music or dance or in some cases even theatre. Plus, top artists tour the world on their own in order to make money; they don't need to go as part of some official U.S. government sponsored tour. And given the unpopularity of U.S. foreign policy in some parts of the world, official sponsorship is probably the last thing some artists would want.
But there may some exceptions to that rule, in the sense that are a few countries where artistic exchanges might open things up in ways that diplomats cannot. Iran isn't likely to welcome Madonna, Christina Aguilera, or Justin Timberlake, perhaps, but have we thought about an artistic exchange with some slightly less edgy U.S. performers? If table tennis could help thaw relations with Mao's China, maybe jazz, acoustic blues, or even classical music could begin to break the ice with Tehran. Iran's has a large under-thirty population that is by all accounts hungry for greater access to world culture, so this sort of exchange would build good will with the populations that will be rising to positions of influence in the future. Plus, Iran has plenty of gifted performers who might find a ready audience here. And you can send a delegation of American musicians without violating UN sanctions or having to answer a lot of thorny questions about nuclear enrichment.
Update: In response to this post, Hishaam Aidi of Columbia University and the Open Society Institute sent me this piece, which takes a critical view of the State Department's more recent efforts to use hip-hop artists as a form of cultural outreach. Well worth reading, and my thanks to Hishaam for sending it to me.
The California Museum via Getty Images
Thanksgiving is a quintessentially American holiday, even though its origins can be traced back to Old World harvest festivals. It is based in part on a romanticized story of the Pilgrims, which took on new life after Abraham Lincoln's proclamation of a day of thanks intended to help reconcile North and South after the Civil War. It is also celebrated in Canada, however, so Americans don't have a monopoly on gratitude.
Still, if you're an American citizen or a green card holder, you've probably got a lot to be thankful for, especially compared to citizens of a lot of other countries. But of course, we Americans often forget to be properly thankful for many of our blessings; instead, we seem to think we deserve them because we are So Darn Exceptional. With that thought in mind, here's a slightly contrarian Top Ten List of Things Americans Should be Thankful For (But Often Aren't).
1. We have a state of our own. Americans could start by being thankful that the rebellious colonists won their war of independence, straightened out the Articles of Confederation, and built a strong state of our own. Having your own state means that you can protect yourself against enemies and there's a government to go to bat for you if you get in trouble. By contrast, stateless peoples like the Kurds, Chechens, Palestinians, Romany, Tamils, Jews before 1948, and many others live at the mercy of others. Given our own revolutionary past, you'd think we'd have a bit more sympathy for peoples trying to escape oppressive foreign rule, but never mind. In any case, in the dog-eat-dog world of global politics, having a state of our own is clearly something to be thankful for.
2. There are no great powers nearby. Given our propensity to exaggerate global dangers, Americans often forget that they are remarkably secure. We haven't had any powerful states near us since the 19th century, and we haven't had to worry seriously about defending our own territory against invasion. (This is what makes movies like Red Dawn so laughable). Or as the French Ambassador to the United States said back around 1910: "America is the most favored of the nations. To the north, a weak neighbor. To the south, a weak neighbor. To the east, fish. To the West, more fish. " This extraordinary level of territorial security explains why Americans are free to go gallivanting all around the world "searching for monsters to destroy" and trying to tell the world how to live. We've forgotten what it is like to face a real threat to our independence, and that sort of amnesia is a luxury for which we should be very thankful indeed.
3. We didn't adopt the same austerity programs that Britain, Europe, and Japan did. A lot of Americans are still hurting from the after-effects of the Great Recession, and those who are still unemployed may not feel especially appreciative tomorrow. And it's clear with hindsight that the governmental response to the financial crisis could have been more effective. But compared with the other industrial democracies, the United States has done much better to eschew austerity and focus more attention on stimulus. So let's give thanks for that.
4. We got lucky when they handed out the natural resources. Americans like to attribute their rise to wealth and power to their virtuous and hardworking nature, embrace of capitalism, novel Constitution, and commitment to liberty. But just as important was the fact that the country happened to be founded on a continent with fertile soils, navigable rivers, abundant wildlife, and a temperate climate. It had lots of iron ore and other minerals, plenty of oil, and it turns out we've got more natural gas than we know what to do with. (Good for us, if not necessarily so great for the atmosphere). This Thanksgiving, Americans ought to silently acknowledge that our privileged status today owes as much to good fortune as it does to any unique American virtues.
And while we're at it, let's not forget that realizing our "Manifest Destiny" involved the deaths of millions of native Americans and taking vast territories from other countries by force. Recalling the uglier side of America's rise to world power is a good way to keep overweening national pride in check.
5. In (many) Gods we trust. I don't know about you, but I for one am thankful that the Founding Fathers didn't try to establish a state religion and instead celebrated theological diversity, including the freedom not to believe. Over the past two centuries, the idea that free men and women could worship whatever gods they choose has protected this country from a powerful cause of civil strife in many other parts of the world. We can give thanks that anti-Semitism has been discredited and marginalized and Islamophobia confined mostly to far-right whack jobs and a few desperate politicians.
Just look at the last presidential election: a Christian with a Muslim name got 70 percent of the Jewish vote, while his opponent -- the first Mormon to be nominated -- didn't lose by that much (i.e., he had over 48 percent of the popular vote). That's America.
And maybe one of these days we'll have a serious presidential candidate who openly proclaims her or his faith in science and reason and rejects allegience to any unseen superhuman entity. Amen.
6. Another successful election. Whether you are a Republican or a Democrat, you should give thanks that this country has once again conducted an election where peace prevailed, citizens voted, and the losers conceded, mostly with good grace. Some GOP leaders may be baffled by the results, but they didn't take up arms or hire a lot of lawyers to try to reverse them. Who knows? They may even start pondering why they lost in a serious way, and beging move their party away from some of its antideluvian notions. That would be something to be grateful for too.
7. Tolerance of diversity. In addition to religious freedom, Americans can be grateful for the progress we have made in embracing those who at first seem different. This includes immigrants, who are often viewed with suspicion yet consistently become some of our most ambitious, energetic, hardworking, and accomplished citizens. Consistent with our liberal ideals of individual human liberty, our country is gradually ending discrimination against gays. We continue to work to reduce the long legacy of racial discrimination. Our reward is a country whose cultural life has been enriched by diverse currents and whose society has managed to take advantage of the best the world has to offer. We are far from perfect, but the American melting pot remains a phenomenon that richly deserves our thanks.
8. No war with Iran. Having wound down one losing war and positioned us to end another, at least President Obama has had the good sense not to start a third war with Iran. Keep your fingers crossed that he remains as wise in his second term, but be grateful that he didn't succumb to all the fear-mongering, even in an election year.
9. Health care for all Americans. I don't want to go all partisan on you, but unless you're one of the One Percent (and maybe even if you are), you ought to be grateful that we've finally taken steps to insure that all citizens get basic medical care. True, most of the industrialized world got there long before we did, but better late than never. It's not a perfect system and it's bound to need improvements over time, but we all ought to feel good about helping our fellow citizens feel good. And say a word of thanks, too.
10. What about the rest of you? Here's a suggestion: if you're not an American, this Thanksgiving you might give thanks if you haven't gotten a lot of attention from Uncle Sam lately. You might not want to be totally ignored (especially if the South China Sea laps your shores) but getting a lot of attention from the United States hasn't been such a good thing in recent years (see under: Iraq, Pakistan, Yemen, Afghanistan, etc.) So if you're citizen of one of the many countries that Americans like to visit but American troops and drones don't, you can be thankful, too.
It's August, which means that students in America (and plenty of other places) are heading off to college for the first time. Some of them are undoubtedly thinking about preparing for careers in international affairs. As a public service to those eager future Secretaries of State (and the parents worrying about their college choices) here's my Top Ten Things that Future International Policy Wonks Should Learn.
1. History. Trying to understand international affairs without knowing history is like trying to cook without knowing the difference between flour and flounder. Not only does history provide the laboratory in which our basic theories must be tested, it shapes the narratives different peoples tell themselves about how they came to their present circumstances and how they regard their relationship to others. How could one hope to understand the Middle East without knowing about the Ottoman Empire, the impact of colonialism, the role of Islam, the influence of European anti-Semitism and Zionism, or the part played by the Cold War? Similarly, how could one grasp the current complexities in Asia without understanding the prior relations between these nations and the different ways that Chinese, Vietnamese, Koreans, Japanese, Pashtuns, Hindus, Muslims, and others understand and explain past events?
But don't just memorize a lot of names and dates: seek out teachers who can help you think about the past in sophisticated ways. Among other things, it's useful to know how other societies see the past even if you don't agree with their interpretation, so make sure you read histories written by citizens of other countries. And if you're studying in the United States, don't just study "Western Civilization." The world is a lot bigger than that.
2. Statistics. Most high schoolers have to learn a certain amount of math, but unless you're going into a technical field, a lot of it won't be directly relevant to a career in international affairs. But statistics is part of the language of policy discourse, and if you don't understand the basics, you won't be a discerning consumer of quantitative information and others will be able to dazzle you with data that may not be right. You can avoid this fate with a little study.
3. Foreign Language. If you grew up outside the United States and are headed for college, you probably already speak more than one language. If you're an American, alas, you probably don't. You should. I know that everyone is learning English these days, but learning at least one foreign language provides a window into another culture that you can't get any other way, and also provides a sense of mastery and insight that is hard to achieve otherwise. I'm not particularly good at languages, but I'd gladly trade my mediocre abilities in French and German for real fluency in one of them (or many others). Don't make my mistake: get to the language lab and acquire some real skills.
4. Economics. Economists aren't the wizards they think they are (see under: 1929, 2007-08), but you can't understand world affairs these days if you don't have a basic grasp of the key principles of international trade and finance and some idea how the world economy actually works. I might add that some forms of economics (e.g., game theory) can provide some useful ways of thinking about strategic interaction, provided you don't push it too far. So take enough economics to be able to read the WSJ op-ed page and know when they are BS-ing you.
5. International Law. You might think that a realist like me would dismiss international law completely, but I took a course in the subject as an undergraduate and have always been grateful that I did. Among other things, it reaffirmed my suspicion that international law is a pretty weak instrument, especially when dealing with great powers. Nonetheless, states and other international actors use international law all of the time, and they certainly invoke it to try advance their own particular interests. So it's good to have some idea what international law is, how it works, and what it can and cannot do.
6. Geography. We often hear that we live in "one world," but it's divided up into lots of regions, countries, areas, and physical configurations, and these variations matter a lot. I don't know when or why we stopped teaching geography, but it is an important part of the world affairs tool kit. I might not go so far as to say "geography is destiny," but just look at all the international issues that you couldn't begin to understand without a detailed knowledge of the physical characteristics of the region in question. South China Sea? The West Bank? The new sea routes in the Arctic? The list is endless, yet I'm often struck by how little geography most students seem to know these days. Here's a good test: if you were given a map of the world with all the country names removed, how many could you fill in? If you can't get at least 75%, time to get out that atlas and start brushing up. The exercise will also tell you which regions you may know well and which ones you need to learn a bit more about. If you're still not convinced that geography matters , check out Robert Kaplan's new book.
7. Get some culture. Education in international affairs tends toward the technocratic, as the previous items on this list suggest. But some appreciation for art and culture is essential. The music, literature, and visual arts of different societies are where their collective souls reside, and more people have been inspired by poetry, art, and music than by the most compelling regression equations. If you don't know why Picasso, Kurosawa, Shakespeare, Solzhenitsyn, Austen, Ellington, Rushdie, Shankar, etc. matter, then you've missed out an enormous part of the human experience and your ability to understand what makes other societies tick will be impoverished.
8. Learn to communicate. Based on some of the graduate students I see, I'm not sure this is something most colleges teach anymore. But not matter what path you end up taking in life, being able to write clearly, quickly, and without enormous effort is a huge advantage. I'm not saying you have to aspire to be a prose stylist on the order of George Kennan, Joan Didion, or Paul Krugman, but overcoming the fear of the blank page or screen and developing the ability to write a clear, well-organized argument is an enormous force-multiplier.
While you're at it, hone your ability to speak effectively and persuasively. Regardless of what sort of career you pursue, being able to present your ideas orally will be very valuable. And I'm not just talking about formal lecturing or giving a keynote speech, I also mean knowing how to brief your boss in five-minutes or less, and how to ask a good question. I go to lots of public lectures and seminars, and I'm often struck by how few people know how to ask a clear, sharp and penetrating question. If you master that skill, you'll stand out.
Formal training and activities like debate can enhance these abilities, but mostly they come from practice. Repetition also helps overcome stage fright, and being relaxed while you're speaking is easily worth 10 or 20 IQ points.
9. What about science? Most of us had to take a lot of science in high school, and some of us continued to do so in college. Although in-depth knowledge of physics, chemistry, biology, computer science, etc., is not directly relevant to many aspects of international affairs, it is powerfully linked to a host of important political phenomena. How can one understand cyber-security, climate change, global pandemics, economic development, and a host of other issues without understanding the scientific knowledge that lies at their core? More importantly, a clear understanding of the scientific method helps protect you from the proud know-nothingism that is increasingly a badge of honor among some politicians. So stick with some science too. And by the way: if you happen to interested in topics where science is central (such as arms control or the environment), you'd probably be better off majoring in a relevant scientific field rather than politics or history.
10. Find your ethical foundation. Universities teach classes on ethics, but apart from favoring free speech and opposing academic fraud, they don't endorse any particular ethical stance. So don't expect your college to teach you what is right or moral. Nonetheless, if you haven't figured these things out for yourself yet, college is a good time to get cracking on it. You'll meet lots of people with different views on this subject, and engaging with them will help you sort out where you stand. What's your view of the good or virtuous life? Where are the lines that shouldn't be crossed? How do you propose to handle the ethical tradeoffs that will inevitably greet you as you advance through life? And as you study, keep a sharp eye out for role models: which people strike you as admirable and worthy of emulation and which seem morally challenged? And on what basis did you decide?
Alert readers will have noticed that my list looks a lot like the classic liberal arts education. True enough: in world that is both diverse and changing rapidly, a broad portfolio of knowledge is almost certainly the best preparation for a long career in the field. My list also leaves out various extracurricular activities that may be every bit as important as what you do in class, such as living for an extended period in a foreign country. But a solid knowledge of these fields and a serious effort to develop some key skills would serve you in good stead in a wide variety of global professions. And if you end up doing something entirely different, they certainly won't hurt.
And if you're just starting your freshman year, I hope you find the next four years challenging and inspiring. Learn as much as you can, because there will be plenty of tough problems for you to work on as soon as you graduate.
I had a relaxing vacation out on Fire Island, though of course I didn't get quite as much accomplished as I intended. But I did do a lot of reading, and I thought I'd pass a bit of what I learned on to all of you.
I started with Volume 4 of Robert Caro's monumental biography of Lyndon Johnson, which covers the period 1958-1964. In this period Johnson runs half-heartedly (and unsuccessfully) for the 1960 presidential nomination, accepts the vice-presidential nod, and then languishes miserably in a powerless position. He's mostly ignored (if not openly dissed) by Kennedy's inner circle, and thinks his political career is mostly over. But Kennedy's assassination in November 1963 suddenly places him in the Oval office, and Caro offers a vivid description of how LBJ rises to the occasion, gets Kennedy's legislative program moving, and helps the country overcome a major national trauma.
The book is a great read, and Caro has few equals at sketching a character or describing how personalities operate within American institutions. He does have a weakness for stark contrasts and mano-a-mano confrontations (e.g.. he makes much of the blood feud between LBJ and Bobby Kennedy, going back to the early 1950s), but such portraits are part of what make the book difficult to put down.
But for me, a subtler message in the book (possibly overstated for dramatic effect) is that John F. Kennedy wasn't much of a president. He was smart, articulate, charming, and courageous (as his exploits in World War II revealed), and he often had sound political instincts. He had a knack for attracting talented acolytes and inspiring deep loyalty from them, and he knew how to use a gifted advisor/speechwriter like Ted Sorenson to great effect. But his record as a congressman and a senator was unremarkable, and Caro's account shows he didn't achieve much in his three years as president. The main elements of his legislative program were stalled in Congress, and his main foreign policy achievement was managing a crisis over Soviet missiles in Cuba that his own policies (e.g., the attempt to overthrow Castro and an unnecessary nuclear weapons build-up) had helped provoke. We obviously will never know what he might have achieved had he not been assassinated and if he had won a second term, but this book makes it clear that the post-assassination hagiography has little basis in fact.
My next selection was David Kang's "East Asia before the West," which I recommend to anyone with a shaky grasp of East Asian history. It's a slim book that focuses primarily on explaining the Sino-centric trade and tributary order that existed in Asia from roughly 1400 to 1900. Kang's emphasis is on interpreting this history, and demonstrating how this order differed from the Westphalian model that has inspired most contemporary IR theory. In particular, he argues that relative power played a lesser role in relations between China and its principal neighbors (Korea, Japan, and Vietnam) than realist theories might suggest, and that status (defined largely in cultural terms) was in fact of critical importance. Instead of being competing billiard balls interacting on the basis of relative power, Kang depicts these societies as heavily (though not totally) shaped by Chinese cultural ideas (primarily Confucianism). Relations among them reflected norms of deference that reflected not just power but also the degree to which other societies met Chinese cultural standards. He also depicts it as an unusually peaceful order -- at least with respect to state-to-state relations -- with the bulk of violence being directed at rebels, bandits, or nomadic tribes, rather than by governments against each other.
Not surprisingly, I though the book downplays the role of power somewhat. Given how much larger and stronger China was, it's not all that surprising that the lesser states didn't challenge it (and in the rare cases when they did, it didn't go well for them). But it is quite a thoughtful book, and well worth your time.
My last selection (apart from a few novels), was Fredrik Logevall's forthcoming book "Embers of War: The Fall of An Empire and the Making of America's Vietnam." It is a fascinating, beautifully-written, and deeply depressing account of the First Indochina War (i.e., the war between France and the Vietnamese resistance led by Ho Chi Minh), with particular emphasis on the background role played by the United States. Many parts of this story have been told before, but Logevall's account provides much new detail and important new insights. Among other revelations, he shows Dwight D. Eisenhower was far more hawkish on Vietnam than is sometimes claimed, and that the U.S. came closer to intervening during the siege of Dienbienphu that I had previously believed.
It is impossible to read the book without being struck by contemporary parallels, and without concluding that the U.S. foreign policy establishment has learned virtually nothing over the past sixty years. Although the French clearly knew more about Vietnamese society than their American counterparts did, officials in both governments were often embarrassingly ill-informed about the actual state of Vietnamese society and opinion. Back in Washington, key decisions were often being made by people (such as Dean Acheson or John Foster Dulles) who had little knowledge of Asian history or society and who were inevitably distracted and shaped by problems elsewhere. And alleged experts like Senator Mike Mansfield (whose opinions were heeded because he had once taught classes in Asian history) were blinded by Cold War ideology and simplistic ideas like the "domino theory." Meanwhile, the American public was chronically misinformed about Asian events by publishers like Henry Luce of Time and Life, and well-organized propaganda campaigns.
Logevall never makes explicit comparisons between the events he describes and more recent counterinsurgencies, but the parallels are quite remarkable. Like the United States in Iraq and Afghanistan, the French forces in Indochina faced enormous logistical difficulties and were frequently vulnerable to ambushes (including what we would know call "improvised explosive devices"). The occupying powers were allied with local elites who were feckless, unreliable, and corrupt, and neither the French nor the United States ever had much leverage over their local clients. The French faced chronic manpower shortages, largely because the war was increasingly unpopular and French politicians could not institute a draft and deploy conscripts there. Instead, they had to rely on legionnaires, troops from their other colonies, or on professional soldiers. Similarly, the Pentagon has always had trouble finding enough troops to run its occupations in Iraq and Afghanistan, and of course could never contemplate turning to a draft. The French thought that a heroic general (Jean de Lattre de Tassigny) would reverse their fortunes and produce a victory, just as U.S. leaders have occasionally pinned their hopes on the likes of David Petraeus or Stanley McChrystal. Both the French and the Americans tried to create local forces who could take over for them; neither effort succeeded to the extent necessary. Massive expenditures and much suffering was justified by baseless fears of falling dominoes, just as today U.S. pundits have somehow managed to turn impoverished Afghanistan into a "vital interest." Finally, Logevall shows that U.S. citizens had very little knowledge of what the United States was actually doing in Indochina -- especially in the period between the signing of the Geneva Accord and the escalation of direct U.S. involvement -- just as we are mostly kept in the dark about the full extent of our involvement in places like Yemen or Pakistan today.
All in all, a pleasant vacation, even if I spent a lot of it reading about unpleasant things and drawing depressing conclusions. Alas, that's an occupational hazard for people in this business, even when we're supposedly taking a break.
I am pleased to offer the following guest post by Nasser Rabbat of MIT:
Nasser Rabbat writes:
The euphoria sparked by the 2011 Arab uprisings has settled into realpolitik. The youth who initiated the protest movements split into myriad organizations or withdrew in despair. The Islamists, disciplined through decades of clandestine political action, took over in Tunisia and Libya, and are poised to wrestle power from a recalcitrant army in Egypt. The secularists, assumed to be the natural allies of the West, are weak and divided. In Tunisia and Egypt, they garnered fewer votes in the elections than predicted. In Libya, they retreated from the National Transitional Council, leaving the Islamists to occupy its most powerful positions. In Syria, still struggling against a belligerent and criminal regime that is proving hard to nudge, the secularists in the opposition are constantly bickering, whereas the Islamists are organized and goal-oriented. Arab secularism, the events seem to suggest, is a spent force. The United States and other Western governments, claiming to be responding to the realities on the ground, are engaging the Islamic parties as the defining new paradigm of Arab politics.
Is this a new turn for the West? Did the West support the secularists before the revolutions? And has Arab secularism really become irrelevant? My answer to all three questions is an emphatic no. To begin with, the record of the West in the Arab world is patently not pro-secularist. Indeed, if we are to limit our assessment to the regimes that have been consistently backed by the U.S. in the last fifty years, we will find on the top of the list Saudi Arabia, Qatar, the UAE, Oman, and Morocco, all avowedly Islamic regimes, at least in their claims to legitimacy or their application of Islamic law. Conversely, some of the most ardent opponents of the U.S. have been the secular regimes of the Baath party in Syria and Iraq, though their secularism proved skin-deep and opportunistic. Moreover, when the United States decided to avenge the attacks of 9/11, perpetrated as they were by an extremist Islamist militancy, its most decisive act was to destroy the secular regime of Iraq. Eight years later, when the Americans finally withdrew from Iraq, they left behind not only a flagrantly sectarian regime, but also a political class composed largely of religious movements umbilically linked to the Islamic Republic of Iran.
Nor does history show much Western support for the budding secular tendencies in the early twentieth century, which coincided with the colonization of most of the Arab world. Pragmatism may explain why colonial powers, Britain and France in particular, preferred to deal with traditional leaders. They had political influence, economic clout, and a wide base of clients. That they adhered to conservative forms of piety added to their usefulness: They understood the mechanisms of religious authority and could manipulate them to appease potential popular unrest. The few Arab secularists, on the other hand, even though thoroughly westernized and belonging to the social elite, were seen as troublemakers. Having been profoundly influenced by the principles of the Enlightenment, they formulated strong demands for liberation, democratization, and modernization. Many clashed with the colonial authorities and paid a heavy price of imprisonment or exile.
Independence, when it finally came, fell smack at the height of the Cold War. The West, which was eventually reduced to the United States, was seeking to build alliances of nations committed to countering the Communist threat. Conservative regimes, such as those of Jordan and Saudi Arabia, were obviously the most promising allies. So the West supported them regardless of their religious agendas. When military regimes came to power in Syria, Egypt, and Iraq after the defeat of these countries in the first Arab-Israeli war of 1948, they first toyed with accepting Western tutelage. Their subsequent turning to the USSR as a patron more sympathetic to their national causes, however, did not translate into espousing communism or rejecting religion. Ungodly these military regimes certainly were, but they were not secular. They neither believed in nor practiced the separation of religion and politics. They in fact heavily relied on religious symbolism to frame the image of their one inspired despot and his family or clan. This was the case of Anwar al-Sadat after Camp David and his successor Hosni Mubarak, as well as Saddam Hussein, Muammar Qaddafi, Hafiz and Bashar al-Assad. Fundamentalism and its defiant social expressions actually grew under their watch, even if they had been relentlessly suppressing all Islamic political organizations, or any other political activism for that matter.
Secularists had no place in such a system. Those who dared to speak out against it found themselves dismissed from their jobs, jailed, or forced to leave their countries. Some, who persisted in their criticism of the dictators or of the rigid views of the growing Islamist extremists, like the journalists Salim al-Lawzi and Samir Kassir in Lebanon, Hidaya Sultan Al-Salem in Kuwait, Farag Foda in Egypt, and Mohammed Taha in Sudan, were assassinated. Others, unable to cobble together a political structure to unite them like the Islamists had, channeled their political activism into more intellectual and artistic pursuits. Secularism, already accused of elitism because of the social background of its proponents, became even more rarefied as it migrated either away from the pulse of the street and into the confines of academia and art or out of the country altogether.
The 2011 uprisings seemed at first to bring secularism back to the forefront as a vociferous political force. Fueled by a new breed of activists -- young, globally networked, and unbothered by considerations of class, religion or gender -- the uprisings wielded the same principles that earlier Arab secularists have advocated. But like those earlier Arab secularists, the youth did not translate their secularist rallying cries into framers of political parties able to compete for the post-revolutionary governments. Some movements, notably the 6th of April Movement in Egypt, simply declared after the fall of Mubarak's regime that it had no plan to become a political party, then lived to regret that impulsive decision. The prominent and reasonably popular candidate for the presidency in Egypt, Mohammad el-Baradei, withdrew from the race before it began, citing as a reason the reprehensible way politics was conducted by his detractors. The few attempts to register a secularist political presence in the elections in Tunis and Egypt were swept aside by the eminently more organized Islamist parties and by their shrewd appeal to the basic religiosity of the people, especially the poor and the illiterate.
Arab secularism, however, remains on the street and online. Though outdone in the current rush to power by the Islamists, it still has the ability to reassert itself in the political arena, if not as the ruling party, at least as lawful opposition and guardian of the principles of civic freedoms. The culture of lawful opposition, long absent under the totalitarian regimes, needs to be reinserted into the political discourse. This is as important a function as good governance for the well-being of the nascent Arab democracies. To that end, the efforts of the discontented revolutionary youth and the seasoned secular intellectuals should be united under the umbrella of political parties. The West should help them by recognizing their crucial political role and by treating them as long-term partners not just as recipients of training and aid.
In February 2011, after the victory of the Egyptian revolution in which they played no significant role, some of the most famous Islamic preachers gloated that the next government will be Islamic. Secularism, they contended, should be put to rest because it reigned for fifty years and failed. But true secularism has never had a chance to rule in the modern Arab world, except perhaps in Tunisia under al-Habib Bourguiba (1957-87). Otherwise, religion was always enshrined in the fiat constitutions of all the Arab kingdoms and republics, even those that were ferociously hunting down Islamists. Moreover, Arab rulers who hid behind secular masks, whether they were civilian or military, never separated religion from their politics. Many enlisted docile forms of religion and compliant sheiks as parts of their arsenal of control. In that, they were following in the footsteps of a long tradition of inglorious religion-based rule in the Arab world, which did not really end until the fall of the Ottoman Caliphate in 1923. It is thus more accurate to question what Islamic rule of the kind imagined by the vocal Islamist organizations will bring that was not tried before during the long centuries of what they themselves believe was an Arab decline.
Nasser Rabbat is the Aga Khan Professor of the History of Islamic Architecture at MIT.
A couple of weeks ago, psychiatrist Robert Spitzer made the news by writing a short but sincere apology to the gay community for his earlier support of "reparative therapy" intended to "cure" homosexuality. He now regards the 2003 experiments that seemed to show success for this "treatment" were irredeemably flawed, and he regrets any role he might have played in reinforcing anti-gay stereotypes. Good for him.
Spitzer's recantation got me thinking: Why do we so rarely see foreign policy mavens offer similar apologies for obvious screw-ups? None of us is infallible, but powerful people sometimes make colossal blunders that lead to enormous human suffering. When that happens, it really does merit a mea culpa from those responsible. Yet with a few exceptions, I can't think of very many politicians, pundits, or government officials who have openly acknowledged their errors and apologized for them. Here in the United States, this only seems to happen when sexual indiscretion is involved, or when former officials are at the end of their careers and seeking some sort of absolution.
At this point, don't you think that William Kristol owes his fellow citizens an apology for his repeated war-mongering about Iraq, a war that cost the United States over a trillion dollars, killed thousands of people, and created millions of refugees? Wouldn't it be refreshing to hear George W. Bush and Dick Cheney admit their numerous mistakes and express some regret for them, instead of trying to stonewall the judgment of history? Couldn't a few of the ambitious "visionaries" who created the Euro say they're sorry they didn't listen to the skeptics who warned that Europe lacked the institutional mechanisms needed to make a common currency work? Shouldn't Elliot Abrams show some contrition about his role in fomenting the disastrous Fatah-coup attempt against Hamas, which left the latter in charge in Gaza? And so on. Heck, we're still waiting to hear regrets from the folks who brought us the financial crisis of 2007-2008, although Bernie Madoff did offer up something of an apology for his massive swindle.
Admitting you were wrong really isn't that hard. I've been in this business for nearly three decades, and I've been blogging for three and half years. In that time, I think I've gotten a number of things right, both in my scholarly work and my public commentary. I think I was mostly right about the core causes of alliance formation, right about the general direction NATO was headed after the Cold War, certainly right about the folly of invading Iraq, and right about the harmful impact of the Israel lobby on U.S. foreign policy. (Does anyone seriously believe that lobby isn't a very powerful force anymore?) And I think my skepticism about Obama's abortive peace efforts in the Middle East and his decision to escalate in Afghanistan have been borne out as well.
But I've been dead wrong on several occasions too. I was overly critical of post-modern IR theory back in the early 1990s, and overly optimistic about the Oslo peace process. I may have recognized the centrifugal tendencies that buffeted NATO following the Soviet breakup, but I also underestimated its staying power. And as I've noted before, I clearly missed the potential for contagion in the Arab spring. I regret every one of those errors, although I don't think very many people suffered as a result.
Of course, academia isn't quite like the policy world. Scholarship advances through vigorous criticism, and no matter how careful we try to be, every academic can look back and see how our earlier work could be improved. No scholar expects to be 100 percent right and all of us (should) understand that our prior work will eventually be overtaken and revised in light of new research. By contrast, people in the policy world or the commentariat can't readily admit mistakes, because their admissions will be seized upon by rivals and used to marginalize them. So instead of honest admissions of error, you mostly get silence, obfuscation, or denial. That's mildly offensive and morally dubious, but the real danger is that it allows serial blunderers to keep influencing policy or public discourse, no matter how many failures they've been associated with in the past.
MANDEL NGAN/AFP/Getty Images
I gave a lecture last night at the Cape Ann Forum, on the topic of America's changing position in the world and what it might (should) mean for U.S. grand strategy. My hosts were gracious and the crowd asked plenty of good questions, which is what I've come to expect when I speak to non-academic groups. Indeed, I'm often impressed by how sensible many "ordinary" Americans are about international affairs in general and U.S. foreign policy in particular. And so it was last night.
One of the attendees was iconoclastic journalist Christopher Lydon, who's been a friend for some years now. Chris asked a great question: Why is there so little accountability in contemporary U.S. policy-making, and especially regarding foreign policy? To be more specific: He wanted to know why some of the same people who got us into the Iraq debacle, mismanaged the Afghanistan war, and now clamor for war with Iran are still treated as respected experts, welcomed as pundits, and recruited to advise Presidential campaigns?
I didn't have a particularly good answer for him, but I thought about it more as I drove home. I'm not sure why there seems to be so little accountability in the American establishment these days (though it is true that if you lose $2 billion dollars, it does affect your job security), but here are a few thoughts.
Part of the problem is institutionalized amnesia. The United States is busy all around the world, and if the short-term results of some action look okay then we tend to move on and forget about what we've left behind. We fought a proxy war in Nicaragua in the 1980s, and it was a controversial issue at the time, with 40,000 or so Nicaraguan perishing as a result. But eventually the war ended, and we moved on with nary a backward glance. We intervened in the Bosnian civil war, patched together a Rube Goldberg-like structure to govern the place, gave ourselves high-fives, and spend the next fifteen years telling ourselves what a success it was. Except that it wasn't. Really. Last year we helped topple the Gaddafi regime in Libya, rejoiced at the fall of a despised and brutal dictator, and then moved on again, even as Libya descends into chaos. But it's not our problem anymore, unless a contraband MANPAD eventually finds its way to some unfortunate civilian airline somewhere. And if that airliner doesn't have Americans on board, we won't worry about it very much.
Heck, I'll bet if Bush had just pulled all our troops out of Iraq after his "Mission Accomplished" photo op, we'd be hailing it as a great military victory no matter what condition Iraq was in today. ("Hey, we got rid of Saddam for them; it's not our fault if the Iraqis can't run the place...")
A second reason is the incestuous clubbiness of the foreign policy establishment. Mainstream foreign policy organizations like the Council on Foreign Relations thrive by being inclusive: It's not clear what a member in good standing would have to do in order not to be welcome there. This is actually a smart principle up to a point: Because none of us is infallible, you wouldn't want to live in a society where being wrong rendered anyone a pariah for life. But neither does one want a system where conceiving and selling a disastrous war has no consequences at all.
Third, the incestuous relationship between mainstream journalists, policy wonks, and politicos reinforces this problem. All three groups live in a symbiotic relationship with each other, and you wouldn't expect to see many people in this world donning their brass knuckles and saying what they really think about other members of the club. And because their livelihoods and well-being aren't directly affected by catastrophes that happen Far Away, why should they worry about holding people accountable and conducting their relations in a more adversarial fashion? Bad for business, man....
A related reason has to do with career paths in the foreign policy world. I'm well aware that most would-be foreign policy wannabes don't have the luxury of tenure, and a lot of them have to survive on soft money budgets at think tanks or as in-and-outers doing private sector work when their party is out of power. In a world like this, yesterday's adversary is tomorrow's ally, and that means pulling punches and doing a lot of forgiving and forgetting. In most case, a bland conformism is the best route to long-term professional success, which diminishes the tendency to render harsh judgments, even when they are appropriate.
Fifth, as U.S. neoconservatives have long demonstrated, the best defense is sometimes a good offense. No influential political faction in America is more willing to engage in character assassination and combative politics than they are, in sharp contrast to most liberals and even most realists. I'm not talking about spirited debate over the issues -- which is a key part of effective democratic politics -- I'm talking about the tendency to accuse those with whom they disagree of being unpatriotic, morally bankrupt, anti-semitic, or whatever. Their willingness to play hardball intimidates a lot of people, which in turn protects them from a full accounting for their past actions.
Finally, there is obviously less accountability for anyone who has reliable financial backing. It doesn't matter how often people at the Weekly Standard or American Enterprise Institute advocate failed policies, so long as somebody is willing to keep bankrolling them. If you've got the Koch Brothers, Rupert Murdoch, or Sheldon Adelson in your corner, you can stay in the game no matter how often you've been wrong about really big and important issues, and no matter how big a price others may have paid for your mistakes.
Ralph Orlowski/Getty Images
At the Big Think website, John Horgan argues that war is just a cultural practice that humankind could eventually abandon, unless we keep infecting ourselves with the "war virus" (h/t Andrew Sullivan). If one state gets infected by war-proneness, so his argument runs, its neighbors may have no choice but to follow suit and adopt similar measures in order to prevent themselves from being conquered. In Horgan's words (as reported by Mark Cheney here):
"Imagine your neighbor is a violent psychopath who is out for blood and land. You, on the other hand, are person who wants peace. You would have few options but to embrace the ways of war for defense. So essentially your neighbor has infected you with war."
It's an arresting use of language, perhaps, but the history of social Darwinism should have taught us to be wary of bringing misplaced biological analogies into the study of world politics. Viral infections spread by very specific and well-known mechanisms -- e.g., they take over the DNA of neighboring cells and replicate themselves-and that's not remotely like the mechanism that Horgan is identifying here. Instead, he's actually describing a situation where an external threat forces the leaders of neighboring states to rationally choose to adopt policies and strategies designed to insure their survival. That's not how viruses spread: You don't catch a cold because you've decided the only way to protect yourself against your sneezing neighbor is to start sniffling and sneezing along with them.
The actual logic that Horgan is pointing to here is the basic "security dilemma" that realists have been talking about ever since John Herz. In a world where no agency or institution exists to protect states from each other, each is responsible for its own security. Because states cannot know each other's intentions with 100 percent certainty (either now or in the future), they have to prepare for the possibility that neighbors may do something nasty at some point. So they invest in their own armed forces or they look for powerful allies, especially if they think the possibility of trouble is fairly high. And once they do that, others have to worry about them in turn. This is the "tragedy" of great power politics identified by my colleague John Mearsheimer, and it's a much better explanation for security competition (and war) than some analogy to microbes.
To be fair, Horgan's larger point is simply that war is not a biological necessity; it is a specific political or cultural response to certain conditions and thus in theory could gradually be abandoned. This theme has been developed at length by John Mueller and more recently by Steven Pinker. I agree with Pinker's claim that the overall level of human violence has declined significantly over the past several centuries (mostly due to the emergence of increasingly stable domestic political orders, i.e., states), but I remain agnostic about the larger claims for a long-term reduction in inter-state violence. That trend is driven almost entirely by the absence of great-power war since 1945, and the absence of great-power war may have multiple and overlapping causes (bipolarity, nuclear weapons, the territorial separation of the U.S. and USSR during the Cold War, the spread of democracy, etc.) whose persistence is hard to forecast.
The absence of great-power war is a good thing, because major powers have the most capability and can do the greatest harm when their destructive capacities are fully roused. What we're seeing instead, however, is either protracted conflicts among warlords, insurgents, or relatively weak states (think the Congo, Sudan, or Colombia), and wars of choice waged by the United States and other powerful states in various strategic backwaters, mostly against adversaries that we don't think can do much in response. At least we hope not.
Will China hold together? I'd say yes. But as scholars and pundits debate China's future, a critical issue is whether the government will face powerful internal challenges of the sort that eventually helped bring down the USSR. One piece of that puzzle is whether minority groups such as China's restive Uighur population in Xinjiang province will pose a significant threat to internal stability.
I know very little about this issue, but I found this brief commentary by Arabinda Acharya and Wang Zhihao, two researchers at the S. Rajaratnam School of International Studies in Singapore, to be rather eye-opening. Factoid #1: Acharya and Wang point out that China is one of the few countries in the world that spends more on domestic security than it does on defense, a fact that reflects the CCP's long-term concern about internal order.
Equally interesting was their reminder about the dearth of reliable information on the true situation in Xinjiang. Money quotation:
"The Xinjiang situation is also characterized by a lack of facts. Accounts of events come mainly from two sources: state-sponsored media and overseas Uighur activists who claim to have sources within the region. Reporting by these two entities however cannot be independently verified, due to China's ban on the presence of outside media in the region. Therefore, it has become difficult to determine where facts end and embellishment begins.
State media attributes the incidents to rioters or terrorists belonging to the East Turkestan Islamic Movement (ETIM) also going by the name Turkistan Islamic Party (TIP). Beijing also accuses overseas Uighur organizations especially the World Uighur Congress for inciting unrests in Xinjiang. Uighur activist groups however, claim that the protests are acts of the local Uighur lashing out at Beijing's "systematic oppression." These incidents nevertheless, are being exploited to garner international support for resisting what is being termed as "state oppression" in Xinjiang. As the facts continue to be obfuscated, it has become difficult to distinguish protests against specific grievances by local Uighur from organized acts of terrorism."
As we've seen in many other contexts, the dearth of reliable information is exacerbated by the contending parties' incentives to misrepresent what is really going on, making it extremely difficult for outsiders to judge either the threat of instability or the appropriateness of the government's counter-measures. And insofar as internal instability poses a significant threat to China's continued economic expansion, it means that outsiders will find it even more difficult to forecast its trajectory with confidence.
For their part, Acharya and Wang offer a fairly sanguine forecast, opining that "Fortunately for China, the situation in Xinjiang is not and does not portend to be a problem of massive proportions." Nonetheless, they warn that overly harsh measures could fuel greater Uighur resistance, and they conclude that "Beijing would do well to temper its actions with appropriate sensitivity to overall issues involved rather than attempt to crush all dissent with mere force."
Good advice, I suspect, but it will be interesting to see if China's leaders can follow this prescription for subtlety, especially internal discontent increases.
LIU JIN/AFP/Getty Images
I'm pleased to present the following guest post from Nina Tannenwald of Brown University. Alert readers will note that she is writing from a constructivist rather than realist perspective, but when you're trying to avoid a foolish war, paradigmatic loyalty is a decidedly secondary consideration.
Nina Tannenwald writes:
At a time when anti-Iran hawks are beating the drums of war, the international community needs to pursue all possible routes to a peaceful solution to Iran's nuclear challenge. One route that has not been tried is harnessing moral and religious norms as a source of nuclear restraint. Incongruous as it may seem, Iran's leaders have repeatedly stated that nuclear weapons are "un-Islamic." Why not hold them to it?
Iran's Supreme Leader, Ayatollah Ali Khamenei, issued a fatwa, a religious decree, in 2004, describing the use of nuclear weapons as "immoral." In a statement to the International Atomic Energy Agency in Vienna in August 2005, the Iranian chief nuclear negotiator, Sirus Naseri, read a statement reiterating Khameini's fatwa that "the production, stockpiling, or use of nuclear weapons is forbidden under Islam." Many regime figures have repeated the prohibition, including Khamanei himself, who said in 2010 that Islam considered weapons of mass destruction (WMD) "to be symbols of genocide and are, therefore, forbidden and considered to be haraam [forbidden in Islam]."
No other national leader anywhere has ever asserted that nuclear weapons are, say, "un-Christian" or "un-Jewish" (although Western religious leaders and scholars have expressed such views).
Iran's leaders could be dissembling, of course, as part of their effort to mislead the international community. But no one forced them to say this -- let alone to repeat it publicly -- and Khameini has not repudiated this fatwa even as Iran's nuclear program has advanced. It would be strange for a regime that derives its legitimacy from its adherence to Islam to keep asserting this point if it were really totally insincere.
We don't need to take the Iranians at face value, but why not take advantage of the opening their own words provide? The international community should capitalize on this element of restraint. We should hold them to it.
How might this work? Diplomats should refer to the statements approvingly and frequently. President Obama should use his rhetorical gifts to publicly acknowledge the Iranian prohibition and state that, as a person of faith himself, he respects and welcomes the testament. The goal would be to invoke Islamic moral values as a positive contribution to both Iranian and global nuclear restraint.
A second approach would involve "Track II" diplomacy. This would entail holding conferences that bring together religious scholars and ethicists from different religions, along with government officials and nuclear strategists from key countries to discuss ethical constraints on nuclear weapons. This would be a good project for foundations to support.
This strategy -- a normative one -- would not replace sanctions. Rather, by invoking Islam's moral contribution in a positive way, and by connecting it to longer term efforts toward global nuclear disarmament, it could help provide Iranian leaders with the political cover and respect to engage in negotiations over their nuclear program.
International relations scholars have a term for this kind of normative strategy: "rhetorical entrapment." Developed especially in constructivist analyses of human rights, it refers to how NGOs especially, but also states and international organizations, seek to hold leaders accountable to their publically-stated commitments to moral values or norms. Leaders can become "entrapped" in a public debate over their adherence. The act of holding the debate increases the salience and legitimacy of the norms at issue and thereby raises the legitimacy, or normative, costs to the regime of violating its own commitments.
Thus, in contrast to a realist strategy for dealing with a recalcitrant state, which emphasizes imposing material costs (sanctions, military threats), a constructivist strategy emphasizes raising the normative (legitimacy) costs of a violation. This approach assumes that leaders care about certain kinds of legitimacy (in this case, fidelity to Islam), just as the realist strategy assumes that states will be vulnerable to material sanctions and threats.
Is this normative approach to Iran pie-in-the-sky? Realists may snicker, but, historically, religious and moral norms have played an important role in shaping our thinking about nuclear weapons. Christian churches and other religious groups played a key role in the anti-nuclear weapons movements of the 1950s and 1980s. Their moral critique of nuclear weapons made it impossible to think of such a weapon as "just another weapon." Perhaps most prominent was the American Catholic bishops' influential 1982 pastoral letter criticizing nuclear deterrence as "morally flawed." This powerful statement provoked a widespread debate about the ethics of the nuclear arms race and helped undermine public support for aggressive nuclear strategies.
Iran has good reason to harbor a special revulsion toward weapons of mass destruction. It is the second largest victim of WMD attacks after Japan. Iran suffered over 100,000 casualties, both military and civilian, from Iraqi chemical weapons attacks during the Iran-Iraq war in the 1980s. Iran did not retaliate in kind partly because it was unprepared but also because Ayatollah Ruhollah Khomeini believed that chemical weapons were prohibited by Islam.
This experience deeply affected the national psyche of a generation of Iranians. Adding to the bitterness is the Iranian perception that the West was mostly indifferent to this suffering. Western countries quietly sided with Saddam Hussein in the war and failed to strongly condemn the chemical weapons attacks. Thus Iran surely has something to contribute to the global moral discourse on weapons of mass destruction.
The repressive Iranian regime is distasteful for reasons that go well beyond nuclear weapons, and no one who cares about the fate of the Middle East should want Iran to acquire nuclear weapons. Yet significant evidence suggests that Iranian leaders, while clearly determined to acquire a nuclear capability, have not yet made a decision to actually build a nuclear warhead. Invoking the value and worth of the Iranian regime's own publically-stated moral norms may help to reinforce more realist reasons for restraint, such as economic sanctions, military threats, or fears of provoking a nuclear arms race in the region.
Like anything else, this moral appeal may not work. But there is little to lose. To date, the key international players have shown a striking lack of diplomatic imagination in dealing with the Iranian challenge. Harnessing cultural and religious resources might facilitate a peaceful solution to this looming crisis and contribute to restraint on all sides. Of course, there is also the "boomerang" effect: engaging Iran in this conversation might require us to confront the status of our own moral values with respect to nuclear weapons.
Nina Tannenwald teaches international relations at Brown University. Her book, The Nuclear Taboo: The United States and the Nonuse of Nuclear Weapons Since 1945, received the 2009 Joseph Lepgold Prize.
Behrouz Mehri/AFP/Getty Images
The family of former President Dwight D. Eisenhower is now weighing in against renowned architect Frank Gehry's proposed design for an Eisenhower Memorial on the mall in Washington, D.C. Good for them. Their main objection is that the main representation of the former president in Gehry's proposed design is a statue of Eisenhower as a young Kansas farm-boy. The rest of the four-acre memorial is an elaborate and soulless structure whose paved walkways also celebrate -- are you ready for this? -- the interstate highway system. Just the sort of message one ought to highlight in an era of climate change, right?
I'm with the Eisenhower family on this one, and the brouhaha has reaffirmed my belief that Gehry is one of the more overrated architects of the modern era. (OK, his Bilbao museum was visually arresting--if you like chaos--but you should thank your lucky stars you don't have an office in this building). This incident may also mark the only moment in recorded history when I've agreed with something published in the National Review.
What's the real problem? Let's start with Gehry's witless decision to depict one of the architects of victory in World War II, as well as a two-term president whose standing has risen steadily over time, as a barefoot farm-boy. The other presidential memorials on the mall are either majestic in their simplicity (e.g., the Washington Monument), or they pay homage to past leaders like Lincoln in their maturity, portraying them as they were when they made their singular contributions to our common heritage. To portray Eisenhower as a boy immediately diminishes him, and give us no sense of his unique qualities as a leader or the achievements that we treasure. Instead, it invites us to see him as an untutored naïf, which is precisely what some of his political opponents mistakenly thought he was.
I should confess that I'm not a huge fan of presidential monuments anyway, because they reinforce popular deference to executive authority and strengthen the growing tendency to view our presidents as akin to monarchs but with term limits. But I'll concede that a handful of presidents have performed acts of leadership, wisdom and courage that can provide enduring inspiration for subsequent generations, and that memorials on the Mall to a very few might be in order.
When it comes to Eisenhower, therefore, I'd like to see a memorial that underscored his singular contribution to our understanding of post-World War II security problems: namely, his eloquent warnings about the danger of the "military-industrial complex" and his consistent efforts to advance the cause of peace. Think about it: here is a West Point graduate and five-star general, who had seen as much of war as any American, and who had presided over a significant expansion of America's strategic nuclear arsenal in the 1950s. Nonetheless, he ends his second term with a message to his countrymen about the dangers of unchecked military/industrial power.
And can anyone doubt that his warnings were prescient, when we realize that the United States still spends more than the next ten or twenty nations combined, when its National Security Mandarins feel little or no compunction about ordering drones to kill suspected terrorists (and sometimes innocent bystanders) while refusing to reveal to the voters who fund these activities exactly what their government are doing (or even the legal basis being used to justify it), and when our post-9/11 panic has led to a massive expansion of secret agencies and contractors whose full extent is not known or understood by the politicians who are supposedly overseeing them?
And let's not forget Ike ended the Korean War faster than Obama got us out of Iraq or Afghanistan, declined to get ensnared in France's debacle in Indochina, quashed the boneheaded Anglo-French-Israeli invasion of Egypt in 1956, and generally avoided costly military entanglements afterwards. His foreign policy record wasn't perfect by any means, but he compares quite favorably to virtually all of his successors.
A proper memorial to President Eisenhower would highlight not his boyhood -- iconic and stereotypical though it might be -- but his maturity, and his wise concerns about the trajectory our nation was on. Such a memorial would bring into fierce relief his final presidential speech, as well as some of his other remarks, where these words could help reverse our robotic tendency to assume our greatness is measured primarily by how much we can destroy, rather than by how much we can provide.
So how about a memorial where quotations such as the following were carved in stone, for each new generation to read and ponder:
"This conjunction of an immense military establishment and a large arms industry is new in the American experience. The total influence -- economic, political, even spiritual -- is felt in every city, every State house, every office of the Federal government. We recognize the imperative need for this development. Yet we must not fail to comprehend its grave implications. Our toil, resources and livelihood are all involved; so is the very structure of our society.
In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.
We must never let the weight of this combination endanger our liberties or democratic processes. We should take nothing for granted. Only an alert and knowledgeable citizenry can compel the proper meshing of the huge industrial and military machinery of defense with our peaceful methods and goals, so that security and liberty may prosper together."
"Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed. This world in arms is not spending money alone. It is spending the sweat of its laborers, the genius of its scientists, the hopes of its children.
The cost of one modern heavy bomber is this: a modern brick school in more than 30 cities.
It is two electric power plants, each serving a town of 60,000 population.
It is two fine, fully equipped hospitals. It is some 50 miles of concrete highway.
We pay for a single fighter plane with a half million bushels of wheat.
We pay for a single destroyer with new homes that could have housed more than 8,000 people.
This, I repeat, is the best way of life to be found on the road the world has been taking.
This is not a way of life at all, in any true sense. Under the cloud of threatening war, it is humanity hanging from a cross of iron."
Now that's a memorial I'd like to see us build. Back to the drawing board, Frank.
There's a must-read
op-ed in today's New York Times by
Yan Xuetong, the dean of the School of Modern International Relations as
Tsinghua University. Writing as a self-described "realist," Yan
acknowledges that the emerging Sino-American competition is a zero-sum game (an
idea deemed politically incorrect by many inside-the-Beltway), and plainly states
that "competition between the United States and China is inevitable."
He approvingly quotes past Chinese sages as emphasizing that "the key to
international influence was political power."
Part of the novelty in Yan's essay is his emphasis on political morality. Power is critical, he says, but "the central attribute of political power was morally informed leadership." Accordingly, the future struggle between the United States and China will be won by the government that best demonstrates what he terms "humane authority," which is material power fused with moral principle. In his words, "states relying on military or economic power without concern for morally informed leadership are bound to fail." Even more interestingly, he says the essential "humane authority begins by creating a desirable model at home that inspires people abroad."
There's a lot of wisdom in this essay, as well as a subtle warning. On the one hand, Yan offers a neat summary of America's current advantages over China: our model of governance, tarnished though it is, is still more attractive than Chinese-style authoritarianism. America's past efforts to stabilize key regions have won it a large array of allies around the world, although these ties have been weakened by a decade of folly and misplaced aggression. U.S. society remains far more open to talented immigrants, such as AIDs researcher David Ho, journalist Fareed Zakaria, the late General John Shalikashvili, or former Secretaries of State Henry Kissinger and State Madeleine Albright. Yan offers a set of prescriptions clearly intended for Chinese readers: the country must assume more global responsibilities, open itself up to talented individuals from overseas, and "develop more high-quality diplomatic relationships."
But on the other hand, Yan also believes China "needs to create additional regional security arrangements with surrounding countries," and says its leaders "must play a larger role on the world stage and offer more security protection and economic support to less powerful countries." These words sound innocuous, but they actually reflect China's understandable desire to create a sphere of influence in key areas, and especially in East and Southeast Asia. Why should countries like South Korea, Japan, Vietnam, or Indonesia maintain security ties with the United States, if Beijing is willing to offer beneficial economic ties and "protection?"
This is what all great powers tend to do as they grow stronger: they extend "protection" to weaker states in their vicinity in order to make sure that those states adopt foreign policies that do not threaten the larger power's interests. ("Hmmmm. Nice country you've got there. Would hate to see anything happen to it.") This doesn't mean China wants to conquer its neighbors or incorporate them into a formal empire, because that would be hard to do in an era of nationalism and wouldn't be worth the effort. Instead, the long-term goal is merely to ensure that its weaker neighbors defer to Chinese interests on key issues, including the future role of the United States in the region.
And as I outlined last week, that is why Sino-American competition in the years ahead is going to be primarily a competition for allies. Yan maintains that "there is little danger of military clashes" and that "neither China nor America needs proxy wars to protect its strategic interests." He's right in theory -- neither state needs such things and both would do well to avoid them -- but that is no guarantee that they won't happen anyway.
And to bring this full circle: that is why the latest episode of Congressional dysfunction -- the failure of the inaptly named "supercommittee" -- is so worrisome. The United States possesses the basic ingredients needed to more than hold its own in a future competition with China -- a competition that is already underway -- were it not for our growing talent for podiatric marksmanship (i.e., shooting ourselves in the foot). Whether the issue is the GOP's stalwart effort to protect the super-wealthy, the bipartisan commitment to throwing good money after bad in Afghanistan, or the gradual hollowing out of the essential sinews of an advanced society (schools, roads, power grids, transport hubs, etc.), it is clear that our problem is not a rising China. On the contrary, the real problem is a befuddled and aimless political class, comprised of men and women lacking knowledge, accountability, political courage, or any genuine commitment to the common weal. What they've got in spades is personal ambition, but not much else. If "morally informed leadership" is a prerequisite for success, then we are in big trouble.
SAUL LOEB/AFP/Getty Images
What do Joe Paterno, Muammar al-Qaddafi, Silvio Berlusconi, and Rupert Murdoch have in common?
The obvious answer, of course, is that 2011 turned out to be a very bad year for each of them. There were clearly important differences between them -- Qaddafi was the only one with blood on his hands and is the only one who is dead -- but there are some striking similarities too.
For starters, all of these men -- and note, they are all men -- were not exactly ... umm ... young. Qaddafi was the youngest of the bunch at 69; Berlusconi is 75, Murdoch is 80, and Paterno almost 85.
Second, all four held power in their respective domains for long periods. Qaddafi ruled Libya for 41 years; Berlusconi dominated Italian politics for roughly 17, Murdoch took over his first media company in the early 1950s, and Paterno became head football coach at Penn State way back in 1966.
Third, except for Qaddafi -- who did remarkably little for Libya despite the vast oil wealth at his disposal -- the other three could lay claim to a number of positive achievements. Whatever one thinks of Berlusconi's political career or Murdoch's journalistic standards, one has to concede that both men did create successful business empires. And whatever one thinks of Paterno's handling of the scandal that cost him his job, there's no question he was a highly successful college football coach for many years. But as dramatists have taught us since ancient Greece, success has a way of breeding hubris.
But the feature that unites these very different men is that each became less and less accountable, and increasingly insulated from candid, face-to-face criticism. Who was going to tell Qaddafi that he was mostly a despotic failure and increasingly unpopular, and that his "Green Book" of supposed "philosophy" was incomprehensible claptrap? Which News Corp. employee was going to warn Rupert Murdoch that his take-no-prisoners approach to journalism was leading the company into corrupt criminality? Did anyone in Berlusconi's inner circle try to tell him that he had become a self-indulgent and sybaritic laughingstock? Could any member of Penn State's cult of "JoePa" puncture the bubble and make it clear to him that there was something rotten in Happy Valley? It appears not.
As a result, each of them began to think that the normal rules didn't apply. Paterno seemed to think he was as effective a coach at 84 as he'd been twenty years previously, ignoring everything we know about the aging process. Berlusconi's media empire allowed him to shape what many Italians believed about him, despite the recurring scandals and his protracted failure to do anything to fix the anemic Italian economy. Murdoch and his associates seemed to think that spying on people and hacking their phones was perfectly legit as long as it helped sell papers. And at the extreme end, a megalomaniac like Qaddafi was willing to kill his own people to sustain his own kleptocracy, while somehow believing to the end that he deserved to govern. And in each case, the events that ended their long runs seemed to catch them unawares and unable to respond.
Finally, in each case, a culture of deference and sycophancy gradually blinded all of them to what was really happening. The personal tragedy is most apparent in the case of Paterno, a decent if stubborn man who failed to recognize or accept that a trusted associate was in fact a criminal sexual predator. But this same tendency is also evident in the other cases -- and with even greater effect -- as the vainglory of these powerful men inflicted great harm on many others.
"If men were angels," James Madison wrote in Federalist #51, "no government would be necessary." But we are not angels, and the dark side of human nature is likely to emerge whenever any of us becomes too big, too powerful, or too revered to be held accountable. The ignominious ends that these four men suffered in 2011 also remind us that even clever and powerful leaders cannot always escape their past sins.
David Ramos/Getty Images
Here's a question for you: does it make sense for the United States to open its best universities to students from China (or any other potential long-term rival) and to help them to acquire advanced scientific and technical knowledge?
On the plus side, you could argue that all universities ought to admit the best and brightest applicants no matter where they come from, because that will help these universities do better work. Having smart students is a powerful spur to continued progress, no matter where they come from. Moreover, this practice might help the United States cream off some of the best foreign talent by convincing them to remain here after they graduate, where they will be of great benefit to the U.S. economy. And even if some of the best foreign students get trained here and then go back home, they can help their own societies develop, generate economic growth, and create bigger markets for everyone, so that the whole global economy grows and we all benefit.
But the downside is obvious too: if more and more of these well-trained people head back home, then U.S. universities will be transferring knowledge that might reduce America's comparative advantage. Even worse, we might be making it easier for other states to catch up or eventually surpass us in areas of advanced technology that have military implications (including cyber-security). So maybe we ought to be limiting foreign access to U.S. higher education, in order to preserve our own advantages for as long as we can.
There, in a nutshell, is a key difference between realists and liberals. Although the latter concede that there is a competitive element to world politics, they tend to downplay it and to focus primarily on the gains to be had from mutual cooperation. This tendency is evident in the emphasis placed on "engaging" China, which has been a hallmark of U.S. policy since the Clinton administration. This view stresses the need for cooperation and the benefits that the United States (and others) will gain as China becomes wealthier, and one dimension of that would be opening up U.S. institutions of higher education and collaborating with Chinese universities.
By contrast, realists tend to worry more about long-term shifts in the relative balance of power between the two sides, and warned that enabling Chinese growth could eventually place the United States in a position where its own influence is reduced. If you believe that Sino-American rivalry will be hard to avoid and potentially costly, then you'd want to start think hard about ways to slow China's rise. But nothing is cost-free: taking steps like that could reinforce Chinese suspicions-- duh! -- and at a minimum means consigning millions of Chinese citizens to lower standards of living. And guess what? It would probably also reduce U.S. standards of living too, although perhaps not by as much.
Here's one way to think about these starkly contrasting worldviews. For liberals, world politics is like playing music, and states are just like members of a band or orchestra. Making good music requires teamwork and cooperation, and the quality of the music generally improves the more highly skilled the musicians are. Among other things, this means that helping your fellow players improve is good for the group as a whole; if your bass player or drummer gets better, then the overall group sound gets better too. So members of a band or an orchestra should help each other out, and not worry about whether one player is improving faster than the others are. And while there can be elements of rivalry or jealousy within a band (or between different groups), it's usually not a zero-sum activity. If La Scala improves and makes opera more popular, that's good for the Met; just as the Beatles and other English groups kicked the door open for lots of other bands too. Similarly, if Wynton Marsalis becomes famous and reignites interest in jazz, then other jazz musicians benefit too.
Musicians obviously have to agree on what piece of music to play, and it helps to have rules to guide them, whether it's fully orchestrated score, a lead sheet, or even just a loose arrangement with a list of solos. Even more abstract forms of improvised jazz depend on hours of training and a shared understanding of musical language. Such norms or rules or tacit understandings facilitate cooperation, and make it possible for lots of individuals to play together without a lot of prior rehearsal.
Thus, music is a pretty good metaphor for the liberal view of world politics, which is why liberals emphasize the importance of international law, institutions, and hegemonic leadership. And that's why most American liberals like to talk about the indispensability of the United States: in their view, the world orchestra needs a conductor, and who is better positioned to play that role than Washington DC? But the underlying image is still one where all will be better off if they work together; and where everyone has a common interest in helping others improve. No wonder E.H. Carr famously characterized idealist (i.e., liberal) approaches as emphasizing the "harmony of interests."
By contrast, realists see international politics as less like music and more like sports. We're not talking about exquisite harmonies and seamless group dynamics; we're talking NFL football or World Cup Rugby. There are clear winners and losers, the competitors sometimes cheat, and athletes are fools if they spend any time helping rivals improve. Players have an interest in helping teammates get better, but you wouldn't expect Albert Pujols of the St. Louis Cardinals to be giving hitting tips to a member of the Texas Rangers right now, and you wouldn't expect Roger Federer to call up Andy Murray and offer him some advice on how to improve his serve.
Unlike music, the essence of sports is inherently competitive, and the winners normally get a lot more benefits than the also-rans do. Rules exist to define the nature of the competition, but everyone understands that some people might cheat. By comparison, it's not even clear what it would mean to "cheat" when you're trying to play music, or how "cheating" would be of any benefit.
So which view provides a better metaphor for world politics? Although both metaphors can offer some revealing insights, it won't surprise you to learn that I think foreign policy is a lot more like sports than it is like music-making. Even if states can gain from collaboration, the benefits of collaboration are not evenly distributed and relative power still matters. More importantly, the occasional periods of close cooperation are occasionally disrupted by all-out struggles that redistribute power and leave the winners better off and the losers licking their wounds. When that occurs, of course, the rules tend to fall by the wayside. Imagine an NFL game played for high stakes, and with no referees on the field.
And because states now that such struggles can occur at any time, the possibility casts a grim shadow over much of their behavior.
Finally, let's not forget that relative power matters in the supposedly collaborative world of music. Conductors and bandleaders (and sometimes financial backers) get to decide what pieces to feature, and minor players just play what they are told. It was Duke Ellington's orchestra, not Johnny Hodges', and there's a reason why most of the songs on the Beatles' albums are by Lennon or McCartney and not George Harrison or Ringo. Over time, changes in the distribution of power world-wide will determine who gets to call the tune, and we might want to think about that before the set list changes in ways we might not like.
Scott Heavey/Getty Images
I was in New York City the past two days and left my laptop in my bag for a change. The main purpose of the trip was to pick up my daughter (who was flying home from a language immersion program), but we did manage to sneak in a benefit concert at the Beacon Theater. Go here for a peek at The Life I Could Have Had if I Had Talent.
Along the way I've been reflecting more on the shooting/bombing in Norway and the debates that have surfaced since last weekend. One of the striking features of Anders Breivik's worldview (which is shared by some of the Islamophobe ideologues who influenced his thinking) is the idea that he is defending some fixed and sacred notion of the "Christian West," which is supposedly under siege by an aggressive alien culture.
There are plenty of problems with this worldview (among other things, it greatly overstates the actual size of the immigrant influx in places like Norway, whose Muslim minority is less than 4 percent of the population). In addition, such paranoia also rests on a wholly romanticized vision of what the "Christian West" really is, and it ignores the fact that what we now think of as "Western civilization" has changed dramatically over time, partly in response to influences from abroad. For starters, Christianity itself is an import to Europe -- it was invented by dissident Jews in Roman Palestine and eventually spread to the rest of Europe and beyond. I'll bet there were Norse pagans who were just as upset when the Christians showed up as Breivik is today.
Moreover, even Christian Europe is hardly a fixed cultural or political entity. The history of Western Europe (itself an artificial geographic construct) featured bitter religious wars, the Inquisition, patriarchy of the worst sort, slavery, the divine right of kings, the goofy idea of "noble birth," colonialism, and a whole lot of other dubious baggage. Fundamentalists like Breivik pick and choose among the many different elements of Western culture in order to construct a romanticized vision that they now believe is under "threat." This approach is not that different from Osama bin Laden's desire to restore the old Muslim Caliphate; each of these extremists is trying to preserve (or restore) an idealized vision of some pure and sacred past, based on a remarkably narrow reading of history.
In fact, any living, breathing society is driven partly by its "inner life," but also inevitably shaped by outside forces. Indeed, as Juan Cole notes in a recent post, most societies benefit greatly from immigration, especially if they have strong social institutions (as Norway does) and the confidence to assimilate new arrivals into the existing order while allowing that order to itself be shaped over time. What is even more striking about conservative extremists like Breivik is their utter lack of confidence in the very society that they commit heinous acts trying to defend. On the one hand, they think their idealized society is far, far better than any alternative, which is why extreme acts are justified in its supposed defense. Yet at the same time they see that society as inherently weak, fragile, brittle, and incapable of defending itself against its cruder antagonists.
Paula Bronstein/Getty Images
What role should academics play in public discourse about major social issues, including foreign policy? I've taken up this issue in the past, as has FP colleague Dan Drezner. The Social Science Research Council has a continuing project on the topic of "Academia and the Public Sphere," and they asked me to contribute an essay on the topic of "International Affairs and the Public Sphere." It just went up on the SSRC website, and you can find it here.
Briefly, in this paper I argue that academic scholars have a unique role to play in public discourse -- primarily as an independent source of information and critical commentary -- as well as an obligation to use their knowledge for the betterment of society. In particular, university-based scholars should resist the "cult of irrelevance" that leads many to limit their work to a narrow, obscure, and self-referential dialogue among academicians. But I also argue that greater involvement in public life has its own risks, most notably the danger of being co-opted or corrupted by powerful institutions who may be eager to enlist academics to help them justify policies that will benefit those same institutions. "Speaking truth to power" is not simple.
The article also includes six recommendations for improving academic participation in the public sphere. They are:
I lay out the rationale for these suggestions in the paper, and you'll have to read it for yourself to find out what they are. But here's the bottom line:
If scholars working on global affairs are content with having little to say to their fellow citizens and public officials and little to contribute to solving public problems, then we can expect even less attention and fewer resources over time (and to be frank, we won't deserve either). By contrast, if the academic community decides to use its privileged position and professional expertise to address an overcrowded global agenda in a useful way, then it will have taken a large step toward fulfilling its true social purpose. Therein lies the good news: the fate of the social sciences is largely in our own hands.
Scott Olson/Getty Images
Andrew Sullivan takes me mildly to task for my comments on the Murdoch/NewsCorp scandal, arguing that NewsCorp never had a monopoly on the news in Britain and pointing out that I failed to mention the BBC, which is the world's largest news organization and obviously a looming presence in British media.
Two points. First, I never said nor implied that Murdoch had a monopoly; my main point was it was a problem when "any single company or individual exercises excessive influence in media circles." Judging from the information released thus far, it seems clear that British politicians and public officials were intensely aware of the power that Murdoch & Co. wielded, and did a variety of regrettable things in an attempt to curry favor with them.
Second, Andrew's point about the BBC is well taken, at least in the abstract. A government-sponsored media giant can also skew what citizens know or believe, as state-controlled media in various dictatorships demonstrates. In a democracy, however, these dangers can be ameliorated by regulatory measures designed to insulate state-subsidized media organizations from political pressure. I haven't researched it in detail, but I'd argue that the BBC's record over the years, while far from perfect, has displayed a level of journalistic integrity that far exceeds NewsCorp. And any organization that could bring us both HardTalk and Monty Python can't be all bad.
But I take it that Sullivan and I agree on the main point: For democracy to function well, citizens have to be able to hear lots of competing views, including views that challenge powerful interests and the government. To me that is still the main lesson of the NewsCorp business.
Postscript: By the way, who has been Rupert Murdoch's most effective defender? Not his wife Wendi, who demonstrated superb reflexes and excellent hand-eye coordination when a moron tried to throw a shaving cream pie at Murdoch during his testimony. In fact, it was the pie-thrower himself who did the most to aid Murdoch's cause. Not only did this stupid act (temporarily) turn Murdoch into an object of sympathy, but it has led a raft of reporters and pundits to focus on Murdoch's wife and her entertainingly deft response. In short, all the assailant managed to do was distract us (once again) from the bigger issues. If I were a conspiracy theorist, I might even suspect that the pie-thrower had been hired by NewsCorp to stage the attack, but even I don't think they are that far gone.
Peter Macdiarmid/Getty Images
The steadily expanding "phone hacking" scandal in Great Britain is a good reminder that understanding politics requires a healthy appreciation of the role of arrogance and stupidity. What began is a seemingly straightforward example of sleazy journalistic practice has grown into a full-blown scandal, and the circle of guilt keeps widening.
Just look at the repercussions so far: 1) the NewsCorp's bid to take over all of British Sky Broadcasting has been scuppered, 2) NewsCorp CEO Rebekah Brooks has resigned and is now under arrest, 3) long-time Murdoch associate and Wall Street Journal publisher Les HInton has also resigned his post, 4) Prime Minister David Cameron has been badly tarnished, and oh yes, 5) the head of Scotland Yard has resigned in the wake of revelations that it had bungled the investigation (which is a charitable way of putting it). The WSJ and FoxNews have been exposed as shills for their boss (Murdoch), which is hardly surprising but is hardly going to help their reputations.
Oh, what a tangled web we weave....
Gallons of ink (or gigabytes of blog posts) have already been devoted to this story, but one broader element has received less attention amidst all the juicy personal stuff. What the scandal really teaches us is the dangers that inevitably arise when any single company or individual exercises excessive influence in media circles. Why? Because a healthy democracy depends on a well-informed citizenry, and media oligarchs can use excessive influence to skew what the public knows or believes in order to advance their own political objectives. If the Murdoch scandal doesn't convince you, just look at how Silvio Berlusconi used his media empire to drive his political career and look where Italy is today.
Furthermore, politicians are likely to accommodate powerful media organizations that are willing to play hardball, punishing politicians they didn't like and rewarding officials who played along. The NewsCorp was a master at this, and it is no wonder David Cameron and even Scotland Yard became compliant.
BEN STANSALL/AFP/Getty Images
What's the most powerful political force in the world? Some of you might say it's the bond market. Others might nominate the resurgence of religion or the advance of democracy or human rights. Or maybe its digital technology, as symbolized by the internet and all that comes with it. Or perhaps you think it's nuclear weapons and the manifold effects they have had on how states think about security and the use of force.
Those are all worthy nominees (no doubt readers here will have their own favorites), but my personal choice for the Strongest Force in the World would be nationalism. The belief that humanity is comprised of many different cultures -- i.e., groups that share a common language, symbols, and a narrative about their past (invariably self-serving and full of myths) -- and that those groups ought to have their own state has been an overwhelming powerful force in the world over the past two centuries.
Read the full article here.
Justin Sullivan/Getty Images
Back when I was in graduate school, Stanley Hoffmann wrote an essay in Daedalus entitled "An American Social Science: International Relations." Among other things, he argued that the field of international relations was dominated by scholars from North America, and especially the United States, in part due to the U.S. dominant global role in post-World War II era. (Foreign-born scholars like Henry Kissinger, Zbigniew Brzezinski, Peter Katzenstein, and the late Ernst Haas are exceptions that support the rule, as each received most if not all of their advanced training in the United States)
Has this situation changed? I ask this in part because lately I've been thinking about faculty recruiting at Harvard's Kennedy School. We have a very strong IR faculty -- my colleagues include Joe Nye, John Ruggie, Graham Allison, Samantha Power (on leave), Ash Carter (ditto), Monica Toft, Nicholas Burns, Meghan O'Sullivan, etc. -- but notice that this is a very U.S.-centric group, even though over 40 percent of our students come from overseas. We are fortunate to have a few colleagues from other countries (such as Karl Kaiser and Jacqueline Bhabha), but the center of gravity is decidedly Washington-focused. And we're no different in this regard than peer institutions like Princeton's Woodrow Wilson School.
I was discussing this issue with a colleague in D.C. the other day, and he argued that one reason was the simple fact that there were hardly any world-class foreign policy intellectuals outside the Anglo-Saxon world. He wasn't saying that there weren't smart people writing on world affairs in other countries; his point was that there are very few people writing on foreign affairs outside North America or Britain whose works become the object of global attention and debate. In other words, there's no German, Japanese, Russian, Chinese, or Indian equivalent of Samuel Huntington's Clash of Civilizations, Frank Fukuyama's The End of History and the Last Man, or Joseph Nye's various writings on "soft power."
Natashia Ruby via Flickr Creative Commons
As readers of the New York Times (and Jewish Week) already know, the Board of Trustees at City University of New York voted to table the awarding of an honorary degree to playwright Tony Kushner after one member of the board, Jeffrey Wiesenfeld, accused Kushner of supposedly "disparaging" Israel. Kushner has been critical of some Israeli policies-which hardly makes him unique among human beings, or among Jews, or even among Israelis. But none of his comments on these issues are outside the bounds of civil discourse or worthy of censure, especially by an institution that is supposed to be committed to freedom of thought and the open exchange of ideas. If you're curious, you can read Kushner's response here. Wiesenfeld is unrepentant, by the way, and defends his attack here. For an update on the evolving situation, see Justin Elliott here.
I have only two points to make about this incident, which one of the many attempts by self-appointed "defenders" of Israel to control discourse on this issue.
First, the main reason that hardliners like Mr. Weisenfeld go after someone like Kushner is deterrence. By denying critics of Israeli policy any honors, they seek to discourage others from expressing opinions that challenge the prevailing "pro-Israel" orthodoxy to which Weisenfeld is committed. Kushner was not nominated for an honorary degree for his views on Middle East politics; he was obviously nominated because he is an exceptionally talented and accomplished playwright and literary figure. But if someone like him can also be critical of Israel's treatment of the Palestinians and receive an honorary degree, then -- horrors! -- other people who feel similarly might be empowered to speak out themselves and pretty soon such comments will cease to be taboo. People like Mr. Weisenfeld don't want that; they want people who do not share their views to be constantly aware of the price they might pay for expressing them. And it never seems to occur to them that maybe Kushner's views might be both more humane but also better for Israel than the position that Weisenfeld apparently holds.
Second, what this incident also reveals is the reflexive timidity of many academic organizations. There doesn't seem to have been any sort of organized campaign to deny Kushner the honorary degree; instead, the board voted to table the nomination after one member (Weisenfeld) made his disparaging remarks. I've spent more than a quarter century in academia, including seven years as an administrator, and the board's reaction doesn't surprise me a bit. Despite their public commitment to free speech and open discourse, nothing terrifies deans and trustees more than angry donors, phone calls from reporters, and anything that looks controversial. By tabling the nomination, they undoubtedly thought they were avoiding a potentially uncomfortable controversy.
But in this case the CUNY board blew it big-time, both because Weisenfeld's accusations were off-base but also because they would not have been grounds for denying Kushner an honorary degree even if they had been true. And meekly caving as they did is contrary to the principles of intellectual freedom that universities are supposed to defend. The end result is that this incident will get a lot more attention than awarding the degree would have garnered (Kushner already has several), and the board's shameful lack of vertebrae has been publicly exposed.
And why does this matter for foreign policy? Because as John Mearsheimer and I wrote a few years ago: "America will be better served if its citizens were exposed to the range of views about Israel common to most of the world's democracies, including Israel itself. . . Both the United States and Israel face vexing challenges. . .and neither country will benefit by silencing those who support a new approach. This does not mean that critics are always right, of course, but their suggestions deserves at least as much consideration as the failed policies that key groups in the [Israel] lobby have backed in recent years" (pp. 351-52).
I was at a book party last night, and a colleague and I started talking about our favorite books in the field. I remarked that one of the odd things about IR (and most social science, for that matter) is that it is rarely entertaining. To be sure, a lot of the work is interesting, and when you read a really terrific book, there can be a genuine sense of intellectual excitement. But how often does one read a work of political science or international relations and find it a genuine pleasure to read? And in particular, how many scholars in the field of IR are truly amusing or entertaining writers?
I can't think of many. Make a list of the big names in the IR field: Waltz, Huntington, Mearsheimer, Nye, Jervis, Simmons, Wendt, Keohane, Krasner, Katzenstein, Waever, Sikkink, etc., etc. Most of them are lucid prose stylists, but with the partial exception of Waltz (who gets off some acerbic sallies on occasion), you'd hardly call any of them a particularly witty writer.
This may be partly due to the subject matter (it's tough to make a lot of jokes when you write about war and peace), but I think it also reflects the normal academic desire to Be Taken Seriously as a Social Scientist. Indeed, the conventions of most academic journals seem deliberately designed to encourage a dry, leaden prose style that is devoid of any personality whatsoever.
So here's my question: who are the most amusing, entertaining, or witty writers in the field of international relations and foreign policy? I don't mean books or aticles that are "funny" because they are wildly off-base; I mean scholars who are a joy to read because their prose is lively, they offer amusing asides, and maybe even manage a laugh-out-loud witticism on occasion. And to narrow the field a bit more, let's exclude journalists (who are rarely all that amusing but usually have livelier writing styles).
My nominees would be John Mueller, James Scott, and Thomas Schelling. Honorable mentions might go to Dan Drezner (for his book on zombies), and Geoffrey Blainey (for his The Causes of War, though Blainey is really a historian/journalist). My three main nominees are all serious academics with long records of scholarly achievement, but each of them is also a joy to read, in part because their prose styles are relaxed and unpretentious and because each is capable of genuine wit.
So nominations are now open. "Mirror, Mirror on the Wall: Who's the wittiest IR scholar of them all?"
There's a fascinating piece in today's New York Times, summarizing the findings of a recent Science article on the origins of human language. Based on a mathematical analysis of phonetic diversity (i.e., the number of separate sounds in different languages), biologist Quentin Atkinson of the University of Auckland has determined that human language originated in southern Africa around 50,000 years ago (some scientists believe its origins may be even earlier).
You've got to hand it to our species: 50,000 years isn't that long a time. Think of all the good and bad ideas that we've produced in 50 millennia: Shakespeare, the "divine right of kings," both slavery and abolitionism, relativity, the Bhagavad Gita, fascism, a mind-boggling array of religious dogma, liberalism, Marxism, the movies of Fred Astaire, Mad magazine, Japanese manga, rap, hip-hop, and bebop. The list is infinite … and now there's the blogosphere.
But here's what I wondered as I finished the article: Who uttered the first pun? And did those early humans groan when they heard it?
Dan Kitwood/Getty Images
This is a guest post by Sean Kay of Ohio Wesleyan University.
As the world goes green for St. Patrick' Day, it is good to reflect on what Ireland's experiences teach us. We might ask, why should a realist care about Ireland? What might be learned from the experiences of this small Island in the North Atlantic -- home to just 4.5 million people?
Realists care about strategy, of course, which is one good reason to ponder Irish history. Ireland was for centuries a key component of England's rear defense against the risk of foreign enemies. Realists also are keen to understand new tactics in warfare and anyone wishing to get a sense of how guerilla campaigns proceed -- and how state responses to them can backfire would be well advised to study Michael Collins and the Irish quest for independence. Add to that the personal risks to those who negotiate an exchange of land for peace -- Michael Collins to Yitzhak Rabin show this only too tragically. The Irish experience in managing its strategic relationship with Britain after independence -- by building tight transatlantic advocacy networks and by integrating into the European community -- also demonstrates how creative diplomacy can achieve major strategic goals.
Ireland is also an interesting case of a state applying realism and ideals in its foreign policy, a topic that realists and others have debated for decades. Ireland remained neutral in World War II because it wished to consolidate its independence and avoid conscription of its people into the British army. Nonetheless, Ireland cooperated in both overt and secret assistance to the allied powers -- likewise during the Cold War. Ireland also advocated the cause of self-determination for all nations at the United Nations -- out of moral sympathy, but also as a way to keep its own views towards Northern Ireland on the agenda of global politics. Ireland managed to show how small nations can lead on a range of issues from peacekeeping to nuclear proliferation. It is often forgotten, but the origins of the nuclear non-proliferation treaty can be found in speeches by the Irish foreign minister at the United Nations in the late 1950s.
Stephen M. Walt is the Robert and Renée Belfer professor of international relations at Harvard University.