Andrew Sullivan has offered a measured response to the Guardian's revelations about a massive effort by the U.S. National Security Agency (NSA) to collect metadata about ordinary Americans' phone calls. You can read his whole comment here, but the sentences that caught my eye were these:
"This kind of technology is one of the US' only competitive advantages against Jihadists. Yes, its abuses could be terrible. But so could the consequences of its absence."
There are two obvious counters. First, the United States (and its allies) are hardly lacking in "competitive advantages" against jihadists. On the contrary, they have an enormous number of advantages: They're vastly richer, better-armed, better-educated, and more popular, and their agenda is not advanced primarily by using violence against innocent people. (When the United States does employ violence indiscriminately, it undermines its position.) And for all the flaws in American society and all the mistakes that U.S. and other leaders have made over the past decade or two, they still have a far more appealing political message than the ones offered up by Osama bin Laden, Ayman al-Zawahiri, and the various leaders of the Taliban. The United States is still going to be a major world power long after the contemporary jihadi movement is a discredited episode in modern history, even if the country repealed the Patriot Act and stopped all this secret domestic surveillance tomorrow.
Second, after acknowledging the potential for abuse in this government surveillance program, Sullivan warns that the "consequences of its absence" could be "terrible." This claim depends on the belief that jihadism really does pose some sort of horrific threat to American society. This belief is unwarranted, however, provided that dedicated and suicidal jihadists never gain access to nuclear weapons. Conventional terrorism -- even of the sort suffered on 9/11 -- is not a serious threat to the U.S. economy, the American way of life, or even the personal security of the overwhelming majority of Americans, because al Qaeda and its cousins are neither powerful nor skillful enough to do as much damage as they might like. And this would be the case even if the NSA weren't secretly collecting a lot of data about domestic phone traffic. Indeed, as political scientist John Mueller and civil engineer Mark Stewart have shown, post-9/11 terrorist plots have been mostly lame and inept, and Americans are at far greater risk from car accidents, bathtub mishaps, and a host of other undramatic dangers than they are from "jihadi terrorism." The Boston bombing in April merely underscores this point: It was a tragedy for the victims but less lethal than the factory explosion that occurred that same week down in Texas. But Americans don't have a secret NSA program to protect them from slipping in the bathtub, and Texans don't seem to be crying out for a "Patriot Act" to impose better industrial safety. Life is back to normal here in Boston (Go Sox!), except for the relatively small number of people whose lives were forever touched by an evil act.
Terrorism often succeeds when its targets overreact, thereby confirming the extremists' narrative and helping tilt opinion toward their cause. Thus, a key lesson in dealing with these (modest) dangers is not to exaggerate them or attribute to enemies advantages that they do not possess. I suspect Sullivan knows this, even if he briefly forgot it when writing his otherwise thoughtful post.
ROBYN BECK/AFP/Getty Images
According to the New York Times, President Obama is hoping to establish a genuine personal rapport with Chinese President Xi Jinping during the latter's visit to the United States this week. Hence the nature of the visit: an informal, in-depth, and deliberately casual event outside Washington, intended to give the two leaders the chance to get past the official talking points and really get to know each other.
It's easy to understand why Obama thinks one-on-one diplomacy is the best way to handle the delicate Sino-American relationship. Most politicians have an exaggerated sense of their own importance, as well as a certain faith in their innate ability to get on good terms with others and get them to do what they want. Although he's been criticized of late for lacking the "schmooze factor," Obama's whole career has been based on his ability to charm people, starting with the Democratic Party insiders who backed his campaign early and winning over the millions of voters who've now elected him twice. No doubt he's hoping to work a little of the same magic with Xi.
More importantly, given all the potential frictions between a rising China and a reigning U.S., what else is he going to do? Neither Obama nor Xi can alter the core interests of the two countries, or wish away the various issues where those interests already conflict or are likely to do so in the future. The best they can achieve is a better understanding of each other's red lines and resolve and some agreement on those issues where national interests overlap. In this way, each can hope to keep things from getting worse and at the margin make relations a bit warmer. In this sense, personal summitry of the sort being practiced this weekend is the only card either can play.
But even if Obama is successful this weekend, this effort is unlikely to prevent Sino-American rivalry from intensifying in the future. The basic problem is that the two state's core grand strategies are at odds, and good rapport between these two particular leaders won't prevent those tensions from re-emerging down the road.
Today, the United States has a dominant position in the Western hemisphere and faces no serious rivals nearby. As I've observed before, this basic level of territorial security is what allows the United States to roam around the world trying to shape events in far-flung places like Afghanistan, Iraq, Sudan, the Korean Peninsula, and the Balkans, and to concern itself with issues that are of often of secondary importance (like Libya or even Syria). Moreover, the United States also has a major security presence in East Asia -- China's home region -- and is planning on bolstering that presence in the years ahead. Despite the missteps of recent years, current geopolitical realities still favor the United States.
By contrast, China faces a decidedly unfavorable regional environment. Its relations with India, Japan, Vietnam, the Philippines, and several other neighbors are wary at best, and many nearby countries have close security ties or formal alliances with the United States (Imagine how we would feel if Canada were allied with China and Chinese warships had a base in Acapulco). Unlike the autarkic Soviet Union, China is also increasingly dependent on foreign trade to supply with raw materials, energy, and export markets, and this trade must travel through various ocean straits and choke points that leave it vulnerable to blockade. Moreover, its growing dependence on outside resources means that China's interests are increasingly global in nature; in the future, it will not be just a land power worrying mostly about events close to home.
Here's the rub: as long as the United States retains a significant military presence in Asia and a network of Asian allies, Beijing will have to worry a lot about security in its own region and it won't be able to interfere as often or as effectively in other parts of the world. And as long as this is the case, the United States will have a freer hand in the other places that it cares about.
But if China continues to rise economically, develops more military power, and uses its growing clout to slowly push the United States out of Asia, then the conditions that currently favor the United States will be gone. If China manages to create a "sphere of deference" in Asia and eventually convinces most Asian states to distance themselves from Washington, then it won't have to worry as much about its immediate neighborhood and it will be free to take a more active role elsewhere. China's geopolitical position would be more like the United States: it would be a regional hegemon that was increasingly free to intervene overseas when it felt that its interests required it to do so. And that might even include a more active role in the Western hemisphere, thereby forcing the United States to pay more attention to matters closer to home.
In short, the struggle for hegemony in Asia will be a crucial pivot point for the 21st century: if it goes one way, the United States will preserve much of the freedom of action that it has enjoyed since 1945. But if it goes the other way, the United States will be sharing the world stage with a peer competitor with a larger population, a larger economy (in absolute terms), and the same capacity to shape events around the world that the United States has long been accustomed to.
At the most basic level, this is why the United States is "pivoting" to Asia: to try to prevent China from establishing a dominant position there. It is this fundamental incompatibility between strategic objectives that will fuel Sino-American rivalry in the future, no matter how well Obama and Xi (or their successors) get on this weekend. And with so much potentially at stake for both countries, you can easily see why intense competition is likely.
Of course, this pessimistic scenario will not arise if Chinese economic growth stalls, or if internal problems force China's leaders to concentrate on domestic matters. Nor am I predicting a future war between the United States and China, or even a competition as nasty and intense as the Soviet-American "Cold War." After all, there are powerful economic incentives for both sides to keep the competition within bounds, and there isn't the same level of ideological animosity that gave the Cold War its particular Manichean character. And there's some comfort in the realist argument that bipolar worlds tend to be both tense but also stable. But the tension between U.S. and Chinese grand strategies is bound to generate recurring frictions, and is likely to generate an intense competition for allies and influence, especially in Asia.
Nothing in international politics is inevitable, of course, and sometimes enlightened statecraft can overcome structural pressures. If U.S. leaders are consistently wise, far-sighted, judicious, calm, and resolute -- not just now but for the next forty years -- and if their Chinese counterparts are equally sensible, restrained and smart, then it is entirely possible that the two governments will navigate the future diplomatic rapids with skill and aplomb. But seriously: how likely is that optimistic scenario? Based on what we know of each country's history, can we be confident that both countries can go for thirty or forty years without eventually choosing a leader or two who aren't especially wise, astute, sensible, or restrained? For this reason, relying on personal rapport to manage relations between the world's two most powerful countries seems like a pretty weak reed to me.
MANDEL NGAN/AFP/Getty Images
I'm not an expert on Turkey and so I don't have much to add to the chorus of commentary about recent events there. My own view of the AKP era in Turkey is mixed: I've been impressed by its economic achievements and by the energy, creativity and acumen of the various foreign policy officials with whom I've had the pleasure of interacting over the past five years or so. In particular, I've often found their views on regional affairs to be insightful and well-informed, though of course not infallible.
But there's also been a worrisome authoritarian undercurrent to the AKP's rule, including assaults on press freedoms, badly-run prosecutions of political opponents and alleged coup plotters, and PM Erdogan's tendency to think that he knows what's best for Turkey's citizens, even when they disagree. And it's always a worrisome sign when a leader blames internal opposition to his policies on "foreign agents" and Twitter.
But let me offer a few (relatively uniformed) thoughts on recent events. The first is that the current upheaval may -- repeat, may -- turn out to be a salutary development in Turkey's political evolution. Turkey's democracy is still a work-in-progress, and both its formal institutions and the guiding norms are in flux. (Remember: Turkey was a politically moribund, economically stagnant, and sometimes brutal military dictatorship not so very long ago). The current backlash against the Erdogan government is a reminder to the AKP that a Parliamentary majority is not a license to impose whatever the ruling party leadership wants; at least not if those same leaders also want a reasonably tranquil society. And if the current tug of war eventually leads Turkey to develop institutions that limit the "tyranny of the majority," it will be a salutary development in the history of Turkish government. Stay tuned.
Second, Americans ought to recognize that their influence over these developments is limited. President Obama reportedly has a good working relationship with Erdogan and can offer constructive advice if asked, and he should make it clear that a continued drift toward authoritarianism will make it harder to maintain close U.S.-Turkish relations in the future. (Yes, I know the U.S. has close ties with other authoritarian governments, but we already expect more from Turkey and Ankara doesn't have lots of oil). I have tried to make this point in my own conversations with Turkish officials, journalists, and scholars, though I doubt my words carried a great deal of weight. Turkey's leaders are likely to follow their own counsel; the big question is whether they will begin to recognize that no leader or party is infallible and that listening to popular sentiment-including the sentiments of those who didn't vote for you -- is almost always a smart political strategy.
As I tweeted yesterday, if I could assign the AKP leadership one book to read, it would be James Scott's Seeing Like a State: Why Certain Schemes to Improve the Human Condition Have Failed. It's long been one of my favorite books, because it shows how authoritarian governments get into trouble when they adopt ambitious plans for social engineering (often based on some sort of far-reaching "modernist" ideology) and when there are no political mechanisms available to check their ambitions. The results are uniformly disastrous, as the cases of Stalinist collectivized agriculture or Mao's "Great Leap Forward" attest, largely because overly ambitious schemes inevitably generate unintended consequences and tone-deaf authoritarian leaders won't recognize things are going wrong until it is too late. (That can happen in democracies too, by the way, as the Bush administration's sorry experience in Iraq shows all too well).
Turkey under the AKP is a very long way from Stalinist Russia or Maoist China, of course, but the lessons of Scott's book are still useful. Democracy is a messy form of government, and as the current state of American and British politics shows, it has its own forms of gridlock and dysfunction. But healthy democracies do tend to be self-correcting (as the 2008 presidential election showed), and so are less likely to drive themselves completely off a cliff. Does anyone know if Seeing Like a State is available in Turkish?
Today I offer a brief comment on David Bosco's excellent FP piece on U.N. peacekeeping. Bosco points out that the United Nations draws its peacekeepers overwhelmingly from poor societies; in his words, "U.N. peacekeeping is an activity mostly paid for by the rich world and carried out by troops from poorer states."
My comment is twofold. First, much the same could be said of military activity conducted by the United States of America. Now that the country has an all-volunteer force, military service in the United States is increasingly reserved for the poorer segments of society. As Amy Lutz, a Syracuse University sociologist, concludes in a 2008 article: "as family income increases, the likelihood of having ever served in the military decreases … the economic elite are very unlikely to serve in the [U.S.] military." As with U.N. peacekeeping, in short, the "common defense" in the United States is an activity paid for by richer Americans and carried out (mostly) by poorer Americans.
Second, I suspect this tendency reflects the broad recognition that warfare is not an especially glorious or attractive activity: It may be necessary at times, but military service is not the best way to make a living if you have other alternatives. For the most part, Americans no longer share Teddy Roosevelt's belief that "a just war is in the long run far better for a man's soul than the most prosperous peace." It may also reflect the collective social awareness that the United States is actually very secure and that most citizens (and particularly those who are well off) do not need to serve in uniform in order to make a contribution to the national defense. Instead, they can just get a job and pay their taxes.
None of this should be seen as denigrating military service itself or questioning the choices of those Americans (including the relatively well-to-do) who opt for a military career. But as Karl Eikenberry and David Kennedy observed in a thoughtful New York Times op-ed this week, the gradual separation between the U.S. military and the rest of society has significant costs and may ultimately be quite unhealthy for the republic. (For a longer discussion, Eikenberry's recent article in the Washington Quarterly is well worth reading too.)
Scott Olson/Getty Images
It's Commencement Day here at Harvard, and we are sending the Class of 2013 out into the world with congratulations, good wishes, and high hopes. My graduate students here at the Kennedy School are a remarkable group, and I look forward to watching them make their way in the complex and often troubling world of foreign policymaking.
At such times I tend to think about how we might have educated them better, and I want to draw an analogy to an interesting op-ed by Jacob Hamblin in today's New York Times. Hamblin's subject is biodiversity, and he traces the origins of our present concern for it to some rather chilling Cold War strategic planning. Specifically, war planners investigating ways to destroy enemy ecosystems gained new appreciation for the dangers of environments that were ecologically one-dimensional (such as vast farmlands sown with a single crop). In particular, loss of diversity leaves whole areas vulnerable to a single pathogen or event that wipes out the dominant species.
I would argue that the same is true of "intellectual ecosystems" as well. When academic disciplines become overly concentrated on one set of questions, one set of theoretical answers, one set of methods, or one body of data, what might seem at first glance to be a powerful engine of scholarly progress can be a source of danger as well. Having everyone working in more or less the same way can generate lots of publications and citations and even help knowledge advance in this particular area, but "normal science" of this sort also means that alternative approaches, questions, methods, or theories get short shrift. The danger is that scholars wake up one day and discover that the reigning method du jour has fatal limitations, or it turns out that some neglected skills (e.g., foreign languages, cultures, etc.) suddenly become very valuable.
In the IR field, for example, contemporary graduate training increasingly involves mastering an enormous arsenal of methodological skills, most of them statistical in nature. Because there are only 24 hours in a day and only five to six years in most Ph.D. programs, most students won't have the time to learn foreign languages, read broadly in history, do more than cursory field research on rather narrow topics, or even acquire a sophisticated understanding of social theory. There are important benefits to this type of training -- though perhaps fewer than is often alleged -- but privileging this particular set of skills comes with a cost as well. If today's graduate students increasingly resemble each other -- varying only in their raw talents or determination -- then we are in effect creating an intellectual monoculture that might leave us badly prepared for new developments. To take an obvious example: After the 9/11 attacks, wouldn't it have been nice to have had a few more people in academia who really understood Islam, the Middle East, the nature of terrorist movements, or even Arabic? Similarly, having a few more people who understood how financial markets and regulations really worked (as opposed to how they worked in theory) might have come in handy both before and after the Great Meltdown.
Of course, the combination of tenure and the abolition of mandatory retirement in the United States compounds this dilemma. Scholars rise to the top of their fields based mostly on their early work, which is bound to reflect the research norms and standards that prevailed at the time. Most academics try to grow and develop over time, and a few exhibit dramatic shifts in their thinking, but for the most part they tend to like scholars whose own work resembles their own. So they hire and promote people who are more or less like them, further diminishing the degree of intellectual diversity within the field.
I don't have an obvious antidote to this tendency. But one of the nice things about teaching at a public policy school is the presence of numerous disciplines within the same faculty (even if economists tend to dominate, or at least they often try to). In addition to being more interesting, such schools may be better equipped to handle new developments in the real world than most academic departments are. And because public policy schools are explicitly supposed to prepare students for careers in the real world (as opposed to the ivory tower), they are more likely to welcome practitioners, public intellectuals, and scholars who don't fit into neat categories. The result is a much richer intellectual environment and one that ought to be more adaptable over time.
If this theory is right, then public policy schools (and other explicitly inter- and multidisciplinary enterprises) should have an especially bright future -- not because they are necessarily better at any one thing, but because they will be less vulnerable to fads, changes of fashion, or shifts in the agenda of relevant problems. Like a diverse investment portfolio or a diverse ecosystem, building a diverse intellectual environment is the smart long-term strategy.
So to the graduates of the Class of 2013, I say: "Congratulations! Your degree is valuable today and is likely to be even more valuable in the future." At least I hope so, for their sake as well as my own.
Paul Marotta/Getty Images
You gotta give U.S. Secretary of State John Kerry credit for persistence -- or maybe just perverseness -- in his efforts to restart the Middle East "peace process." Given the complete failure of the past two decades of peace-processing, you might also wonder why he's bothering. My guess is that he does realize that the Israeli-Palestinian conflict is still a significant problem for the United States, as well as a source of continued human suffering. The fighting in Syria and the continued struggles in Iraq, Egypt, and elsewhere may command more attention these days, but the situation in Israel/Palestine remains a potent source of anti-Americanism and a constant headache for every president. Plus, Kerry is an ambitious guy, and who wouldn't like to be the hero who finally managed to put this century-old conflict to rest?
News reports suggest that Kerry is trying to advance this goal by employing a time-honored tool of Middle East diplomacy: bribery. No, I don't mean direct under-the-table payoffs to key leaders (although the United States has done plenty of that in the past and I wouldn't rule it out here). Instead, I mean offering the various parties big economic incentives to lure them back to the table. Back in the 1970s, for example, Henry Kissinger got Israel to withdraw from the Sinai by promising it enormous military aid packages and assorted other concessions. Jimmy Carter did the same thing when he brokered that Egyptian-Israeli peace treaty in 1979, and U.S. largesse also greased the subsequent peace deal between Israel and Jordan in 1994. When domestic politics make it impossible to use sticks, carrots are all you have left.
This time around, Kerry has reportedly assembled a $4 billion investment package for the Palestinian Authority, designed to improve economic conditions in the West Bank and demonstrate to the Palestinians the benefits of peace. Presumably all they need to do is agree to resume negotiations and the money will flow; the investment is supposedly not linked to a final-status agreement. This approach is also a familiar American tendency at work: The United States is happy if the parties are talking, even if they are simultaneously taking steps that are "not helpful" and if they never get to the finish line.
The real question is: Should Abbas & Co. take the money and resume discussions?
Of course they should, but not because it will produce an agreement. Any talks that do resume are going to lead nowhere, and the Palestinians might as well get paid for engaging in an otherwise meaningless activity. The talks are meaningless because Israel is not going to agree to a viable Palestinian state, and certainly not one based on the 1967 borders. Remember that Prime Minister Benjamin Netanyahu's entire career has been based on opposition to a Palestinian state and that the official platform of his Likud party "flatly rejects the establishment of a Palestinian Arab state west of the Jordan river." Netanyahu is under no domestic pressure to cut a deal either; on the contrary, he'd be in political hot water if he tried.
Ever since the Oslo Accords, the basic Israeli strategy has been to negotiate endlessly while continuing to expand settlements, with the number of settlers more than doubling since 1993. Even then Prime Minister Ehud Barak's supposedly "generous" offer at Camp David in 2000 fell well short of an acceptable deal, as his own foreign minister, Shlomo Ben-Ami, later acknowledged. Netanyahu now leads the most right-wing government in Israel's history, and his government would collapse if he were to agree to allow the Palestinians anything more than a handful of disconnected bantustans under complete Israeli control. That's why Palestinian President Mahmoud Abbas has been reluctant to resume the negotiations; he knows that talks merely provide a cover for further colonization.
But acknowledging that reality could also be
liberating. Given that negotiations are
pointless and that more and more people know it, the Palestinians should simply take
the money that Kerry has assembled and agree to the charade, while making it
clear that they will not settle for less than the Clinton parameters. They can also hint that if a viable and sovereign
state is not in the cards, then they will begin to campaign for full civil and political rights within the "Greater Israel" that now exists.
That's not the outcome Kerry has in mind, and it's not likely to materialize anytime soon. But neither will a final-status agreement, no matter how often Kerry drops in for a visit and how many dollar bills he waves.
In academic research, "rigor" is an especially cherished quality. If you want to praise a scholar's work, you talk about how "rigorous" it is. If you want to dis someone's scholarship politely, you might sniff and say, "Well, it's interesting, but it's not very rigorous."
But what we mean by "rigor" isn't always clear, and the way it is implemented in practice may even be counterproductive. Many academics tend to define "rigor" in narrow technical terms: 1) Did the researcher employ the most advanced methodological practices, 2) did he or she consider and debunk alternative explanations convincingly, 3) was the data-collection procedure especially careful, 4) did he or she examine all the relevant archives or only a few, 5) was the statistical model properly "identified"? Etc., etc. These criteria can be applied to both quantitative and qualitative research, by the way: In this sense, "rigor" is conceived as a measure of technical proficiency, designed to give us confidence that the claims being advanced are in fact valid.
The gold standard for "rigorous" research is publication in a "peer-reviewed" academic journal. By subjecting papers to anonymous peer review, academic fields supposedly weed out less "rigorous" works and publish only the best research. Different scholarly journals acquire reputations over time, and publishing in "top" journals is seen as the primary measure of a scholar's worth. University presses follow similar procedures when deciding which monographs to publish, and they too develop reputations of various sorts. Notice, however, that this is all inherently subjective: A journal or a publisher is regarded as prestigious if scholars in the field believe it is.
There's a lot to be said for this basic approach, which has generated a lot of progress in some fields. I've spent a lot of my own career writing articles for refereed journals, reviewing manuscripts for them, or co-editing a book series for a university press, so I'm hardly hostile to this way of doing business. But if we're really honest with ourselves, academics ought to acknowledge that the system is far from perfect and even encourages some counterproductive tendencies.
For starters, peer review doesn't guarantee that false results don't get published; academic journals are filled with articles that are subsequently shown to have contained significant errors. That's inevitable in the research enterprise, of course, but it is a reminder that peer review alone is not a guarantee of quality. And it certainly doesn't guarantee that a particular work of scholarship will be useful or important, because most published academic articles are read by very few people and essentially disappear without a trace.
Second, peer review isn't a mechanical process that automatically winnows the good from the bad. In my experience, journal editors play key independent roles in the evaluation process, and their autonomy can have a huge impact on which works actually get published. Editors don't have to blindly follow reviewers' advice if they think a particular manuscript has potential that the reviewers didn't see, and they can nurture a piece that they think makes a contribution. In this way, editors with a particular vision can guide journals in one direction or another. By the same token, lazy or narrow-minded editors can harm a journal (or a subfield) either by mindlessly following reviewers' advice or by relying too much on an intellectually narrow set of reviewers.
Third, peer review is probably overvalued because reviewers' comments are often less than helpful and rarely decisive. By the time most articles are submitted for publication, they've usually been presented at academic seminars and have gone through multiple drafts in response to suggestions from the authors' friends and colleagues. I've occasionally gotten useful suggestions from an anonymous reviewer's report, but I'd say that more than half the comments I've received over the years were of no value at all and I simply ignored them. Indeed, a dirty little secret is that a lot of "peer reviews" are no more than a couple of cursory paragraphs along with a recommendation to publish, reject, or revise and resubmit. If that's the reality of the review process, then why do we fetishize publication in "peer-reviewed" journals as much as we do? In other words, knowing that something got published in the American Political Science Review, World Politics, International Organization, or International Security doesn't tell you very much about its real value. You have to read it for yourself to make a firm judgment.
Fourth, fetishizing refereed journals (and their supposed rankings) encourages universities to make personnel decisions on the basis of supposedly "objective" indicators such as citation counts, number of "peer-reviewed" articles, and the like. These measures can be useful when used with caution, but they are at best an indirect measure of a scholar's real contribution. A high citation count may simply indicate that one is working in a faddish subfield and doing "normal science" that other scholars find acceptable but not necessarily pathbreaking. It may also be a sign that you've written something that got a lot of attention even though (or because) it was dead wrong. Again, the danger is that departments and university administrators will judge research output not by actually reading the work and making an informed assessment, but by looking at these various indirect indicators.
Fifth, this notion of rigor that is imbedded in these practices may actually make it easier for incorrect or trivial scholarship to survive. If the desire to be seen as "rigorous" leads scholars to produce works that are difficult to understand (either because they use lots of rarified techniques, specialized data, obscure historical sources, or arcane and confusing language), then it is going to be harder for anyone reading the work to evaluate their claims.
By contrast, a scholarly argument that is simple, straightforward, and fairly easy to grasp is inherently easier to evaluate. Accordingly, scholarship that is accessible -- i.e., that is easily read and understood -- will face a larger audience of potential critics than a piece of scholarship that can only be understood by a small, rarified group of readers, many of whom may share a lot of the presuppositions of the study's author(s).
In short, publications whose clarity widens the circle of potential challengers can actually contribute to scholarly advancement, because the larger the audience that can understand and evaluate an argument, the likelier it is that errors will be exposed and corrected and the better the argument will have to be to win or retain approval. By contrast, a dubious argument that is presented in an opaque or impenetrable way may survive simply because potential critics cannot figure out what the argument is or because it is too time-consuming and difficult to try to replicate the published results. As mathematician Melvyn Nathanson observes, "The more elementary the proof, the easier it is to check and the more reliable is its verification."
Please note: I am not suggesting that academia discard peer review and discourage scholars from publishing in prestigious journals. Rather, I'm suggesting that the social sciences would be more useful and more rigorous if members of these disciplines adopted a less hidebound approach to the merits of different types of publication. "Should it really be the case," Bruce Jentleson correctly asks, "that a book with a major university press and an article or two in [a refereed journal] ... can almost seal the deal on tenure, but books with even major commercial houses count so much less and articles in journals such as Foreign Affairs often count little if at all?"
Instead of privileging one sort of publication over others, based on a narrow notion of "rigor," we ought to recognize that different types of scholarly writing reach different audiences and are exposed to different forms of outside scrutiny. In most cases, an article published in a prominent economics, history, or political science journal will be read by relatively few people, one or two of whom may then take issue with the work and challenge its findings. By contrast, if that same author presented the results in an article or report intended for a broader audience, so that it was read by a much larger number of informed citizens and by well-informed practitioners in the real world, then this larger population of readers might be quick either to hail its contribution or to identify obvious mistakes. This capacity may be even more pronounced in the Internet age, which allows readers on every continent to challenge an author's claims -- assuming, of course, that they are not published in obscure venues or written in ways that make it harder for all but a few people to understand them.
Finally, fetishizing "peer review" is a good way to ensure that fewer and fewer people pay attention to what academics have to say about important world issues. This is especially true in fields like IR and public policy, whose main social value lies in what we (supposedly) can contribute to public and elite understanding of a complex world. But if universities only reward the things that scholars write solely for each other, we will be encouraging a narrow professionalism and contributing to the cult of irrelevance that rules many academic departments. And over time, we shouldn't be surprised if the outside world places less and less value on what we have to say and eventually decides to invest society's finite resources in other activities.
GABRIEL BOUYS/AFP/Getty Images
Permit me to indulge today in a bit of speculation, for which I don't have a lot of hard evidence. As I read this article yesterday on Hezbollah's involvement in the Syrian civil war, I began to wonder whether U.S. involvement in that conflict isn't more substantial than I have previously thought. And then I did a bit of web surfing and found this story, which seemed to confirm my suspicions. Here's my chain of reasoning:
1. The Syrian conflict has become a proxy fight between the opposition and its various allies (Qatar, Saudi Arabia, the United States, Turkey, etc.) and Bashar al-Assad's regime and its various outsider supporters (Iran, Russia, Hezbollah).
2. For Washington, this war has become a golden opportunity to inflict a strategic defeat on Iran and its various local allies and thus shift the regional balance of power in a pro-American direction.
3. Israel's calculations are more complicated, given that it had a good working relationship with the Assad regime and is concerned about a failed state emerging next door. But on balance, a conflict that undermines Iran, further divides the Arab/Islamic world, and distracts people from the continued colonization of the West Bank is a net plus. So Prime Minister Benjamin Netanyahu won't object if the United States gets more deeply engaged.
4. Consistent with its buck-passing instincts, Barack Obama's administration does not want to play a visible role in the conflict. This is partly because Americans are rightly tired of trying to govern war-torn countries, but also because America isn't very popular in the region and anyone who gets too close to the United States might actually lose popular support. So no boots on the ground, no "no-fly zones," and no big, highly visible shipments of U.S. arms. Instead, Washington can use Qatar and Saudi Arabia as its middlemen, roles they are all too happy to play for their own reasons.
5. Since taking office, Obama has shown a marked preference for covert actions that don't cost too much and don't attract much publicity, combined with energetic efforts to prosecute leakers. So an energetic covert effort in Syria would be consistent with past practice. Although there have been news reports that the CIA is involved in vetting and/or advising some opposition groups, we still don't know just how deeply involved the U.S. government is. (There has been a bit of speculation in the blogosphere that the attack on Benghazi involved "blowback" from the Syrian conflict, but I haven't seen any hard evidence to support this idea.)
6. In this scenario, the Obama administration may secretly welcome the repeated demands for direct U.S. involvement made by war hawks like Sen. John McCain. Rejecting the hawks' demands for airstrikes, "no-fly zones," or overt military aid makes it look like U.S. involvement is actually much smaller than it really is.
To repeat: The above analysis is mostly speculative on my part. I have no concrete evidence that the full scenario sketched above is correct, and I don't know what the level of U.S. involvement in the Syrian civil war really is. But that's what troubles me: I don't like not knowing what my government is doing, allegedly to make me safer or to advance someone's idea of the "national interest." And if you're an American, neither should you. If the United States is now orchestrating a lot of arms shipments, trying to pick winners among the opposition, sending intelligence information to various militias, and generally meddling in a very complicated and uncertain conflict, don't you think the president owes us a more complete account of what America's public servants are or are not doing, and why?
Si Mitchell/AFP/Getty Images
Stephen M. Walt is the Robert and Renée Belfer professor of international relations at Harvard University.