I learned this morning that Kenneth N. Waltz, who was arguably the preeminent theorist of international relations of the postwar period, had passed away at the age of 88. Ken was the author of several enduring classics of the field, including Man, the State, and War (1959), Foreign Policy and Democratic Politics (1967), and Theory of International Politics (1979). His 1980 Adelphi Paper on nuclear proliferation ("The Spread of Nuclear Weapons: More May Be Better"), was also a classic, albeit a controversial one. One of his lesser achievements was chairing my dissertation committee, and he was a source of inspiration throughout my career.
I've written a tribute to Waltz's scholarship before, in the preface to a festschrift for Ken edited by Andrew Hanami. But today I want to celebrate his role as a teacher, based on some remarks I made at the 2010 meeting of the International Studies Association, where Waltz received an award for lifetime achievement. With a few edits, here's what I said back then:
Ken Waltz is widely recognized as one of the preeminent IR scholars of the postwar period, but he was also responsible for training an impressive number of graduate students, including Barry Posen, Stephen Van Evera, Bob Powell, Avery Goldstein, Christopher Layne, Benny Miller, Karen Adams, Shibley Telhami, Jim Fearon, William Rose, Robert Gallucci, Andrew Hanami, and many others. I want to say a few words about what it was like to have him as a teacher and advisor, and why I think he was so effective at it.
First, Ken was trained in political theory and renowned as a theorist of international relations, but he was deeply interested in real-world issues and his example showed us how theory could be used to illuminate crucial policy issues. In addition to his own theoretical work, Ken wrote about Vietnam, nuclear strategy, economic interdependence and globalization, nuclear proliferation, the U.S. defense budget, and even the Rapid Deployment Force. For those of us who were interested in international security affairs, his model was wonderfully liberating. Ken showed that you could be a theorist and a social scientist without joining the "cult of irrelevance" that afflicts so much of academia.
Indeed, Ken's work on these topics underscored why theory is so important. Having lots of facts at one's disposal didn't help if you were thinking about those facts in the wrong way. In a world where most people think theory and practice have little in common, Ken was teaching us that they were inextricably intertwined. That's why he got a lot of things right that others got wrong. He was right about Vietnam, right about which side was winning the Cold War, right about the basic principles of nuclear deterrence, and right about the continued relevance of politics, even in the era of economic "globalization." A little theory can go a long way, and his case, it led in the right direction.
Second, Ken encouraged his students to ask big questions, largely by the force of his own example. Man, the State, and War organizes and critiques several centuries of writing on the causes of war. Theory of International Politics presents a powerful general theory explaining the behavior of self-regarding actors in anarchy. His essay on proliferation attacks the conventional wisdom with ruthless logic, just as his earlier essays on interdependence showed where liberal theories had gone off-course and why power was still central. Ken encouraged us to tackle puzzles whose answers were not immediately available and to be fearless about challenging entrenched orthodoxies.
Third, and perhaps most important, Ken held the bar high and encouraged his students to have equally high standards. The first time I laid eyes on Ken was the orientation meeting for new grad students at Berkeley in 1977. Ken was director of graduate studies that year and had to give the welcoming speech. I don't remember most of what he said, except that he emphasized that grad school took too damn long and that we should all plan on finishing in four years ... or at most five. His message was simple: "Get your coursework done, write your MA paper, pass your qualifying exams ... then write the thesis ... four years! Why wait?" The average at Berkeley in those days was more like seven or eight years, so he was raising the bar from the very start.
I also remember my first day in Poli Sci 223, his graduate seminar in IR theory. I was already convinced that everyone else in the room knew more than I did, and Ken began by setting out his basic ideas about the field and about theory. At one point he made some critical remarks about two professors I had studied with as an undergraduate -- nothing overly disparaging, just some critical comments on their conception of theory -- which immediately made me think that not only did I know less than every one else in the room, everything I had learned up till then was wrong. The real lesson, however, was that grad school was not about learning what other people thought, it was about learning to think for yourself. And Ken gave us the freedom to do that. He never tried to force his students to agree with his views or to write books and articles designed to reinforce his own work or burnish his own reputation.
Fourth, Ken placed great value on writing well. His students are a diverse group -- and certainly none of them are clones of Waltz himself -- but all of them are very clear writers, regardless of which methods or approaches they use. Ken used to tell us to read Fowler's Modern English Usage and Strunk and White's Elements of Style, and he'd give little mini-lectures on his linguisic pet peeves in the middle of a seminar. In Waltz's view, a scholar's first duty is to make it easy for the reader to figure out what you were saying. If the reader is confused, that's probably your fault.
This leads me to my most important encounter with him, which occurred as I was nearing the end of my dissertation. Writing a dissertation for Ken Waltz was intimidating from the start -- remember, his dissertation was Man, the State, and War -- and if you'd read that book and then read Theory of International Politics you knew you were dealing with someone with a razor-sharp ability to cut through a bloated argument and find the jugular. After two years of work I sent Ken the main analytical chapters of my thesis, and all I had left -- or so I thought -- was a short conclusion. Thinking I was nearly done, I accepted a post-doc for the following year.
And then I got a letter back from Ken, giving his comments on the chapters I had sent him earlier that month. His letter began by declaring that he had read the first twenty-five pages with "increasing dismay." "They are terrible," he wrote, and then went on: "Ask yourself why this is so. Were you trying to write too fast, or did you just not know what you were trying to say?" He continued in this vein for a few more paragraphs, making it clear that what I had sent was -- to quote the letter again -- "nowhere near ready to be an acceptable dissertation." His bracing conclusion: "You have to face this squarely, and you are the only one who can fix these problems. So enjoy a busy summer." By the way, there was little P.S. at the end, telling me that he thought it would be an excellent thesis once I had worked out the kinks.
I was basically curled up in a ball under my desk by the time I was finished reading this missive, and it was too early in the day to go for a stiff drink. I didn't enjoy the experience very much at the time, and you might think he was being harsh or even cruel. In fact, Ken had done me an enormous service. He was telling me that there were no short-cuts if I wanted to make a serious scholarly contribution and reminding me that hasty or poorly thought-out work deserved to be treated harshly.
Looking back, I'm grateful that he didn't spare my feelings, and there's a lesson there for all of us. Professors aren't really helping our students when we go easy on them, and students should in fact be grateful when their advisors occasionally take them to the woodshed.
So apart from his extraordinary scholarly achievements, Ken Waltz was also an inspiring and accomplished teacher. I was extraordinarily fortunate to have the opportunity to learn from him, and the study of international politics is much the richer for his remarkable contributions.
Addendum: All I would add to this today is the reminder of Waltz's deep aversion to foolish military excesses. He served in the U.S. Army during the Korean War and was a realist rather than a pacifist. But like Hans Morgenthau, he was an early opponent of the Vietnam War and deeply skeptical of the paranoid threat-inflation that has informed so much of U.S. foreign and defense policy. Like many other realists, he also opposed the 2003 invasion of Iraq. The field of international relations would be better off with more people like Ken, and the world would be better off if more great powers -- especially the United States -- paid more attention to his insights.
I had planned to write about something else today, but instead I want to acknowledge the recent passing of Glenn Snyder, an important international relations theorist. I didn't know him well -- indeed, I think we met on only one occasion -- but I read a lot of his work over the years and admired both his intellectual ambition and the clarity of his thinking.
Snyder's scholarly career spanned more than four decades and he made contributions in several areas. He was a co-author of Strategy, Politics and Defense Budgets (1962) an important account of U.S. national security policymaking in the 1950s, contributing a lengthy study of Eisenhower's "New Look" in nuclear strategy. His 1961 book Deterrence or Defense: Toward a Theory of National Security was an early refinement of classical deterrence theory and one of the first applications of game theory to international affairs. In the 1970s, he and co-author Paul Diesing published Conflict among Nations: Bargaining, Decisionmaking and System Structure in International Crises, an ambitious attempt to integrate structural realism, game theory, and theories of decision-making to understand crisis outcomes. I pored over this book in graduate school and learned an enormous amount from Snyder's careful analysis; I must have read chapter 6 of that book ("Crises and International Systems") dozens of times. His 1984 World Politics article "The Security Dilemma in Alliance Politics" was another classic, and especially his elaboration of the reciprocal risks of "abandonment" versus "entrapment" (concepts first proposed by Michael Mandelbaum). This last line of work culminated in his magisterial book Alliance Politics, which combined careful deductive analysis with a series of deeply research case studies.
Snyder was primarily a theorist, although he was also clearly comfortable doing careful qualitative/historical research. And, like John Herz, he strikes me as someone who deserved a higher reputation in the field than he had. I think this may be due to the nature of his later work: Instead of picking a single big idea and promoting it incessantly, both Conflict among Nations and Alliance Politics contained a lot of different ideas and came at their subjects from several angles at once. This comprehensive approach had a great deal of scholarly integrity to it, but it also made his works harder to pigeonhole. They were also too long to put on most graduate course syllabi, which meant that over time fewer graduate students were exposed to his work.
In this way, the practical sociology of the IR business may have cost Snyder some recognition. Nonetheless, he was the author of not one but several classic books and articles, works that still reward a careful reading today. How many IR scholars can say the same?
Christopher Furlong/Getty Images
Over at the new, independent Daily Dish, Andrew Sullivan has been hosting an interesting thread on why academic writing is frequently abysmal. As someone who tries hard to make even my academic writing clear and accessible and who tries to instill that value in my students, I've followed the thread with interest.
For starters, I don't think the problem is that no one encourages future academics to write well. In my own case, for example, I was fortunate to study with Alex George at Stanford as an undergrad and with Kenneth Waltz at Berkeley during graduate school, and both repeatedly stressed the importance of writing well. Waltz didn't do a lot of line-editing of grad student papers or dissertations, but he certainly let me know when he thought my writing was obscure, verbose, disorganized, or just plain confused. He also spoke openly about the importance of writing in his graduate courses, encouraged students to read books such as Fowler's Modern English Usage, and was scornful of the trendy neologisms that infest academic writing like so many weevils.
I also don't think the problem is due to poor editing at journals or university presses. I've published in over a dozen academic journals, with a prominent university press, and with two different commercial publishers, as well in a number of journals of opinion. Almost all of the editors or copy-editors with whom I've worked were helpful and attentive, and some were superlative. Indeed, I can think of only one case in nearly thirty years where a manuscript of mine was truly butchered by an editor (it was actually done by an intern) and fortunately the magazine let me repair the damage before the article appeared.
So why is academic writing so bad?
One reason academic writing is sometimes difficult is because the subjects being addressed are complicated and difficult and hard to explain with ordinary language. I have more than a little sympathy for philosophers grappling with deep questions about morality, time, epistemology, and the like, as these subjects are inherently slippery and it is easy to lose the reader in a fog of words. But it isn't inevitable even there. Some philosophers manage to write about very deep and weighty matters in a prose that is crystal clear. You still have to pay attention and think hard to understand what is being said, but not because the author is making it more difficult than it needs to be.
A second reason is the failure of many scholars to appreciate the difference between the logic of discovery and the logic of presentation. Specifically, the process by which a scholar figures out the answer to a particular question is rarely if ever the best way to explain that answer to a reader. But all too often articles and manuscripts read a bit like a research narrative: "First we read the literature, then we derived the following hypotheses, then we collected this data or researched these cases, then we analyzed them and got these results, and the next day we performed our robustness checks, and here's what we're going to do next."
The problem is that this narrative form is rarely the best way to make a convincing case. Once you know what your argument is, really effective writing involves sitting down and thinking hard about the best way to present that argument to the reader. The most important part of that process is figuring out the overall structure of the argument -- what points need to be developed first, and then what follows naturally or logically from them, and so on. An ideal piece of social science writing should have a built-in sense of logical or structural inevitability so that the reader moves along the argument and supporting evidence as effortlessly as possible.
Achieving this quality requires empathy. You have to be able to step outside your own understanding of the problem at hand and ask how your words are going to affect the thinking of someone who doesn't already know what you know and may even be inclined to disagree with you at first. Indeed, persuasive writing doesn't just convince the already-converted, a really well-crafted and well-supported argument will overcome a skeptic's initial resistance.
Why does this matter? Because the poor quality of academic writing is both aesthetically offensive and highly inefficient. Academics should strive to write clearly for the obvious reason that it will allow many others to learn more quickly. Think of it this way: If I spend 20 extra hours editing, re-writing, and polishing a piece of research, and if that extra effort enables 500 people to spend a half-hour less apiece figuring out what I am saying, then I have saved humankind a net 230 hours of effort.
Which leads me to the real reasons why academic writing is often bad. The first problem is that many academics (and especially younger ones) tend to confuse incomprehensibility with profundity. If they write long and ponderous sentences and throw in lots of jargon, they assume that readers will be dazzled by their erudition and more likely to accept whatever it is they are saying uncritically. Moreover, jargon is a way for professional academics to remind ordinary people that they are part of a guild with specialized knowledge that outsiders lack, and younger scholars often fear that if they don't sound like a professional scholar, then readers won't believe what they are saying no matter how solid their arguments and evidence are.
The second problem is the fear of being wrong. If your prose is clear and your arguments are easy to follow, then readers can figure out what you are saying and they can hold you to account. If you are making forecasts (or if the theory you are advancing has implications for the future), then you will look bad if your predictions are clearly stated and then fail. If your argument has obvious testable implications, others can run the tests and see how well your claims stand up.
But if your prose is muddy and obscure or your arguments are hedged in every conceivable direction, then readers may not be able to figure out what you're really saying and you can always dodge criticism by claiming to have been misunderstood. (Of course, sometimes critics do deliberately misrepresent a scholarly argument, but that's another matter). Bad writing thus becomes a form of academic camouflage designed to shield the author from criticism.
In the endless war against academic obscurantism, I tell my own students to read Strunk and White's classic The Elements of Style and to heed their emphasis on concision. Most of us tend to overwrite (especially by using too many adverbs), and shorter is almost always better. Or as Strunk and White put it:
"Vigorous writing is concise. A sentence should contain no unnecessary words, a paragraph no unnecessary sentences, for the same reason that a drawing should have no unnecessary lines and a machine no unnecessary parts. This requires not that the writer make all his sentences short, or that he avoid all detail and treat his subjects only in outline, but that every word tell."
I'm also a fan of Anthony Weston's A Rulebook for Arguments, a very smart primer on the different forms of persuasive argument and the ways to make written arguments more convincing.
Finally, I encourage students to emulate writers they admire. If there are scholars whose books you enjoyed, read them several times and try to capture what it is that makes their use of language so effective. I've found inspiration in writers like Waltz, Thomas Schelling, James Scott, John Mueller, and Deirdre McCloskey. And you don't have to agree with someone to respect their ability to write: Charles Krauthammer's ideas usually appall me, but there's no question that he is an effective prose stylist.
In the end, it comes down to what a scholar is trying to achieve. If the goal is just narrow professional success -- getting tenure, earning a decent salary, etc. -- then bad writing isn't a huge handicap and may even confer some advantages. But if the goal is to have impact -- both within one's discipline and in the wider world -- then there's no substitute for clear and effective writing. The question is really pretty simple: do you want to communicate with others or not?
Adam Berry/Getty Images
Here's a puzzle for all you academics and IR theory mavens out there. On the one hand, the most distinguished scholars in the IR field are theorists. Think of names like Kenneth Waltz, Alexander Wendt, or Robert Keohane, whose reputations rest on theoretical ideas rather than empirical work. Or look at the recent TRIP surveys, where virtually all the scholars judged to have had a major impact on the field are theorists. And most of the classic works in the field are also works of theory; by contrast, few empirical works have proven to be of lasting scholarly value
But on the other hand, the amount of serious attention that IR scholars in the US devote to theory is declining. (Interestingly, the same trend seems to be true of economics as well). The field is moving away from developing or carefully employing theories and instead emphasizing the testing of empirical hypotheses through some combination of quantitative or qualitative analysis. Such work is not purely inductive or atheoretical, but theory plays a relatively minor role and most of the effort goes into collecting data and trying to draw reliable causal inferences from it.
Hence the paradox: theory is the most esteemed activity in the field, yet hardly anybody wants to do it anymore. John Mearsheimer and I explore this paradox in a new paper, and argue that this shift away from theory is a mistake. A revised version will be published later this year in the European Journal of International Relations, but you can read a working paper version here.
Here's the abstract:
"Theory creating and hypothesis testing are both important elements of social science. Unfortunately, in recent years the balance between theory creation/refinement and the testing of empirical hypotheses has shifted sharply toward the latter. This trend is unfortunate, because insufficient attention to theory can lead to misspecified models and overreliance on misleading measures of key concepts. In addition, the poor quality of much of the data in IR makes it less likely that these efforts will produce useful cumulative knowledge. The shift away from theory and towards hypothesis testing is due mostly to the professionalization of academia, and this trend is likely to continue unless there is a collective decision to alter prevailing academic incentives."
It's August, which means that students in America (and plenty of other places) are heading off to college for the first time. Some of them are undoubtedly thinking about preparing for careers in international affairs. As a public service to those eager future Secretaries of State (and the parents worrying about their college choices) here's my Top Ten Things that Future International Policy Wonks Should Learn.
1. History. Trying to understand international affairs without knowing history is like trying to cook without knowing the difference between flour and flounder. Not only does history provide the laboratory in which our basic theories must be tested, it shapes the narratives different peoples tell themselves about how they came to their present circumstances and how they regard their relationship to others. How could one hope to understand the Middle East without knowing about the Ottoman Empire, the impact of colonialism, the role of Islam, the influence of European anti-Semitism and Zionism, or the part played by the Cold War? Similarly, how could one grasp the current complexities in Asia without understanding the prior relations between these nations and the different ways that Chinese, Vietnamese, Koreans, Japanese, Pashtuns, Hindus, Muslims, and others understand and explain past events?
But don't just memorize a lot of names and dates: seek out teachers who can help you think about the past in sophisticated ways. Among other things, it's useful to know how other societies see the past even if you don't agree with their interpretation, so make sure you read histories written by citizens of other countries. And if you're studying in the United States, don't just study "Western Civilization." The world is a lot bigger than that.
2. Statistics. Most high schoolers have to learn a certain amount of math, but unless you're going into a technical field, a lot of it won't be directly relevant to a career in international affairs. But statistics is part of the language of policy discourse, and if you don't understand the basics, you won't be a discerning consumer of quantitative information and others will be able to dazzle you with data that may not be right. You can avoid this fate with a little study.
3. Foreign Language. If you grew up outside the United States and are headed for college, you probably already speak more than one language. If you're an American, alas, you probably don't. You should. I know that everyone is learning English these days, but learning at least one foreign language provides a window into another culture that you can't get any other way, and also provides a sense of mastery and insight that is hard to achieve otherwise. I'm not particularly good at languages, but I'd gladly trade my mediocre abilities in French and German for real fluency in one of them (or many others). Don't make my mistake: get to the language lab and acquire some real skills.
4. Economics. Economists aren't the wizards they think they are (see under: 1929, 2007-08), but you can't understand world affairs these days if you don't have a basic grasp of the key principles of international trade and finance and some idea how the world economy actually works. I might add that some forms of economics (e.g., game theory) can provide some useful ways of thinking about strategic interaction, provided you don't push it too far. So take enough economics to be able to read the WSJ op-ed page and know when they are BS-ing you.
5. International Law. You might think that a realist like me would dismiss international law completely, but I took a course in the subject as an undergraduate and have always been grateful that I did. Among other things, it reaffirmed my suspicion that international law is a pretty weak instrument, especially when dealing with great powers. Nonetheless, states and other international actors use international law all of the time, and they certainly invoke it to try advance their own particular interests. So it's good to have some idea what international law is, how it works, and what it can and cannot do.
6. Geography. We often hear that we live in "one world," but it's divided up into lots of regions, countries, areas, and physical configurations, and these variations matter a lot. I don't know when or why we stopped teaching geography, but it is an important part of the world affairs tool kit. I might not go so far as to say "geography is destiny," but just look at all the international issues that you couldn't begin to understand without a detailed knowledge of the physical characteristics of the region in question. South China Sea? The West Bank? The new sea routes in the Arctic? The list is endless, yet I'm often struck by how little geography most students seem to know these days. Here's a good test: if you were given a map of the world with all the country names removed, how many could you fill in? If you can't get at least 75%, time to get out that atlas and start brushing up. The exercise will also tell you which regions you may know well and which ones you need to learn a bit more about. If you're still not convinced that geography matters , check out Robert Kaplan's new book.
7. Get some culture. Education in international affairs tends toward the technocratic, as the previous items on this list suggest. But some appreciation for art and culture is essential. The music, literature, and visual arts of different societies are where their collective souls reside, and more people have been inspired by poetry, art, and music than by the most compelling regression equations. If you don't know why Picasso, Kurosawa, Shakespeare, Solzhenitsyn, Austen, Ellington, Rushdie, Shankar, etc. matter, then you've missed out an enormous part of the human experience and your ability to understand what makes other societies tick will be impoverished.
8. Learn to communicate. Based on some of the graduate students I see, I'm not sure this is something most colleges teach anymore. But not matter what path you end up taking in life, being able to write clearly, quickly, and without enormous effort is a huge advantage. I'm not saying you have to aspire to be a prose stylist on the order of George Kennan, Joan Didion, or Paul Krugman, but overcoming the fear of the blank page or screen and developing the ability to write a clear, well-organized argument is an enormous force-multiplier.
While you're at it, hone your ability to speak effectively and persuasively. Regardless of what sort of career you pursue, being able to present your ideas orally will be very valuable. And I'm not just talking about formal lecturing or giving a keynote speech, I also mean knowing how to brief your boss in five-minutes or less, and how to ask a good question. I go to lots of public lectures and seminars, and I'm often struck by how few people know how to ask a clear, sharp and penetrating question. If you master that skill, you'll stand out.
Formal training and activities like debate can enhance these abilities, but mostly they come from practice. Repetition also helps overcome stage fright, and being relaxed while you're speaking is easily worth 10 or 20 IQ points.
9. What about science? Most of us had to take a lot of science in high school, and some of us continued to do so in college. Although in-depth knowledge of physics, chemistry, biology, computer science, etc., is not directly relevant to many aspects of international affairs, it is powerfully linked to a host of important political phenomena. How can one understand cyber-security, climate change, global pandemics, economic development, and a host of other issues without understanding the scientific knowledge that lies at their core? More importantly, a clear understanding of the scientific method helps protect you from the proud know-nothingism that is increasingly a badge of honor among some politicians. So stick with some science too. And by the way: if you happen to interested in topics where science is central (such as arms control or the environment), you'd probably be better off majoring in a relevant scientific field rather than politics or history.
10. Find your ethical foundation. Universities teach classes on ethics, but apart from favoring free speech and opposing academic fraud, they don't endorse any particular ethical stance. So don't expect your college to teach you what is right or moral. Nonetheless, if you haven't figured these things out for yourself yet, college is a good time to get cracking on it. You'll meet lots of people with different views on this subject, and engaging with them will help you sort out where you stand. What's your view of the good or virtuous life? Where are the lines that shouldn't be crossed? How do you propose to handle the ethical tradeoffs that will inevitably greet you as you advance through life? And as you study, keep a sharp eye out for role models: which people strike you as admirable and worthy of emulation and which seem morally challenged? And on what basis did you decide?
Alert readers will have noticed that my list looks a lot like the classic liberal arts education. True enough: in world that is both diverse and changing rapidly, a broad portfolio of knowledge is almost certainly the best preparation for a long career in the field. My list also leaves out various extracurricular activities that may be every bit as important as what you do in class, such as living for an extended period in a foreign country. But a solid knowledge of these fields and a serious effort to develop some key skills would serve you in good stead in a wide variety of global professions. And if you end up doing something entirely different, they certainly won't hurt.
And if you're just starting your freshman year, I hope you find the next four years challenging and inspiring. Learn as much as you can, because there will be plenty of tough problems for you to work on as soon as you graduate.
For the past few years, a group of scholars at the College of William and Mary have been conducting surveys of the international relations discipline, as part of the Teaching, Research, and International Policy (TRIP) project. (FP has published excerpts from their reports in the past, such as this one.) Their latest results have just been released, and this year they've gone global, surveying nearly 3500 IR scholars from around the globe. You can download the whole survey -- "TRIP Around the World" -- here.
For me, the most interesting results are at the beginning, and they show that there's quite a bit of variation in how IR is taught in different parts of the world. For example, 9 percent of IR teachers in the U.S. say that they include material on Central Asia in their course, but in Turkey that number is 25 percent. 40 percent of American IR scholars include material on East Asia, the same percentage as in Australia, but in Israel the number reported was zero. In other words, there's a lot of regional bias in the content of IR courses: what you teach depends in part on where your country sits. This pattern isn't that surprising, perhaps, but it does tell you that students in different countries (and future policy professionals) aren't absorbing quite the same view of the world.
Those aren't the only differences, of course. On average, U.S. scholars report that 28 percent of their courses deal with "policy analysis" of various sorts. But in Turkey the reported average is 49 percent, and in Finland and Singapore the average is only 14 percent. And then there's the question of which authors get assigned: in the United States, IR teachers report that 71 percent of the readings are by American authors, and both Singapore and Israel report a similar number. But the percentage of American authors drops to the mid-forties in the U.K., Canada, Colombia, France, and several other countries, and those independent-minded Finns assign only 27 percent. Other TRIP results show that American academics still dominate the lists of "most influential" scholars, but what students are reading clearly varies a lot by country.
I'm also happy to report that realism appears to be alive and well in the academy, at least as measured by the self-reported content of undergraduate "Intro to IR courses." Once again, it's Finland where realism seems least widespread (only 11 percent of the course material), but none of the rival paradigms seem all that popular in Finland either).
Do these variations in basic IR teaching tell us anything about international politics and foreign policy itself? If students are being taught somewhat different views of the world (and if there's a lot of regional bias in what they are learning), then one could argue this will tend to create policy elites who don't see the world in the same way and will have more trouble finding common ground. It might be tempting to see this as a potent source of international conflict, but I'd be wary of such a facile explanation. For starters, international conflict and competition took place long before anyone started teaching undergraduate courses about it, and nation-states would still have conflicting interests even if everyone everywhere took exactly the same courses and read the same books. (Depending on which books they read, in fact, maybe reading the same ones would make things worse). Furthermore, many of the people who ultimately are in charge of foreign policy aren't relying on what they learned in some undergraduate course, and at least some of them may have escaped some of the ethnocentrism within their earlier training. Understanding how potential antagonists think can be very useful, but it hardly guarantees you'll get along.
Most importantly, the TRIP survey covered only twenty countries, and some pretty interesting possibilities weren't included. China wasn't part of the survey, for example, and neither was Iran, even though both countries have significant academic institutions and a lot of young people taking international relations courses. I wonder what they are reading, and what conclusions they are drawing from the content of their courses?
ADAM BERRY/Getty Images
Which university is more likely to defend academia's basic commitment to sharing ideas and knowledge in an open and unconstrained way, West Point or Yale? You'd probably think it would be Yale, that well-known bastion of tweedy academics and liberal values. How wrong you'd be.
As West Point faculty member Gian Gentile outlines in a fascinating piece in the Atlantic, former U.S. Afghan commander Stanley McChrystal has been teaching a course at Yale's Jackson Institute of Global Affairs on strategy and leadership. Nothing wrong with that: Plenty of universities (including my own) hire practitioners to share insights from the real world with students. And I've got no problem having a former general teach a course. But in a shocking departure from normal academic practices, Yale requires students taking the course to sign a non-disclosure form, pledging that they will not divulge what is said in the course to outsiders. In other words, McChrystal is teaching an "off-the-record" course.
This restriction is so contrary to the normal practice of universities that it is hard to know where to begin. Academic institutions exist to pursue knowledge, to teach what we know to our students, and to instill in them an appreciation for free and open inquiry. The whole principle of academic freedom rests on the idea that knowledge is best advanced by allowing ideas to blossom and to be shared without restriction. In this way, good ideas can be validated and retained and bad ideas or conjectures can be scrutinized and eventually excluded. By telling students in McChrystal's class that they cannot share what they learn with others, Yale is artificially constraining the normal give-and-take of ideas. There may be vigorous discussions inside that particular classroom, but the rest of Yale (and the larger world) won't know about them. Secrecy of any kind is fundamentally at odds with the principles that universities stand for, yet here Yale has enshrined it in one of their courses.
A commitment to free and open discussion also keeps the focus on the ideas themselves, rather than on the identity or the supposed prestige of the faculty member leading the course. Giving McChrystal a special exemption immediately tells Yale students that the general is a "Very Important Person" who gets to be treated differently from other members of the faculty. Again, that's not how universities are supposed to work: People taking my courses aren't supposed to accept what I tell them because I am the professor and they are mere students. They are supposed to accept what I tell them only if I've successfully convinced them it is useful and makes sense. And they are free -- even encouraged -- to disagree with me, especially if they have good reason to do so and can make their objections stick. And I want them to talk about my courses outside of class; maybe someone they know will point out a new way to think about an issue or identify a mistake I've made. But if I made my students sign a non-disclosure form, I would limit their capacity to hold me accountable.
Requiring students to sign a non-disclosure form also sends the subtle but unmistakable signal that the instructor is imparting secret knowledge that is too hot or potentially controversial to be shared with the outside world. I can easily imagine students lapping this up -- we all like thinking we're getting info that others aren't privy to -- but this is just not how universities are supposed to work.
Yale officials might argue that McChrystal is a unique asset for their teaching program, and that the only way they could convince him to teach there was to promise him that some student wouldn't blab about the course to the Yale Daily News or the New York Times. But that argument won't wash: If McChrystal really believes what he's teaching, then he should be willing to have it openly discussed. He shouldn't be able to win arguments in the classroom by saying, "Now let me tell you about some really secret stuff I did in Afghanistan, stuff you won't find out about in books. Trust me." He should be willing to be held accountable for what he says to his students, and not just by those who happen to be sitting there (and whom he might eventually be grading). If some students disagree with him, he should be willing to have them voice their disagreements to the rest of the class, but also to their roommates, friends, parents, other faculty members, and yes, even to reporters. That's the same risk that all of us run when we teach: All of our students are free to talk about what they learn with anyone they want. What's General McChrystal so afraid of?
Yale's abandoning of its principles is itself a symptom of the growing deference that Americans now grant the professional military (and to a lesser extent, top members of the broader national security establishment). The country has been at war for over a decade, and there's an inevitable tendency for civilians to start treating those who've been fighting these wars with kid gloves. This tendency is not healthy, however, because the professional military has its own interests and world view -- as such, it is not necessarily the best judge of what is in the overall interests of the nation. National security is a topic that affects all Americans, and it is more likely to be openly and intelligently debated when we don't give any of the participants (and especially not those with particular interests in the subject) a free pass.
Brendan Smialowski/Getty Images
Today I want to offer a few brief words of tribute to Paul Doty, who passed away yesterday at the age of 91. Paul was a distinguished biochemist and molecular biologist, as well as a pioneering figure in the field of arms control. He was head of the Federation of American Scientists, a founder of the Pugwash Conferences (which brought together scientists from both sides of the Iron Curtain to discuss arms control and war prevention), and a key figure in the renaissance of security studies that began in the late 1970s. A more detailed account of his life and career can be found here and here.
I am one of the countless number of scholars who owe part of their professional success to Paul's vision and support. Back in the 1970s, Paul realized that his generation of policy-minded academics was not being replicated, and he convinced the head of the Ford Foundation, McGeorge Bundy, to finance new research centers at a number of prominent universities. This act led to the founding of the Center for Science and International Affairs (CSIA) at Harvard (with Paul as founding director), and to parallel centers at Stanford, UCLA, and Cornell.
The model for CSIA (subsequently renamed the Belfer Center), was a scientific lab. In addition to providing young scholars with the time and resources to conduct their research, these centers also provided an atmosphere where older scholars could mentor younger colleagues and where people with varying backgrounds could meet, exchange ideas, and build robust professional networks. Thus, a fellowship at CSIA was more than just an opportunity to finish or revise a dissertation. It was also a chance to interact with prominent academics and policymakers, to learn how to challenge a prominent expert with whom one disagreed and, in general, to comport oneself as an engaged and competent professional. My initial stint at CSIA (1981-1984) was central to getting my own career started, and there are now literally hundreds of CSIA alumni holding prominent positions in the academy and in key policymaking circles, including prominent Obama administration figures such as Michele Flournoy, Daniel Poneman, Kurt Campbell, and Ivo Daalder.
Paul had a lot of the "absent-minded professor" in him, and stories about some of his idiosyncrasies became legendary among his colleagues. But what I remember most was his rare ability to cut to the heart of an issue, and his quiet fearlessness in confronting those with whom he disagreed. I never saw him behave rudely to a visiting speaker, but he had little patience for arguments that didn't add up or for policy positions that made no sense. And it didn't matter if the person trying to sell some dubious idea was powerful or prominent; Paul would press the attack with quiet persistence. He was, in short, a truth-teller, who cared more about getting the right answer or the right policy than advancing his own personal fame or power. In that most basic of virtues, he was a model for us all.
Belfer Center for Science and International Affairs
Today, I'd like to try a bit of crowd-sourcing. Specifically, I'd like to ask readers of this blog for some help with one of my courses. The course is a graduate-level survey of international and global affairs, designed for public policy students concentrating in that area. One of the components I'm adding this year is a session explicitly focused on the topic of "policy analysis in international and global affairs." By "policy analysis," I mean the method (or for some, the art) of analyzing concrete policy problems and deciding which policy options will best achieve some intended goal.
Here's the problem. There is an extensive literature on policy analysis, including well-known works by Eugene Bardach, Michael Munger, John Kingdon, Edith Stokey and Richard Zeckhauser, Deborah Stone, and many others. Yet the bulk of these works focus on domestic policy analysis (i.e., on the analysis of problems that policy analysts face in purely domestic contexts). So far, I have yet to discover any serious work explaining how to do policy analysis in the realm of foreign policy or international and global affairs.
There is a large literature on the analysis of military budgets and defense management--dating back to the heyday of "systems analysis" in the Pentagon-but this literature views these problems as essentially a domestic issue (e.g., the choices decision-makers make between guns vs. butter, or between Weapon System #1 vs. Weapon System #2, etc.). There are also works like Wolfgang Reinecke's Global Public Policy, but this book is an extended argument for why we need to situate policymaking at the global rather than national level. It is not a primer explaining how one actually performs the analysis of a concrete global policy issue.
I'm not saying that such works do not exist; I just haven't been able to find them. And assuming that there aren't any/many, it's interesting to speculate on why that is the case. I think it is partly because scholars in international relations have tended to focus on grand theory (realism, liberalism, constructivism, etc.), or on trying to identify recurring laws or tendencies between states or other groups. In short, they are mostly engaged in a positivist search for regularities, and trying to devise theories that explain them). In other words, most scholars stand apart from the policy process and treat international affairs as something to be studied from a safe distance, much as a biologist might study animals in the wild. There's just not that much interest in the academy in giving students practical advice on how to solve problems, and it's not clear that most academics would have much to contribute even if they were interested. With the exception of some important work on environmental issues (which tend to be global in scope), that task has been mostly addressed by scholars of public management or public administration, not IR.
Similarly, the field of "foreign policy analysis" tends to focus on explaining why governments make the foreign policy decisions that they do, and not on developing methods or techniques for analyzing different foreign policy options. So this literature investigates how regime type, bureaucratic politics, interest groups, social and individual psychology and any number of other "independent variables" influence government decisions. In other words, the subfield of "foreign policy analysis" does not tell you how to analyze a concrete policy problem or compare the merits of alternative policy choices.
For whatever reason, scholars working in the broad area of international and global affairs have not devoted much attention to helping would-be policy analysts learn how to do the jobs that most of them will eventually occupy. Instead, I suspect graduates of leading public policy schools end up learning this on-the-job.
One might ask: why can't we just take the existing literature on "policy analysis" and apply it to foreign policy? I think students can get some useful insights from that literature, and that some of the specific analytic techniques developed there (such as cost-benefit analysis) are clearly germane and valuable. But there are some key differences between the situation facing a domestic policy analyst and someone addressing an international or global problem. In general, policy analysts working on domestic issues are dealing with situations where there is clear legal authority and where politics, though never absent, is less salient. If your job is figuring out how to cut costs for an urban bus system, decide how to accommodate increased enrollment in a local public school, or come up with proposal to improving health care improve, etc., the main task is to identify the goals, figure out the alternatives, identify the likely results of different choices, and eventually decide which alternative will best accomplish the intended goal. Once the decision is reached, legitimate authority to implement it presumably exists (although one may also have to develop a strategy for building sufficient political support).
In global affairs, by contrast, the rule of law is far weaker and there are often competing power centers with very different interests. Strategic interactions loom much larger, and the success of a given policy choice often depends not just on the intrinsic merits of the specific initiative but on how other key actors will respond to it. (Among other things, this is why simple game theoretic models are often useful for analyzing certain international policy problems). To the extent that the issues are truly global, the correct policy choice depends far more on bargaining, persuasion, in some cases coercion, and on developing solutions that either elicit others' voluntary compliance or achieve the objective in the face of opposition. Such features are not entirely absent in domestic policy discussions, but they play a larger role in interactions between states, corporations, and non-state actors operating in the anarchic world of international politics.
Whatever the reason, there seems to be a large and regrettable gap in the existing literature. Note to potential authors: we need a good book or article that gives students a useful guide to performing policy analysis in international and global affairs.
Unless, of course, such a work already exists. So here's your chance to shape what my students read next term: is there anything good to read about global policy analysis? Anybody got any good suggestions?
For those of you who are curious about the relationship between scholarship and the real world, with particular reference to the social sciences, I recommend FT columnist John Kay's recent essay "The Map is Not the Territory: An Essay on the State of Economics." Kay is an experienced professional economist himself, and the essay is a penetrating critique of the kind of divorced-from-reality thinking that has dominated a lot of macroeconomic research over the past few decades. As you'll see if you read the piece, he's especially irritated by the unwillingness of some prominent macroeconomists (including Nobel Prize winners like the University of Chicago's Robert Lucas) to acknowledge that the failure to anticipate the financial meltdown of 2007-2008 casts some well-founded doubt on the direction that economic thinking has taken in recent decades.
Kay's essay also contains some valuable lessons for political science and other academic disciplines. My favorite passage:
For many people, deductive reasoning is the mark of science, while induction - in which the argument is derived from the subject matter - is the characteristic method of history or literary criticism. But this is an artificial, exaggerated distinction. ‘The first siren of beauty', says [macroeconomist John] Cochrane, ‘is logical consistency'. It seems impossible that anyone acquainted with great human achievements - whether in the arts, the humanities or the sciences - could really believe that the first siren of beauty is consistency. This is not how Shakespeare, Mozart or Picasso - or Newton or Darwin - approached their task.
The issue is therefore not mathematics versus poetry. Deductive reasoning of any kind necessarily draws on mathematics and formal logic; inductive reasoning is based on experience and above all on careful observation and may, or may not, make use of statistics and mathematics. Much scientific progress has been inductive: empirical regularities are observed in advance of any clear understanding of the mechanisms that give rise to them. This is true even of hard sciences such as physics, and more true of applied disciplines such as medicine or engineering. Economists who assert that the only valid prescriptions in economic policy are logical deductions from complete axiomatic systems take prescriptions from doctors who often know little more about these medicines than that they appear to treat the disease. Such physicians are unashamedly ad hoc; perhaps pragmatic is a better word.
Needless to say, I like this argument because I believe it is important for the social sciences to be a diverse intellectual ecosystem instead of a monoculture where one approach or method reigns supreme. Even if one approach or theoretical model were demonstrably superior -- and that is rarely, if ever, the case -- there would still be considerable value in having lots of other scholars working in different ways. Sometimes we learn by exploring deductions in a formal model (though we often just restate the obvious when we do); at other times we learn by "soaking and poking" among policymakers, by constructing a data set and exploring patterns within it, or by immersing ourselves in the details of historical cases or by exploring the categories of thought and discourse that surround a given policy domain. Given that all these approaches yield useful knowledge, why would any serious department want to privilege one approach over all others?
But because academic disciplines are largely self-defining and self-policing (i.e., we determine the "criteria of merit" and success depends almost entirely on one's reputation among fellow academics), there is the ever-present danger that academic disciplines spin off into solipsistic and self-regarding theorizing that is divorced from the real world (and therefore unlikely to be refuted by events) and of little value to our students, to policymakers, or even interested citizens. This tendency occurs primarily because proponents of one approach naturally tend to think that their way of doing business is superior, and some of them work overtime to promote people who look like them and to exclude people whose work is different. Anybody who has spent a few years in a contemporary political science department cannot fail to have observed this phenomenon at work; there just aren't very many people who are genuinely catholic in their tastes and willing to embrace work that isn't pretty much like their own.
This situation creates a real dilemma: if you believe in academic freedom (and I do), then you don't want outside authorities interfering in the production of knowledge, telling academics how to do their work, or setting stupid criteria for evaluating scholarly contributions. But without some pressure to be at least potentially relevant, the social sciences are prone to drift off into what Hans Morgenthau once decried as "the trivial, the formal, the methodological, the purely theoretical, the remotely historical -- in short, the politically irrelevant." I've already touted my own prescriptions for this problem here, but I don't have enormous confidence that any of them will be heeded. But at the risk of seeming to tout my own employer (and similar programs elsewhere), that's why I increasingly expect the most interesting and relevant work to emerg from schools of public policy, and not from the increasingly arcane worlds of traditional disciplinary departments.
Apart from a few brief sojourns at various think tanks, I've spent most of
my professional life in the academic world. Seven of these years were spent
helping run various programs, first as deputy dean of social sciences at the
University of Chicago and later as academic dean here at the Kennedy School. I
have one child in college and another heading there in two years. You can
therefore assume I have a certain professional and personal interest in the
whole business of higher education.
Which is why I find discussions of how technology might transform this whole enterprise quite fascinating. It's hard not to read such articles and wonder how my own job might change in the years ahead, and to reflect on how I think it ought to change. I have not studied this issue in detail, so what follows are some purely impressionistic observations, based mostly on my own experience.
1. I think there's no doubt that the traditional model of the academic lecture is headed the way of the dodo. I say that with a certain wistful regret, because I enjoy lecturing and like to think I'm fairly good at it. But it's hardly an efficient mode of information-transmission, and there are plenty of studies suggesting that students don't learn particularly well in this sort of passive "I-speak-while-you-listen-and- take-notes" experience. Lecturing of the old-fashioned sort can be entertaining and inspirational, but real learning requires students to engage and wrestle with the material instead of just hearing some older person declaim about it.
2. Given that top-flight faculty are among any college or university's scarcest resources, having them stand in front of a handful of students and talk is especially inefficient, and all the more so in basic introductory courses. In other words, you probably don't want Nobel Prize winners teaching basic statistics, Economics 101, or even Intro to Biology -- especially when there may be lots of less renowned people who are actually better at doing that. But you do want students to have the opportunity to interact with the most brilliant minds, to argue with them, to see how they do their work, and to be inspired by their example. And that means creating different sorts of educational experiences (seminars, workshops, mini-courses, etc.) rather than just one.
3. Information technology is making it possible to transmit educational content at almost no cost; you can put course materials on the web and stream lectures to anyone with an internet hookup. This is what MIT is doing now, and it doesn't seem to be discouraging people from wanting to attend full-time and pay full-freight. There are also online teaching programs that might do a better job of teaching basic materials (such as introduction to microeconomics, statistics, calculus, etc.) than that old model of the single lecturer with a chalkboard and a pile of notes. This suggests that we ought to be thinking of ways to use faculty rather differently -- in more interactive and personal modes--where hands-on attention, genuine inspiration, and pedagogical ability can produce big payoffs, while using online tools to deliver basic factual or technical content.
4. I suspect that in the near future we are going to see a lot of experimentation with new forms of higher education, reflecting the fact that these institutions in fact serve many purposes other than merely transmitting knowledge/skills to students. One reason MIT can make its content available for free is that students understand there is a difference between watching lectures online and actually being in the class, being on the campus, and being immersed in the broader in-person environment. In the United States, at least, universities and colleges also provide a relatively safe space for making the transition from adolescence to adulthood. They are environments where young people can meet future spouses of similar class or social backgrounds, have lots of arguments with peers and with their professors, and get a lot of preconceived notions challenged. For many young people (though not all), college is about a lot more than just what they learn in class, which is one reason parents are willing to pay through the nose to make that whole experience possible.
What I'm describing here, of course, is the traditional model of a liberal arts education, and it's hardly the only model out there. Other institutions (e.g., commuter colleges, junior colleges, vocational institutes) serve somewhat different educational functions and are already organized differently. My guess, therefore, is that changes in information technology and the overall globalization of information and education is going to produce an explosion of innovation over the next few years. The traditional four-year university/college won't disappear, but it will be coexisting and competing with a lot of other models.
Lastly, this is going to be a painful process. Universities are filled with brilliant and innovative people -- as individuals -- but they are also incredibly conservative institutions (not politically, but in the sense of being wary of change). As a former Harvard president reportedly said, "trying to change the curriculum is like moving a graveyard." Faculties don't like having to retool and alumni and other stakeholders often have powerful emotional attachments to traditional ways of doing business. And the older and more successful a university is, the more impervious to change it is likely to be.
Plus, coming up with new educational models is hard to do if you're already working pretty hard teaching the existing program. But there's no stopping this sort of Schumpeterian "creative destruction," and I'd hate to be working for the educational equivalent of Polaroid -- a brilliant and innovative company that proved unable to adapt to a rapidly changing technological frontier.
Now if we can just get universities out of the business of running semi-professional athletic teams...
Darren McCollester/Getty Images
I will be flying to Seattle tomorrow to attend the annual meeting of the American Political Science Association, so blogging for the rest of the week will be light. I'm on a roundtable discussion of John Ikenberry's new book Liberal Leviathan, and plan to offer some friendly but provocative points about the book. I'm also running for the APSA Council, as part of a broad effort to democratize its governance structures, encourage more open elections, and support efforts to make academic political science more attentive to real-world policy issues.
But the really important shift this week is a structural change in my home life. As of tomorrow, I move from the multi-polar system in which I've lived for the past sixteen years to a tri-polar world. Translation: my wife and I are taking my eldest son off to college, with pride and high hopes and every nickel we can scrape together. I just hope that the gloom-and-doom accounts of U.S. higher education that I've been reading lately are overly pessimistic.
A graduate student at UC-Berkeley, Allan Dafoe, has asked for my help in a project he's conducting (in collaboration with researchers at Uppsala University) on the impact of perception in international crises. To be more precise, what he really wants is your help. Part of his project involves collecting responses from "foreign policy elites" to a hypothetical crisis scenario, and who better than the enlightened readers of this blog?
Because FP readers are hardly typical -- even among "foreign policy elites" -- there is obviously a potential problem of sampling bias here. But that is for Allan and his collaborators to sort out. If you'd like to become a data point in his project, go to this link and take the survey. It will take you about five minutes, and you'll be helping extend our knowledge of crisis behavior. Then you can go back to sticking pins in your voodoo doll of whichever U.S. politician you think is most responsible for the embarrassing spectacle that has been playing itself out in Washington.
What role should academics play in public discourse about major social issues, including foreign policy? I've taken up this issue in the past, as has FP colleague Dan Drezner. The Social Science Research Council has a continuing project on the topic of "Academia and the Public Sphere," and they asked me to contribute an essay on the topic of "International Affairs and the Public Sphere." It just went up on the SSRC website, and you can find it here.
Briefly, in this paper I argue that academic scholars have a unique role to play in public discourse -- primarily as an independent source of information and critical commentary -- as well as an obligation to use their knowledge for the betterment of society. In particular, university-based scholars should resist the "cult of irrelevance" that leads many to limit their work to a narrow, obscure, and self-referential dialogue among academicians. But I also argue that greater involvement in public life has its own risks, most notably the danger of being co-opted or corrupted by powerful institutions who may be eager to enlist academics to help them justify policies that will benefit those same institutions. "Speaking truth to power" is not simple.
The article also includes six recommendations for improving academic participation in the public sphere. They are:
I lay out the rationale for these suggestions in the paper, and you'll have to read it for yourself to find out what they are. But here's the bottom line:
If scholars working on global affairs are content with having little to say to their fellow citizens and public officials and little to contribute to solving public problems, then we can expect even less attention and fewer resources over time (and to be frank, we won't deserve either). By contrast, if the academic community decides to use its privileged position and professional expertise to address an overcrowded global agenda in a useful way, then it will have taken a large step toward fulfilling its true social purpose. Therein lies the good news: the fate of the social sciences is largely in our own hands.
Scott Olson/Getty Images
Back when I was in graduate school, Stanley Hoffmann wrote an essay in Daedalus entitled "An American Social Science: International Relations." Among other things, he argued that the field of international relations was dominated by scholars from North America, and especially the United States, in part due to the U.S. dominant global role in post-World War II era. (Foreign-born scholars like Henry Kissinger, Zbigniew Brzezinski, Peter Katzenstein, and the late Ernst Haas are exceptions that support the rule, as each received most if not all of their advanced training in the United States)
Has this situation changed? I ask this in part because lately I've been thinking about faculty recruiting at Harvard's Kennedy School. We have a very strong IR faculty -- my colleagues include Joe Nye, John Ruggie, Graham Allison, Samantha Power (on leave), Ash Carter (ditto), Monica Toft, Nicholas Burns, Meghan O'Sullivan, etc. -- but notice that this is a very U.S.-centric group, even though over 40 percent of our students come from overseas. We are fortunate to have a few colleagues from other countries (such as Karl Kaiser and Jacqueline Bhabha), but the center of gravity is decidedly Washington-focused. And we're no different in this regard than peer institutions like Princeton's Woodrow Wilson School.
I was discussing this issue with a colleague in D.C. the other day, and he argued that one reason was the simple fact that there were hardly any world-class foreign policy intellectuals outside the Anglo-Saxon world. He wasn't saying that there weren't smart people writing on world affairs in other countries; his point was that there are very few people writing on foreign affairs outside North America or Britain whose works become the object of global attention and debate. In other words, there's no German, Japanese, Russian, Chinese, or Indian equivalent of Samuel Huntington's Clash of Civilizations, Frank Fukuyama's The End of History and the Last Man, or Joseph Nye's various writings on "soft power."
Natashia Ruby via Flickr Creative Commons
As readers of the New York Times (and Jewish Week) already know, the Board of Trustees at City University of New York voted to table the awarding of an honorary degree to playwright Tony Kushner after one member of the board, Jeffrey Wiesenfeld, accused Kushner of supposedly "disparaging" Israel. Kushner has been critical of some Israeli policies-which hardly makes him unique among human beings, or among Jews, or even among Israelis. But none of his comments on these issues are outside the bounds of civil discourse or worthy of censure, especially by an institution that is supposed to be committed to freedom of thought and the open exchange of ideas. If you're curious, you can read Kushner's response here. Wiesenfeld is unrepentant, by the way, and defends his attack here. For an update on the evolving situation, see Justin Elliott here.
I have only two points to make about this incident, which one of the many attempts by self-appointed "defenders" of Israel to control discourse on this issue.
First, the main reason that hardliners like Mr. Weisenfeld go after someone like Kushner is deterrence. By denying critics of Israeli policy any honors, they seek to discourage others from expressing opinions that challenge the prevailing "pro-Israel" orthodoxy to which Weisenfeld is committed. Kushner was not nominated for an honorary degree for his views on Middle East politics; he was obviously nominated because he is an exceptionally talented and accomplished playwright and literary figure. But if someone like him can also be critical of Israel's treatment of the Palestinians and receive an honorary degree, then -- horrors! -- other people who feel similarly might be empowered to speak out themselves and pretty soon such comments will cease to be taboo. People like Mr. Weisenfeld don't want that; they want people who do not share their views to be constantly aware of the price they might pay for expressing them. And it never seems to occur to them that maybe Kushner's views might be both more humane but also better for Israel than the position that Weisenfeld apparently holds.
Second, what this incident also reveals is the reflexive timidity of many academic organizations. There doesn't seem to have been any sort of organized campaign to deny Kushner the honorary degree; instead, the board voted to table the nomination after one member (Weisenfeld) made his disparaging remarks. I've spent more than a quarter century in academia, including seven years as an administrator, and the board's reaction doesn't surprise me a bit. Despite their public commitment to free speech and open discourse, nothing terrifies deans and trustees more than angry donors, phone calls from reporters, and anything that looks controversial. By tabling the nomination, they undoubtedly thought they were avoiding a potentially uncomfortable controversy.
But in this case the CUNY board blew it big-time, both because Weisenfeld's accusations were off-base but also because they would not have been grounds for denying Kushner an honorary degree even if they had been true. And meekly caving as they did is contrary to the principles of intellectual freedom that universities are supposed to defend. The end result is that this incident will get a lot more attention than awarding the degree would have garnered (Kushner already has several), and the board's shameful lack of vertebrae has been publicly exposed.
And why does this matter for foreign policy? Because as John Mearsheimer and I wrote a few years ago: "America will be better served if its citizens were exposed to the range of views about Israel common to most of the world's democracies, including Israel itself. . . Both the United States and Israel face vexing challenges. . .and neither country will benefit by silencing those who support a new approach. This does not mean that critics are always right, of course, but their suggestions deserves at least as much consideration as the failed policies that key groups in the [Israel] lobby have backed in recent years" (pp. 351-52).
Saturday's New York Times contained an interesting op-ed piece by Charles Blow, titled "American Shame." The main item was a table listing the 33 countries designated as "advanced economies" by the International Monetary Fund and comparing them on various social and educational characteristics. Specifically, Blow charted income inequality, unemployment rates, level of democracy, the "percentage thriving" (according to the Gallup Global Well-Being Index), food insecurity, prison population, and student performance in math and science. The bottom line: The United States is at the bottom of the heap on most of these measures, and at or near the top in none.
It's a sobering collection of data, to be sure, but I wish Blow had added two more columns to his chart: 1) percentage of GDP devoted to defense, and 2) defense spending per capita. According to the 2010 IISS Military Balance, here's what those columns would have looked like (the countries are in the order presented by Blow, which reflected their summary ranking on the various measures, from best to worst):
Country Defense $/GDP (%) Defense $/population (2008)
Australia 2.24 1,056
Canada 1.19 597
Norway 1.49 1,264
Netherlands 1.41 738
Germany 1.28 570
Austria 0.77 389
Switzerland 0.83 542
Denmark 1.94 344
Finland 1.33 693
Belgium 1.10 534
Malta 0.60 122
Japan 0.93 362
Sweden 1.30 736
Hong Kong n.a. n.a.
Iceland 0.27 (200 153 (2006)
New Zealand 1.39 420
Luxembourg 0.43 478
United Kingdom 2.28 998
Ireland 0.60 382
Singapore 4.20 1,663
Cyprus 2.16 503
South Korea 2.60 500
Italy 1.34 532
France 2.35 1,049
Czech Rep. 1.46 310
Slovenia 1.53 415
Taiwan 2.76 458
Slovakia 1.55 271
Israel 7.41 2,077
Spain 1.20 276
Greece 2.85 946
Portugal 1.53 349
United States 4.88 2,290
And just for fun, let's toss in:
P.R. China 1.36 45
Rod Lamkey Jr/Getty Images
Like most residents of New England, I've spent the past day digging out from a major snowstorm. Unlike most of my neighbors, I've also spent many hours grading the take-home final from my course. It occurred to me that some of you might like to know what we asked our students, and what some of them had to say about it.
The exam was in two parts, and the first part consisted of the following hypothetical question:
Q1: "Due to an unexpected movement of tectonic plates, the United States and China have switched geographic locations. The United States is now located in East Asia; sharing borders with Russia, North Korea, India, Mongolia, Vietnam, etc., and is much closer to Japan, while China is now located in North America, in-between Canada and Mexico. Assume that all other features of the two societies are unchanged (i.e., each state faces this new situation with the same populations they have today, along with the same natural resource endowments, military capabilities, economic systems, political institutions, etc.).
The question: how would this development affect contemporary international relations? Your answer should draw upon the theoretical material covered in this course (e.g., realism, liberalism, constructivism, etc.) but feel free to add your own ideas as well."
Students were given 1250 words (5-6 pages) to address this question, and most of them did pretty well with it. The question is obviously designed to get them to think through what different theories tell you about how geography would affect relations between states. For instance: would US relations with India and Japan deteriorate if the US were located nearby, or would shared democratic values dampen potential rivalries? Would China try to establish regional hegemony in the Western hemisphere, and would states like Canada, Mexico or Brazil try to contain it? Or would they "bandwagon" with China as they have done with the United States? Would the United States have to curtail its global ambitions in order to deal with security problems closer to home -- such as Pakistan, North Korea, Burma, or Russia -- or would it feel compelled to use force against a threatening neighbor like North Korea? There's no single "right answer" to this sort of question; what I'm looking for is a clear, logically consistent, and well-argued set of predictions.
Not surprisingly, many of the papers argued that switching places would be a tremendous benefit to China. In particular, students clearly recognized that the United States enjoys some enormous geographic advantages. In addition to being wealthier and more powerful than any of the other major powers, the United States is protected by two enormous oceanic moats and has no great powers in its immediate neighborhood. Moving from East Asia to the Western hemisphere would put China in this same favorable position, and place the United States in a much more problematic location in East Asia.
But what was really interesting was an implication that some (though hardly all) students drew from this line of argument. A number of them argued that China would be so secure in the Western hemisphere that it could focus even more attention on economic development, and not worry very much about military or security developments elsewhere. It would want to defend its own territory, and it would worry about securing energy supplies from Canada, Venezuela, Mexico, and elsewhere, but otherwise it would be sitting pretty and could remain aloof from lots of other security issues. The United States, by contrast, would be facing all sorts of challenges over in Asia and would have to try to deal with all of them.
An obvious question, therefore, is: why doesn't this same logic apply to the United States today? Instead of devoting trillions of dollars to transforming the Middle East, trying to bring Afghanistan into the 20th century (or is it the 19th?) and generally interfering all over the world, the United States could almost certainly do a lot less on the world stage and devote some of those resources to balancing budgets and fixing things here at home. It's called nation-building, but we'd be building our nation and our future, not somebody else's.
What some of our students have intuitively grasped (and not because we told them), is that there is in fact a very powerful case for a much more limited U.S. military posture overseas. Indeed, given the existence of nuclear weapons, there is even a cogent case to be made for something approaching isolationism, as laid out by people like the late Eric Nordlinger, by the CATO Institute's Chris Preble, or the team of Gholz, Press, and Sapolsky. I don't go quite that far myself (i.e., I'm an offshore balancer, not an isolationist), but I recognize that there is a serious case for the latter position. And because this view does have a certain appeal, the current foreign-policy establishment has to do a lot of threat-mongering and engage in a lot of ideological oversell in order to get Americans to keep paying for foreign wars and sending their sons and daughters out to garrison the globe. It also helps to portray anybody who advocates doing less as some sort of idealistic pacifist or naive appeaser.
But this debate is beginning to open up. When states and local governments are facing bankruptcy, when military adventures like Iraq or Afghanistan yield not victory but at best only prolonged and costly draws, and when there is in fact no ideologically motivated great power adversary out there trying to "bury us," then continuing to try to manage the whole goddamn planet isn't just foolish, it's unconscionable. It will probably take another decade for this reality to work its way through our hidebound national-security establishment, but the winds of change are already apparent. And not a moment too soon.
Mario Tama/Getty Images
For years a number of political scientists have been complaining about the propensity for scholars to study topics that are of little real-world value or of interest only to a handful of fellow scholars. We've come to call this the "cult of irrelevance." At the same time, many academics cloud their analyses in obscure jargon or a fog of methodological "sophistication," and rarely bother to offer up translations for the busy policy-maker. To make matters worse, although academics defend the institution of tenure fiercely, most of them do not use the protection it affords to pursue topics that might be politically controversial.
These unfortunate tendencies are not universal, however, and a number of us have tried to address the broader issue in various ways. You can read about the general subject here, here, here, or here. In that spirit, I'm also happy to pass on the news that a group of political scientists have organized a week-long summer institute designed to tackle the problem head-on. Under the guidance of Bruce Jentleson of Duke, Steve Weber of UC-Berkeley, and James Goldgeier of George Washington, a new International Summer Policy Institute will "deliver an intensive curriculum designed to teach participants how to develop and articulate their research for a policy audience, what policymakers are looking for when they look to IR scholarship, whom to target when sharing research, and which tools and avenues of dissemination are appropriate." The Institute is part of a larger effort to "bridge the gap" between academia and policy, and you can find out more about its activities here.
Needless to say, I think this is a worthy enterprise. Together with efforts like the Tobin Project, it may encourage more academics to focus their research efforts on policy-relevant topics and teach them how to communicate their results in ways that policymakers will find more accessible. The point here, by the way, is not to "dumb down" scholarship or to imitate the plethora of partisan think tanks now located inside the Beltway. Academic scholars should be independent researchers first and foremost, and seekers of truth above all. But the topics that they choose to address can be chosen to illuminate important policy issues more directly, and devoting some time to figuring out how to communicate their results more broadly would surely be a good thing.
What is also needed is a change in academic practice, including the criteria that are used to make key hiring and promotion decisions. The standards by which we assess scholarly value are not divinely ordained or established by natural law; they are in fact "socially constructed" by the discipline itself. In other words, we collectively decide what sorts of work to valorize and what sorts of achievement to reward. If university departments placed greater weight on teaching, on contributions to applied public policy, on public outreach, and on a more diverse range of publishing venues -- including journals of opinion, trade publishers and maybe even blogs--then individual scholars would quickly adapt to these new incentives and we would attract a somewhat different group of scholars over time. If university departments routinely stopped the "tenure clock" for younger academics who wanted to do a year of public service, that would enable them to gain valuable real-world experience without short-changing their long-term academic futures. It would also send the message that academia shouldn't cut itself off from the real world. And it probably wouldn't hurt if deans, department chairs, and university presidents welcomed controversy, encouraged intellectual diversity, and defended the slaying of sacred cows. As I've said before, academics really shouldn't count it a great achievement when students have no interest in their classes, and when people outside the ivory tower have no interest in what we have to say.
2. Harvard students showed that they have clearer ethical vision than Harvard's leaders.
3. The Obama administration's loss is Just World Books' gain. (Translation: Ambassador Chas Freeman has written a book: America's Misadventures in the Middle East.) Buy it and read it and you'll be really annoyed that he was witch-hunted out of public service.
4. The Israeli human rights group Breaking the Silence was short-listed for the Sakharov Prize and right-wingers go bananas. The award eventually went to a prominent Cuban dissident, but anything that drives the WSJ op-ed page crazy is probably a good thing. See the Magnes Zionist here.
5. Britain's defense cuts confirm my view of NATO's future. Like Dorian Gray, the alliance is slowly fading into irrelevance while trying to keep up appearances. No matter how many new "strategic concepts" get written and how many nice meals they serve at the next ministerial meeting, the high-water mark of transatlantic security cooperation is behind us.
6. NYT columnist Tom Friedman had a moment of clarity.
7. NYT reporter Ethan Bronner did too! There are even hints that a few people in the Obama administration may be aware of just how badly they have screwed this one up. I'm not really smiling at this one, of course, but it is gratifying when occasional flashes of insight emerge from the cloud of propaganda and prevarication that normally surrounds this topic.
9. I finished my first Barry Eisler novel, and rejoiced in the fact that there is a whole bunch more that I haven't read it. Combined with the new John Le Carre book, my addiction to espionage fiction will be sated for awhile.
I hadn't intended to say anything further about the shameful Martin Peretz affair, and lord knows there are plenty of good reasons for me not to poke my finger in the eye of Harvard's current leadership. But seriously: You'd think after nearly 400 years the leaders of the university would have figured out what the principles of academic freedom and free speech really mean -- and also what they don't mean. But judging from the official university response to the furor, the people I work for appear to be somewhat confused about these issues.
To recap: A couple of weeks ago, Peretz made some offensive and racist statements about Muslims on his blog. Specifically, he wrote that "Frankly, Muslim life is cheap, especially for Muslims," and then went on to say that he didn't think American Muslims deserved the protections of the First Amendment, because he suspected they would only abuse them.
These statements were not an isolated incident or just a lamentably poor choice of words. On the contrary, they were the latest in a long series of statements displaying hatred and contempt for Muslims, Arabs, and other minorities. Peretz retracted part of his latest remarks after they were exposed and challenged by Nicholas Kristof (Harvard '82) in his column in the New York Times, but in his "apology," Peretz nonetheless reaffirmed his belief that "Muslim life is cheap." Indeed, he declared that "this is a statement of fact, not value."
A number of people then began to question whether it was appropriate for Harvard to establish an undergraduate research fund in Peretz's name and to give him a prominent role in the festivities commemorating the 50th anniversary of its storied Social Studies program. A University spokesman defended the decision to accept the money for the research fund and to have Peretz speak at a luncheon by saying:
As an institution of research and teaching, we are dedicated to the proposition that all people, regardless of color or creed, deserve equal opportunities, equal respect, and equal protection under the law. The recent assertions by Dr. Peretz are therefore distressing to many members of our community, and understandably so. It is central to the mission of a university to protect and affirm free speech, including the rights of Dr. Peretz, as well as those who disagree with him, to express their views."
In a masterful display of understatement, the Atlantic's James Fallows (Harvard '70) termed this response "not one of the university's better efforts." As he (and others) pointed out, nobody was questioning Peretz's right to write or say whatever he wants. For that matter, nobody has even questioned whether Harvard ought to give him a platform to expound his views on this or any other subject. (For my own part, if the Kennedy School invited him to speak on any subject he chose, I wouldn't object.
As should be obvious, this issue isn't a question of free speech or academic freedom. Rather, the issue is whether it is appropriate or desirable for a great university to honor someone who has repeatedly uttered or written despicable words about a community of people numbering in the hundreds of millions. And isn't it obvious that if Peretz had said something similar about African-Americans, Catholics, Jews, Asians, or gays, the outcry would have been loud, fierce, and relentless and some of his current defenders would have distanced themselves from him with alacrity.
And let's also not lose sight of the double standards at work here. After a long and distinguished career, journalist Helen Thomas makes one regrettable and offensive statement and she loses her job, even though she offered a quick and genuine apology. By contrast, Peretz makes offensive remarks over many years, reaffirms some of them when challenged, and gets a luncheon in his honor and his name on a research fund at Harvard.
And why? Because Peretz has a lot of wealthy and well-connected friends. Bear in mind that in 2003 Harvard suspended and eventually returned a $2.5 million dollar gift from the president of the United Arab Emirates, after it learned that he was connected to a think tank that had sponsored talks featuring anti-Semitic and anti-American themes. As the Harvard Crimson said at the time, "no donation is worth indebting the university to practitioners of hate and bigotry." So the University clearly has some standards, it just doesn't apply them consistently.
For more on this unequivocally depressing business, you can read:
2. James Fallows' summary of recent developments.
3. A powerful statement by Ta-Nehisi Coates of The Atlantic, examining Peretz's achievements as an editor and questioning his liberal bona fides.
4. A comment by Alan Gilbert of the University of Denver, a former tutor in the same Social Studies program.
5. And while you're at it, you might read the Boston Globe's editorial whitewashing Peretz, and compare it with their reaction to the Helen Thomas affair.
And no, this isn't just a matter of Ivy League academic politics, unrelated to issues of foreign policy. As everyone knows, U.S. relations with the Arab and Muslim world are especially delicate these days. You can read this or this to understand why, but it certainly doesn't help when one of the nation's premier academic institutions decides to honor someone with such deplorable views, even after they have been widely exposed. This is obviously not the main reason why the America's image in the Arab and Muslim world is so negative, but it surely adds fuel to the fires of bigotry.
To take this matter a step further, Islamophobia is on the rise here in the United States. Efforts to combat this pernicious and dangerous trend would be furthered if institutions like Harvard took a principled stand on this issue, and declined to honor anyone who has made bigoted remarks about Muslims (or any other group). This has not happened with Peretz, and history will not treat Harvard well for its behavior in this case.
Update: As I write this, I've received a couple of emails suggesting that Peretz was not going to be speaking at the Social Studies event after all. I don't know if that's true or not, but to me the issue is less about his being one of the speakers, and more about having his name permanently attached to an undergraduate research fund.
Update 2: James Fallows reports on the reported resolution of the dispute (i.e., Peretz won't have a speaking role at the event), and suggests that Harvard could address the controversy by creating a scholarship fund for students of Muslim background.
I had dinner a couple of weeks ago with a group of Harvard colleagues (and a visiting speaker), and we got into an interesting discussion about America's future as a world power. Nobody at the table questioned whether the United States was going to remain a very powerful and influential state for many years/decades to come. Instead, the main issues were whether it would retain its current position of primacy, whether China might one day supplant it as the dominant global power, and whether U.S. standards of living would be significantly compromised in the future.
One participant (a distinguished economist), was especially bullish. He argued that the United States enjoyed a considerable demographic advantage over Europe, Russia, and Japan, largely due a higher birth rate and greater openness to immigration. These societies will be shrinking and getting much older on average, while the United States will continue to grow for some time to come. He also argued that the United States remained far more entrepreneurial than most other societies, and a better incubator of technological innovation. Despite our current difficulties, therefore, he was optimistic about the longer-term prospects for the U.S. economy and for America's position as a global power.
But then came the crucial caveat. After reciting this long list of American advantages, my colleague remarked: "of course, our political system could screw it all up." And everyone around the table nodded in agreement.
MARK RALSTON/AFP/Getty Images
I'm back from my mini-break and digging out emails and correspondence, so I don't have an extended commentary today.
One piece in my mailbox did catch my eye, however, from the June 2010 issue of Perspectives on Politics. For those of you who aren't political scientists, PoP is a relatively new journal, founded eight years ago by the American Political Science Association. It was created in part in response to a bottom-up protest movement within the discipline known as the "Perestroika" movement ("Perestroika" was the pseudonym of the anonymous list-server who got it started). Although primarily motivated by a desire to defend methodological pluralism, one of the movement's related concerns was the "cult of irrelevance" within academic political science. In my judgment PoP has, been a partial corrective to that tendency, and it often features articles that engage big political issues from an academic perspective.
In any case, the current issue has a provocative article by Lawrence Mead on "Scholasticism in Political Science." Mead argues that academic writings about politics are increasingly "scholastic," which he defines as being increasingly specialized, preoccupied with methods, non-empirical, and primarily oriented to other academic literature instead of engaging real-world issues. In his words:
Today's political scientists often address very narrow questions and they are often preoccupied with method and past literature ... Scholars are focusing more on themselves, less on the real world. ... Research questions are getting smaller and data-gathering is contracting. Inquiry is becoming obscurantist and ingrown."
This sort of complaint is hardly new, of course. Hans Morgenthau offered a similar critique way back in the 1950s, when he warned of a political science "that is neither hated nor respected, but treated with indifference as an innocuous pastime, is likely to have retreated into a sphere that lies beyond the positive or negative interests of society. The retreat into the trivial, the formal, the methodological, the purely theoretical, the remotely historical -- in short, the politically irrelevant -- is the unmistakable sign of a 'non-controversial' political science which has neither friends nor enemies because it has no relevance for the great political issues in which society has a stake. History and methodology, in particular, become the protective armor which shields political science from contact with the political reality of the contemporary world. Political science, then, resembles what Tolsoi said modern history has become: 'a deaf many answering questions which no one has asked him.'" (Dilemmas of Politics, 1958, p. 31).
Morgenthau's Olympian denunciation was offered without a lot of supporting evidence, but Mead's warning is accompanied by an analysis of every article published in 1968, 1978, 1988, 1998 and 2007 in the American Political Science Review. You might get different results if you looked at different journals (i.e., the "scholasticism" of the APSR was one of the complaints of the original "Perestroikans"), but Mead's complaints are consistent with a lot of my own impressions of how the field is evolving. As Mead shows, the issue isn't method per se; it's the tendency of many scholars to ask smaller, less significant, and less controversial questions and to produce what he describes as "analyses of jewel-like precision that ... generate only minor findings and arouse little interest beyond specialists." This is accompanied by an aversion to topics that might make a scholar visible outside the academy -- or god forbid, controversial -- because that might screw up your shot at tenure or get your criticized in print.
This tendency is not universally true, of course, and I'd argue that the willingness of younger scholars to take up blogging as a form of public engagement is a prominent counter-tendency. Could it be that younger scholars are just as bored producing "scholasticist" works as many of us are reading them, and that they find blogging far more fulfilling than adding another (largely) unread article to the catalog of academic journals. And if that's the case, what does it tell us about the priorities and values of contemporary academe?
JACK GUEZ/AFP/Getty Images
My wife and I spent part of our honeymoon in Paris (where I am at the moment), and we had gorgeous weather the entire time. I've been back here without her three times since then, and the weather has been grey and gloomy (if not downright awful) each time. There's an obvious lesson to draw from this pattern of evidence (though the causality is murky), and I conclude that I need to stop coming here without her. Make a note....
But I digress. The hotel where I'm staying has rather primitive internet access, so posting will be light this week. I did manage to get online for a few minutes this morning, and caught up on some of the news. Two quick comments on things I read:
Via Sullivan and Yglesias, I picked up on Chris Beam's amusing essay in Slate asking "What if Political Scientists Covered the News?" Instead of breathlessly reporting every up-and-down in the news cycle, and trumpeting events that research has shown to be largely meaningless, as journalists tend to do, political scientists covering the news would undoubtedly provide better analysis of the underlying forces that shape political outcomes and help everyone see the forest for the trees. Political scientists would also be more inclined to discuss foreign policy in terms of long-term economic and demographic trends, underlying social forces within and across states, shifting balances of power, the role of interest groups, the impact of shifting normative discourses, etc. With perhaps a few exceptions, they'd be less inclined to highlight personalities and inside-the-Beltway gossip.
Movement in these directions would be an improvement. But speaking as a card-carrying political scientist who is (mostly) proud of my chosen profession, there'd be a pretty clear downside too. If political scientists wrote the news, we might see a lot of articles about trivial topics of little interest to anyone but a handful of other scholars. (Check out the next APSA annual program if you don't believe me). Moreover, most political scientists would be reluctant to tackle anything that might be controversial for fear that someone might say something mean about them in response. Journalists can be thin-skinned, but most academics are notoriously sensitive to even fair-minded criticism.
FREDERIC J. BROWN/AFP/Getty Images
Here's a quick recommendation for all you terrorism mavens out there. The Chicago Project on Security and Terrorism (CPOST) has a fascinating and very useful website up and running, which you can access here.
According to its operators (a program headed by Professor Robert Pape), the site contains all known instances of suicide terrorism between 1981 and 2001, and will eventually be brought completely up to date. Three features of the site are especially interesting. First, it lets you perform interactive searches along multiple dimensions (location of attacks, the target type, the weapon used, demographic and biographical characteristics of attackers, etc. For example, if you wanted to know how many suicide attacks were conducted by women in Kashmir between 1995 and 2000, you can enter those parameters and it will give you the results. Second, the site provides the external sources used to document each attack, so that you can check up on the coding of any specific incident. Third, each incident in linked to GPS data on location, so that you can explore the geographic patterns of contemporary suicide terrorism. On the latter point, by the way, the data shows that almost all these attacks are concentrated in Sri Lanka, Kashmir, Afpak, Iraq, and Israel/Palestine, a finding consistent with Pape's well-known argument that suicide terrorism is primarily a response to perceived foreign occupations.
All in all, a very useful tool. But the skeptic in me has to ask the following question: will the existence of databases like this one tend to feed our fascination with conventional terrorism, a threat that is almost certainly exaggerated and overblown? (WMD terrorism is another matter, although we may be overstating that danger too). That's not a criticism of the Chicago Project -- which is doing excellent work -- it's just a warning to us all not to fixate on a phenomenon just because it's something we can count.
BEHROUZ MEHRI/AFP/Getty Images
“Here is what I think a Hippocratic Oath for Quantitative Analysis in Security Studies should look like:
War is a human endeavor. I recognize that it is a phenomenon that does not conform to neat mathematical equations.
I will use quantitative analysis in conjunction with theory and qualitative analysis to describe what I see as phenomena in war and peace. I will be honest about the limits of both my theory and my analysis.
In war and peace, the variables are infinite, and not everything can be measured or assigned a numerical value.
I will not use numbers to signify what are fundamentally qualitative assessments without acknowledging to my reader that I have done so in order to satisfy a departmental requirement, gain tenure, or get published in the APSR. Or because I have been in graduate school for so long that I have forgotten how to effectively write in prose.
I recognize there are no mathematical equations in Vom Kriege and that it is nonetheless unlikely that my legacy will transcend that of Clausewitz.
I recognize that very few squad leaders in the 10th Mountain Division have ever taken a course in statistics yet probably know more about the conduct and realities of war than I do.”
Wise words indeed. I’d just add that Nobel prize-winning economist and strategic guru Thomas Schelling offered a similar warning in The Strategy of Conflict, cautioning against any tendency “to treat the subject of strategy as though it were, or should be, solely a branch of mathematics.”
That’s not to say that various types of mathematical analysis aren’t useful, whether one is talking about operations research, basic statistics, game theory, or whatever. But it’s just a tool, and ought to be used in conjunction with other methods and with an appropriate degree of humility.
There has been an interesting flap in Cambridge this past week regarding some appalling remarks made by one Martin Kramer. As some of you undoubtedly know, Kramer is a hard-line Israeli-American commentator who has made something of a name for himself attacking the Middle East studies profession, and just about anyone who is remotely critical of Israel’s actions or the U.S.-Israeli “special relationship.” (Full disclosure: he’s taken various ill-aimed swipes at me in the past few years). He was an early supporter of Campus Watch (the organization Daniel Pipes founded to blacklist scholars it disapproved of), and Kramer has also sought to convince Congress to curtail or at least closely monitor the Title VI funding it provides to support Middle East studies and other area studies programs at American universities. He is affiliated with a number of right-of-center organizations in the United States and Israel, and for the past few years, he’s also been a research fellow at the Weatherhead Center for International Affairs here at Harvard, under the auspices of its National Security Studies program.
In any case, the ruckus started when it was revealed that Kramer had given a speech at the recent Herzliya Conference in Israel, where he advocated eliminating outside aid to Gazans (which he termed “pro-natal subsidies”) because -- according to him -- it encouraged them to reproduce, which led to the creation of what he termed “superfluous young males,” which, in turn, contributed to terrorism. He also suggested that Israel’s siege of Gaza was intended to deal with this problem. You can watch his remarks here, but the money quote is the following:
“Aging populations reject radical agenda and the Middle East is no different. Now eventually, this will happen among the Palestinians, too. But it will happen faster if the West stops providing pro-natal subsidies for Palestinians with refugee status. Those subsidies are one reason why in the ten years, from 1997 to 2007, Gaza's population grew by an astonishing 40%. At that rate, Gaza's population will double by 2030 to three million. Israel's present sanctions on Gaza have a political aim, undermine the Hamas regime, but they also break Gaza's runaway population growth and there is some evidence that they have. That may begin to crack the culture of martyrdom, which demands a constant supply of superfluous young men."
In other words, if Israel and the West can just keep those pesky Palestinians on a subsistence diet and stop them from having all those babies, the population will get increasingly older and smaller and the terrorism problem will eventually go away.
One rarely hears anyone make such horrific remarks in polite company here in the United States, especially someone associated with a college or university. Not surprisingly, Kramer’s remarks have stirred up a major controversy. Several prominent bloggers -- notably Ali Abunimah (who broke the story) and M.J. Rosenberg -- accused Kramer of advocating genocide. Juan Cole at Informed Comment referred to Kramer’s ideas as a form of eugenics, Richard Silverstein called it anti-Muslim racism, and a number of people complained to the leadership of the Weatherhead Center. I know that because I am on the center’s executive committee and I received several irate emails demanding that Harvard dismiss Kramer or least distance itself from him. In response, the center’s directors issued a statement saying, “It would be inappropriate for the Weatherhead Center to pass judgment on the personal political views of any of its affiliates, or to make affiliation contingent upon some political criterion. Exception may be made for statements that go beyond the boundaries of protected speech, but there is no sense in which Kramer's remarks could be considered to fall into this category.” They also said the charge that he was advocating genocide was “baseless.” The Harvard Crimson took a similar line, which you can read here.
I have three points to make about this matter.
First, although a good case can be made that Kramer’s remarks were tantamount to advocating genocide, I would not use that word to characterize them. The 1948 U.N. definition of genocide does include “imposing measures intended to prevent births within the group,” and Kramer’s call for an end to ‘pro-natal subsidies” is very close to that part of the definition. But despite my respect for Abunimah and Rosenberg, I think the word “genocide” has become a loaded term that gets tossed around too loosely, which makes it easy for Kramer and his defenders to portray legitimate criticism of his extreme views as over the top.
What word you use to describe his comments is actually not that important, because their substance is so offensive to any decent person that you don’t need to worry much about getting the right label for them. To illustrate this point, just imagine how Kramer would react if the Iranian government announced that it was worried its Jewish population (some 25,000 or so) was a potential “fifth column,” and that it was therefore imposing measures intended to discourage Iranian Jews from having more children? Or what if a prominent academic at Harvard declared that the United States had to make food scarcer for Hispanics so that they would have fewer children? Or what if someone at a prominent think tank noted that black Americans have higher crime rates than some other groups, and therefore it made good sense to put an end to Temporary Assistance for Needy Families (TANF) and other welfare programs, because that would discourage African-Americans from reproducing and thus constitute an effective anti-crime program? Americans of all persuasions would appropriately denounce such views as barbaric and racist, and that’s precisely how Kramer’s chilling remarks should be viewed.
Second, I take the issue of academic freedom very seriously and believe that the principle applies to Kramer, even though I found his remarks appalling. Thus, I believe that the Weatherhead administrators were correct in deflecting calls to dismiss him. (Some of you may recall that I thought that the head of Ben Gurion University of the Negev was wrong when she tried to censure Professor Neve Gordon, who is on her faculty and who called for a boycott of Israel. By the same logic, it would be wrong for Harvard officials to cut off Kramer because they disagreed with what he said or even found it offensive.)
But notice that the Weatherhead directors did not quite “refrain from passing judgment” on what Kramer said. The appropriate stance to adopt whenever a faculty member or affiliated researcher takes a controversial or unpopular position is strict neutrality; the institution, or its official representatives, should take no position at all about the validity of the person’s views. Therefore, they should have defended Kramer’s right to say what he did but refrained from commenting on whether the accusations against him were “baseless” or not.
It is also more than a little ironic that Kramer and his defenders are using the principle of “academic freedom” as a means of defense, given Kramer’s past efforts to bring external pressure to bear on academics who made arguments about the Middle East that he found objectionable.
Third, the principle of academic freedom does not prevent scholars from challenging Kramer’s racist ideas, and pointing out just how offensive they are. Nor does it prevent any of us -- and that includes academic administrators -- from questioning Kramer’s judgment on matters relating to U.S. Middle East policy or from questioning the judgment of anyone who thought that having him affiliate with Harvard was a good idea.
One final point. It is important to emphasize that many Israelis and most American Jews would undoubtedly find Kramer’s views offensive. At the same, however, he is hardly an isolated extremist, or some messianic settler sitting in a trailer in an illegal outpost in the West Bank. On the contrary, he is an especially well-connected individual, with appointments at the Shalem Center in Jerusalem, the Washington Institute for Near East Policy, and of course Harvard. Moreover, he is not the only Israeli who has expressed such hateful views about the Palestinians. Of course, one can find equally hateful sentiments about Israeli Jews coming from Palestinians and Arabs. But the key difference is that they don’t hold appointments at prestigious institutions like Harvard.
Last week a colleague who has been facing repeated and unfair attacks in the media and the blogosphere (for making arguments that cut against the conventional wisdom) sent around an email asking a number of friends and associates (including me) for advice on how to deal with the attacks. Having been smeared in similar fashion myself, I circulated a list of the lessons I learned from my own experience with "grabbing the third rail." A few of the recipients thought the list was helpful, so I decided to revise it and post it here. If any readers are contemplating tackling a controversial subject -- and I hope some of you will -- you'll need to be ready should opponents decide not to address your arguments in a rational fashion, but to attack your character, misrepresent your position, and impugn your motives instead. If they take the low road, here are ten guidelines for dealing with it. (The advice itself is politically neutral: it applies regardless of the issue in question and no matter which side you're on.)
1. Think Through Your "Media Strategy" before You Go Public. If you are an academic taking on a "third rail" issue for the first time, you are likely to face a level of public and media scrutiny that you have never experienced before. It is therefore a good idea to think through your basic approach to the media before the firestorm hits. Are you willing to go on TV or radio to defend your views? Are there media outlets that you hope to cultivate, as well as some you should avoid?
Are you open to public debate on the issue, and if so, with whom? Do you plan a "full-court" media blitz to advance your position (an article, a book, a lecture tour, a set of op-eds, etc.), or do you intend to confine yourself to purely academic outlets and let the pundits take it from there? There is no right answer to these questions, of course, and how you answer them depends in good part on your own proclivities and those of your opponents. But planning ahead will leave you better prepared when the phone starts ringing off the hook and there's a reporter -- or even someone like Bill O'Reilly or Jon Stewart -- on the other end. Don't be afraid to listen to professional advice here (such as the media office at your university or research organization), especially if it's your first time in the shark tank. It's also a good idea to let your superiors know what's coming; deans, center directors, and college presidents don't like surprises.
2. You Have Less Control Than You Think. Although it helps to have thought about your strategy beforehand, there will always be surprises and you will have to think on your feet and improvise wisely. Sometimes real-world events will vindicate your position and enhance your credibility (as the 2006 Lebanon War did for my co-author and myself), but at other times you may have to explain why events aren't conforming to your position. A vicious attack may arrive from an unexpected source and leave you reeling, or you may get an unsolicited endorsement that validates your views. Bottom line: life is full of surprises, so be ready to roll with the punches and seize the opportunities.
3. Never Get Mad. Let your critics throw the mud, but you should always stick to the facts, especially when they are on your side. In my own case, many of the people who attacked me and my co-author proved to be unwitting allies, because they lost their cool in public or in print, made wild charges and ad hominem arguments, and generally acted in a transparently mean-spirited manner. It always works to your advantage when opponents act in an uncivil fashion, because it causes almost everyone else to swing your way.
Of course, it can be infuriating when critics misrepresent your work, and nobody likes to have malicious falsehoods broadcast about them. But the fact that someone is making false charges against you does not mean that others are persuaded by the malicious rhetoric. Most people are quite adept at separating facts from lies, and that is especially true when the charges are over-the-top. In short, the more ludicrous the charges, the more critics undermine their own case. So stick to the high ground; the view is nicer up there.
4. Don't Respond to Every Single Attack. A well-organized smear campaign will try to bury you in an avalanche flurry of bogus charges, many of which are simply not worth answering. It is easier for opponents to dream up false charges than it is for you to refute each one, and you will exhaust yourself rebutting every critical word directed at you. So focus mainly on answering the more intelligent criticisms while ignoring the more outrageous ones, which you should treat with the contempt they deserve. Finally, make sure every one of your answers is measured and filled with the relevant facts. Do not engage in ad hominem attacks of any sort, no matter how tempting it may be to hit back.
5. Explain to Your Audience What Is Going On. When refuting bogus charges, make it clear to readers or viewers why your opponents are attacking you in underhanded ways. When you are the object of a politically motivated smear campaign, others need to understand that your critics are not objective referees offering disinterested commentary. Be sure to raise the obvious question: why are your opponents using smear tactics like guilt-by-association and name-calling to shut down genuine debate or discredit your views? Why are they unwilling to engage in a calm and rational exchange of ideas? Let others know that it is probably because your critics are aware that you have valid points to make and that many people will find your views persuasive if they get a chance to judge them for themselves.
6. The More Compelling Your Arguments Are, The Nastier the Attacks Will Be. If critics can refute your evidence or your logic, then that's what they will do and it will be very effective. However, if you have made a powerful case and there aren't any obvious weaknesses in it, your adversaries are likely to misrepresent what you have said and throw lots of mud at you. What else are they going to do when the evidence is against them?
This kind of behavior contrasts sharply with what one is accustomed to in academia, where well-crafted arguments are usually treated with respect, even by those who disagree with them. In the academic world, the better your arguments are, the more likely it is that critics will deal with them fairly. But if you are in a very public spat about a controversial issue like gay marriage or abortion or gun control, a solid and well-documented argument will probably attract more scurrilous attacks than a flimsy argument that is easily refuted. So be prepared.
7. You Need Allies. Anyone engaged on a controversial issue needs allies on both the professional and personal fronts. When the smearing starts, it is of enormous value to have friends and associates publicly stand up and defend you and your work. At the same time, support from colleagues, friends, and family is critical to maintaining one's morale. Facing a seemingly endless barrage of personal attacks as well as hostile and unfair criticisms of one's work can be exhausting and dispiriting, which is why you need others to stand behind you when the going gets tough. That does not mean you just want mindless cheerleaders, of course; sometimes allies help us the most when they warn us we are heading off course.
One more thing: if you're taking on a powerful set of opponents, don't be surprised or disappointed when people tell you privately that they agree with you and admire what you are doing, but never say so publicly. Be realistic; even basically good people are reluctant to take on powerful individuals or institutions, especially when they might pay a price for doing so.
8. Be Willing to Admit When You're Wrong, But Don't Adopt a Defensive Crouch. Nobody writing on a controversial and contested subject is infallible, and you're bound to make a mistake or two along the way. There's no harm in admitting to errors when they occur; indeed, harm is done when you make a mistake and then try to deny it. More generally, however, it makes good sense to make your case assertively and not shy away from engaging your critics. In short, the best defense is a smart offense, even when you are acknowledging errors or offering a correction. For illustrations of how my co-author and I tried to do this, see here, here, and here.
9. Challenging Orthodoxy Is a Form of "Asymmetric Conflict": You Win By "Not Losing." When someone challenges a taboo or takes on some well-entrenched conventional wisdom, his or her opponents invariably have the upper hand at first. They will seek to silence or discredit you as quickly as they can, so that your perspective, which they obviously won't like, does not gain any traction with the public. But this means that as long as you remain part of the debate, you're winning. Minds don't change overnight, and it is difficult to know how well an intellectual campaign is going at any particular point in time. So get ready for an emotional roller coaster -- some days you might think you're winning big, while other days the deck will appear to be stacked against you. But the real question is: are you still in the game?
The good news is that if you have facts and logic on your side, your position is almost certain to improve over time. It is also worth noting that a protracted debate allows you to refine your own arguments and figure out better ways to refute your opponents' claims. In brief, think of yourself as being engaged in a "long war," and keep striving.
10. Don't Forget to Feel Good about Yourself and the Enterprise in Which You Are Engaged. Waging a battle in which you are being unfairly attacked is hard work, and you will sometimes feels like Sisyphus rolling the proverbial stone endlessly uphill. But it can also be tremendously gratifying. You'll wage the struggle more effectively if you find ways to keep your spirits up, and if you never lose sight of the worthiness of your cause. Keeping your sense of humor intact helps too; because some of the attacks you will face are bound to be pretty comical. So while you're out there slaying your chosen dragon, make sure you have some fun too.
NICHOLAS KAMM/AFP/Getty Images
My copy of Mein Kampf sits on a shelf in my study, along with a couple of dozen books on World War II. It was the first book ever translated by the late Ralph Manheim (who also translated the works of Gunter Grass and others) and published by Houghton Mifflin in 1943. I've used it to prepare lectures on the Second World War, where I quote a few of Hitler's more lurid and bizarre passages in order to convey to students the dangerous world-view from which Nazism sprang.
I mention this because authorities in Bavaria are reportedly trying to prevent new editions of this book from being published in Germany (where it has been banned), now that the original copyright (which is controlled by the Bavarian government) is about to run out. Their concern, which is understandable but in my view overstated, is that neo-Nazi groups will use the expiration of copyright as an opportunity to disseminate Hitler's hateful ideas anew.
I think this is a mistake. In addition to being filled with a lot of appalling racist claptrap, Mein Kampf is an awful book-turgid, tedious, badly organized, and mostly boring. So the danger that a German edition it will win a lot of new converts seems remote. Second, it's widely available in pirated versions on the Internet and in plenty of other countries (including the Untied States), so anybody with neo-Nazi sympathies can get a copy already.
Ever since Hiroshima, the role of nuclear weapons in international politics has been a central part of the security studies field. Think of the seminal works of Bernard Brodie, Albert Wohlstetter, and Thomas Schelling, as well as the somewhat less enduring but still important work of people like Pierre Gallois, William Kaufmann, Herman Kahn, Hedley Bull, and others. (If you want a real hoot, try to re-read Henry Kissinger's Nuclear Weapons and Foreign Policy (1957), the book that made his early reputation but has -- to put it politely -- not aged well). Discussions of nuclear strategy were a cottage industry in the 1970s and 1980s (think Robert Jervis, Colin Gray, Desmond Ball, Bruce Blair, Paul Bracken, John Steinbruner, Ken Waltz, etc.), and former statesman and policy wonks routinely weighed in on the issues of nuclear proliferation and arms control.
Indeed, when I got my first job at Princeton in 1984, I was hired in part to teach a course on nuclear weapons and arms control, and it routinely attracted 50-100 students. The Cold War was still going strong and the Reagan administration was raising the nuclear temperature in various ways, so concerns about nuclear weapons were front and center. Interest in the topic hasn't vanished entirely since then, but there's no course of that kind at Harvard these days (or at Princeton, for that matter), and I haven't detected much student demand for one. (That may also reflect that fact that there is only one regular faculty member in Harvard's Government Department whose main research interest is the study of war and peace, but that's another story).
Stephen M. Walt is the Robert and Renée Belfer professor of international relations at Harvard University.