Ignorance Is Not Always Bliss

Contrary to the views of some skeptics, I think that political science deserves the second half of its name, and I therefore consider myself to be a working scientist. The longer I’ve worked at it, though, the more I wonder if that status isn’t as much a curse as a blessing. After more than 20 years of wrestling with a few big questions, I’m starting to believe that the answers to those questions are fundamentally unknowable, and permanent ignorance is a frustrating basis for a career.

To see what I’m getting at, it’s important to understand what I take science to be. In a book called Ignorance, neurobiologist Stuart Firestein rightly challenges the popular belief that science is a body of accumulated knowledge. Instead, Firestein portrays scientists as explorers—“feeling around in dark rooms, bumping into unidentifiable things, looking for barely perceptible phantoms”—who prize questions over answers.

Working scientists don’t get bogged down in the factual swamp because they don’t care all that much for facts. It’s not that they discount or ignore them, but rather that they don’t see them as an end in themselves. They don’t stop at the facts; they begin there, right beyond the facts, where the facts run out.

What differentiates science from philosophy is that scientists then try to answer those questions empirically, through careful observation and experimentation. We know in advance that the answers we get will be unreliable and impermanent—“The known is never safe,” Firestein writes; “it is never quite sufficient”—but the science is in the trying.

The problem with social science is that it is nearly always impossible to do the kinds of experimentation that would provide us with even the tentative knowledge we need to develop a fresh set of interesting questions. It’s not that experimentation is impossible; it isn’t, and some social scientists are working hard to do them better. Instead, as Jim Manzi has cogently argued, the problem is that it’s exceptionally difficult to generalize from social-scientific experiments, because the number and complexity of potential causes is so rich, and the underlying system, if there even is such a thing, is continually evolving.

This problem is on vivid display in a recent Big Think blog post in which eight researchers identified as some of the world’s “top young economists” identify what they see as their discipline’s biggest unanswered questions. The first entry begins with the sentence, “Why are developing countries poor?” The flip side of that question is, of course, “Why are rich countries rich?”, and if you put those two questions together, you get “What makes some economies grow faster than others?” That is surely the most fundamental riddle of macroeconomics,  and yet the sense I get from empirical economists is that, after centuries of inquiry, we still really just don’t know.

My own primary field of political development and democratization suffers from the same problem. After several decades of pondering why some countries have democratic governments while others don’t, the only thing we really know is that we still don’t know. When we pore over large data sets, we see a few strong correlations, but those correlations can’t directly explain the occurrence of relevant changes in specific cases. What’s more, so many factors are so deeply intertwined with each other that it’s really impossible to say which causes which. When we narrow our focus to clusters of more comparable cases—say, the countries of Eastern Europe after the collapse of Communism—we catch glimpses of things that look more like causal mechanisms, but the historical specificity of the conditions that made those cases comparable ensures that we can never really generalize even those ephemeral inferences.

It’s tempting to think that smarter experimentation will overcome or at least ameliorate this problem, but on broad questions of political and economic development, I’m not buying it. Take the question of whether or not U.S.-funded programs aimed at promoting democracy in other countries actually produce the desired effects. This sounds like a problem amenable to experimental design–what effect does intervention X have on observable phenomenon Y?–but it really isn’t. Yes, we can design and sometimes even implement randomized controlled trials (RCTs) to try to evaluate the impacts of individual interventions under specific conditions. As Jennifer Gauck has convincingly argued, however, it’s virtually impossible to get clear answers to the original macro-level questions from the micro-level analyses RCTs on this topic must entail when the micro- to macro- linkages are themselves unknown. Add thick layers of politicization, power struggles, and real-time learning, and it’s hard to see how even well-designed RCTs can push us off of old questions onto new ones.

I’m not sure where this puts me. To be honest, I increasingly wonder if my attraction to forecasting has less to do with the lofty scientific objective of using predictions to hone theories and more to do with the comfort of working on a more tractable problem. I know I can never really answer the big questions, and my attempts to do so sometimes leave me feeling like I’m trying to bail out the ocean, pouring one bucket at a time onto the sand in hopes of one day catching a glimpse of the contours of the floor below. By contrast, forecasting at least provides a yardstick against which I can assess the incremental value of specific projects. On a day-to-day basis, the resulting sense (illusion?) of progress provides a visceral feeling of accomplishment and satisfaction that is missing when I offer impossibly uncertain answers to deeper questions of cause and effect. And, of course, the day-to-day world is the one I actually have to inhabit.

I’d really like to end this post on a hopeful note, but today I’m feeling defeated. So, done.

  • Author

  • Follow me on Twitter

  • Follow Dart-Throwing Chimp on WordPress.com
  • Enter your email address to follow this blog and receive notifications of new posts by email.

    Join 13,609 other subscribers
  • Archives

%d bloggers like this: