Forecasts derived from prediction markets did an excellent job predicting last night’s Academy Awards.
PredictWise uses odds from online bookmaker Betfair for its Oscars forecasts, and it nearly ran the table. PredictWise assigned the highest probability to the eventual winner in 21 of 24 awards, and its three “misses” came in less prominent categories (Best Documentary, Best Short, Best Animated Short). Even more telling, its calibration was excellent. The probability assigned to the eventual winner in each category averaged 87 percent, and most winners were correctly identified as nearly sure things.
Inkling Markets also did quite well. This public, play-money prediction market has a lot less liquidity than BetFair, but it still assigned the highest probability of winning to the eventual winner in 17 of the 18 categories is covered—it “missed” on Best Original Song—and for extra credit it correctly identified Gravity as the film that would probably win the most Oscars. Just by eyeballing, it’s clear that Inkling’s calibration wasn’t as good as PredictWise’s, but that’s what we’d expect from a market with a much smaller pool of participants. In any case, you still probably would have one your Oscars pool if you’d relied on it.
This is the umpteen-gajillionth reminder that crowds are powerful forecasters. “When our imperfect judgments are aggregated in the right way,” James Surowiecki wrote (p. xiv), “our collective intelligence is often excellent.” Or, as PredictWise’s David Rothschild said in his live blog last night,
This is another case of pundits and insiders advertising a close event when the proper aggregation of data said it was not. As I noted on Twitter earlier, my acceptance speech is short. I would like to thank prediction markets for efficiently aggregating dispersed and idiosyncratic data.
1 Comment