Super Forecasting – Super AnalysisPosted: December 18, 2015
December 18, 2015
A few weeks ago, I got Superforcasting to review, I started reading it at once, given its provocative title. By the time I finished it, I had dog-eared over a dozen pages for re-reading, close to a personal record.
Let me give you the background. In a project funded by the US intelligence community, specifically the Intelligence Advance Research Projects Activity (IARPA), a forecasting competition was set up among five teams. The focus was on the kinds of questions that intelligence agencies deal with every day. In other words, analysis of the likelihood of certain events happening, based on research.
One of the teams was run by Professor Tetlock. Hundreds of questions about international affairs were asked of the teams over a four-year span, producing over 1 million responses to evaluate, a researcher’s dream. Superforecasting is an in-depth report of that multi-year study of political and economic forecasting, written in a clear, straight-forward, even amusing style.
Putting aside its style, what the authors report is startling. Tetlock’s Team – the good Judgment Project or GJP – was composed of “ordinary volunteers”, several of whom are profiled in Superforecasting. Let me give you the official summary of the results of the competition:
“In Year One, GJP beat the official control group by 60%. In Year two, they beat the control group by 78%. After two years, GJP was doing so much better than its competitors that IARPA dropped the other teams.”
In other words, the amateur analysts not only beat the professionals, but got better at what they were doing year over year (thus eliminating the possibility of chance driving the first year results).
How did they do that? The long answer is the core of Superforecasting; the short answer is that good forecasting, i.e., analysis, involves the following, among other steps:
- Get your evidence (data) from a variety of sources;
- Learn to think (and then think) probabilistically (math skills seem highly correlated with super forecasting);
- “Unpack” the questions to be answered into smaller components;
- Synthesize a wide variety of different views into a single vision;
- Work in teams, providing that they are truly diverse;
- Keep (honest) scores; and
- Be willing to admit error and to change direction (quickly).
If that sounds simple, it is, but it is rarely done this way. “Superforecasters” actually follow these rules, instead of merely acknowledging that they exist, as is the case for so many analysts. Superforecasting gives good, solid tips on how to do all of this (and more).
For those working as CI analysts, whether doing your own analysis or doing it for others, this is a must read. For those to whom strategic and competitive intelligence is being provided, they should be read this – sooner than later – so that they can become better, and more critical intelligence end-users.
 Philip E. Tetlock and Dan Gardner, Superforecasting: The Art and Science of Prediction, Crown Publishing, New York, 2015, $28.00, 341 pages.
Anyone teaching intelligence analysis should immediately immerse him/herself in the methodology to see how to strengthen their teaching of analysis.
 For example, the Appendix with the “Ten Commandments for Aspiring Superforecasting” actually has eleven commandments. Is that a tip of the hat to Monty Python?
 This should destroy the common (mis)conception that better CI analysis is always provided by analysts that are more familiar with the industry or the company being serviced.