Philip Tetlock’s recent Superforecasting says, basically, some people do better at forecasting than others and, furthermore, networking higher performing forecasters, providing access to pooled data, can produce impressive results.
This is a change from Tetlock’s first study – Expert Political Judgment – which lasted about twenty years, concluding, famously, ‘the average expert was roughly as accurate as a dart-throwing chimpanzee.”
Tetlock’s recent research comes out of a tournament sponsored by the Intelligence Advanced Research Projects Activity (IARPA). This forecasting competition fits with the mission of IARPA, which is to improve assessments by the “intelligence community,” or IC. The IC is a generic label, according to Tetlock, for “the Central Intelligence Agency, the National Security Agency, the Defense Intelligence Agency, and thirteen other agencies.”
It is relevant that the IC is surmised (exact figures are classified) to have “a budget of more than $50 billion .. [and employ] one hundred thousand people.”
Thus, “Think how shocking it would be to the intelligence professionals who have spent their lives forecasting geopolical events – to be beaten by a few hundred ordinary people and some simple algorithms.”
Of course, Tetlock reports, this actually happened – “Thanks to IARPA, we now know a few hundred ordinary people and some simple math can not only compete with professionals supported by multibillion-dollar apparatus but also beat them.”
IARPA’s motivation, apparently, traces back to the “weapons of mass destruction (WMD)” uproar surrounding the Iraq war –
“After invading in 2003, the United States turned Iraq upside down looking for WMD’s but found nothing. It was one of the worst – arguable the worst – intelligence failure in modern history. The IC was humiliated. There were condemnations in the media, official investigations, and the familiar ritual of intelligence officials sitting in hearings ..”
So the IC needs improved methods, including utilizing “the wisdom of crowds” and practices of Tetlock’s “superforecaster” teams.
Unlike the famous M-competitions, the IARPA tournament collates subjective assessments of geopolitical risk, such as “will there be a fatal confrontation between vessels in the South China Sea” or “Will either the French or Swiss inquiries find elevated levels of polonium in the remains of Yasser Arafat’s body?”
Tetlock’s book is entertaining and thought-provoking, but many in business will page directly to the Appendix – Ten Commandments for Aspiring Superforecasters.
- Triage – focus on questions which are in the “Goldilocks” zone where effort pays off the most.
- Break seemingly intractable problems into tractable sub-problems. Tetlock really explicates this recommendation with his discussion of “Fermi-izing” questions such as “how many piano tuners there are in Chicago?.” The reference here, of course, is to Enrico Fermi, the nuclear physicist.
- Strike the right balance between inside and outside views. The outside view, as I understand it, is essentially “the big picture.” If you are trying to understand the likelihood of a terrorist attack, how many terrorist attacks have occurred in similar locations in the past ten years? Then, the inside view includes facts about this particular time and place that help adjust quantitative risk estimates.
- Strike the right balance between under- and overreacting to evidence. The problem with a precept like this is that turning it around makes it definitely false. Nobody would suggest “do not strike the right balance between under- and overreacting to evidence.” I guess keep the weight of evidence in mind.
- Look for clashing causal forces at work in each problem. This reminds me of one of my models of predicting real world developments – tracing out “threads” or causal pathways. When several “threads” or chains of events and developments converge, possibility can develop into likelihood. You have to be a “fox” (rather than a hedgehog) to do this effectively – being open to diverse perspectives on what drives people and how things happen.
- Strive to distinguish as many degrees of doubt as the problem permits but no more. Another precept that could be cast as a truism, but the reference is to an interesting discussion in the book about how the IC now brings quantitative probability estimates to the table, when developments – such as where Osama bin Laden lives – come under discussion.
- Strike the right balance between under- and overconfidence, between prudence and decisiveness. I really don’t see the particular value of this guideline, except to focus on whether you are being overconfident or indecisive. Give it some thought?
- Look for the errors behind your mistakes but beware of rearview-mirror hindsight biases. I had an intellectual mentor who served in the Marines and who was fond of saying, “we are always fighting the last war.” In this regard, I’m fond of the saying, “the only certain thing about the future is that there will be surprises.”
- Bring out the best in others and let others bring out the best in you. Tetlock’s following sentence is more to the point – “master the fine art of team management.”
- Master the error-balancing cycle. Good to think about managing this, too.
Puckishly, Tetlocks adds an 11th Commandment – don’t treat commandments as commandments.
Great topic – forecasting subjective geopolitical developments in teams. Superforecasting touches on some fairly subtle points, illustrated with examples. I think it is well worth having on the bookshelf.
There are some corkers, too, like when Tetlock’s highlights the recommendations of 2nd Century physician to Roman emperors Galen, the medical authority for more than 1000 years.
Galen once wrote, apparently,
“All who drink of this treatment recover in a short time, except those whom it does not help, who all die…It is obvious, therefore, that it fails only in incurable cases.”