Search This Blog

Thursday, December 13, 2007

Changing Our Minds About Iran

There appear to be mathematical answers to the question about changing our minds about the threat of a nuclear Iran ... and the results may surprise you.

Those worried the recent NIE is wrong in estimating that Iran is no longer pursuing a nuclear weapons program have grounds to worry. Intelligence assessments are often wrong. That may not accord with the glamorous movie portrayals of spies and espionage, but that’s how it is.

...

Whether we like it or not, there are limits to what intelligence can know at any one time. The inescapable uncertainties may make it impossible to decide the status of Iran’s nuclear program “once and for all”. As in the case of the Soviet Union changes in the situation and leadership happen all the time. Honest analysts must keep revising the picture as new information comes to light. While Washington politics describes any change in intelligence estimates as examples of ‘lying’ or incompetence the plain fact is that altering assessments is endemic to the process. An unchanging intelligence picture is a wrong picture. Changing your mind is a natural thing to do.

What’s needed is a way to keep improving the picture with each successive measurement. Bruce Blair, a former Senior Fellow at the Brookings Institution, notes that as long as “changing one’s mind” is done scientifically using a mathematical tool called Bayes’ analysis, the result is a more accurate intelligence estimate.

...
Let’s look at an example of how repeated measurement affects our estimate of how likely a threat is. Suppose the US received a single report — like an NIE — stating that dictator X had given up his WMD program after a long series of measurements claiming the contrary. Should he believe it? The answer is: not right away.

If the leader interpreting the intelligence reports holds the initial opinion that it is virtually certain that the dictator is amassing mass-destruction weapons – an opinion that may be expressed as a subjective expectation or probability of, say, 99.9 percent – then what new opinion should the leader reach if the intelligence community (or the head of a UN inspection team) weighs in with a new comprehensive assessment that finds no reliable evidence of actual production or stockpiling?
Adhering to the tenets of Bayes’ formula, the leader would combine the intelligence report with the previous opinion to produce a revised expectation. Upon applying the relevant rule of inductive reasoning, which takes into account the 25 percent error rates, the leader’s personal subjective probability estimate (the previous opinion) would logically decline from 99.9 percent to 99.7 percent! The leader would remain highly suspicious, to put it mildly, indeed very convinced of the dictator’s deceit. …

Believe it or not, a rational leader could receive four negative reviews in a row from the spy agencies and would still harbor deep suspicion of the dictator because the leader’s logically revised degree of belief that the dictator was amassing weapons would only fall to 92.5 percent.

No comments: