Monthly Archives: December 2017

Can Subjective Probability Be Expressed As A Number? What Does The CIA Say?

The Psychology of Intelligence Analysis summary:

This volume pulls together and republishes, with some editing, updating, and additions, articles written during 1978-86 for internal use within the CIA Directorate of Intelligence. The information is relatively timeless and still relevant to the never-ending quest for better analysis. The articles are based on reviewing cognitive psychology literature concerning how people process information to make judgments on incomplete and ambiguous information. Richard Heur has selected the experiments and findings that seem most relevant to intelligence analysis and most in need of communication to intelligence analysts. He then translates the technical reports into language that intelligence analysts can understand and interpreted the relevance of these findings to the problems intelligence analysts face.

Money quote, chapter 12 pages 152 – 156

Expression of Uncertainty

Probabilities may be expressed in two ways. Statistical probabilities are based on empirical evidence concerning relative frequencies. Most intelligence judgments deal with one-of-a-kind situations for which it is impossible to assign a statistical probability. Another approach commonly used in intelligence analysis is to make a “subjective probability” or “personal probability” judgment. Such a judgment is an expression of the analyst’s personal belief that a certain explanation or estimate is correct. It is comparable to a judgment that a horse has a three-to-one chance of winning a race.

Verbal expressions of uncertainty—such as “possible,” “probable,” “unlikely,” “may,” and “could”—are a form of subjective probability judgment, but they have long been recognized as sources of ambiguity and misunderstanding. To say that something could happen or is possible may refer to anything from a 1-percent to a 99-percent probability. To express themselves clearly, analysts must learn to routinely communicate uncertainty using the language of numerical probability or odds ratios. As explained in Chapter 2 on “Perception,” people tend to see what they expect to see, and new information is typically assimilated to existing beliefs. This is especially true when dealing with verbal expressions of uncertainty.

By themselves, these expressions have no clear meaning. They are empty shells. The reader or listener fills them with meaning through the context in which they are used and what is already in the reader’s or listener’s mind about that context. When intelligence conclusions are couched in ambiguous terms, a reader’s interpretation of the conclusions will be biased in favor of consistency with what the reader already believes. This may be one reason why many intelligence consumers say they do not learn much from intelligence reports.

It is easy to demonstrate this phenomenon in training courses for analysts. Give students a short intelligence report, have them underline all expressions of uncertainty, then have them express their understanding of the report by writing above each expression of uncertainty the numerical probability they believe was intended by the writer of the report. This is an excellent learning experience, as the differences among students in how they understand the report are typically so great as to be quite memorable.

In one experiment, an intelligence analyst was asked to substitute numerical probability estimates for the verbal qualifiers in one of his own earlier articles. The first statement was: “The cease-fire is holding but could be broken within a week.” The analyst said he meant there was about a 30-percent chance the cease-fire would be broken within a week. Another analyst who had helped this analyst prepare the article said she thought there was about an 80-percent chance that the cease-fire would be broken. Yet, when working together on the report, both analysts had believed they were in agreement about what could happen. Obviously, the analysts had not even communicated effectively with each other, let alone with the readers of their report.

Sherman Kent, the first director of CIA’s Office of National Estimates, was one of the first to recognize problems of communication caused by imprecise statements of uncertainty. Unfortunately, several decades after Kent was first jolted by how policymakers interpreted the term “serious possibility” in a national estimate, this miscommunication between analysts and policymakers, and between analysts, is still a common occurrence.

I personally recall an ongoing debate with a colleague over the bona fides of a very important source. I argued he was probably bona fide. My colleague contended that the source was probably under hostile control. After several months of periodic disagreement, I finally asked my colleague to put a number on it. He said there was at least a 51-percent chance of the source being under hostile control. I said there was at least a 51-percent chance of his being bona fide. Obviously, we agreed that there was a great deal of uncertainty. That stopped our disagreement. The problem was not a major difference of opinion, but the ambiguity of the term probable.

The table in Figure 18 shows the results of an experiment with 23 NATO military officers accustomed to reading intelligence reports. They were given a number of sentences such as: “It is highly unlikely that. . . .” All the sentences were the same except that the verbal expressions of probability changed. The officers were asked what percentage probability they would attribute to each statement if they read it in an intelligence report. Each dot in the table represents one officer’s probability assignment.

While there was broad consensus about the meaning of “better than even,” there was a wide disparity in interpretation of other probability expressions. The shaded areas in the table show the ranges proposed by Kent.

The main point is that an intelligence report may have no impact on the reader if it is couched in such ambiguous language that the reader can easily interpret it as consistent with his or her own preconceptions. This ambiguity can be especially troubling when dealing with low-probability, high-impact dangers against which policymakers may wish to make contingency plans.

Consider, for example, a report that there is little chance of a terrorist attack against the American Embassy in Cairo at this time. If the Ambassador’s preconception is that there is no more than a one-in-a-hundred chance, he may elect to not do very much. If the Ambassador’s preconception is that there may be as much as a one-in-four chance of an attack, he may decide to do quite a bit.

The term “little chance” is consistent with either of those interpretations, and there is no way to know what the report writer meant. Another potential ambiguity is the phrase “at this time.” Shortening the time frame for prediction lowers the probability, but may not decrease the need for preventive measures or contingency planning.

An event for which the timing is unpredictable may “at this time” have only a 5-percent probability of occurring during the coming month, but a 60-percent probability if the time frame is extended to one year (5 percent per month for 12 months). How can analysts express uncertainty without being unclear about how certain they are? Putting a numerical qualifier in parentheses after the phrase expressing degree of uncertainty is an appropriate means of avoiding misinterpretation. This may be an odds ratio (less than a one-in-four chance) or a percentage range (5 to 20 percent) or (less than 20 percent). Odds ratios are often preferable, as most people have a better intuitive understanding of odds than of percentages.

I’ll probably (heh) use the ranges in the figure to do Bayesian updates in the app I’m coding.


Posted by on December 27, 2017 in Bayes

NeuroLogica Blog

My ὑπομνήματα about religion

Slate Star Codex



Matthew Ferguson Blogs

The Wandering Scientist

What a lovely world it is

NT Blog

My ὑπομνήματα about religion


Understand your mind with the science of psychology -


Musings on biblical studies, politics, religion, ethics, human nature, tidbits from science

Maximum Entropy

My ὑπομνήματα about religion

My ὑπομνήματα about religion

My ὑπομνήματα about religion

Skepticism, Properly Applied

Criticism is not uncivil


My ὑπομνήματα about religion

Research Digest

My ὑπομνήματα about religion

Disrupting Dinner Parties

Feminism is for everyone!

My ὑπομνήματα about religion

The New Oxonian

Religion and Culture for the Intellectually Impatient

The Musings of Thomas Verenna

A Biblioblog about imitation, the Biblical Narratives, and the figure of Jesus

The Syncretic Soubrette

Snarky musings from an everyday woman