Using the analytic techniques contained in this primer will assist analysts in dealing with the perennial problems of intelligence: the complexity of international developments, incomplete and ambiguous information, and the inherent limitations of the human mind.What in cognitive science is called "prejudices". Not a perjorative word, it allows us to fill in gaps in information with data that is generally correct.
Understanding the intentions and capabilities of adversaries and other foreign actors is challenging, especially when either or both are concealed.
The first hurdle for analysts is identifying the relevant and diagnostic information from the increasing volume of ambiguous and contradictory data that is acquired through open source and clandestine means. Analysts must also pierce the shroud of secrecy—and sometimes deception—that state and nonstate actors use to mislead. A systematic approach that considers a range of alternative explanations and outcomes offers one way to ensure that analysts do not dismiss potentially relevant hypotheses and supporting information resulting in missed opportunities to warn.
Cognitive and perceptual biases in human perception and judgment are another important reason for analysts to consider alternatives. As Richards Heuer and others have argued, all individuals assimilate and evaluate information through the medium of “mental models” (sometimes also called “frames” or “mind-sets”). These are experience-based constructs of assumptions and expectations both about the world in general and more specific domains.
For example, if I say the word "car", it's likely that the picture in your mind is of a vehicle with 4 wheels. Not all do of course, but it's a reasonable default, we are prejudiced to believe that unless otherwise specified, cars have 4 wheels. Moving right along...
These constructs strongly influence what information analysts will accept—that is, data that are in accordance with analysts’ unconscious mental models are more likely to be perceived and remembered than information that is at odds with them.I'm teaching a course at the ANU called "Software Requirements Elicitation and Analysis Techniques". The first part, the Elicitation, is essentially Intelligence-gathering. The second part is about figuring out what the data - always incomplete, often contradictory - means in terms of what the requirements of the system must be. What it must do. It matters not that no-one's trying to deliberately mislead or obfuscate the situation, there's enough miscommunication and mutual incomprehension between stakeholders that there may as well be deliberate deception.
Mental models are critical to allowing individuals to process what otherwise would be an incomprehensible volume of information. Yet, they can cause analysts to overlook, reject, or forget important incoming or missing information that is not in accord with their assumptions and expectations. Seasoned analysts may be more susceptible to these mind-set problems as a result of their expertise and past success in using time-tested mental models. The key risks of mindsets are that: analysts perceive what they expect to perceive; once formed, they are resistant to change; new information is assimilated, sometimes erroneously, into existing mental models; and conflicting information is often dismissed or ignored.
The techniques in the Tradecraft handbook have general applicability to analysing incomplete data, not just in the world of spooks and espionage. Recommended Reading.