Practical, Cost Effective and award-winning

Business Continuity, Crisis Management & Information Security Solutions

Phone:

0800 035 1231 (Mon to Fri 9am – 5pm)

Suite 3, The Cotton Mill, Torr Vale Mills, New Mills, Derbyshire, SK22 4HS, UK

Book Review – Psychology of Intelligence Analysis by Richards J Heuer

This is certainly not a new book (the most recent edition came out in 2018), but I only recently became aware of its existence and relevance to risk and crisis management.  Richards Heuer enjoyed a distinguished career in the CIA and wrote the book primarily for his fellow intelligence professionals (and consumers of intelligence such as politicians).  However, the central focus on making sense of complex and confusing situations, where the impact of mistakes is very high, has obvious relevance to the fields of risk and crisis management.

Heuer approaches a broad sweep of psychology, as it relates to intelligence analysis, beginning with three chapters on our “Mental Machinery”.  Of particular interest is his discussion of how we store and retrieve memories.  This provides the foundation for the core of the book “Tools for Thinking”.  These tools include various strategies for generating hypotheses, on the basis of limited information, and choosing amongst competing hypotheses.  As regards the latter, he repeatedly emphasises the need to focus on seeking evidence that enables you to reject a hypotheses, rather than looking for confirmation of what you already believe to be true.  The section concludes with Heuer’s most significant contribution to practice, a step-by-step process for the “Analysis of Competing Hypotheses”.

The penultimate section of the Book, “Cognitive Biases”, arguably repeats some material that is available elsewhere but, once again, Heuer’s practitioner viewpoint illuminates elements that are not routinely highlighted elsewhere.  In particular his discussion of how initial evidence of uncertain accuracy, even if it is subsequently demonstrated to be false, can still colour our judgement is a useful warning to anybody engaged in crisis management; as are his observations on the attention that we pay to the consistency of evidence.  Meanwhile the discussion of our endless search for cause-effect relationships and his analysis of people’s interpretations of verbal descriptions of probability are both very relevant to risk management.  The concluding chapter, “Improving Intelligence Analysis”, is perhaps less directly relevant to risk and crisis management, but serves well to wrap up the various themes discussed throughout the book.