Meeting with President Eisenhower. President Kennedy, President Eisenhower, military aides in Camp David, Maryland, April 22, 1961.

Meeting with President Eisenhower. President Kennedy, President Eisenhower, military aides in Camp David, Maryland, April 22, 1961. Robert L. Knudsen/Wikimedia Commons

Cognitive Science Helps Explain How We Blunder Into War

A new book explores how common flaws in human reasoning drew the U.S. into Vietnam — and how tomorrow’s leaders can avoid them.

In 1945, Vietnam’s Ho Chi Minh wrote to President Truman, thanking him for U.S. assistance in their mutual fight against the Japanese and asking for help against France’s effort to reassert colonial control in Indochina. Truman never got the letter — but there’s little reason to think it would have diverted America from its path to war in Southeast Asia. In his new book, Road to Disaster, U.S. Naval Academy professor Brian VanDeMark explores why. By using the insights of cognitive science to dissect the flawed perceptions and decisions of the Vietnam era, he teaches today’s leaders to spot their own.    

Fearful of alienating French cooperation with the Marshall Plan and Western European defense—particularly the creation of NATO with Paris at its core (France had the largest army in Western Europe)—Truman at first tolerated France’s neo-colonial effort in Vietnam, then indirectly aided it, and finally began actively assisting it after Communist North Korea invaded South Korea in June 1950. Just a few months later, China intervened against U.S. forces and inflicted heavy casualties, intensifying Americans’ belief that all Communist actions comprised elements of a single, implacably aggressive, global war against freedom. Driven by this perception, U.S. military aid by 1954 financed 80 percent of the French war effort in Indochina. Truman’s acceptance of the re-imposition of French rule in Indochina seemed an unpleasant but minor trade-off at the time, with no thought given to the long-term implications for his successors.

Indeed, Truman had bigger things on his mind. Less than a year before the Korean War, the Soviet Union had detonated an atomic bomb and Mao Zedong’s Communists had won the Chinese Civil War. The resulting anti-Communist hysteria in America led by Republican Senator Joseph McCarthy (one of whose staff assistants was Bobby Kennedy) froze perceptions and attitudes. Constantly reinforced anti-Communism had what cognitive scientists call a “priming effect” among Americans, evoking information that was compatible with it and discarding that which was not. Incidents in people’s lives that are especially vivid, or recent, are likely to be recalled with special ease, and thus to be disproportionately weighted in any judgment. Amos Tversky and Daniel Kahneman called this the “availability heuristic,” which leads to systematic biases. Tversky and Kahneman demonstrated this bias through their K experiment. Participants were asked​, "If a random word is taken from a text in English, is it more likely that the word starts with a K, or that K is the third letter?" Participants overestimated the number of words that began with K and underestimated the number of words with K as the third letter—even though it is possible to make three times as many words with K as the third letter than words with K as the first letter, and texts in English typically contain half as many words beginning with K than words with K as the third letter. It is easier to think of words that begin with K than words with K as the third letter. People’s judgments are informed by how easily they recall things. The priming effect and the availability heuristic made American policymakers prone to believe too strongly what they already believed. These combined with a tendency of policymakers—like people in general—to share basic assumptions that they rarely questioned.

Another troublesome factor was what cognitive scientists call “theory-induced blindness”: the fact, writes Kahneman, that “once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws . . . You give the theory the benefit of the doubt, trusting the community of experts who have accepted it.” Because established theories give a coherent view of reality, contradictory facts are often overlooked or ignored until the theory is displaced. Said simply, once people adopt a particular interpretation, they find it very difficult to see things any other way. Theory-induced blindness led nearly all Americans to uncritically assume the monolithic nature of Communism. A stereotype warped their judgment (as happened to many Communists who had a stereotypical view of the United States). Reinforcing these tendencies was the habit of people to reach conclusions on the basis of limited evidence—what Kahneman calls WYSIATI (“What You See Is All There Is”). WYSIATI leads people to focus on existing evidence and ignore absent evidence. People do so because this makes it easier to fit things into a coherent pattern. “It is the consistency of the information that matters for a good story,” notes Kahneman, “not its completeness.” He continues, “We often fail to allow for the possibility that evidence that should be critical to our judgment is missing—what we see is all there is.” Almost perversely, it is by incompleteness that we complete: we construct interpretations on the basis of partial evidence because this facilitates achievement of consistency and coherence, which makes an interpretation plausible. Explains Kahneman, “You build the best possible story form the information available to you, and if it is a good story, you believe it . . . Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

Limited knowledge can make for a compelling story and sometimes that story can lead (even by accident) to a favorable outcome. Indeed, since we can almost never know “it all,” most of our decisions are to some extent based on information that, while fragmentary, can be sufficient. But WYSIATI also generates a limited set of basic assessments, reinforces biases, dulls the pursuit of completeness, and feeds overconfidence. It makes people ignorant of their ignorance. Afflicted by the priming effect, the availability heuristic, theory-induced blindness, and WYSIATI, American policymakers did not realize how just how little they really knew about Vietnam—including that there was much more to know.

As a result, American policymakers failed to grasp, because they did not seek to discover, the historical enmity between China and Vietnam and thus to recognize that there were significant differences between Communists in China and Vietnam and the fact that Ho Chi Minh, though a Communist, behaved primarily as a Vietnamese nationalist. They did not know the Vietnamese adage that the shape of Vietnam’s coastline reflects a spine bent under the weight of China, its great neighbor to the north, with which it had fought more than a dozen wars dating back to 39 C.E. They did not know that, as a result, the Vietnamese developed an unyielding determination to resist foreign occupation at all costs, usually adopting guerrilla warfare to wear down more powerful armies through hit-and-run attacks, avoiding major engagements whenever possible, and using their knowledge of local terrain to establish weapons caches and inaccessible hideouts. They did not know about the Battle of Bach Dang in 938 C.E.—as famous in Vietnamese history as the 1775 Battle of Lexington and Concord is in American history—in which Vietnamese guerrilla fighters dealt a fatal blow to superior Chinese troops through the tacit of feint and strike. They did not know about the Lake of the Restored Sword in the heart of Hanoi, based on the legend that when the Ming Dynasty ruled Vietnam, a fisherman named Le Loi found in his net a magical sword that empowered him to lead his people in a ten-year struggle that drove the Chinese out in 1428 C.E. and when Le Loi offered gratitude to the spirit of the lake, a giant golden tortoise snatched the sword and restored it to the depths. They did not know that when faced with the threat of French domination at the end of the nineteenth century, the Vietnamese had set aside their ancient suspicion of Chinese domination and pleaded with Beijing to come to their aid. American policymakers were conditioned so much that even if someone came to them with such information, they did not pay attention to it or discarded it as irrelevant.

The atmosphere of Cold War America did not encourage such attention and awareness, or the perceptions and distinctions that went with them, so government decision-makers in Washington—for not the last time—remained dangerously ignorant and therefore seriously overestimated ideological factors and seriously underestimated historical and nationalistic ones.

This excerpt from Road to Disaster: A New History of America’s Descent into Vietnam (Harper Collins, 2018) is used with permission.