Error, Folly and Intelligence
select icon to download printable PDF version that will open in a new browser window .
John D. Stempel
Patterson School of Diplomacy
and International Commerce
It ain't what you don't know that will hurt you, it's what you think you know that ain't so.
In this brief observation, the astute Will Rogers captures the essence of today's problem. In the context of governments and nations, the history of intelligence-gathering goes hand-in-glove throughout recorded history. People don't like surprises and both individuals and organizations believe the more information they have, the better off they will be. A corollary is that, given the laws of human nature, people will try to hide information when they believe it is to their advantage. Hence the creation of intelligence organizations to obtain such information, as well as counterintelligence units to protect it from others.
Despite the apparent simplicity of this paradigm, some problems have occurred with consistency across history. First, erroneous or misleading information has been developed. Second, the analysis used to prepare information for use by others has been flawed, producing poor advice, occasionally (some would say often) leading to disaster. Third, at key junctures of history error has been compounded by stubbornness over time to produce what we call "folly." In the context I am discussing it, folly is the continued persistence of error to produce action, usually adverse, which fails to accomplish the desired goals and eventually lead to disaster for those persisting in it.
Historically, the American political system has disregarded the need for formal, organized intelligence by not developing any for nearly 100 years. Permanent U.S. military intelligence organizations first came into existence in the late 19th century. A make-do national intelligence/information office was cobbled together for the first World War, but it was not until the Second World War that the first professional intelligence organization, OSS, was created and then not until the National Security Act of 1947 that a peacetime professional intelligence organization was estabished. The U.S. State Department did not develop a 24-hour crisis management/intelligence center until 1961.
Americans have generally adopted the view that intelligence was somehow immoral (though this never bothered our founding fathers), and the relative safety of the American nation was sufficient to prevent the country from disabusing itself of this notion until December 7, 1941 when the Japanese surprise attack on Pearl Harbor destroyed our illusions.
In the post-World War II period, intelligence activities became mixed up with covert operations -- undercover efforts to shape the political environment -- here we focus on "intelligence" in the informational sense; we shall leave to another time and venue the discussion of the error and folly relating to covert operations which, as a former CIA director said, "require only five per cent of our budget, but create 95 per cent of our problems."
Here we shall note the errors that directly led or contributed significantly to Pearl Harbor, the Battle of the Bulge, the Iranian Revolution, and the Gulf War, as well as the follies involved in the American Revolution, American involvement in Vietnam, and contemporary Iran-American relations.
Follies and errors have their genesis in both individual and organizational failures or inadequacies. Individual deficiencies lay the groundwork for organizational problems and thus need to be dealt with first. Rational theories of policy/decision making emphasize complete and extensive fact-gathering and perception. In fact, we know this is not the case. Individuals perceive events according to their own makeup and biases.
U.S. Army and Navy commanders in Hawaii were convinced that Japan would not attack Pearl Harbor. In the face of mounting evidence that something was afoot, they interpreted each new piece of evidence according to their own preconceptions: The Japanese carriers could not be located because of radio silence--they were headed for Malaysia. Small two-man submarines surfaced off Oahu very early Sunday December 7--simply reconnaissance.
An entire group of men were so certain that Japan would not attack Pearl Harbor that they even decided not to alter the fleet and naval base training exercises in any way to increase readiness and reconnaissance, disregarding entirely the possibility that they could be wrong. Similarly, Allied commanders in Europe in December, 1944 were so certain that the Germans would adopt a defensive deployment that they did not even look for signals that Hitler might not take a fully rational approach to the problem of defending Germany, and hence missed the German buildup. In the 1979 Iranian Revolution, the Shah deluded himself up to and beyond the last moment that a serious challenge to him and his regime was growing.
Individuals are frequently in error, but more often than not they realize their mistakes when matters begin to go wrong and events turn out differently than anticipated. But there are cases, however, more numerous than one would like to think, where persistence in error leads to folly because self-correcting mechanisms do not come into play for various reasons. An individual's ego is simply too tied up in a fixed position to permit change. His or her arrogance simply will not admit a wrong view.
Hard-line British statesmen and politicians in the pre-revolutionary and revolutionary period of the American Revolution --extending over 20 years --fall neatly into this category. From the time of the Stamp Acts forward, British prime ministers and lord chancellors were outraged at the colonies' reaction to governance from London without representation.
As matters grew worse and led to war, the willful blindness of Frederick, Lord North and Lord George Germain brought on the unity of the American colonies and the military defeats at Saratoga and Yorktown which brought down the British Government. Statesmen on both sides of the Atlantic, most notably Edmund Burke in England and Benjamin Franklin in America as well as most historians in the years since, believed that absent the stubborn, willful blindness of British statesmen, America would have maintained some sort of a political relationship with Great Britain.
A similar phenomenon was exhibited during America's engagement in Vietnam from 1955-1975. Through four presidents, two from each party, America gradually escalated her involvement in Vietnam. In the early years, from 1955 until 1965, an argument could be made in the Cold War context for the proportionality of U.S. engagement to the goal sought -- keeping Vietnam free. However, when the steady march to a full commitment of half a million fighting men and an overwhelming percentage of American military and diplomatic resources in a small corner of the globe is assessed in a broader context, the term "folly" is not too grand. In the end, the American effort tore the U.S. apart, cost the President his job, alienated a generation of young men and women, created the behavior that led to Watergate and a deep distrust of government -- and permitted the conquest of South Vietnam by North Vietnam.
A generation of American statesmen, David Halbertstam's "best and brightest," proved unable to reexamine their own faulty assumptions about America's ability to do whatever it wanted to without the solid, persevering support of the American people. Key leaders such as President Lyndon Johnson, Secretary of Defense Robert McNamara, Secretary of State Dean Rusk and others, including military leaders such as U.S. Joint Chiefs of Staff Chairman General William Westmoreland, persisted in asserting and defending positions than an increasing number of political and diplomatic leaders came to believe were untenable. The parallel with the British leadership in respect to the American revolution is striking: arrogance and stubbornness persisted in beyond the time when any rational calculation showed benefit, plus a simple unwillingness to back down. This latter trait of stubborness seems more common the higher up one ascends the political and military ladder.
It ought to be a simple matter for leaders to learn from the past--but as many have suggested and history has shown, this is more difficult than it appears. At this point, if wisdom were possible, leaders would reexamine their direction and a change of course should be possible. But it seldom happens. If reassessment does not occur, pursuit of the original mission, in the words of historian Barbara Tuchman, "enlarges the damages until it causes the fall of Troy, the [Protestant] defection from the Papacy, the loss of a trans-Atlantic empire, the classic humiliation in Vietnam."
All the above are often magnified if the issues and players come from different cultures. For example, in the Pearl Harbor case, Americans assumed the Japanese thought the same way they did about strategy and the relative hierarchy of goals. No one questioned the Roosevelt administration's embargo on shipments of strategic materials in July, 1941. Yet instead of inducing compliance with American desires that Japan stop attacking its neighbors, it drove the Japanese warlords to consider attacking the U.S. rather than "submit" to the United States. For the Japanese, "honor" was more important than "rationality."
Similar cross cultural dissonances can be found elsewhere. British statesmen in London in the 1700s who pressed for a hard line against the colonies had little or no understanding of how the American experience had affected the colonists. Similarly, American statesmen in the 1950's and 1960's knew little about Southeast Asia and less about guerrilla warfare, despite America's experience in the Philippines 60 years earlier. Similar issues have bedeviled the U.S. Iranian relationship on both sides from the mid-1950's to today.
The problems noted above are multiplied many times when organizational issues become involved. Perhaps the best description of this can be found in Irving Janis' book, Groupthink.
At Pearl Harbor, the major organizations involved -- the separate Army and Navy commands in Hawaii (later, of course, after creation of a unified Defense Department in 1947, a unified theater commander was chosen) and the War Council in Washington (precursor of today's joint chiefs of staff) did not even share similar views of the danger of the Japanese threat in late 1941. The Navy high command thought the Army had gone to a high state of readiness after the Nov. 24 war alert from Washington, but neither bothered to check with the other, and the Navy assumed that Army radar and antiaircraft batteries were fully activated when this was not the case.
Janis suggests that effective organizational relations inhibit critical thinking about issues. As a consensus builds about an issue or a problem, dissenters are marginalized. The effective impact of this is to stifle doubts and discourage expression of alternative theses, let alone serious consideration of them. This is the phenomenon known as "groupthink." The striving for consensus and cohesion overrides individual members' motivation to take a hard look at alternative courses of action. This especially affects intelligence organizations, which are more likely than other bureaucracies to generate dissenting information vis--vis other units. Where there is uncertainty about such information, individuals frequently resolve that uncertainty in vavor of the data supporting either their personal or organizational views.
Examples of this abound: The commitment of high command leadership at Pearl Harbor to the view that the Japanese would not attack brushed aside and discouraged all contradictory views. The Pearl Harbor hearings are replete with instances where officers said they had misgivings about something but decided not to raise the issue. In the case of Vietnam, intelligence from the field was slanted and sometimes withheld if it did not agree with "the party line" expressed from Washington by senior leadership.
In the case of the Iranian Revolution, evidence was often "interpreted" by U.S. Washington units in ways which coincided with American policy of support for the Shah and minimized the danger to him and thus to America's position in Iran. In some cases, information was interpreted differently in the State Department from the way it was viewed within the CIA. The resulting organizational conflict paralyzed American policy on at least two critical occasions.
In gathering intelligence through a large organization, management style becomes a serious issue. Given problems of misperception, bias and uncertainty, how do managers correct for these problems? Stratgic planning units of some organizations are often given the role of exploring different concepts or providing independent analysis. At Pearl Harbor, there was no such mechanism for bringing an alternative view to bear. During Vietnam, the CIA differed frequently with Defense Department's intelligence estimates. In the end the CIA was proven right in its disagreement with Defense on such arcane matters as unit strength and the lack of significance of "body counts."
Since elements of the old OSS, precursor to the CIA, were integrated into the State Department just after World War II, the Secretary of State has had his/her own intelligence subunit, the Bureau of Intelligence and Research, to provide an alternative view to that offered by the Department's geographic bureaus, as well as other departments.
Organizational leaders thus have a considerable amount of responsibility for the effectiveness of their organizations in acquiring and assessing information. Obviously, one key element of this is to seek intelligent, well-balanced, curious people to work in an organization, and then to train them well. This has important implications for recruiting which are just beginning to be fully explored in corporate and government circles.
But recruitment and training of such individuals is not enough--the organization itself must be structured to promote effective acquisition of information, timely in-depth analysis, and smooth integration of information into policy/decision-making processes. Beyond that, the organizational culture and top leadership must be such that flexibility and openness mark the cooperative interaction process, or else informal, personal norms of behavior will defeat formal organizational exhortations.
A study of errors and follies suggests that arrogance, rigidity and inflexibility, plus cross cultural unawareness, are to blame for the failure of human and organizational environments to deal successfully deal with their environments. Since those environments change, adapting historical insights to the future requires a careful look at the future rather than a rote application of past lessons learned.
Preparing for the Future
Rapid change seems to be the hallmark of the current epoch, but we are more wary than our ancestors of the last century that change automatically means progress. Rapid change implies the need for obtaining information about the changing environment, but those who wish to abolish the CIA clearly wish to throw the baby out with the bath.
It seems self-evident that the United States needs good intelligence. General Motors to the Episcopal Church or the Boy Scouts all need good information, or "intelligence." Moreover, this information/intelligence needs to be well-analyzed and shaped into an effective guide to action. Otherwise, these organizations will become increasingly marginalized by contemporary life. The same is true of individuals, who draw on the work of organizations to increase their knowledge.
But there are impediments. The increased speed of communication and faster flow of transportation requires us to know more about more things, more quickly. Ten years ago someone in the field of international relations could easily prepare a short talk on key international issues--the Cold War, trade issues, etc. Today, the structure of the world is less hierarchical, both politics and trade are more decentralized, there are 182 members of the U.N. instead of the 50 which formed the organization in 1945.
The emergence of the computer, e-mail, and the internet have greatly aided the expansion of our awareness, but these technologies also affect intelligence and intelligence-gathering in another important way--by multiplying the volume of information available. This, in turn, complicates analysis. One should also check key internet sites and join electronic exchange groups. Adequately searching becomes more important than it used to be, as well as more difficult to do.
Another implication of burgeoning information is the growing need to focus on what intelligence agencies call "all source" information. The distinction between classified and unclassified information is breaking down. As Kentucky Prof. Vince Davis noted 25 years ago, "the invention of the xerox machine serious diluted our ability to keep secrets." The growing power of "secret-leakers," as opposed to "secret-keepers," beginning with the Pentagon papers in 1971, is testimony to this.
The mushrooming of information also makes it even more difficult to enforce the traditional distinction between "secrets" and "mysteries." A secret is something that can be discovered, a fact that is not yet known--for example, the number of Soviet MIRV missiles, how many divisions Iraq has on the Kuwaiti border. A mystery is much more ephemeral--Saddam's intentions, Yeltsin's plans, why men and women bicker.
A certain consequence for policy/decision makers will be the need to deal with more probabilities and to develop action strategies to cope with less, rather than greater, certainty. The current controversy over the use of biochemical weapons in the Gulf War exemplifies the difficulties inherent in the search for certainty as well as the difficulty of trying to draw secrets out of a mystery.
Sorting and searching, as well as developing an appropriate model or paradigm of what is relevant will take on increased importance for intelligence organizations. For example, given limited resources, should the United States focus more on China, or on Russia? How much costly intelligence structure should we devote to the states of the former Yugoslavia?
When one adds to this the problems of disinformation deliberately put forward by nations to mislead others the problem again multiplies. Being taken in by a deception is an error; continuing to believe it as it unravels is a folly. In the future, awareness of these possibilities must remain even closer to the analyst's consciousness. Setting parameters for assessment will become even more critical than it is now.
From governments to corporations, the need for counter-intuitive and probability analysis, the "what if..." scenario builders, will grow. Finding those who can do this work without excessively alienating their colleagues will be a real coup. More important, and perhaps more difficult, will be developing leaders who can live with and manage the resulting uncertainty.
Studying error and folly give us a powerful sense of the "what ifs" of history and government: What if John Kennedy had lived? Would he have been more flexible and less rigid than Lyndon Johnson and withdrawn American troops from Vietnam, cutting American losses? Could a different group of British leaders have managed to mollify the colonists thereby creating a great trans-Atlantic federation? Would we have been better or worse off if those "what ifs" had happened?
While it is true that "good" decision can turn out poorly without any reference to the way it was made, and "bad" decisions can turn out well despite the deficiencies of the process, in the long run most of us would bet on things done well to produce better results. to accomplish this, the most important quality an individual needs is a sense of humility. This will greatly help him or her to avoid the arrogance of misplaced certainty, and it will help defuse that cocksureness that leads men and women in all stations in life to ignore those warnings which would save them from both error and folly.
Scholars, intelligence officers, political and business leaders would all do well to bear in mind the remark of former Baltimore Orioles manager Earl Weaver, who said "It's what you learn after you think you know everything that really counts."