America’s Midlife Crisis
The article focuses on the idea of "America's Midlife Crisis," and traces America's cultural development and the ruptures which have occurred over the last decade. The authors draw on history, sociology, political science, and other disciplines to examine this development. They also discuss the places of the Bush and Obama presidencies in this narrative, and how their presidencies reflect American cultural values and development. Finally, they look at the directions in which American culture and cultural values may move in the future.
About the authors
Dr. Gary Weaver is founder and Executive Director of the Intercultural Management Institute (IMI) and for four decades a professor in the School of International Service at American University. Adam Mendelson is former Managing Editor of The Middle East Journal and serves on the Editorial Review Board of Intercultural Management Quarterly (IMQ). They are the co-authors of America’s Midlife Crisis: The Future of a Troubled Superpower (Intercultural Press, 2008).
by Gary R. Weaver and Adam Mendelson
posted 16th March 2011
The Growth of American National Identity
On his Inauguration day last year, Barack Obama’s approval rating hovered between the mid-sixties and low seventies. According to Gallup, just before the Inauguration, over 3 in 4 Americans approved of him, an exceptional statistic in an exceedingly partisan era. His sweep to power and cresting popularity drove observers to seek comparisons. Would he be a new Eisenhower, whose approval ratings averaged 65% over his eight years in the office? The next Kennedy, whose approval peaked at 83% and never dropped below 56%? Observers rushed to pronounce judgment on what Obama’s presidency would be like before it even began.
Over a year later, many new comparisons and judgments are now being bandied about as the administration fends off economic, political, and foreign policy crises. Is Obama Reagan in ’82? Clinton in ’94? Or, depending on your political viewpoint, is he the next Jimmy Carter – whose approval ratings slowly, consistently trickled down from a post-Inaugural 75% to a consistent place in the 30s and 40s? Or perhaps Harry Truman (a comparison that George W. Bush liked to make with his own presidency) – whose popularity oscillated wildly, dipping to a low of 22%, before ultimately being vindicated by historians?
In the waning days of the Bush administration, observers tried to determine and quantify his presidency. In April 2008, the History News Network released an informal survey of 109 professional historians, asking whether the Bush presidency was a success or a failure. All but two — just 1.2% — called it a failure. Nearly two-thirds called it the worst presidency ever. Clearly, much of Obama’s early popularity was fueled by the fact that he was simply not George W. Bush.
With such a consensus (informal as it may be) among the historians who will quite literally write the books on how we remember the Bush presidency, is it even possible that the administration will be characterized as anything but the darkest of days? Similarly, how can we gain perspective on the Obama presidency when we are constantly deluged with commentators purporting to give the final, definitive judgment on the meaning and trajectory of his still young administration?
Once history makes its judgment on a president, it is exceedingly hard to challenge or even add nuance to it. Americans love black and white. We grew up on cowboy movies in which the hero wears a white hat, the villain wears a black hat, and the general store doesn’t stock gray hats, because there is no in between. In practice, we often echo the mantra of John Wayne, who said, “If everything isn’t black and white, I say ‘Why the hell not?’”
Many historians tell us that James Buchanan was one of the worst presidents, if not the worst. Numerous surveys and polls over the years have placed Buchanan in the presidential cellar, where he is typically kept ignominious company by the likes of Warren Harding, Franklin Pierce, and Andrew Johnson.
Maligned during his day, before he died Buchanan gruffly argued that “history will vindicate my memory from every unjust aspersion.” Yet, the opposite has been the case: the book on Buchanan was written quickly and, it seems, irrevocably. Buchanan remains a president held in such low esteem that one would think that in addition to fiddling as the embers of the Civil War grew into flames, he kicked puppies and pushed old ladies, among countless other base outrages.
Yet the real picture is slightly cloudier. While Buchanan certainly did not do all he could to save the Union and prevent the Civil War, many events were beyond his control and were true Gordian Knots waiting in vain for a solution that never came; the country was evolving in ways that perhaps could not be controlled by one person. Moreover, his administration was not all doom and gloom and should be seen in more shades than just black and white. The first transatlantic cable was completed. Oil was found in Pennsylvania. The legendary Pony Express began. And…well, 400 gallons of oysters were eaten at his inaugural ball, which one assumes must have been some sort of record.
Buchanan’s successor, Abraham Lincoln, is often found at the other end of the spectrum: he is the frequent top dog of polls and surveys assessing who the best president was, and in case we forget, Mount Rushmore and the Lincoln Memorial are always there to remind us of his greatness. His “official” historical representation is nearly pure white with no black or shades of gray.
However, this popular perception also oversimplifies the controversies that plagued his presidency. The suspension of habeas corpus set a dangerous precedent for the compromising of civil liberties during times of war. Draft riots, sometimes envenomed by racial hatred, painted a bloody background to the conduct of the Civil War in the North. And, the dreaded income tax was first implemented in America during his presidency.
Part of these two presidents’ successes and failures were dependent on their actions, personality, and initiative (or lack thereof). Yet many of their actions and levels of accomplishment also were due to the times in which they were in office. We can never know how one president would fare during another’s term. How would Teddy Roosevelt have done during the Cold War? What would have happened if Woodrow Wilson was at the helm during Reconstruction? While Lincoln’s superior skills may have enabled him to excel during many periods other than his own (and the inverse may be true for Buchanan), we should never discount the importance of when a president was in office to their success or failure.
The development and character of the country at the time of his presidency, and its standing in the world, are essential to judging a president’s skill. Neither nations nor people develop in a vacuum; rather, we judge them both by their pasts and by their peers. When we’re six, we’re expected to be able to tie our shoelaces; our peers can do so. Soon after, we should be able to join the Boy Scouts and Girl Scouts and learn to tie even more intricate knots. Not only is it what many of our peers are doing, but our past behavior has shown that we’re capable knotmakers. Eventually, we tackle more and more complex things — going to school, searching for a steady job, getting married, having and raising children — based not only on the accumulation of our skills and experience, but in order to do things “appropriate” to our age. We develop our unique character, and judge and locate ourselves based on our past accomplishments and our standing relative to our age and peers.
Countries evolve in much the same way. National economies start relatively simple and become increasingly complex and mature. Political systems are conceived of, tested, and reformed. As a country ages, it develops a collective historical experience and national identity as well as a national image of itself in the global system of nation-states.
The national identity of a people grows and develops as a result of unique shared historical experiences and natural resources. While there are individual, ethnic, racial, and regional differences, most Americans share common national values and beliefs which are continually reinforced in schools and the mass media. Just as individuals mature, over time the American national identity has changed, and there are stages of growth and crises we have gone through that have altered our worldview and national image. It’s important to understand where we came from in order to best assess where we are as a country today and to appropriately judge our leaders’ actions. As the historian Arthur Schlesinger Jr. has written, “History is to the nation as memory is to the individual. As persons deprived of memory become disoriented and lost, not knowing where they are going, so a nation denied a conception of the past will be disabled in dealing with its present and its future.”
When we take a bird’s eye, whirlwind tour of the development of the nation, what do we see? Until the late 19th century, American national identity was analogous to the state of an over-protected child with a worldview of naive utopianism. Immigrants came to the country chasing the American Dream of economic success and personal freedom. The country was seen as a land in which anything was possible, where a “city upon a hill” could be achieved that would be a “beacon to all humanity,” in the words of John Winthrop. Buffeted by two massive oceans from the intrigues of the Old World, and with no local rivals for power, the country grew in power, optimism, and self-confidence. While far from wholly abstaining from the international community, the country followed the advice of George Washington’s Farewell Address, in which he cautioned that “The great rule of conduct for us, in regard to domestic nations, is in extending our commercial relations, to have with them as little political connection as possible. Europe has a set of primary interests, which to us have none, or a very remote relation… [T]herefore, it must be unwise in us to implicate ourselves, by artificial ties, in the ordinary vicissitudes of her politics, or the ordinary combinations and collisions of her friendships or enmities.”
This prolonged childhood and its contemporaneous idealism was perpetuated by the nation’s incredible economic success and its isolation and insulation from the rest of the world. The continual economic growth and political stability of this very young nation was considered as evidence that somehow the United States was an exceptional country with exceptional people — a prodigal son among the family of nations.
Just as children go through the emotional turmoil and ambivalence of adolescence, with its conflicting bouts of overconfidence and fear as they socialize with others outside the warm protection and love of their family, in much the same way the United States eventually, haltingly entered onto the world stage, oscillating between periods of total involvement followed by a relative withdrawal.
With the Spanish-American War the United States left its relatively peaceful isolation and insulation from the turmoil of the rest of the world. The mood of the country was for expansion and for the display of its palpably felt domestic power on the international stage. “A new consciousness seems to have come upon us — the consciousness of strength — and with it a new appetite,” wrote a Washington Post editorial in the late 1890s. “The taste of Empire,” it continued, “is in the mouth of the people as the taste of blood in the jungle.”
The period from the beginning of the Spanish-American War until the end of World War II can be regarded as the adolescence of the United States. Adolescents leave the warmth and security of the family to enter both friendly and unfriendly relationships with outsiders. This involvement is often very emotionally intense, highly idealistic with unrealistic and naive expectations, and usually very awkward. Sometimes it leads to acceptance, but it often ends up with rejection. At that moment, many adolescents vow to never get involved again, and become mired in apathy or withdrawal. For example, unrequited love can lead to withdrawal and the false security that stems from noninvolvement. However, most of us venture out again, but with the wisdom of experience. We are more realistic in our expectations; we know that there is the possibility of failure and we realize that there is both pain and pleasure in all intense human relationships. While noninvolvement is secure and risk-free, it offers no possibility of growth or pleasure. Total head-over-heels idealistic involvement and total cynical withdrawal from relationships are often characteristic of the ambivalent extremes of adolescents.
Of course, during the childhood period, the new country was always commercially involved in the rest of the world because trade, especially agricultural trade, was a primary way to bring capital to the New World. Even the earliest British colony at Jamestown in Virginia was a business venture, not just a matter of Europeans fleeing religious persecution and political stagnation. However, when it came to military involvement and foreign policy, historically the United States has shown something of a pendulum pattern: A period of total — and often military — internationalism, often based upon some kind of crusade or high idealism, is followed by relative foreign policy and military disengagement or isolationism and a focus on internal affairs.
The United States went into the Spanish-American War with the noble and idealistic intention of liberating Puerto Rico, Cuba, and the Philippines from a colonial power, Spain. While the war was a military success, it was followed with a period of overwhelming disillusionment and disappointment when the US was viewed as an imperialistic occupier by many in these newly independent nations. Soon after American troops landed in the Philippines, fighting broke out between Filipinos trying to establish their independence and Americans trying to establish their control. Years of fighting resulted in the deaths of thousands of Americans and tens (if not hundreds) of thousands of Filipinos.
Perhaps the most dramatic — and famous — example of adolescent idealism came when Woodrow Wilson led the United States into a major war in Europe — World War I. This war was not depicted as simply a matter of helping friendly nations or crass national interests. It was seen as a “war to end all wars” and a “war to make the world safe for democracy.”
When the war ended, Wilson tried to extend American constitutionalism and values to the entire world, through the vehicle of the League of Nations and his Fourteen Points, many of which reflected classic liberal American values. This was internationalism with a Yankee Doodle missionary zeal; if we could just create the structure where people could come together with good hearts and minds, they would choose to negotiate and cooperate rather than fight wars. In many ways Wilson’s sweeping post-World War I dreams were similar to those of George Bush’s visions of nation building in Iraq, with Bush’s unilateral interventionism substituting for Wilson’s Americanized internationalism.
Wilson’s efforts failed spectacularly. Realpolitik combined with an isolationist desire to withdraw from the world trumped Wilson’s global version of American idealism. He once said, “Sometimes people call me an idealist. Well, that is the way I know I am an American.” The settlement of the war did not occur according to the Wilsonian discourse or ideals. Without any kind of enforcement mechanism, the League of Nations was a paper tiger. Indeed, in the League all nations were not equal, as each nation had the same vote on world issues. Rather than having a United Nations-style Security Council it was an international version of the US Senate. Even the US Congress would not allow the US to join the League and most Americans simply wanted to withdraw from the world of power politics following the experience of World War I.
Adolescent America had experienced another instance of rejection — thus jilted, the country turned away from the world in the subsequent decades. Warren Harding seized on this mood in the 1920 election, proclaiming that “America’s present need is not heroics, but healing; not nostrums, but normalcy; not revolution, but restoration … not the dramatic, but the dispassionate; not experiment, but equipoise; not submergence in internationality, but sustainment in triumphant nationality.”
The country’s adolescence—and its ability to effectively avoid international commitments—ended with World War II. The United States was attacked at Pearl Harbor in 1941, Hitler had invaded most of Europe, and our allies were being occupied by Nazi troops. Franklin Roosevelt had to take action and defend his country. This time the US faced genuine hegemonic military aggression and its involvement in this war was a matter of realism, not idealism.
Young adulthood began with World War II, and the United States could no longer remain withdrawn behind its own borders. The Cold War, beginning after the conclusion of the Second World War, demanded continual involvement and a more mature, adult America to handle its challenges. There was no possibility of withdrawal again. Like a young adult who must now get a steady job, pay the bills, and act maturely, the US could no longer shirk its role as a world leader. Nevertheless, the longing to return to the simple innocence of youthful noninvolvement without the complexity and responsibility of adulthood was a sentiment which always had some degree of resonance in the country.
A national crisis can increase patriotism and strengthen a people’s resolve to stand by traditional civic values, but it also can move the country in new and unexpected directions. World War II gave Americans a deep and broad sense of national unity, and for two or three decades, traditional American “can-do” optimism soared. Throughout this period, there was a strongly held belief—despite many national and international crises—that Americans could achieve great things. Men and women with enormous talent and vision gave us the United Nations, the Marshall Plan, NATO, the G.I. Bill, the interstate highway program, the Peace Corps, the space program, the Civil Rights Act, and much more. Although this was the era of the Cold War and the atomic bomb — along with the Korean War and the Vietnam War — which put Americans in continual or potential military conflict, it also led to the increased use of diplomacy and greater American involvement in international organizations. Like a young adult finding a secure job and a stable family life, the country was settling into stability at home and mature engagement with others in the community of nations.
When the Cold War ended with the collapse of the Soviet Union in 1991, Americans celebrated a further proof of the exceptional nature of the country. But, there were no parades for this victory of the American way, and the US seemed adrift: not only was the role of the world’s lone superpower a difficult, often lonely burden, but it was also one in which no clear enemy, no cowboy in a black hat, could be easily identified. The country was extraordinarily successful and an economic and military juggernaut. Although it was far from a primary concern, some wondered how America’s international role in the world could be defined as it grew into a powerful, mature member of a seemingly less bellicose international community. This created a certain level of stress in the country as we struggled to identify friends and foes (for example, the consternation over Japan’s rise in the 1990s) and argued over whether our preeminent power should make us more or less involved with the rest of the world and its problems.
Adults under stress can regress to an earlier and immature behavior that once provided security. There have been times of regression to early eras of oppression and extremism, simplistic utopianism, and jingoism throughout American history. With the end of the Cold War, many nostalgically wanted to withdraw into a pre-Cold War era of isolation from international alliances and their potential obligation to fight wars.
Nevertheless, following the end of the Cold War, the US developed alliances to control nuclear armaments and it participated in numerous international forums and organizations to foster better care of the environment, control the spread of diseases such as AIDS, and to bring war criminals to trial in the World Court.
However, in 2000, President Bush was reluctant to support the internationalism and interventionism of the previous decade. He viewed the US involvement in the Bosnian War as “nation building” and believed that such involvement in the world was not in the national interests of the US. Bush refused to sign the Kyoto Accords to control global warming, he withdrew the US from the antiballistic missile treaty with Russia, he would not allow the US to join the International Criminal Court, and he threatened to withhold money from the UN. If the Clinton years (and, to some extent the administration of President George H.W. Bush) represented the more internationalist argument, the George W. Bush years represented the more domestic-centric argument — the other end of the spectrum of a withdrawal from world politics and international organizations and agreements. Whereas George H.W. Bush saw globalization as internationalism, his son, George W. Bush, viewed internationalism as unnecessary with the end of the Cold War and the dawn of a unipolar world. Globalization was seen as a teleologically Americanizing process, which could be promoted through unilateral, and perhaps bilateral, involvement.
With the attacks of 9/11, there was no way the President could continue to withdraw from the international community. An attack from beyond America’s borders similarly required the country to go beyond its borders to punish the perpetrators. But the US’s involvement after 9/11 was primarily one of unilaterialism, and the use of military force rather than the international negotiation and consensus-building which the US had increasingly practiced as it aged. The Bush administration declared that there was an “Axis of Evil” between Iran, Iraq, and North Korea that threatened the US and world peace, recalling the black and white, Manichean cowboy movies of old. The unilateralism evident in the Iraq War showed an America wary of the world beyond its borders and willing to go it alone to get what it wanted — like a pioneer of old going out West. This was a return to the earlier, less mature behavior of the US during its early years — its childhood and early adolescence. These actions recalled the US of the Mexican-American War and the Monroe Doctrine, not the more recent (and more mature) US of the Marshall Plan and NATO.
Continuing our analogy between the development of an individual and the nation, we can say that the stress caused by 9/11 catalyzed a sort of national midlife crisis in the US. Just as individuals undergoing a midlife crisis exhibit behaviors incommensurate with their maturity and station in life, our country did the same following 9/11, and returned in many ways to the simplicities of an idealized past and past behavior.
When we go through a traumatic and stressful life event we often long for the simplicity and security of childhood or carefree adolescence. Some of us regress to these youthful behaviors and identities when we are faced with the pain and responsibilities of adulthood. A country experiencing the fear and insecurity of war similarly will often return to the national images of its past. Faced with a difficult, hard to define crisis (much like a midlife crisis), the US couldn’t buy a sports car to make itself feel better.
The country needed to address what was perceived as both an existential and very real threat with concrete actions — with the “can do” spirit. Yet, this desire for action can be rash and misdirected. In their hurt and anger, individuals will sometimes irrationally lash out at any apparent enemy; the people of a country also often want to strike out at any enemy who seems to be a threat. A year after the attacks of 9/11, over 60 percent of Americans believed that “we should get even” with anyone who crosses the US — a 20 percent increase over a pre-9/11 poll asking the same question. Rather than accepting the tragic reality and uncertainties of a dangerous world where there are no clearly defined good guys and bad guys, we perhaps regressed to an earlier worldview. In many ways, the reaction of the American government was very traditional: We resorted to a dualistic, black and white, unambiguous and melodramatic worldview.
As with any individual under stress, there is a nostalgic desire to return to a perceived past when life was simple and clear-cut — a time when there were very explicit distinctions between right and wrong, and our own family provided a safe haven from the threats of the outside world. Post-9/11 insecurities caused Americans to accept simple answers to the many complex problems and ambiguities of the world. We accepted idealized, oversimplified panaceas to complicated problems — believing that the war on terrorism was quantifiably winnable through military means alone, and that our soldiers in Iraq would be greeted with roses and candy. Somehow a “victory” in Iraq and Afghanistan was believed to be able to create a domino effect and bring about democracy throughout the Middle East. We took insufficient time to examine our motives and the repercussions of our actions — a behavior which the country exhibited in its youth and caused such debacles as the legalization of slavery in the Constitution and the Mexican-American and Spanish-American wars.
Presidents tap into the temper of the times — and also shape it. If they do neither, they are ineffectual. If they do both, they are bound for the history books, for better or worse. To some degree, Bush operated in the context of his time — America’s station and development as a nation and the national values of American culture. And in a country where a premium is placed on “doing,” after 9/11, no American president could refrain from taking some kind of bold action.
Yet how and what the president “did” is what matters. One hundred and sixty years ago, the mood of the country was for national expansion. Cries of manifest destiny animated the politics of the nation and the wagonwheels of American settlers pushing West. President James Polk seized on this mood and sought to help shape it through his own policies. One way that he did this was by embarking upon the Mexican-American War. Lincoln, who protested the war from its start, wrote to a colleague: “Allow the President to invade a neighboring nation whenever he shall deem it necessary to repel an invasion, and you allow him to do so whenever he may choose to say he deems it necessary for such purpose, and you allow him to make war at pleasure…If to-day he should choose to say he thinks it necessary to invade Canada to prevent the British from invading us, how could you stop him? You may say to him, — I see no probability of the British invading us’; but he will say to you, ‘Be silent: I see it, if you don’t.’” Lincoln was not alone in his protests — Thoreau’s famous Civil Disobedience was penned as he sat in a Massachusetts jail for opposing the war. (Ralph Waldo Emerson visited his friend Thoreau in jail, asking in horror, “Henry, what are you doing in there?” Thoreau calmly replied, “Waldo, the question is what are you doing out there?”) The dissenters’ protestations were borne out: the debate over whether or not to extend slavery into the territory the US acquired through the Mexican-American War helped to foment the Civil War.
The Mexican-American War (1846-48) was disastrous for the US, although it achieved its primary aim of acquiring territory and slaking the nation’s manifest destiny-inspired fever. Few thought through the war’s consequences. And so it was with Iraq. Bush picked up on the burgeoning midlife crisis and took the country down a course of action which, though resonant with some of the country’s cultural values, was not resonant with the country’s maturation. Context was ignored — in America and in Iraq itself. It is no coincidence that at the beginning of the occupation, Iraq was put under the authority of those who placed not enough value on local context. Nor is it coincidental that General David Petraeus’ context-savvy approach has proven much more successful. Context matters, at home and abroad.
Is Bush destined to join Buchanan on the slow train to presidential infamy? It’s highly likely. Most Americans don’t prefer to dwell upon their midlife crises as their high points — and neither will most think the years of the Bush presidency – the time of America’s midlife crisis - as a high point.
Is this fair? Yes and no. Buchanan did little to prevent the impending national crisis of the Civil War and Bush certainly did little to stymie the country’s midlife crisis. Just as doing is a national value, so are optimism and a willingness to tackle new obstacles with ingenuity — to solve every problem. These values, however, were not much drawn upon in the administration’s response to the crisis. And of course the catalytic event of this crisis, 9/11, would have been very hard, if not impossible, for the Bush Administration to avert. Like Buchanan, Bush was an average or below average president in the most trying of times. Therefore, a great – possibly unfair – amount of hope was placed on Obama that he could be the Lincoln of our times. During his campaign, Obama certainly tapped into the nation’s temper. Cries for change could not have been met with an audience much readier to receive them. While Obama has unequivocally attempted to implement bold policy shifts and programs, the jury must remain out on their long-term effects and consequences.
When we emerge from a midlife crisis we can rely upon the stable, established parts of our life (our job, family, friends, personal character, etc) to overcome the personal crisis. Moreover, having weathered the crisis we are seasoned and strengthened with new strength, wisdom, and maturity. The same may be true for a nation. Just as an often black and white, Manichean perspective and action for action’s sake (despite the potential consequences) have been hallmarks of American culture, so have egalitarianism, a focus on a better future, and optimism. The country must draw upon values such as these in order to successfully move beyond its midlife crisis and into the next stage of its development. Indeed, there are many signs that the US could be emerging from its midlife crisis and grow into the steady and resolute maturity that many in the US — and beyond — look forward to.
Many saw Obama’s election as a symbolic, if not actual, first step in this development or rebirth. Breaking America’s longest held racial taboo, the election of an African-American to the highest office in the country marked a dramatic break from the past and a willingness to embrace change and move ahead, rather than the more backward-looking, navel-gazing qualities that marked its midlife crisis. While domestic policy has been contentious — and the need to address the aftershocks of the economic earthquake last autumn certainly would have derailed any president’s agenda, more discernible shifts can be observed in the realm of foreign policy. The Obama administration’s greater engagement with international institutions (as well as pariah states such as Iran) and concern for improving the global image of the US — seen most dramatically in Obama’s public diplomacy initiatives and speeches in the Muslim world — are evidence of this. These moves were not ideologically motivated, but rather drew upon the country’s mood and that of its people.
Indeed, when it comes to national policy, the American people may be more sophisticated, mature, and progressive than many of their political leaders. They do not want to withdraw from the world and they do not support unilateral military intervention; they favor international involvement and international organizations; and they would like to use diplomacy and negotiation rather than military force. While the majority of them want to end the war in Iraq, they are not sure how this can be done effectively and with the least amount of harm to the US and Iraq. They are unsure as to how the US will militarily disengage from the Iraq War, but it is clear that they want American troops to return home. A similarly nuanced portrait could be drawn for the public’s attitude towards Afghanistan.
This was also the situation in the late 1960s and early 1970s when it came to the Vietnam War. The majority of Americans were united in the firm belief that the war must end, but there was a great deal of uncertainty as to how that might be done. However, when the Vietnam War ended, the US remained fully involved in international affairs, it expanded its involvement in international organizations, and today the US has given Vietnam a “most favored” nation status.
At times in the past, Americans have longed to withdraw from the brutish and tragic aspects of world politics and return to some sort of isolationism from the international arena following a costly military engagement, and yet just the opposite seems to be true today. Furthermore, although Americans are not fatalistically cynical they are also not blindly utopian idealists. They seem to understand that involvement may mean sacrifice and compromise, but it also requires values and ideals to guide public policy. This is a remarkable indication of a more mature and realistic attitude among Americans toward the US role in the world community. As a nation, we have come to realize that it is impossible to disengage from the international system of nations. In a 2007 Pew poll tracking 20 years worth of American political values and “core attitudes,” nearly 9 in 10 Americans asserted their belief that it is best for the US to be active in international affairs, a number which had been consistently high since the poll’s inception. Despite the wars in Iraq and Afghanistan, frustrations with Iran and North Korea, economic tensions with China, and palpable American unpopularity around much of the world, this number has barely declined when the poll last asked this question in 2003. The most recent (2008) report of the Chicago Council on Global Affairs – which has been asking a similarly worded question since 1974 – also found the majority of Americans to be in favor of an active role in world affairs, as they had whenever the Council had polled on the question in the last 36 years. If this number was “soft” — i.e., if Americans did not feel strongly about it — we would expect this number to drop precipitously given all of these frustrations; we would expect Americans to want to distance themselves from the world behind their two oceans. Yet the numbers have not declined, showing that Americans are willing global citizens.
Similarly, in 2002 the Chicago Council on Foreign Affairs found that 41 percent of the problems which Americans mentioned as facing the nation were related to foreign affairs. Certainly people in other countries might find this percentage to be fairly low in light of the impact the United States has on the rest of the world. However, one must bear in mind that in 1998, this number was just 7 percent. The trend has continued: The number of Americans who take an active interest in foreign affairs news has increased markedly over the last several years, and is statistically greater than the respective numbers of Chinese, South Koreans, and Indians who follow foreign affairs.
Americans as a whole are also moving toward a multilateral rather than unilateral global posture, even if this posture has not yet fully “bubbled up” to the upper echelons of government. In 2007, the lowest percentage ever of respondents in the 20 years of Pew Values Surveys indicated that they agreed with the statement “The best way to ensure peace is through military strength.” There is strong support for many multilateral treaties and organizations. According to the Chicago Council’s 2008 polling, over 70 percent of Americans support an international treaty to address climate change and greenhouse gasses. Nearly 90 percent support the Nuclear Test Ban Treaty. Around 70 percent support US participation in the International Criminal Court. And majorities favor giving the UN more powers in a wide variety of areas.
The dichotomy between the people and the government of the US, which many international observers have noted, is often real. Yet, just as someone going through a midlife crisis must eventually make their actions appropriate to their age in order to pass through the crisis, the US government must better reflect the majority sentiments of its people in order to pass through its own midlife crisis. Obama’s increased multilateralism may be reflective of such a trend. In the next decade, political leaders who focus on the future and share these more progressive, internationalist views are likely to win the support of the American people. While fear may have motivated the either/or, dualistic, melodramatic national mind-set after 9/11, it appears that Americans are returning to their traditional optimism and that they are seeking leaders who will restore the moral confidence of the American people.
If there has been a regular oscillation between international involvement and a tendency to withdraw behind the two oceans, for this nation that oscillation has probably ended with the Iraq War. The United States is politically, economically, and militarily interconnected with the rest of the world, and it is now impossible to maintain a policy of noninvolvement in international affairs — or to conduct a foreign policy based on unilateral militarism. Polls clearly show that somehow the national image held by most Americans has indeed become more sophisticated; the country expects its leaders to be actively involved in the world. As the country begins to assume this role, its midlife crisis will pass, and it may look forward to years of a proud, steady and even rarified, maturity. What kind of president Barack Obama is remembered as may well rest on his ability to fully tap into this era’s mood and make the country’s direction match the vision of its people. If he can do this, his presidency will be compared not to the trajectory or comparative polling numbers of his predecessors, but on its own unique merits.
 “Presidential Approlval Ratings,” Roper Center Public Opinion Archives, http://webapps.ropercenter.uconn.edu/CFIDE/roper/presidential/webroot/presidential_rating.cfm.
 Andrew F. Smith, “The first suppers: A tradition of inaugural
meals,” Los Angeles Times, January 14, 2009,
 See, inter alia, Daniel Yankelovich, New Rules: Searching for Self-Fulfillment in a World Turned Upside Down (New York: Random House, 1981) and Edward C. Stewart and Milton J. Bennett, American Cultural Patterns: A Cross-Cultural Perspective, Revised Edition (Yarmouth, Maine: Intercultural Press, 1991).
 Arthur M. Schlesinger Jr., “Folly’s Antidote,” The New York Times, January 1, 2007.
 Cited in Howard Zinn, A People’s History of the United States (NewYork: HarperCollins Publishers, 2003), p. 299.
 Letter to William H. Herndon, February 15, 1848, cited in Paul M. Angle and Earl Schenk Miers, The Living Lincoln: The man and his times, in his own words (New York: Barnes & Noble Books, 1992), p. 110.
 Pew Research Center, Trends in Political Values and Core Attitudes: 1987-2007, p. 19.
 Chicago Council on World Affairs: Worldviews 2002: American Public Opinion & Foreign Policy,” p. 11.
 The Chicago Council on Global Affairs, “Global Views 2006,” p. 6.
 Forty-nine percent agreed, 47% disagreed. Pew Research Center, Trends in Political Values and Core Attitudes: 1987-2007, p. 21.
 The Chicago Council on Global Affairs, “Global Views 2006,” pp. 16-18.
Studies Today Online is
American Studies Resources Centre, Aldham Robarts Library, Liverpool John Moores University, Maryland Street, Liverpool L1 9DE, United Kingdom.
Tel 0151-231 3241
views expressed are those of the contributors, and not necessarily those
of the Centre or the University.
© Liverpool John Moores University and the Contributors, 2010
Articles and reviews in this journal may be freely reproduced for use in subscribing institutions only, provided that the source is acknowledged.