INTRODUCTION
Insanity Defense
“The House believes the war on terror has been its own worst enemy.” I was asked to argue this proposition before the Oxford Union for a debate on October 28, 2018. At first I was uncomfortable accepting the invitation. The topic was provocative. The setting—the world’s oldest and most prestigious debating society—was intimidating. But I screwed up my courage and joined the team arguing for the proposition. As I worked through my presentation and supporting arguments, I realized how central the issue is to explaining America’s mistakes in national security policy. The culprits are hubris, complacency, and an inability to comprehend a world in which others may reject our political values and economic model.
Everyone knows that the definition of insanity is doing the same thing over and over and expecting a different result. My work in the defense and intelligence space spans more than three decades, and I am vexed by the fact that policies designed to protect America are actually making us less safe. I call this “insanity defense”: doing the same thing again and again and expecting it to enhance our security. This book chronicles how four administrations have failed to confront some of the toughest policy issues, and it suggests achievable policy fixes that can move us toward a safer future. It is also an account by someone who was, to paraphrase the score from the musical Hamilton, in the rooms where it happened.
Consider the track record of the past thirty years:
Slashing defense and intelligence spending at the end of the Cold War without a strategy for what the world would look like, then defaulting to military force repeatedly after 9/11 with increasingly dismal results.Blowing off multiple terrorism warnings and then creating a homeland security apparatus neglected and misused by successive presidents and Congresses. If 9/11 was a wake-up call, then COVID-19 is a five-alarm fire for revamping how we prepare and respond to security threats.Running the intelligence community on a 1947 business model, reforming it after the Iraq debacle, then undermining it through repeated purges of experienced career leadership.Ignoring the Constitution and the Geneva Conventions when detaining and interrogating so-called enemy combatants in the name of preventing the next attack; when faced with the ugly consequences, failing to enact a sustainable legal framework.Failing to use the lawful tools available to prevent post-9/11 terror attacks, then adopting a massive extrajudicial domestic surveillance program; now letting recently adopted legal provisions lapse, potentially leaving America dangerously exposed.Allowing successive presidents to ignore constitutional checks and balances, most egregiously after 9/11, with military operations, drone strikes, and arms sales launched without congressional approval or oversight.The Congress, weakened by toxic partisanship, enabling its own demise as a coequal branch of government by failing to replace the 2001 Authorization for the Use of Military Force (AUMF) and conceding other powers to the executive, all at a time when bipartisan consensus and action are needed to take on America’s hardest problems.* * *
For most of this period I was there: as witness, legislator, exhorter, enabler, dissident, and, eventually, outside advisor and commentator. Before 9/11, I confronted the impact of defense and intelligence cuts on my aerospace-dependent congressional district—and, ultimately, on America’s national security capabilities—in an era when domestic concerns dominated the political agenda. After 9/11, I was a leading congressional voice on intelligence and counterterrorism issues. I had the opportunity to play a role in the Bush administration’s efforts to fashion a new security architecture in the years following the attacks. When bipartisan cooperation was still valued, President George W. Bush made a point of cultivating a group of us on the Democratic side. I was invited to high-level meetings and, as the ranking member of the House intelligence committee (that is, the most senior Democrat on the committee, which at the time was controlled by Republicans), received the most-classified briefings. I crafted key legislation with Republican counterparts, creating new government structures for homeland security and national intelligence. I also provided bipartisan support for some policies and approaches—on detentions, surveillance, and military intervention—that I later came to regret.
After leaving the Congress, I continued to engage on these issues from my perch as president and CEO of the nonpartisan Woodrow Wilson International Center for Scholars. There I succeeded my friend and mentor Congressman Lee Hamilton, who co-chaired the 9/11 Commission from what later became my office. As an observer, analyst, and in some cases advisor, I saw the Obama administration attempt to “turn the page” on the war on terror while continuing with many of the same approaches—if not policies and people—as the Bush-Cheney administration: an escalating drone assassination campaign, a military “surge” in Afghanistan, and a policymaking approach that, in conjunction with a swollen White House staff, marginalized the administration’s own cabinet departments along with the Congress. And despite the rhetoric about ending “endless wars,” the Trump administration followed roughly the same playbook as its predecessors: more drone strikes and Special Forces raids, continued use of the detention camp at Guantanamo Bay Naval Base, and more troop deployments and arms sales to support autocrats in the Middle East.
The political and strategic entropy is real and cannot be ignored. The reasons are varied and complex, in some cases going back generations. American leaders didn’t realize soon enough that the institutions and habits formed during the Cold War were no longer effective in an increasingly multi-power world transformed by digital technology and riven by ethno-sectarian conflict. Nations that became rising centers of economic and political power, freed from the fear of the Soviets, no longer deferred to America as before. Yet we settled into a comfortable, at times arrogant, position as the lone superpower or “indispensable nation.” At the same time our governing institutions, which had stayed resilient, however imperfectly, through multiple crises, began their own unraveling. Our post–Cold War miscalculations and vulnerabilities were exposed traumatically on September 11, 2001, and have not been fundamentally addressed in the years since.
* * *
To be sure, I have some great stories to tell. But this is not a typical political memoir or retrospective. Arguably, the time for that book was soon after I left the Congress in 2011, when memories were clearer, passions were hotter, and leading personalities (and culprits) were still in office. This is a story of people—leaders who strove to do their best under complex circumstances but were too often undermined by personal, ideological, or bureaucratic blind spots. In many respects, it is also a story of institutions: their cultures, their processes, and, too often, their inability to adapt from an industrial-age analog mindset to our digital world.
The U.S. intelligence community (IC) was the focus for most of my time in the Congress.1 I have great affection and respect for those toiling in the shadows to protect our country, often at great personal risk. I pushed to ensure they had sufficient funding, tools, and authority to do their jobs, and pushed back when they were scapegoated unfairly, including after 9/11. But I also became disillusioned as legitimate requests by the Congress were ignored and trust was shattered. Until recently it was the political left that tended to be most skeptical of America’s intelligence agencies, due to Cold War–era abuses. A large segment on the right still believes a “deep state” conspired against President Trump. Our IC is the so-called tip of the spear in confronting threats against us, everything from terrorism to great power competition to global pandemics. We undermine the IC at great risk to America.
By necessity, presidential power will always grow during times of conflict and crisis. Temporarily shrunk by the end of the Cold War and the distractions created by partisan investigations and impeachment, the White House would rebound with a vengeance after 9/11. I watched up close as multiple administrations operated with an expansive view of the president’s authority as commander in chief. The most egregious executive branch abuses of the Bush-Cheney era were pared back by Congress over time. But a new administration promising “hope and change” would cling just as tenaciously to executive prerogatives with respect to warmaking, counterterrorism, and secrecy.
I spent seventeen years in the Congress as an elected representative and, much earlier in my career, five years as a lawyer and committee staff in the Senate. My time on Capitol Hill is the source of some of my dearest and most enduring friendships with members of both political parties.
Article I of the Constitution provides Congress significant authority and responsibilities in national security, too often unexercised. Notwithstanding some genuine bipartisan achievements before and after 9/11, Congress’s role in national security has succumbed to the toxic and divisive forces that began to permeate electoral politics in the 1980s and 1990s. Years later we still lack a coherent and politically sustainable comprehensive legal framework for dealing with surveillance, detention, and interrogation of terror suspects. Guantanamo is still in business, and the main perpetrators of 9/11 remain there in tropical captivity, untried and unconvicted.
* * *
This book is an explanation, from one participant’s informed—but hardly omniscient—perspective, of how we got here: the series of fits and starts, insights, and misjudgments that put the United States in the position it is in today. America cannot afford a fourth lost decade while threats continue to rise. Yet, as a government and as a nation, we seem to cycle over and over through the same problems and make the same mistakes again and again; this is the definition of insanity. The chapters in this book discuss seven national security challenges that have been addressed inadequately or, in some cases, barely addressed at all. In addition to critiquing the failures and omissions that brought us to this point, this book suggests politically realistic pathways to make significant progress on, if not solve, these perennial hard problems.
ONE
Paying the Price for Over-Militarizing Security
I walked into Rayburn 2118, the stately hearing room for the House Armed Services Committee. It was February 24, 1993, and I had been in office just a few weeks. As the most junior committee member, I took the last seat on the lower dais, eye level with the top Pentagon brass testifying on the defense budget. The committee’s chair, Ron Dellums, a marine and community activist turned congressman from Berkeley, California, opened the hearing: “For the first time in forty-five years we are in a ‘window of opportunity’ where we do not face a major military threat from abroad.” In the coming months Dellums posed a series of fundamental questions about America’s national security policies and institutions. Were the threats more economic and technological than military? Dellums pointed out that the United States did not have a road map for this post–Cold War world. Indeed, we had no real strategy. We were just “treading water.” Few good answers were forthcoming from either the generals or the members of the committee, including me.
We didn’t realize it at the time, but the lack of such a strategic road map would have disastrous consequences. Massive cuts to the defense and intelligence procurement budgets by the administration of President George H. W. Bush (often referred to as “Bush 41” to distinguish his administration from that of his son George W. Bush, or “Bush 43”) hollowed out America’s aerospace industry and skilled workforce, which are so necessary to maintain military superiority. They also crushed my aerospace-dependent congressional district. A diminished military would not, however, stop the United States from pursuing a series of foreign deployments during the 1990s against countries that posed no serious security threat to America. Over that same period the growing terror threat got inadequate attention. After 9/11, we declared a “war on terror” that resulted in a series of military interventions, none more disastrous than Iraq. The result has been more instability, more terrorists, and calls for yet larger military budgets. At the same time, tools of “soft power” that could bring more of the world to our side were underfunded and undermined.
A saner approach then—and now—would be to live our values as Americans and defeat bad ideas (terrorism and authoritarianism) with good ideas (freedom and human dignity). That would require a significant investment in diplomacy, development, and other means of persuasion and inspiration. America will always need a strong military and a willingness to use it, but we can be much smarter about how much we spend on defense—and where.
* * *
The roots of America’s post-9/11 national security fumbles go back to our response to the end of the Cold War. In many respects the 1990s represented a lost decade. I had a front-row seat and was often in the room as a member of the congressional class of 1992, the first elected after the final collapse of the Soviet Union. When I took office in January 1993 the “peace dividend” declared by the Bush 41 administration was widely touted.
What was a dividend for most of the country turned out to be an economic disaster for my congressional district, which ran along the Pacific coast of Los Angeles County and included many small cities that were home to most of California’s satellite production. We did not have big assembly plants churning out hundreds of vehicles and aircraft. What the South Bay, as the area is known, did produce was some of the most technically sophisticated and sensitive assets of the U.S. intelligence enterprise. More important was the human capital—not so many wrench turners, but quite a few of what I called the “triple PhDs who won the Cold War.” As I told the Los Angeles Times during my 1992 campaign: “This is a community of people terrified about losing jobs who have enormously sophisticated training and skills.”
I had asked to join the House Permanent Select Committee on Intelligence (HPSCI)—a so-called leadership committee, with plum appointments made by the Speaker of the House. I already had a close relationship with Speaker Tom Foley. But he told me he had to give the last remaining seat on HPSCI to a fellow Californian who was senior to me: Nancy Pelosi.
Instead I joined the House Armed Services Committee (HASC)—my second choice, but still important to my district. It was during one such HASC hearing that Ron Dellums posed serious questions about America’s strategic priorities after the Cold War. The questions he posed were the right ones, both back then and, in many respects, still today.
Strategy is not just a set of goals and aspirations. A real strategy sets priorities and makes trade-offs. What is not done is often more important than what is done. During the Cold War our strategy was to defeat communism, a battle that basically split the world into two teams. After the fall of the Soviet Union, a handful of senior officials in the Bush 41 Pentagon, supported by a number of conservative writers and analysts, suggested a rather extreme strategy: seize the historical opportunity to establish untrammeled U.S. hegemony, a global Pax Americana sustained by a military as large and powerful as it had been during the Cold War, if not more so.
But few people in either political party were inclined to go in that direction. Or really any direction, as U.S. foreign policy became increasingly ad hoc and personality-driven. Some in the Congress close to President Clinton pressed hard to intervene in Haiti—so he did. NATO was expanded to encompass many of the former Warsaw Pact nations, right up to the borders of the former Soviet Union. Knowledgeable people have different views on the wisdom of moving NATO eastward. As the original author of the policy of containment, George Kennan, pointed out, this move would certainly exacerbate Russian fears and insecurities. But UN ambassador Madeleine Albright, with her inspiring personal story as a refugee from behind the Iron Curtain, carried the day within the Clinton administration.
Some of us hoped the Cold War could give way to a more noble purpose. Military force could be used for the sake of good in places where ethnic cleansing and other human rights abuses violated core moral values. A talented young war correspondent named Samantha Power came up with the foreign policy principle of “responsibility to protect.” But as a practical matter, European allies had to be on board with any proposed military action, and the risk of casualties had to be close to zero. The Balkans would become the scene of U.S.-led military interventions—albeit limited to airpower and conducted well after the worst atrocities were committed. The genocide in Rwanda was left to burn itself out, much to the later anguish of a young State Department official named Susan Rice. At the time I largely supported Clinton’s foreign policy—from NATO expansion to the Kosovo air campaign—though with lingering reservations.
In the absence of real strategy and with defense budgets shrinking, congressional and Pentagon leaders chose to focus on raw numbers: personnel, weapons platforms, and pork. The assumption was that future conflicts would resemble a scaled-down version of the Gulf War (which ended in 1991) or possibly the Korean War (which ended in 1953), with lots of tanks, fighters, and warships doing a better job of what they’d done for decades—fighting other large armies, navies, and air forces. General Colin Powell, near the end of his tenure as chair of the Joint Chiefs of Staff, had endorsed the concept of a “base force” large enough to fight two regional wars at the same time.
Yet it should have been evident even then that the threats to America, indeed the character of conflict, were fundamentally changing. Just one month into my first term, in late February 1993, Middle Eastern terrorists attempted to take down New York’s World Trade Center with a vehicle bomb. Shortly thereafter the U.S. public watched in horror as the bodies of American helicopter pilots were dragged through the streets of Mogadishu by Somali militants. International forces were ambushed there using tactics similar to what soldiers and marines would face a decade later in Iraq. Flush with Cold War victory, we failed to appreciate the significance of these events and chose to focus on other, mostly domestic, concerns. The World Trade Center perpetrators were arrested and ultimately convicted, given long prison sentences. The main lesson from Somalia? No more U.S. boots on the ground, at least in Africa.
In the summer of 1993, Congress dived into debating and approving a defense budget that in constant dollars was about a third less than it had been at the height of the Reagan defense buildup. The chief executives of America’s aerospace and defense firms were summoned to the Pentagon for what was called “the Last Supper,” at which they were informed there would not be enough money to go around to support so many companies—they would have to merge to survive. From about a dozen so-called prime contractors we ended up with six by the end of the decade.1 Such consolidation was necessary, perhaps, but it translated into less competition and innovation.
Under those circumstances, congressional oversight by the House Armed Services Committee devolved into what I called a “widget protection” program. I don’t intend to come across as sanctimonious; I could fairly be labeled a “widget protector” at the time. For example, in 1993 I pulled out all the stops to prevent the Los Angeles Air Force Base from being moved to Colorado, and in 1995 I fought to prevent it from being closed. I advocated for Southern California programs outside my district as well—C-17 cargo planes in Long Beach, the B-2 bomber in Palmdale. (One of my vivid—and somewhat embarrassing—early memories was of trying to climb onto the massive B-2 in a skirt and heels.) Two area firms, Hughes Aerospace and McDonnell Douglas (which built the C-17), were later acquired by Boeing; after practicing some “tough love” toward Boeing when a satellite procurement went off the tracks, I would become known as “Boeing’s mother.”
But good local politics can also produce good national defense. While most of the U.S. military’s major platforms had been recapitalized (that is, replaced with newer, more advanced submarines, combat vehicles, ships, missiles, and aircraft) during the Reagan administration, much of the air force’s mobility fleet—refueling tankers and cargo—was going on thirty years of service life at the time (and today some tankers are older still, put into service more than a half century ago). And no matter what the future held, America would always need top-flight surveillance and reconnaissance satellites—the kind made in my district. If anything, in a fragmenting world—nations splitting into multiple states, radical movements arising from the ashes—we would need more situational awareness capacity, not less.
Unfortunately, the 1990s saw the United States go in the opposite direction with respect to space capabilities. My district paid the social and economic price then, and our country is paying a strategic price today. Early in the 1990s, the top employer in the South Bay was Hughes Aerospace, with more than 5,000 employees working in El Segundo. In 1988, defense-related projects accounted for 85 percent of Hughes’s revenue. When I accompanied deputy defense secretary William Perry in June 1993 for a tour of Hughes, that proportion had fallen to about 50 percent.
The following year (February 1994) I testified as a witness—unusual for a junior representative—before the House Intelligence Committee. I pointed out that “if action is not taken now to stabilize the intelligence industrial base, we could lose important capabilities that have required years to develop … losses likely to be permanent, because the companies and the workforce that make up this industrial base will not be around when the Federal government decides in three or six or ten years that it needs them again.” Once these workers retired or gave up in frustration, their critically important skill set would go with them—human capital that would take years, perhaps even decades, to reconstitute.
I urged the committee to “consider letting intelligence suppliers commercialize their capabilities, where doing so does not undermine U.S. security.” One example entailed using a common satellite “bus,” or basic model, for both defense and commercial purposes. Then there were exports of U.S. commercial satellites to help keep American aerospace providers healthy. A political scandal later in the decade involving the satellite communications company Loral and information that was revealed in violation of the Arms Export Control Act would prompt Congress to make it extremely difficult for U.S. firms to sell overseas—for export control purposes, commercial satellites were now treated like cruise missiles. According to the Aerospace Industries Association, the U.S. global market share of satellite exports fell from 75 percent in 1995 to 25 percent in 2005.
During my third term, from 1997 to 1999, I finally got the coveted seat on HPSCI. It was an entry into a world that, intellectually, represented the high point of my Hill experience. The United States no longer had to worry about the Soviet Union, but a toxic admixture of other perils had emerged out of its ashes and from the shadows in other parts of the globe. These included Al Qaeda, a transnational terrorist network headed by the charismatic son of a Saudi construction magnate—a group that had taken credit for destroying two U.S. embassies in East Africa the previous August. The growing ideological and security threats from Islamic extremism, which should have been apparent since 1993, were finally getting the attention they were due—at least by HPSCI. The CIA was well into the fight against Al Qaeda—or at least trying to be. But it was apparent even then that the U.S. government overall, including the Congress, was not well organized or, frankly, motivated to deal with a threat unlike any we’d faced before.
* * *
The phone call came sometime in early June 1999. The House minority leader, Richard Gephardt, was on the line. The prior summer I had come up short in the Democratic primary for governor of California, a quest that required me to give up my House seat. In recent months there had been rumblings in California political circles that I might have another go at the seat I had relinquished for the governor’s race, though an April 7 Associated Press headline had mostly quashed them: “Former Congresswoman Will Not Seek Her Old Seat.” I initially assumed Gephardt, with hopes of taking back Democratic control of the House in 2000, was trying to change my mind (again) about running. But after some brief pleasantries he got right to the point: would I be willing to serve as a member of the newly authorized National Commission on Terrorism? I didn’t think twice, accepting immediately—and eagerly.
In Washington, D.C., appointing outside commissions to “study” a problem is often seen as a way to avoid dealing with the problem, especially one that is complex and controversial. I did not see this terrorism commission in that light at all; if I had, I would not have joined. Rather, it was a congressionally chartered response to the 1998 Al Qaeda bombings of U.S. embassies in East Africa. Before leaving Congress, I had seen the intelligence showing Al Qaeda growing in power, reach, and ambition. It also helped that the commission was chaired by a former senior State Department official, L. Paul “Jerry” Bremer, a protégé of Henry Kissinger who had served as ambassador to the Netherlands. A Yale and Harvard graduate, son of the former president of Christian Dior Perfumes, he was known at the time as one of the country’s most talented diplomats. He’d also been a counterterrorism chief at the State Department.
My prior committee assignments on defense and intelligence had schooled me to some degree on the issues involved. But this commission was the deepest dive by any American nongovernmental body into the global terrorism problem since the end of the Cold War. There were other outside commissions doing excellent work relating to terrorism, most notably the panel led by former senators Gary Hart and Warren Rudman. But ours was singularly focused on global terrorism in all its dimensions—ideological, diplomatic, financial—and the readiness of the U.S. government to prevent or respond to a major attack. The commission was a diverse, bipartisan group. (I was the only member who had run for political office.) We got the necessary security clearances, traveled the world, and met with people inside and outside government—U.S. and allied intelligence agencies, academics, NGOs, and more.
While I was working on the commission, the “seduction of Jane” campaign (as I called it) by Gephardt and other Democratic congressional leaders continued, and eventually it achieved its intended result: in December I declared officially my intention to run for my former congressional seat. The early stages of the campaign still left plenty of time to focus on the commission, which wrapped up its work and published the report “Countering the Changing Threat of International Terrorism” in June 2000.
Our report, in retrospect, demonstrated the limitations of even the most expert and sophisticated assumptions about terrorism. We correctly identified the emergence of a new brand of terrorists, who are “less dependent on state sponsorship and are, instead, forming loose, transnational affiliations based on religious or ideological affinity and a common hatred of the United States … [which] makes terrorist attacks more difficult to detect and prevent.” Al Qaeda had effectively declared war on the United States with a 1996 “fatwa,” and shortly after the East Africa bombings, CIA director George Tenet declared war on Al Qaeda in an internal message (though his posture did not extend to the rest of the IC, the Pentagon, the Congress, or the White House).
But the traditional paradigm of state sponsorship of terrorism still loomed large. Ideological groups active in Europe during the Cold War remained very much in memory. Readers of the report will be surprised by the number of mentions of Greece, where there had been 146 attacks against Americans since 1975, only one of which was solved (we recommended that Greece and Pakistan both be cited for “not cooperating fully” in counterterror efforts—prompting angry responses from both countries’ embassies). The most recent horrific attack involving American civilians—besides Oklahoma City—was the 1988 downing of Pan Am flight 103 over Scotland at the direction of Libyan intelligence services. Before 9/11, the greatest killer of American troops and diplomats overseas was Hezbollah and related groups, supported by Iran. They were considered responsible for the bombings in 1983 of the U.S. embassy and Marine Corps barracks in Beirut—killing more than 250 American troops and diplomats—and the 1996 attack on Khobar Towers in Saudi Arabia, which killed nineteen U.S. Air Force personnel being housed there and wounded hundreds of other people. Hezbollah, still an Iranian proxy group, also bombed Jewish sites in Buenos Aires in 1995, which showed they had the reach to attack the Western Hemisphere.
Iran would be mentioned in the report more than any other terrorism source—thirty-three times (Al Qaeda was mentioned four times). Afghanistan received some attention as the known home of Al Qaeda leader Osama Bin Laden, but the report focused more on the official U.S. list of state sponsors of terrorism. The Taliban was not then recognized as a state. Iraq was not mentioned once.
Four months later, an Al Qaeda–linked suicide bomber on a speedboat struck the USS Cole, anchored in a Yemeni harbor, killing seventeen sailors and nearly sinking the ship. With the U.S. presidential election less than a month away, the response to the Cole bombing would be left to the next administration. The presidential candidates, Al Gore and George W. Bush, issued statements of condemnation and resolve. Then, immediately after the election, nearly all media and government attention turned to the Florida recount and the issue of hanging chads.
In January 2001, the Hart-Rudman Commission issued the third and final installment of its two-year study, “Roadmap for National Security, Imperative for Change.” Commissioned originally by the Defense Department, this was supposed to be a comprehensive assessment of all national security challenges. Yet, as Rudman testified after 9/11, the group kept coming back to homeland security and terrorism—rather than conventional military threats—as the overwhelming priority. The report’s first installment, in 1999, said: “Large numbers of Americans will die on American soil, victims of terrorism, in the coming century.” Its final installment shaved the time frame for these deaths to the next twenty-five years.
* * *
September 11 was so catastrophic that referring to it as a crime—which it was, albeit on a massive scale—seemed inadequate. For many people, the most devastating attack ever on U.S. soil had to be an act of war, which called on us to wage war in response. Thus the phrase “war on terror” was coined almost immediately. Apart from being illogical—terror is a tactic—the phrase played into the hands of Al Qaeda by elevating them from criminals to combatants (holy warriors or mujahedin, as they saw it, and as they effectively sold it). The phrase “war on terror” and more consequentially the mentality that informed it, would prove costly—hundreds of thousands of lives, trillions of dollars, millions displaced, a region shattered, adherents of a major world religion alienated, and other consequences we still can’t predict. It would also deplete the resources and leadership bandwidth needed to protect the homeland—or to pursue diplomatic and political avenues that leaders could use, by working closely with allies and new partners, to make America safer. Over the next decade the fear of another 9/11-style attack would ease. But the war on terror—military deployments and follow-on “surges,” drone strikes, and other “kinetic” measures—would march on apace.
The Congress passed an Authorization to Use Military Force on September 13, 2001. It was worded broadly to give the president latitude. Nearly two decades later, as will be discussed later, that AUMF is still being used to authorize a variety of military operations. Afghanistan would turn out to be a long, frustrating, and ultimately unpopular campaign. But there we had little choice but to strike hard with military force—at least initially. The timid U.S. reactions to past terror attacks—Bin Laden had taken note of America’s aversion to any military casualties during the 1990s—demanded a major correction.
To the extent America was going to have a “war on terror” in a militarized sense, Afghanistan is where it should have begun—and ended. But, of course, it would not end there. I believe, though it is hard to prove, that the relative ease of the initial Afghan campaign turned out to be a curse, as it led many Americans to believe that our military was invincible and history was on our side. This attitude led to the invasion of Iraq. A tough, prolonged fight to topple the Taliban in Afghanistan might have spared the United States—and the people of the Middle East—much agony later on.
Intelligence experts define “black swans” as shocking world-historical events: Pearl Harbor, the Yom Kippur War, the fall of the shah of Iran, the Soviet invasion of Afghanistan, and—the blackest of them all—the 9/11 attacks. For all the reforms to intelligence systems—and pledges of “never again”—these surprises are almost impossible to avoid. According to Israeli scholar Martin Kramer, the key is not “in predicting black swans, but in responding to them. Each … is an opportunity to be either seized or wasted.” Kramer concludes that the Bush 43 team “wasted their black swan” following 9/11. With the rest of the world recoiling in horror at what Al Qaeda had done, and with sympathy and support for the United States at an all-time high (even in Iran), the opportunity existed to, in Kramer’s words, “forge a new paradigm out of the wreckage.” Instead of a new paradigm, we got Iraq, waterboarding, surges, and drones.
* * *
The proposition that Saddam Hussein was a serious threat was uncontroversial among national security officials and experts from both parties. The Clinton administration spent billions each year policing a no-fly zone over Iraq and launched a multiday bombing campaign in 1998 after UN weapons inspectors withdrew in frustration. Bush upped the ante by citing the “Axis of Evil” (Iraq, Iran, and North Korea) in his January 2002 State of the Union address. Then in September, one year after the 9/11 attacks, the administration issued a new National Security Strategy. Various strategy documents are required by Congress but usually make for soporific reading. Not this time. President Bush declared that the United States “must be prepared to stop rogue states and their terrorist clients before they are able to threaten or use weapons of mass destruction against the United States and our allies and friends.” (Even after 9/11, the administration seemed more focused on state actors—countries with borders and governments—instead of the loosely organized transnational groups that had begun to emerge as major terrorism sponsors.) “To forestall or prevent such hostile acts by our adversaries,” the strategy concluded, “the United States will, if necessary, act preemptively.”
Hence the doctrine of preemption was unveiled—essentially the inverse of the steady, multiple-foci containment strategy authored by Kennan that outlasted the Soviet Union. If America lacked a strategy after the Cold War, this was Bush 43’s answer. The problem was, its assumptions were almost all wrong. In the preamble to his National Security Strategy, Bush pledged that any decision for war would be reached only after “using the best intelligence and proceeding with deliberation.” In this respect, what followed in Iraq was a violation of the president’s own doctrine: the intelligence was far from good, and the process was anything but deliberate. But the drumbeat was becoming louder.
The White House pushed for a congressional vote authorizing force against Iraq during the month before the midterm elections. Democratic members with presidential aspirations, especially those in the Senate, would be put on the spot. A number who had voted against the 1991 Persian Gulf resolution—most notably Joe Biden and John Kerry—would ultimately authorize military action against Iraq the second time around. My constituents, as I told the Los Angeles Times, were “much more concerned about the potential suicide bomber next door than they are about what Saddam Hussein may have in store for us in a year or so when he gains nuclear capability.” The government apparatus for securing the homeland was still a mess. As far as I was concerned, the homeland was still the main front against terrorism.
By then I was back on the HPSCI, having won back my seat and regained seniority just behind Pelosi, who was ranking member. Determined to get Iraq right, I did what had come naturally since my earlier days on the Hill: I went to school on the subject. Not only did I study the highly classified CIA analyses available to me as a member of the House Intelligence Committee, but I went to the United Kingdom to learn what their intelligence agencies had to say. The consensus was overwhelming—Iraq had the supplies and facilities to create biological and chemical weapons (and possibly radiological ones), and some delivery mechanisms. These could be easily transferred to nonstate groups to strike the U.S. homeland. With memories of 9/11 still fresh, the inclination of most of the Congress was to err on the side of assuming—imagining, even—too much rather than too little.
Iraq presented a conundrum. Saddam had started two major wars, had attacked multiple neighbors (including Israel), and was a sworn enemy of the United States. For more than a decade intelligence agencies under the Clinton and both Bush administrations concluded that Iraq almost certainly had weapons and materials that could be used to attack the United States. A regional U.S. military buildup, bolstered by bipartisan support from the Congress, provided the best chance of forcing Saddam to disclose fully his clandestine weapons programs. It was also true that any military operation came with great risks—above all that troops might encounter a chemical gas attack as part of a last-ditch bid by Saddam to survive, though hopefully his generals would take things into their own hands at that point. The notion that Saddam would go to such lengths of obstruction as preventing U.N. inspectors’ access to suspected weapons sites—and thus expose his country to suffering through international sanctions and war—for nonexistent weapons was unthinkable.
We knew deposing the regime meant that we would have to maintain a military presence in Iraq for a sustained period. And it would be expensive—costing billions of dollars, and possibly tens of billions. The administration and most of us in the Congress envisioned a postwar situation akin to that in the Balkans during the mid- to late 1990s. The stabilization phase would be mostly a peacekeeping operation, likely multilateral, using the United Nations and possibly NATO. The administration had been tight-lipped about post-regime plans. But the Bush team—the vice president, secretary of defense, and secretary of state—all had formidable national security chops (two of them had been secretaries of defense in earlier administrations, and a third had been chair of the Joint Chiefs) and reputations as strong managers.
The congressional resolution ultimately negotiated with the White House seemed like a reasonable compromise: the most bellicose language was deleted, and a reference to pursuing diplomacy at the UN was inserted. I told the media outlet The Hill that the new provisions addressed “many of my constituents’ concerns, while also allowing us to address the clear and present threat posed by Saddam Hussein.” This all might sound hopelessly naive and unrealistic, and indeed it turned out to be so. But it represented mainstream national security thinking by serious people at the time. (The far left and isolationist right predictably opposed any military engagement.)
However, not everyone in the establishment bought it. Former Bush 41 national security advisor Brent Scowcroft published an op-ed in the Wall Street Journal opposing a military invasion on the grounds it would destabilize the region and undermine the diplomatic and intelligence efforts against terrorism. He was promptly ostracized by the White House, and he was castigated, mocked even, in the conservative media as out of touch and burdened with a pre-9/11 mentality.
Another figure who did not agree with this view was Senate Intelligence Committee chair (until January 2003) Bob Graham, who represented Florida. Graham conceded the likely presence in Iraq of banned weapons. Although he had been one of only ten Democratic senators to vote in support of the Gulf War in 1991, he feared that using force against Iraq this time would cause the real war on terror—against Al Qaeda and its offshoots—to suffer as a result. Pro-Bush and pro-military sentiments were strong in Florida, but Graham bravely and presciently voted no.
One tragic figure in the lead-up to the vote in the Congress was House majority leader Dick Armey, a Texas Republican. Because he was one of the House’s top Republicans, his support was assumed. However, Armey had his doubts from the outset, and when he was shown the National Intelligence Estimate (NIE) on Iraq, he was underwhelmed (as was Bush 43, thus prompting CIA director George Tenet to say, “Mr. President, it’s a slam dunk”). As we would learn later, the sourcing for many NIE conclusions was unreliable and outdated—stretched to the most alarming possible interpretations. Cheney would eventually lobby Armey personally, showing him additional and more alarming Iraq material (later proved false). Armey’s deep misgivings continued, but he felt he had no choice but to vote yes. We’ll never know if a principled stand by a senior Republican—in the House, Armey was second only to the Speaker—could have slowed the rush to war.
As deputy secretary of defense Paul Wolfowitz reportedly said, the case for Iraq’s weapons of mass destruction (WMD) was the one thing everyone could agree on—it could be sold to allies concerned about Iraq’s perceived violation of multiple UN Security Council resolutions, it was frightening to the American public, and it was most convincing to members of Congress like me. However, those of us who weren’t neoconservatives but still supported the Iraq War resolution failed to consider two core questions of strategy: What if all our assumptions—not just about WMD but also about the motivations of the Iraqi regime—were fundamentally wrong? And, more important, even if those assumptions were mostly right (if, say, stockpiles of chemical and biological weapons turned up), what next? As David Petraeus, at the time a major general, said, “Tell me how this ends.” As we found out later with Afghanistan, whatever the merits of the initial intervention, once a war starts it has a way of perpetuating itself at enormous costs in ways far removed from its original purpose.
I had little interest in the grand designs of the Iraq War’s most ardent supporters. My focus was on how Iraqi WMD could be used to harm the United States. I was impressed by the presentation by secretary of state Colin Powell to the UN Security Council. Shortly afterward I commented to CNN about Powell’s speech: “I think the administration spent too many months using cowboy rhetoric and not enough months putting the facts out.… That’s the right way to build support hopefully for peaceful disarmament in Iraq.… Doing nothing is the worst option at this point.”
* * *
In the months that followed the quick regime change in Baghdad, no weapons of mass destruction were found and the growing insurgency foretold a long, bloody, and costly military campaign. I was encouraged when my friend and fellow commissioner Jerry Bremer was appointed to lead the Coalition Provisional Authority, or CPA (effectively making him the U.S. quasi-colonial viceroy of Iraq). Jerry would later get tarred with the disastrous decision to fire all Baath Party members from the Iraqi government and disband the Iraqi army—dismantling the last remaining cohesive (and respected) institution of Iraqi society. However, very little happened in Iraq without the approval of the Pentagon leadership: the secretary of defense, Donald Rumsfeld, and his lieutenants, deputy secretary Paul Wolfowitz and undersecretary of defense for policy Doug Feith. My law school friend Walter Slocombe was also involved in the CPA and had a major role in designing the de-Baathification policy.
My goal for Iraq during this period was minimal: to reverse a deteriorating situation on the ground, pride and ideals be damned. That meant honoring prewar construction contracts with Saddam Hussein’s government held by France, Germany, and Russia. However, the Defense Department did the opposite—excluding countries that had opposed the war from bidding on huge infrastructure contracts. At that point, given the chaotic situation in Iraq and our failure to find weapons of mass destruction, the United States was in no position to lecture or alienate anyone. The fleeting opportunity to build a new strategy to confront shared threats had been lost.
In September I told CNN: “I want the president to tell us what’s really in store for Americans. How much are we going to pay? What is the possible loss of life going forward? And how is he going to repair the damage to our relationship with international organizations, so that they step up and bear a reasonable share of this?” I later told the Washington Post, “There’s no possible way that we can pay those costs in Iraq.” Yet pay we would: first the $18 billion for Iraqi reconstruction requested by the administration that fall, and then in ever-increasing war-funding requests thereafter.
* * *
I never subscribed to the “Bush lied, people died” battle cry that became popular with many Democrats in the wake of the invasion. Senior leaders in the administration were sincere in their beliefs about Iraq’s WMD program, and most of those beliefs—though not all—were supported by intelligence estimates provided by career professionals. But looking back, it became clear that claims about WMD were simply a means to a long-sought end. The more idealistic neoconservatives thought that a pro-Western democracy in Iraq would have a virtuous liberalizing domino effect throughout the region, crippling the appeal of extremists and even leading to a Palestinian-Israeli peace (“The road to Jerusalem goes through Baghdad” was the phrase used). Whatever the motivation and agenda, the most ardent supporters of regime change were going to have their war no matter what. And we in the Congress let them do it and in some cases helped them.
By 2007, more than 3,000 U.S. troops had been killed in Iraq, tens of thousands had been grievously wounded, and hundreds of billions of dollars had been spent. Iraqi losses and suffering were many times greater. For those of us old enough to remember Vietnam, it was like reliving a nightmare. A debate arose about adding 30,000 ground troops to Iraq (called the “surge”), a proposal championed by a new team including America’s top diplomat, Ryan Crocker, and its savviest general, David Petraeus. I wrote: “A surge in troops may have been a great idea three and a half years ago but it makes no sense now. There is no way to achieve success in Iraq using military force. If, and it’s a big if, stability can be achieved in Iraq, it will only be accomplished by getting buy-in of Sunnis, Shiites and Kurds and making certain that the government is strong enough to act. Neither of these conditions exists now.” I endorsed the idea put forward by Les Gelb, past president of the Council on Foreign Relations, and Senator Joe Biden to create three autonomous regions inside the country for Sunnis, Shiites, and Kurds. I saw a glimmer of hope, strangely enough, in the Bosnia example. As I explained, “After brutal ethnic cleansing following the death of Tito, Yugoslavia separated. People now think of the Balkans as a reasonable success story or at least not a failure.”
Later, after violence in Iraq declined dramatically, administration officials and senators including Lindsey Graham and John McCain would berate those like me who had opposed the surge. It certainly was far more successful militarily than prior performance in Iraq would have led us to expect (even its biggest supporters considered the surge a gamble). The surge bought some time and space for the Iraqi government to get its act together and for the different factions to reconcile. But that didn’t happen. The collapse of Iraqi security forces in the face of an ISIS assault years later showed how shallow and fragile the progress was. It also showed how stability depended on a large and open-ended U.S. presence (McCain mentioned during his 2008 presidential run that it might need to last “100 years”)—a presence that would be unacceptable to most Americans and their elected representatives. Iraq in the first decade of the twenty-first century was not Germany or Japan after World War II, or Korea in the 1950s.
Barack Obama said when he ran for president in 2008 that he didn’t just want to end the enormously unpopular Iraq War, “but I want to end the mindset that got us into war in the first place.” That mindset was an inclination to use military force—or other violent “kinetic” instruments, such as targeted assassinations—as the principal means of dealing with difficult security challenges. (Ironically, Obama himself would succumb to this mindset on more than one occasion after becoming president.)
* * *
One of the most cited, and politically most damning, arguments against U.S. military action in Iraq was that it fatally undermined the “good war” in Afghanistan—the campaign directly tied to the 9/11 attacks. During his presidential campaign Obama, trying to shore up his national defense bona fides against a genuine war hero, John McCain, promised to increase troops in Afghanistan as he withdrew from Iraq. But in clamoring to give full attention to the “good war,” too many leaders, Democrats especially, succumbed to the same militarized instincts that got us into trouble in Iraq in the first place. Because a significant increase in U.S. troops in Iraq in 2007 dramatically lowered levels of violence through the use of counterinsurgency tactics, it was tempting to believe another surge could do the same in Afghanistan. When President Obama took office, he was trapped in two ways: by his own campaign rhetoric and by the Pentagon top brass’s push for escalation.
The seven grim years since 9/11 had convinced me that simply pouring more U.S. forces and massive aid projects into these fractured societies was never going to work. In January 2009, before Obama’s inauguration, I told CNN: “I don’t think inserting 30,000 troops or spending $30 billion to $50 billion is going to turn the corner on this problem.… What will turn the corner on this problem is if the Afghani government cleans up its rampant corruption.… If they are not willing to do it, and on present facts they are not willing to do it, there’s no way inserting any number of troops will win any kind of victory there.”
The original justification of the Afghanistan campaign was to punish the perpetrators of 9/11 and their allies while ensuring the country would not become an Al Qaeda sanctuary again. But since that time the nature of the terror threat had changed under our noses. The main terrorist threat to the U.S. homeland had morphed from a centralized Al Qaeda into a loose horizontal affiliation of groups in Yemen, in the Horn of Africa, and in Iraq after the U.S. invasion. Sending tens of thousands more troops—the total number would reach more than 100,000 at the height of the Afghan surge—did little to address the evolving terrorist threat to Americans at home. A bigger military footprint in Afghanistan might have been stabilizing in the first years after the Taliban fled, and perhaps might even have made a lasting difference. But not later.
I was not willing to write off Afghanistan. The United States could not afford a humiliating defeat there, something akin to the chaotic helicopter evacuation of Saigon in 1975. Millions of girls and women had begun attending school and were joining Afghan civic life in ways forbidden under the Taliban (though the harshest sharia rules and punishments still held sway in parts of the country). I was not on the House committees—Armed Services, Foreign Affairs, or (after 2006) HPSCI—with direct oversight of the Afghan campaign. But I kept well informed from a variety of official and unofficial sources. One of the latter was an extraordinary woman named Sarah Chayes. She first went to Afghanistan as a reporter for NPR, then led an aid project in Kandahar, and eventually became a civilian advisor to the chair of the Joint Chiefs of Staff. Sarah also happened to be the daughter of my Harvard Law School mentor, Abram Chayes, one of the leading international lawyers of his generation. Her mother, Antonia Handler Chayes, an impressive lawyer in her own right, was the first woman undersecretary of the air force, a post she held during the Carter administration. Unlike other Western reporters and aid workers, Sarah lived and worked directly with the Afghan people in a women-run cooperative. She had no security detail. Feeling much affection (and responsibility) toward her family, I was worried about what might happen to her—a kidnapping for ransom, or worse. I’ll never forget visiting her during one of my trips to Afghanistan. We sat around with some of the village elders. Sarah, who spoke Pashtu fluently, was the translator when I asked the elders if she was safe in their community. With some umbrage, they told me that Sarah was their “sister” and would be protected. Still, it was reported that Sarah slept with a Kalashnikov rifle by her bed for a couple of years.
Sarah’s experiences on the ground outside the bubble of the capital, Kabul, provided unique insight into the biggest enemy in Afghanistan: not the forces of the Taliban, as dangerous and vicious as they could be, but corruption. Average Afghans had such terrible experiences dealing with the government at the local, provincial, and national levels that they felt little allegiance to their leaders. Sarah spent a fair amount of time around the ruling Karzai family, and she saw them receive packages of $100 bills from the CIA. They and other Afghan elites eventually received billions of dollars of Western largesse, much of it spent on lavish mansions (de facto escape pods) in the United Arab Emirates and other Gulf monarchies.
A decade and thousands of American casualties later, we are still in Afghanistan, albeit in vastly reduced numbers. The Trump administration has negotiated a “peace deal” with the Taliban, and the numbers of U.S. troops are being further reduced. Many, including me, are skeptical that this deal will succeed.
* * *
Other Obama military interventions would follow the Afghanistan surge—first in the skies over Libya to depose Muammar Gaddafi, then in Iraq and Syria to beat back ISIS. Though each operation had some merit, the cumulative outcome was more use of U.S. military force to address what are fundamentally political problems in the Middle East—between Sunnis and Shiites, between Islamists and secularists, between authoritarians and democrats. During the 2010s, which turned out to be another lost decade, other important U.S. interests and instruments of U.S. power continued to be shortchanged. The Obama administration later pledged to pivot toward Asia and away from the focus on terrorism, but the cycle of military involvement in the Middle East—a geopolitical version of the interminably repeating events in the film Groundhog Day—would continue. President Trump came to office pledging to stop the “endless wars.” Instead, his impulsive, process-free approach—pulling U.S. forces from Syria without consulting allies, ordering the drone assassination of a top Iranian military leader—only created more instability.
* * *
The return to “great power competition” with Russia and China—as stated in the Pentagon’s 2018 National Defense Strategy2—has been the justification for another surge in defense spending. The regions and adversaries may change, as do the weapons and equipment, but the militarized character of American statecraft does not. I’ve seen the Pentagon—with firm support from the Congress—continue to spend astronomical sums on conventional weapons systems, including fighter jets, bombers, helicopters, and large surface ships. The air force’s biggest procurement priority in recent years has been the F-35 Joint Strike Fighter, which will cost approximately $400 billion to buy and hundreds of billions more to operate. The U.S. Navy’s most sacrosanct program is the Ford-class aircraft carrier, averaging $10 to $15 billion per ship (not counting the air wing).
There is no shortage of smart, visionary people in our military who could reimagine and redesign defense spending. The question is whether the White House or the Congress will let them. For example, the Defense Advanced Research Projects Agency (DARPA) has developed an underwater drone that can operate for sixty to ninety days at a small fraction of the cost of a conventional crewed attack submarine. The new chief of naval operations has advocated for a more realistic and affordable mix of crewed warships and autonomous vessels such as ACTUV and the Optical RF Communications Adjunct (ORCA).
There is a U.S. Air Force–led effort to connect all defense platforms over one digital network. The network would allow the various branches of the military to constantly share, sort, and analyze information about enemy threats, new targets, friendly movements, the weather, or anything else relevant to modern combat. Putting the network together and then keeping it secure from hacking or jamming is a huge technological challenge. This new project is being met with skepticism, even fear, in parts of the Pentagon and Congress because it would shift the emphasis (and some funding) away from crewed aircraft to the new network.
In the past, government-funded R&D would fuel innovation in the commercial space (two good examples are the internet and satellite navigation). My priority in the 1990s was getting traditional defense contractors to adapt their skill sets and product lines for “dual use” products. Today, technologies with military applications may start out in the commercial tech sector and be repurposed for defense. Internet giants are becoming defense and intelligence contractors (Amazon for cloud hosting, Microsoft for high-tech combat goggles). The Air Force and National Geospatial-Intelligence Agency leadership rely on inexpensive commercial satellites—some as small as 100 pounds—instead of billion-dollar purpose-built military satellites.
Given the pace of technological change, we can no longer spend ten to twenty years developing major weapons systems that may be obsolete by the time they are produced. This is where Congress—or its authorizing committees, anyway—has been helpful in providing new “rapid acquisition” authorities. These allow the military to skip the usual steps and acquire prototypes, “fail fast” if need be, and continuously upgrade. The Pentagon is getting on board, but the habits of risk aversion—no one wants to see another headline about $800 toilet seats—die hard within the acquisition bureaucracies.
* * *
Getting the military right-sized and buying the right things does not address the more fundamental shortcomings of U.S. statecraft. In many cases, they reach back to the end of the Cold War. During the 1990s, the U.S. Foreign Service stopped hiring new officers, and the U.S. Information Agency, which was responsible for public diplomacy, was folded into the State Department. Budgets for diplomacy increased after 9/11—though not nearly at the scale of defense and intelligence budgets. But much of this new funding was consumed by security requirements in Iraq, Afghanistan, and other unstable posts like Libya. The Trump administration slashed these important “soft power” functions. Also absent was fresh thinking. The State Department personnel system is designed principally to staff embassies, and since the end of the Cold War there are now many more of them. America will always need foreign missions and talented diplomats to staff them. But many of the most relevant bilateral and international interactions are taking place outside official channels through commercial, research, social, and cultural exchanges between nongovernment groups and persons, many of them online. Various attempts at bringing public diplomacy into the twenty-first century—in particular, the Global Engagement Center, whose task is to counter disinformation by everyone from ISIS to the Russians—have had mixed success. Ultimately the real war on terror will be won in the hearts and minds of young people across the world, Muslim and otherwise. For too long their views of what America stands for have been distorted by a succession of U.S. military actions (including drone strikes).
In many respects we’ve come full circle. Nearly twenty years ago, America’s formidable conventional war machine could do little to prevent a handful of radicalized militants using box cutters from bringing our country to a standstill. In 2020, after trillions of dollars in military expenditures and multiple wars, a virus originating in a Chinese “wet market” inflicted even more economic and human damage. Overcoming the most lethal threats of the twenty-first century—at least those threats that pose the greatest risk to the health and well-being of the average citizen—will require staying the itchy trigger finger of militarized statecraft. Ultimately, achieving true security will require embracing a broader “whole of government” and “whole of nation” set of tools that reflects the full strength of America.
Copyright © 2021 by Jane Harman