1
1990: Houston
Cowboy Cosmopolitanism and the End of History?
For a new breeze is blowing, and a world refreshed by freedom seems reborn; for in man’s heart, if not in fact, the day of the dictator is over.
—GEORGE H. W. BUSH, INAUGURAL ADDRESS, JANUARY 20, 1989
On Monday morning, January 1, 1990, the president of the United States and his wife awoke in Suite 271 of the Houstonian, a 22-acre resort in Houston, Texas, that served as their official voting address. The plush setting, with 66,000 square feet of athletic facilities in the shadow of Houston’s gleaming office towers, epitomized the city’s—and the president’s—cowboy cosmopolitanism, mixing Texas pioneers’ traditional, hardscrabble, frontier values with the glitzy, gilded, boom-time sensibilities of the Eighties. Charging $12,500 in initiation fees, the swanky spa was where God would live “if he’d been rich,” locals joshed. The president’s enemies sneered that between his ancestral Kennebunkport, Maine, estate and his years in Washington, America’s chief executive was using this $550-a-night hotel address as both tax dodge and subterfuge, helping this Connecticut Yankee aristocrat impersonate a Texan.
December 31 began for George H. W. Bush with a quick flight to San Antonio, to visit soldiers wounded in the invasion of Panama. He had launched Operation Just Cause eleven days earlier to depose the drug-running, election-sabotaging Panamanian dictator Manuel Noriega. That afternoon, the president golfed at the exclusive Houston Country Club. The game ended ominously with a missed putt.
As the president returned to his hotel suite, Houstonians were planning splashy celebrations to ring in the 1990s. Nearby, the Yellow Rose Carriage Company was offering horse-drawn tours around the Houston Galleria Mall, the gleaming campus of consumption that was one of America’s ten largest malls. Modeled after Milan’s glass-domed nineteenth-century Galleria Vittorio Emanuele II, the mall covered 1.6 million square feet, featuring Neiman Marcus as its luxurious anchor, a skating rink, an office tower, and two hotels.
Across town, the Washington Avenue Showbar was hosting its “New Year’s Eve Party of the Decade” a “going-away party for Ronnie, Nancy, Ollie, Jim & Tammy, Oprah & Geraldo … and the other crazies of the ’80s.” Houstonians were finishing a wild decade. Americans were enjoying an eighty-six-month economic expansion as of January 1990 that the Reaganaut economist Martin Anderson called “the greatest economic expansion the world has ever seen.” More remarkably, America was winning the Cold War, burying the fundamental assumption that the United States and the USSR would be facing off for centuries. Texan swagger seemed appropriate for the entire country: richer, safer, more confident than anyone would have dared imagine a decade earlier.
Houston We Have Problems
Still, for all the progress, Americans’ fantasy that they were free of history and headaches was overblown. The oil glut that helped trigger the Reagan boom nationwide ruined many Texans. Salaries drooped as the crude oil price dropped—from $37.42 per barrel in 1980 to $18.33 in 1989. “The Capital of the Sunbelt” also suffered as Ronald Reagan’s budget cuts reduced federal aid to cities by two-thirds, from $37.3 billion in 1980 to $12.1 billion a decade later. With housing prices sagging and jobs disappearing, Houston was troubled.
Desperate and clever, Texans started modernizing. Houston was more than crude cowboys and speculating oilmen. Hosting NASA’s Mission Control at the Johnson Space Center, the nation’s largest concentration of petrochemical processing plants, and Rice University’s sophisticated medical facilities, “Space City” evolved into yet another white-collar urban R and D center. During the decade, Texas grew twice as fast as the rest of the nation.
America’s fourth most populous city, with 1.6 million people, Houston was one of its most sprawling, nearly half the size of Rhode Island. Reflecting modern urban America’s two great shames, H-Town’s poverty rate hit 15 percent in 1990, and crime jumped by as much as 29 percent during the 1980s. The crack epidemic tripled the number of cocaine users nationwide from 1986 to 1989, reinvigorating America’s post-1960s crime wave. More than twenty thousand Americans were murdered annually, often in crack-related disputes or frenzies.
President Bush unveiled a $1.2 billion crime package in May 1989. But fear and resignation prowled too many streets. “When people are afraid to walk out of their houses, between sundown and sunup, it’s a big problem and to ignore it is a political mistake,” recalls Al From, who was challenging fellow Democrats to restore party credibility by fighting crime seriously.
A decade later in 1999, Houston’s homicide rate would be down by 63 percent from its peak. Statewide, despite a population that would grow 25 percent, the number of crimes would drop more than 20 percent. Effective policing initiated by Houston’s Lee Brown, among others, worked. Brown would become New York’s first African American police commissioner in January 1990. Prosperity, an aging population, more police on the streets, more criminals in jail, and, most important, fewer crackheads helped too. More broadly, America’s “recivilization” during the 1990s, as the Harvard neuropsychologist Steven Pinker calls it, would reduce the number of violent assaults against the body nationwide, although many feared mounting assaults against the soul.
The wave of Hispanic immigration continued, doubling the number of Hispanics in America from 1990 to 2010, reaching 50 million, 1 in 6 Americans. In absorbing what one demographer would call “an entire Venezuela’s worth of Hispanics,” America’s capacity for diversity expanded exponentially. Jesse Jackson’s rainbow rhetoric of the 1980s became a demographic reality of the 1990s nationwide. In 1980, 64 percent of Houston was non-Hispanic white; 19 percent was black; 15 percent was Hispanic. In 1990, only 57 percent was non-Hispanic white, the black percentage was 18 percent while the number of Hispanics grew, now constituting 21 percent of the city’s population. Many of those Hispanics also accounted for the 13 percent of Houstonians who were foreign born.
A mass of “illegal” immigrants, ranging from 4.5 million to as high as 13 million people nationwide, posed social, political, and ideological challenges—even if advocates prettified the phenomenon by calling them “undocumented” or “unauthorized.” Texas had at least 438,000 illegals in 1990. The number would more than double to over 1 million in 2000.
America in the 1990s would welcome more immigrants than ever. Both the authorized and unauthorized immigrants enriched and enlivened America. Still, a functioning democracy could not have millions of residents living in the shadows. A proud nation could not have the rule of law flouted and its borders violated. Individuals in a democracy could not be marginalized, exploited, abused, and perpetually branded illegal, with the economy addicted to these shadow workers who earned less than citizens for menial jobs.
For all these challenging changes, a charming traditional streak persisted, even in this muggy, traffic-choked, oil refinery–smelly, sprawling, corporate tower–dominated modernist city. Many Houstonians’ New Year’s Day celebrations would include black-eyed peas, a Southern talisman for good luck, perhaps because the beans swelled when cooked, suggesting expanding horizons for the New Year, perhaps because the humble dish was the rare food marauding Northern troops did not bother seizing during the Civil War.
The president and First Lady had a more pedestrian takeout Chinese dinner by candlelight. “We were the earliest people in bed in America, I think. Nine o’clock, reading in bed,” Barbara Bush reported. She and the president hadn’t “seen midnight” in forty New Year’s eves. In 1980, this big-boned, no-nonsense, matronly woman’s down-to-earth Waspy refusal to dye her hair had prompted mean gibes that the fifty-six-year-old George Bush was campaigning with his mother. Now, her authenticity fed her popularity as the antidote to the imperious, nouveau riche Nancy Reagan.
From Cold War to World Peace?
Mrs. Bush deflected questions about her New Year’s resolutions by vowing to “give up desserts … until tonight, maybe.” Her husband, who had been outed that Saturday crumbling Butterfinger candy bar bits on his oat bran while standing on the resort’s breakfast line, took the question more seriously. His New Year’s wish, he said, was “Peace. World peace.”
Bush’s formulaic answer seemed heartfelt—and suddenly attainable—that magical New Year’s Day. The nuclear-tinged Cold War conflict between the United States and the Soviet Union itself was ending, smoothly, peacefully, surprisingly quickly. In Prague, 5,450 miles away, the dissident playwright Václav Havel had just been sworn in as Czechoslovakia’s ninth president, only eight months after international intervention freed him from prison. Havel ascended following the lightning-fast, student-spurred, forty-one-day “Velvet Revolution.” This mustachioed modern prophet promised democratic elections within six months and freed thousands of prisoners, to start the post-Communist healing.
Bush’s cautious streak had made him mealy-mouthed throughout 1989. His formidable mother Dorothy Walker Bush had taught him not to gloat. His national security adviser Brent Scowcroft and others doubted the Communist implosion and feared infuriating the Soviets. Scowcroft insisted “The Cold War is not over,” two days after Bush’s inauguration. The president wanted a “very deliberate” foreign policy: “encouraging, guiding, and managing change without provoking backlash and crackdown.” When asked whether the Cold War had ended, the president sputtered: “so I—but if the—in the—I want to try to avoid words like Cold War.”
Meteoric progress throughout 1989 compelled a new tone. Bush had interrupted his post-Christmas hunting trip to send a message promising American support for Czechoslovakian democracy. The White House statement said Havel’s “astonishing” December 29 election “marks a fitting end to a year of astonishing change in Eastern Europe.”
In Moscow, the general secretary of the Soviet Communist Party and the president of the Supreme Soviet, Mikhail Gorbachev, the man most responsible for this peaceful upheaval, was acknowledging 1989 as “the most difficult year of perestroika,” his restructuring and reform program. Stores were empty. Workers were mobilizing. Provinces were restive, with ethnic extremists rioting sporadically. Nevertheless, Gorbachev, who had discussed these changes with Bush in Malta in early December, toasted 1989 as “the year of the ending of the cold war” while predicting: “The 1990’s promise to become the most fruitful period in the history of civilization.”
The giddy crowds popping champagne bottles and launching streamers on Prague’s Wenceslas Square, its cobblestones made slick with spilled spirits, to hail Havel, Czechs’ long-sought freedom, and this new, bold Soviet leader, had much more to celebrate. On Christmas Day, Romanian rebels had executed their Communist dictator Nicolae Ceau?escu along with his imperious wife Elena. Two weeks before that, Bulgaria’s Communist government had approved multiparty elections. Seven weeks earlier, on November 9, 1989, the people of East and West Germany had together dismantled the Berlin Wall, that despised Cold War symbol Reagan had begged Gorbachev to “tear down.”
That July, Bush had traveled to Poland and Hungary. In June, the dissident labor Solidarity movement had swept Poland’s free elections. In May, Hungary had inched toward lifting the repressive Iron Curtain dividing the free West from the oppressed Communist East by dismantling its 150-mile-long border fence with Austria. “The world is inspired by what is happening here,” Bush, finally animated, had gushed in Warsaw.
And eight months before New Year’s 1990, the Soviet Union had held free national legislative elections, for the first time in seven decades. Even then, few imagined those would be the final elections of what Reagan had called “The Evil Empire,” which would dissolve in December 1991. Had any of these events occurred in isolation, they would have been considered transformative. Together, the cumulative impact was overwhelming.
Bush Inherits Reagan’s Horseshoe
President Bush was finishing a very good rookie year. His call for a “kinder, gentler nation” when accepting the Republican nomination in 1988 kindly, gently, chided his boss. Bush repudiated 1960s permissiveness and 1980s greed. Liberalism was listing but many Americans had soured on Reaganite materialism. In one poll, majorities perceived that yuppies, stockbrokers, and drug users were “losing favor” among their peers. The paradoxical package gaining favor included “parents spending more time with children,” “being concerned about the less fortunate,” “putting one’s career first,” and “having only the best quality things.”
More doer than talker or thinker, Bush would deemphasize rhetoric, ideology, and what he dismissed as “the vision thing.” One Bush aide lamented that “the movie actor’s White House was the one that was hospitable to new ideas. Not the Yalie’s.” Raised for stewardship more than leadership, Bush knew where he stood, not where America was heading—or where he wanted to take it. Ultimately, the great bipartisan success of Reagan and the other Cold War presidents in helping Soviet Communism collapse propelled him.
Bush got results. While tiptoeing around Eastern Europe’s transition, he bravely tackled the Reagan-era Central American impasse. He brokered a bipartisan accord with the Democratic House Speaker Jim Wright to support the Contra insurgents economically, not militarily, while building up to Nicaragua’s elections in April 1990. Again cooperating with Congress, he created the Resolution Trust Corporation to manage the huge costs still menacing the federal budget from the 1980s’ Savings and Loan banking crisis. Ultimately, the Sandinistas would lose the free elections and the bank bailouts would stabilize the economy.
After watching him govern, Americans liked this more moderate Republican. “Ronald Reagan left his horseshoe under George Bush’s pillow,” grumbled David Axelrod, a young Chicago-based Democratic political consultant. The short, sweet, successful Panama invasion reinforced this positive new impression of the once-unpopular president. Bush’s first year approval rating of 76 percent competed with John Kennedy’s 79 percent and Dwight Eisenhower’s 70 percent. “He actually achieved his goals,” said Roger Stone, a Republican political consultant. While others compared Bush to Theodore Roosevelt, speaking softly while carrying a big stick, Stone noted that Bush’s boldness exorcised “the ‘wimp’ word … from the political lexicon forever.”
Still, like recurring pains, three problems would haunt the Bush presidency—and the American people throughout the coming decade. The culture wars that began with the youth rebellion of the 1960s and 1970s, then intensified with Reagan’s counterrevolution of the 1980s, persisted. Cultural controversies clustered sensitive issues together including race, gender identity, sexual practice, individual morality, and collective confidence in the nation’s virtue and future. Similarly, the Reagan-era debate about budget deficits and the welfare state, about tax burdens on the middle class and moral responsibilities to the poor, continued to irritate raw national economic and political nerves. These chronic domestic problems competed with the world’s chaos for presidential attention. Ending the Cold War did not eliminate regional conflicts. Some hostile forces once checked by the Soviets now menaced America directly.
A Polluted Public Square?
More cold warrior than culture warrior, George H. W. Bush feared that fights over art, education, sexuality, and ideology would ruin his “kinder, gentler” stewardship. Nevertheless, the tensions persisted. That New Year’s Eve 1990, as a twenty-foot lighted Lone Star rose up at midnight alongside Houston’s thirty-seven-story Texas Commerce Tower, the soundtrack kids listened to often distressed their parents.
Rap’s rise intensified this age-old problem. Perhaps America’s most demonized song that New Year’s was the lurid, sexually domineering, misogynist “Me So Horny,” from 2 Live Crew’s album As Nasty As They Wanna Be. This monster crossover hit from the rap charts, peaking at 26 on the more staid Billboard 100, was sold on a record album whose cover warned about the explicit lyrics. Still, legislators in at least sixteen states demanded more specific warnings. Prosecutors in Florida and Alabama were preparing cases against record store owners who sold the album to minors. The album’s parent-friendly version, As Clean As They Wanna Be, only sold 200,000 copies in the four months the dirty version sold 1.3 million.
Anti-porn activists and Christian evangelists increasingly viewed America as Vulgaria, a land with no limits where nothing was sacred. New York senator Daniel Patrick Moynihan would soon lament that Americans were “defining deviancy down.” Fears of a Naked Public Square—stripped of religious values to separate church from state—now paled beside fears of a Polluted Public Square—sullied by X-rated lyrics, images, and language. Concerned parents also denounced all-white nihilistic, exhibitionist Heavy Metal acts, especially Guns N’ Roses, Metallica, and Ozzy Osbourne.
Free speech absolutists and entertainers counterattacked. “The true winner is mediocrity—saccharin, overproduced garbage like New Kids on the Block, so middle-class, suburban and clean it makes your teeth hurt,” Steve Marmel, a comedy writer, warned in USA Today. “The losers are diversity and the minority viewpoint, the very things the First Amendment was designed to protect.” Less elegantly, when Florida’s governor Bob Martinez encouraged 2 Live Crew’s prosecution, the band’s July 1990 album, Banned in the U.S.A,would feature a song with a repeated refrain, “Fuck Martinez.”
In March 1990, most record companies agreed to place stickers on “potentially offensive” albums saying “Explicit Lyrics—Parental Advisory.” In The New York Times, a Penthouse editor exposed a more “pernicious” artistic danger circulating unstickered, Richard Wagner’s The Ring of the Nibelung. Peter Bloch sarcastically warned of the libretto’s inclusion of incest, suicide, and other Wagnerian sacrileges.
In Cincinnati, conservatives were advising the Contemporary Arts Center not to run an exhibit displaying Robert Mapplethorpe’s photographs of men inserting a finger, a hand, and a whip into other men’s private parts. That spring, prosecutors would indict the Center and its director Dennis Barrie. In October, a mostly working-class jury would return a “not guilty” verdict, anticipating 2 Live Crew’s Florida acquittals. The ambiguity around definitions of obscenity and the clarity of the First Amendment protection swayed most judges and juries. Still, the museum’s legal bills hit $300,000. American sensibilities had changed dramatically, the columnist E. J. Dionne noted. In 1955, a Gallup poll found that 55 percent of men and 73 percent of women disapproved of “women wearing Bermuda shorts on the street.” Thirty-five years later, “Gallup wouldn`t even think of asking that question … most Americans, reluctantly and uneasily, are prepared to let adults be as nasty as they want to be.”
Republicans happily exploited what Walter Isaacson of Time called this new “age of escapist politics.” In 1990, the Supreme Court would invalidate a federal law prohibiting flag burning. Senate Republican leader Bob Dole warned that if any legislator opposed a constitutional amendment to ban flag desecrations, that vote would “make a good 30-second spot” election time. “Values are always important,” Democratic representative Dick Durbin of Illinois admitted, remembering Michael Dukakis’s 1988 presidential campaign blunders. For Democrats to win post-Reagan, they would have to match the Republicans in the values combat zone.
The Two George Bushes
Bush disliked playing these cultural cards and lacked Ronald Reagan’s certainty in managing the nation’s economy. Reagan perched his vision of the three Ps, prosperity, patriotism, and peace, on a three-legged stool of cutting taxes, fighting Communism, and boosting defense spending. Bush and the Republicans would stumble with no Cold War to fight and no Republican majority in Congress to fight Democratic demands for tax hikes.
By October 1990, with the economy slowing down and the budget deficit building up, Bush waffled. He felt hog-tied by his “Read my lips: no new taxes” campaign pledge but pressured by government shutdown threats—especially after a three-day taste of it over Columbus Day weekend. During three days of negotiations with the Democratic-dominated Congress, he reversed himself four times. Bush wanted to preserve capital gains tax cuts, which favored America’s wealthiest investors. He feared massive automatic cuts imposed by the Gramm-Rudman-Hollings Act. He began negotiating about an income tax surtax, what Reagan had misleadingly, cravenly, labeled a “revenue enhancement.”
Reporters interrupted Bush on a Florida jog, and he answered crassly, “Read my hips,” while patting his backside. On November 5, when he signed the Omnibus Budget Reconciliation Act of 1990, he signed his presidency’s death warrant. One Democratic pollster, Harrison Hickman, rejoiced, that in one day, Bush exposed his “two Achilles’ heels—‘rich’ and ‘wimp.’”
Ultimately, this politically self-destructive act was one of great statesmanship. Bush’s five-year $482 billion deficit reduction package started taming America’s budget deficit, triggering what would be called the Clinton boom. The journalist Jonathan Rauch estimates that Bush’s cuts were 27 percent larger than Bill Clinton’s. “By breaking his promise,” Rauch notes, “Bush put out Reagan’s fiscal house fire, and he enabled Clinton and the strong economy to rebuild the house. Bush thus made two presidents’ reputations and unmade his own.”
Bush also led effectively when Iraqi dictator Saddam Hussein unleashed his army on neighboring Kuwait in August 1990. Saddam’s soldiers easily overran the small Arab monarchy, seizing the oil fields, looting the stores, sending truckloads of shiny Mercedes-Benzes and glittering jewels to Baghdad. Margaret Thatcher, Great Britain’s Reaganite prime minister, challenged her American colleague: “Remember, George, this is no time to go wobbly.”
He didn’t. Thanks to sixty-two phone calls to government leaders in the first thirty days after Iraq’s aggression, Bush forged an impressive coalition of twenty-eight countries, including Saudi Arabia and Egypt. He prevailed upon Israel to stay out of the coalition—and not retaliate when Saddam bombed the Jewish State with Scud missiles.
Bush’s actions that fall would make him look like a foreign policy president uninterested in leading domestically.Time would select “the two George Bushes” as 1990’s “Men of the Year,” a first, for having influenced the world for better and worse. Bush’s aristocratic, statesmanlike “I’ll prevail, or I’ll be impeached” foreign policy spine of steel turned rubbery when domestic issues arose. He too candidly admitted at a press conference: “I enjoy trying to put the coalition together and keep it together.… I can’t say I just rejoice every time I go up and talk to [House Ways and Means chairman Dan] Rostenkowski.… about what he’s going to do on taxes.”
Happy but Pessimistic
Throughout the Christmas season, the focus on Iraq boosted the president’s spirits—and poll ratings. Ninety percent of Americans polled felt good about their lives. This “Don’t worry be happy” mind-set was remarkable considering the despair that greeted Reagan and Bush when they were inaugurated in 1981. Yet this new Nineties’ optimism was brittle. Half of those surveyed feared the nation was on the wrong track. Americans worried about Japan’s predominance, the limits of the Reagan boom, crime, the $3 trillion national debt, homelessness, drugs, pollution, and moral drift. “We’re not number one anymore,” said Barbara Greer, forty-seven, a California real estate agent. “We’re so caught up in our yuppie existence, we’re letting it slip through our fingers.”
The pessimism became acute when pollsters asked what people expected college students to face. “There won’t be jobs for them,” many feared. This anxiety connected to what two out of three respondents blamed as “a major cause” of America’s drug problem, “the breakdown of the family.” Here, as the 1990s began, Americans were still struggling with the sexual revolution that launched in the 1960s and went mainstream in the 1970s.
On New Year’s Day 1990, George H. W. Bush and the Republicans felt too confident for such worries. They were the party of peace and prosperity, taking credit for delivering 18.7 million new jobs and generating $30 trillion worth of wealth. That day, when an African American became mayor of America’s largest city, New York, people celebrated nationwide. An estimated 7,370 blacks were now officeholders. David Dinkins celebrated his personal achievement as reflecting his people’s progress. He remembered “when there were no African-Americans in government.”
By contrast, eighteen days later, Mayor Marion Barry’s cocaine bust in Washington, D.C., exposed the flip side of the African American experience. “King Nightowl” was making the majority-black city a laughingstock even before his arrest. “Know what the mayor’s answer is to a paralyzing snowstorm?” one comic joked: “Quick, gimme a straw!”
Foreshadowing Bill Clinton’s resurrection, Barry expressed outrage that his outrageous behavior outraged Americans. His mayoral mission was too important to indulge moral or legal trivialities. His defiant chutzpah exploited American moral confusion regarding drug use and adultery as lapses not sins. Many constituents remained loyal as he enrolled in the fashionable Hanley-Hazelden Center in West Palm Beach, Florida. “We love our mayor because he is a human being first before he is the mayor,” Gayle M. Petersen wrote to The Washington Post.
Playing the race card as O. J. Simpson would do after him, Barry denounced “The Plan,” the white establishment’s supposed plot to subdue him. Even as two-thirds of African Americans finally prospered economically, millions remained marginalized, ghettoized, consigned to America’s underclass, and open to Barry’s rhetoric. A court eventually convicted him on one of fourteen counts, fining him $5,000 and sentencing him to six months in jail. Still, Barry’s demagoguery further infected America’s great, ongoing racial wounds.
Meanwhile, that January 1, 1990, Americans’ pop culture remained more saccharine than sadistic. The top song was Chicago’s “Look Away,” asking an ex-wife who found a new love not to look “if you see me walking by. And the tears are in my eyes.” Bette Midler’s “Wind Beneath My Wings” asked mawkishly yet compellingly, “Did you ever know that you’re my hero?” Robert Fulghum’s collection of inspirational essays, All I Really Need to Know I Learned in Kindergarten, was starting its sixty-second week as a New York Times best seller. On the fiction list, Tom Clancy’s escapist adventure, Clear and Present Danger, introduced a new post–Cold War set of non-Russian villains, Colombian drug lords.
That television season, ABC’s crude, white, working-class feminist comedienne Roseanne Barr competed for primacy against NBC’s elegant, African American, upper-middle-class fatherly comedian Bill Cosby—both reflecting modern America’s mushrooming mishmash of identities. Along with CBS and ABC, NBC now faced an alphabet soup of cable competitors that emerged in the 1980s, including A&E, CNN, ESPN, HBO, LIFE, MTV, NICK, TBS, TMC, TNT, and USA. For Christmas, home theater systems proved popular, despite costing $3,000 and more. Camcorders also flooded the market, inspiring another top ten TV hit, America’s Funniest Home Videos.
On the Cusp of a Transformational Decade
There was much to celebrate and much to fear on that 1990 New Year’s Day. Many of the men and women who would define the 1990s were living their lives, with most seeming quite ordinary.
Rodney King was serving a two-year sentence for stealing $200 from a Los Angeles convenience store. Clarence Thomas was chairing the Equal Employment Opportunity Commission, awaiting what would be an easy Senate confirmation to the U.S. Court of Appeals for the District of Columbia Circuit. Anita Hill was a freshly tenured professor at the University of Oklahoma law school. Congressman Newt Gingrich was House Minority Whip, dreaming of a Republican takeover of the Congress—run by the Democrats since Dwight Eisenhower’s presidency. Rudy Giuliani remained dumbfounded that the mild-mannered David Dinkins beat him in November, in New York City’s closest mayoral race in more than eighty years.
In Seattle, Kurt Cobain, a fragile twenty-two-year-old guitarist, was back from an emotionally draining six-week European tour peddling Nirvana’s first album, Bleach. Howard Schultz, the one-time employee who had purchased Starbucks in 1987, was roasting over 2 million pounds of coffee annually for forty-six stores but had not yet turned a profit. Jeff Bezos, a young computer science major, felt restive working for Bankers Trust Company in New York. He wanted to work with computers more directly and creatively.
Larry Page was a sixteen-year-old high school student in Michigan. His future Google partner, Sergey Brin, five months younger, lived in Maryland. In 1979, Brin and his scientist parents had emigrated from Russia, seeking the freedom that Soviet Communism denied Jews. In the summer of 1990, Brin’s father would take Sergey and other students to the Soviet Union on a two-week study tour. Remembering his fear of authority growing up, Sergey would tell his father during the trip: “Thank you for taking us all out of Russia.”
O. J. Simpson, the retired NFL running back, was a celebrity pitchman. Despite his Nice Guy persona, O. J. had been arrested a year earlier on New Year’s Day, for beating his wife, Nicole Brown Simpson—one of eight times the police were called to the home. Timothy McVeigh was an infantry soldier stationed at Fort Riley, Kansas. Matthew Shepard was a thirteen-year-old junior high school student in Casper, Wyoming. J. K. Rowling, a former English major, was languishing as a secretary, working at Amnesty International, among other offices. A few months later, stuck on the Manchester train line waiting to get into London, she would start imagining a wizard named Harry Potter.
Monica Lewinsky was a student at Beverly Hills High School, living in the soon-to-be-iconic 90210 zip code. Her headmaster would later remember her as a “nice kid and pretty normal young lady.”
Al Gore, the forty-one-year-old Democratic senator from Tennessee, was completing the worst year of his life. His six-year-old son had been hit by a car after the Baltimore Orioles’ Opening Day game on April 2, 1989, catapulted thirty feet in the air, and landed badly injured but alive. When accepting the Democratic presidential nomination in 2000, Gore would recall that awful moment to humanize his image while emphasizing the seriousness of his mission.
By contrast, George W. Bush, the president’s forty-three-year-old eldest son, was finishing a very, very good year. Sober since 1986, in April 1989 Bush finalized a deal that would make him wealthy, satisfied, and famous on his own. His original 1.8 percent ownership stake in the Texas Rangers would grow, eventually yielding nearly $15 million in profits when Thomas O. Hicks bought the team in 1998. Wielding power privately in his father’s White House, the younger Bush was considering going public in Texas—and would run successfully for governor in 1994.
Over at Hilton Head Island, South Carolina, the governor of Arkansas, his First Lady, his only child, and hundreds of peers were enjoying yet another Baby Boomer bonding bash grandiosely called Renaissance Weekend. Bill and Hillary Clinton had attended since 1984. Bill Clinton would recall that during these three-day talkfests, “We revealed things about ourselves and learned things about other people that would never have come out under normal circumstances.” Many of the resulting friendships would be the basis of FOB, the Friends of Bill network that would campaign tirelessly in 1992, then populate his eventual administration.
Clinton was already planning a presidential run. He had planned a campaign the last election cycle, only to cancel abruptly, fearing “bimbo eruptions.” He would explain when he announced in October 1991 that he refused “to be part of a generation that celebrates the death of Communism abroad with the loss of the American Dream at home.… Our streets are meaner, our families are broken, our health care is the costliest in the world and we get less for it.” A decade into the Reagan revolution, he saw “No vision, no action. Just neglect, selfishness, and division.”
And in Saudi Arabia, a thirty-two-year old rising jihadist, Osama bin Laden, heir to a construction fortune, was enjoying a hero’s welcome for his supposed valor in defeating the Soviets in Afghanistan.
Liquid Modernity: “All That Is Solid Melts into Air, All That Is Holy Is Profaned”
In 1990, 248.7 million Americans were poised to help make the United States undergo its greatest population growth spurt ever over the next ten years, with a 13.2 percent increase to 281.4 million. Globally, Hong Kong was still British. Macau was still Portuguese. Yugoslavia, Czechoslovakia, and the Soviet Union were still united; Germany was still divided, East and West. The “European Union” sounded like a powerful labor organization, not the twelve-country political and economic alliance that would form in 1993, and grow to twenty-eight members today. Nelson Mandela was imprisoned. The World Health Organization still listed “homosexuality” as a disease. The most famous Governor Clinton in American history was DeWitt of New York not Bill of Arkansas. And the most famous Hillary in the world was Sir Edmund, who climbed Mount Everest in 1953.
At the start of that transformational, final decade of the twentieth century, Amazon was only a river and a rain forest, Google was only a very big number with lots of zeroes spelled with no “e”s, googol, and “pay, pal” was something you said to someone who owed you money. The World Wide Web sounded like something that might ensnare Spider-Man. People knew about “hi-fi,” hi-fidelity stereos, not Wi-Fi.
A PowerBook sounded like something from the comics. When we thought of a cell, we thought biology not telephones—which were never smart—while a PDA was an embarrassing public display of affection not a personal digital assistant. Cookies were something you gobbled but wouldn’t disable. Most mail came from the post office not that newfangled Internet.
“Props” were theatrical devices not compliments from “Friends,” who were real, not Joey, Chandler, Monica, Phoebe, and Rachel. People knew about sign language—not “Seinlanguage”—“not that there’s anything wrong with it.” Rent was something you paid not something you saw, while “The View” was something you saw not something you watched. A PlayStation was an area for games in a kindergarten. “Law and Order” was a political slogan not a blockbuster TV franchise. What was mocked as a “girl movie” had not yet been upgraded, slightly, depending on who was saying it, to a “chick flick.” When you said “edgy female duo,” you thought “Laverne and Shirley” not “Thelma and Louise.” TMI sounded liked a multinational corporation not a plea for discretion after hearing too much information.
“Don’t ask, don’t tell,” sounded like a philanderer’s recipe for marital harmony not a military policy. A “Drudge Report” sounded boring not scandalous. “Newt” mostly conjured up thoughts of salamanders, not a Republican, while “Tiger” and “Woods” conjured up fears of dangerous animals in the jungle not on the golf course. There had never been a woman secretary of state, a woman national security adviser, a woman attorney general, a woman senator from New York, and no First Lady who had ever used the White House as a launching pad to electoral office. The Dow Jones stock average was at 2753.20, about to quadruple in a decade to over 11,000. America was at peace. And the world seemed to be changing for the better.
What the French expatriate Ted Morgan celebrated as America’s “opportunity quotient” and “anxiety quotient” were both spiking. Out of the ensuing collision of impulses, loyalties, commitments, values, and visions, Americans weaved together the latest chapter in their extraordinary nation’s collective saga. The winners of the Cold War, that prolonged, often excruciating, rarely bloody, power struggle with the Soviet Union, would become the liberators of Kuwait, the pioneers of the Internet Age, the lost souls of the gay Nineties, and, on that awful September day, the victims of Osama bin Laden’s Jihadist delusions.
One of the anomalies of life in 1990s America would be the accelerating pace of change in one of the world’s most stable societies. In The Communist Manifesto, Karl Marx and Friedrich Engels recognized the destabilizing impact of capitalism’s perennial reinvention and renewal. By “constantly revolutionizing” the economy, the “whole relations of society” change, too, creating “everlasting uncertainty and agitation.” As a result, Marx and Engels argued, tradition, in all its forms, is perpetually threatened with being “swept away.” Anticipating modern America in mid-nineteenth-century Europe, they wrote: “All that is solid melts into air, all that is holy is profaned.” Technology and consumerism, both in a state of perpetual update (or obsolescence) and each enveloping in its own way, further accelerated the changes and the disorientation.
Working off that insight, the sociologist Zygmunt Bauman has identified liquidity as modernity’s defining characteristic. Flexibility, fluidity, immediacy, impulse, individuality, and consumerism all trump solidity, tradition, patience, responsibility, and communalism. Clintonites themselves would realize the value of the liquid-solid, sacred-profane framing, to explain Bill Clinton, his presidency, and the 1990s. While drafting the 1996 State of the Union address, Clinton’s chief speechwriter Michael Waldman, expanding on Clinton’s celebration of this “age of possibility” and “of great challenge and change,” would describe the 1990s as “An era in which things certain seem to melt into air.” In those days when Google was not yet a verb, the speech went through ten drafts before an intern discovered the source. This diligent researcher saved America’s Democratic president from quoting The Communist Manifesto in his State of the Union.
The “End of History” Euphoria—And Distraction
Back in 1980, few Americans would have predicted how many would begin the 1990s happy or confident, let alone victorious. Despite George H. W. Bush’s caution, Houstonians and their fellow Americans were proud of their bloodless Cold War victory. In the great twentieth-century lifestyle war, the Texas-tough Marlboro Man and his domestic-but-chic Martha Stewart wife had defeated Uncle Ivan and his dour Communist spouse. Pumped-up pundits misread the political scientist Francis Fukuyama’s spring 1989 essay, “The End of History” as declaring that ending the Cold War eliminated all of America’s troubles. Fukuyama emphasized the “universalization of Western liberal democracy as the final form of human government.” But, presciently, he saw the “broad unhappiness with the impersonality and spiritual vacuity of liberal consumerist societies” and feared ongoing “terrorism and wars of national liberation” rooted in “ethnic and nationalist violence.”
Those concerns, however, were eclipsed by the post–Cold War euphoria, the Texas-size surge in confidence because “we won.” There were no victory parades; it was not that kind of war. There was great faith in the “peace dividend” paying off existentially and not just financially at budget time. The Nineties would be fun, with Web surfing, rollerblading, Beanie Babies, Pogs, and the Super Nintendo Entertainment System, among the decade’s glorious distractions. Yet on September 12, 2001, with New York and Washington still smoldering, millions of Americans would look back on the gay Nineties and wonder, “What did we do with all that opportunity, why did we fritter away our chances to be great?”
Copyright © 2015 by Gil Troy