This Is the Dawning of the Age of Self-Reliance
The volatile political and economic conditions of the 1970s profoundly affected the way ordinary Americans thought of themselves and their country. A great many felt betrayed and forgotten. The country they were raised to believe in as the land of opportunity seemed to have dissolved into the land of dead ends. Worse, there was no familiar remedy. The government, exposed by Watergate and other scandals as corrupt, seemed to pursue policies more out of political expediency than out of concern for the average man or woman. As business exerted new influence, and as American cities catered more to investors than their own residents, many Americans with no prior activist experience instinctively turned to community organizing and to consumer and public interest advocacy—if only to try to regain control over their lives. On the national level, it was a lost opportunity for the two major parties. As Michael Kazin noted, the political allegiance of the “upset and forgotten” workingman “seemed up for grabs” in the 1970s. A candidate or party sounding like the movie Nashville’s Hal Phillip Walker or like Ralph Nader might have been able to form a New Majority that would have made Richard Nixon envious. But maybe it was not possible. More important, maybe it did not matter so much. What mattered was that Americans now felt that they were on their own, pushed out of their midcentury comfort and confidence to a new, unsparing frontier. Looking across the plains of American experience from the vantage point of their own front porches, they could see a variety of social, political, cultural, and environmental threats encroaching from different directions. They reacted passionately—sometimes out of outrage, or fear, or despair—and channeled that passion into the only avenue for redress that seemed promising: grassroots organizing.
Much of what one expert called the “government versus the people” culture of the 1970s was cultivated by Richard Nixon’s political operation. Certainly, Nixon’s handling of the war, of protesters, and of political opponents betrayed the public trust. After 1968, most Americans wanted the Vietnam War just to end, and Nixon won the presidency in part by pledging “peace with honor.” Instead, in his first year in office, he willfully deceived the public by creating the appearance of extricating the United States from Vietnam—via a policy of “Vietnamizing” (or de-Americanizing) the ground war and bringing GIs home—while secretly escalating the air war, expanding it beyond Vietnam into Cambodia. Not until the spring of 1970 did it emerge that American B-52s had dropped more than 108,000 tons of bombs on Cambodia over fourteen months, a campaign coordinated in the White House and with the knowledge of only a handful of National Security Council staff members, a few military commanders, and the pilots themselves. And then, of course, Nixon seemed to subvert his own carefully crafted de-escalation narrative by sending ground forces from South Vietnam into Cambodia. The nation erupted in fury, and not only on college campuses.
Just as important as Nixon’s secrecy and deception—at least in terms of the public’s trust in government—was the president’s treatment of his critics. In the same way that candidate Nixon had made political hay out of disparaging protesters of all types during the 1968 campaign, President Nixon wasted no time attacking anyone who dared to challenge his war policies, either by ridiculing them publicly or by spying on them illegally. As early as May 1969, when The New York Times reported that American planes were bombing targets in Cambodia, the White House used both the FBI and its own private surveillance team—the first “plumbers” hired to plug leaks—to place illegal wiretaps on the telephones of National Security Council staffers and several journalists. Five months later, following the October 15, 1969, Moratorium protest, when millions of Americans in at least two hundred cities across the country skipped work and school to participate in a wide variety of demonstrations against the war, Nixon famously dismissed them as a “vocal minority” whom he would ignore. Calling instead for “the great silent majority” of Americans to support his plans for winning the peace, the president questioned the patriotism of the Moratorium participants. “North Vietnam cannot defeat or humiliate the United States,” he said. “Only Americans can do that.”1 And in the wake of the Cambodian invasion, when a student strike spread to hundreds of colleges and universities across the country, Nixon again denounced the protesters, saying that even if he ended the war, these campus “bums” would find another issue to protest violently. Even after National Guard troops opened fire on unarmed Kent State University protesters, killing four and wounding fourteen, the White House blamed the protesters, sternly warning that “this should remind us all once again that when dissent turns to violence, it invites tragedy.”2
In 1971, Nixon’s assaults on the antiwar movement escalated. After a cascade of rallies and peaceful demonstrations brought hundreds of thousands of protesters to Washington in late April, the White House mobilized a joint military and D.C. police response to organizers’ threat to block streets and bridges all over Washington and prevent government employees from getting to work.
As the sun came up on May 3, 1971, military jeeps and other transports swarmed through Washington, tear gas greeted protesters and residents alike, and six Chinook helicopters landed on the Washington Monument grounds, dispatching 198 soldiers in what looked like a massive search and destroy mission. By 8:00 a.m., D.C. police had arrested two thousand protesters, and by noon they had rounded up seventy-five hundred. Police detained thousands in a fortified football field near RFK Stadium, where they “languished without benefit of arraignment.” Such tactics may have been illegal, but they helped keep Washington from being shut down.3 For mainstream America, however, the seemingly constant conflict grew tiring. The news never seemed good. The war labored on, and so did the division in the country.
Six weeks later, the leaking of the Pentagon Papers sparked another high-profile suppression of dissent. Officially known as the “History of U.S. Decision Making Process on Vietnam Policy,” the forty-seven-volume “Top Secret” report had been commissioned by Defense Secretary Robert McNamara in 1967 and completed in January 1969, as the Johnson administration left office. One of the report’s authors was Daniel Ellsberg, a former Marine and Pentagon staffer under McNamara and an ex-student of Henry Kissinger’s. After the Cambodia invasion, he turned unequivocally against the war and made multiple copies of the report in hopes that he could convince certain congressional officials to hold hearings. When Ellsberg found little interest in Congress, he turned to the reporter Neil Sheehan and The New York Times, which began publishing revelatory excerpts beginning on June 13. Over the next couple of weeks, the public watched a back-and-forth battle between a Nixon administration that seemed determined to hide something and the press. The White House turned to the courts to try to stop the Times (and later other newspapers) from publishing the Pentagon Papers, but the Supreme Court, on June 30, ruled six to three against the president.
Although the Pentagon Papers did not cover a single day’s history of Nixon’s handling of the war, the administration’s attempts to keep them secret only made many Americans think the president was continuing previous administrations’ deception. In the wake of the spring protests and Pentagon Papers revelations, one June public opinion poll showed that 61 percent of Americans now regarded the war as a mistake, and in July, 65 percent wanted the administration to withdraw “even if the government of South Vietnam collapsed.”4 So much for a silent majority of supporters.
By the time Americans learned that Nixon had directed that “plumbers” be sent into Daniel Ellsberg’s psychiatrist’s office—to find any dirt they could on Ellsberg—the president had bigger problems.5 The plumbers’ arrest during the June 17, 1972, break-in at the Democratic National Headquarters in the Watergate Hotel and Office Complex in Washington began a process that ultimately ended Nixon’s presidency. Not only did the Senate Watergate hearings (and the investigative journalism of The Washington Post’s Bob Woodward and Carl Bernstein) reveal the Committee to Re-Elect the President’s campaign of dirty tricks aimed at political opponents, as well as the administration’s illegal wiretapping of journalists and NSC staffers, but it made the president of the United States look, once again, as though he had something serious to hide. At the end of October 1973, following the “Saturday Night Massacre,” when Nixon’s order to fire Watergate special prosecutor Archibald Cox led the attorney general and his deputy to resign (only to see Cox fired, in any case, by Solicitor General Robert Bork), the president’s approval rating had plummeted to 23 percent. He never recovered. On August 8, 1974, soon after subpoenaed Oval Office tapes revealed Nixon’s participation in the Watergate cover-up—ordering obstruction of the FBI investigation of the burglary and discussing hush money payments to the burglars—he resigned.
In the aftermath of Nixon’s fall from power, it became common to blame Watergate for the collapse of belief in government. Stanley Kutler, the dean of Watergate historians, writes that “Watergate transformed and reshaped American attitudes toward government, and especially the presidency, more than any single event since the Great Depression of the 1930s, when Americans looked to the President as a Moses to lead them out of the economic wilderness.” When President Gerald Ford gave Nixon a full and complete pardon a month after his resignation, it “added a new element of cynicism.”6 Maybe so, but to view Watergate in isolation is to miss a much bigger picture. Watergate, like the proverbial tip of the iceberg, only hinted at the widespread abuse of power that permeated public “service” by the early 1970s.
Nixon’s resignation was followed, in the public’s consciousness, by wave after wave of revelations about the illegal spying on American citizens by the CIA, FBI, NSA, IRS, and local police in cities across the country. To civil rights and antiwar activists, the exposure of domestic spying proved what they had suspected all along. To most Americans, however, the truth was shocking. In the two years after Nixon’s resignation, congressional investigations led by Senator Frank Church (D-ID) and Congressman Otis Pike (D-NY) revealed a long list of stunning CIA and NSA abuses, including opening hundreds of thousands of pieces of domestic mail since 1956, listening in on thousands of overseas phone calls, and acquiring copies of millions of international cables—to and from ordinary Americans, and all without court-authorized warrants.7 The FBI was no better, but since it had previously enjoyed tremendous popularity in Middle America, news of its COINTELPRO (counterintelligence program) operations hit hard. COINTELPRO resonated with the public in large part because the FBI seemed to employ so many of the same tactics used against Ellsberg and the Democratic Party by the Nixon White House, but used them against American citizens in the civil rights, New Left, and antiwar movements. And particularly because the antiwar movement eventually attracted the participation of many Middle Americans, most Americans found COINTELPRO abhorrent. What most infuriated so many citizens was not only that the FBI had abused its public trust, but that six presidents had known about it. Thus, the Church and Pike Committees went beyond making the connections in the public’s mind between Watergate and the FBI’s abuses; Americans now knew that from Franklin Roosevelt to Richard Nixon, the FBI consistently forwarded political intelligence to the White House, and no president had ever asked the Bureau to stop.8
To make matters worse, a series of revelations also demonstrated that the abuse of power extended to police departments across the country. In particular, police “red squads,” special police divisions that targeted political dissidents, had engaged in widespread illegal behavior. At the most extreme, COINTELPRO documents confirmed that the FBI’s campaign against the Black Panther Party included facilitating the Chicago Police Department’s assassination of the twenty-one-year-old party leader Fred Hampton in 1969.9 Other newly exposed police misdeeds shocked Americans because they were evidently so routine. New York City police officers had files on 1.2 million people and 125,000 organizations, including on Mayor John Lindsay; Representatives Charles Rangel, Herman Badillo, and Shirley Chisholm; actor Dustin Hoffman; and women’s rights activists Gloria Steinem and Betty Friedan—hardly threats to American national security; under pressure, the department cut the files back to 240,000 people and 25,000 organizations. A year later, in 1974, as Nixon’s presidency collapsed under the weight of Watergate, the Chicago Police Department admitted that public pressure had led it to destroy files on 105,000 individuals and 1,300 organizations. In 1975, Los Angeles followed suit, planning to destroy more than 2 million files on 55,000 people dating back to the 1920s.10
The government’s betrayal of the American people seemed to know no bounds. Americans saw deceit and illegal behavior at every level of government—from Nixon to the FBI to the local police force—and many suddenly found themselves feeling alienated from mainstream party politics.
* * *
Watergate, COINTELPRO, and the fall of Saigon in 1975 may have left Americans wondering what had happened to their country, but these high-profile news stories did not, on their own, affect most people’s sense of personal security. It took their inability to pay the bills to do that. In the early 1970s, the two conditions—shocking news of abuse of (and disrespect for) authority and economic decline—existed side by side and sometimes intertwined. While intelligence agencies and red squads battled radicals, and an administration unraveled, and the Vietnam War came to an ignoble end, the economy—for a generation, a steady source of confidence—imploded. And a government that seemed no longer trustworthy appeared incapable of curing the nation’s economic woes.
Prevailing Keynesian economic thinking suggested that unemployment and inflation would never rise simultaneously, but in the early 1970s they did, and a new phenomenon—“stagflation”—appeared. American production dropped, foreign competition swelled, companies laid off workers and closed or moved operations, and the prices of goods and services climbed and climbed.
All of this caught President Nixon and the nation off guard. Nixon had entered the White House in 1969 interested far more in foreign policy than in domestic issues or the economy and, like most Americans, he expected the economy to continue to thrive. In a Gallup poll taken shortly after his inauguration, 40 percent of Americans listed the war as the “most important problem facing the country,” while only 9 percent identified it as inflation and the high cost of living.
But the country’s economic picture turned much gloomier in 1970, and critics blamed Nixon. At first, the president responded by attributing the economic deterioration to his predecessors—the excesses of funding the war and Great Society programs—and tried to use his position as president to persuade industry to voluntarily curb prices while expressing confidence in the economy to the public. But just as Herbert Hoover found in the early years of the Great Depression, it did not work. Congress, in its own smoke-and-mirrors game, passed the Economic Stabilization Act of 1970, authorizing the president to freeze prices, rents, wages, and salaries—though, as the historian Melvin Small notes, it never expected a Republican president to use such power.11 By the fall of 1971, only 25 percent of Americans answering the same Gallup poll question about the “most important problem facing the country” said the war; now, 45 percent said “economic problems.”
Over time, the Nixon administration adopted a series of price and wage freezes that, despite their initial popularity, ultimately made the president look as though he was careening from one unsuccessful policy to another. It did not help that the terminology was confusing (Freeze II, for example, was also referred to as Phase IV), but mostly, Americans grew frustrated that nothing the president did seemed to stop prices from actually going up. Time reported that “in the first full week” of a sixty-day freeze on the price of everything except agricultural products (the main inflationary culprit at the time)—and in the same month in which White House counsel John Dean testified about Watergate before the Senate—“each costly ring of the check-out cash register seemed to eat away at public patience with the Administration far more than the revelations of the Watergate scandal.”12
It only got worse as Nixon wrestled with responding to Egypt’s and Syria’s surprise attack on Israel during Yom Kippur. At first, Nixon held back, aware that openly aiding Israel would draw the ire of the Arab states on whom America relied so much for its oil. But when it became clear that without American aid, Israel might suffer complete destruction, the administration authorized a major airlift of war matériel that ultimately helped Israel prevail. As Nixon feared, America’s support came with a heavy price. The Organization of Petroleum Exporting Countries (OPEC), dominated by Arab states, quickly announced an oil embargo against the United States, Japan, the Netherlands, and other Western European nations. The oil embargo immediately prompted panic buying across the country and, perhaps more directly than anything else, confronted anxious Americans with the grim realities of the nation’s economic slide.
Nixon responded to the embargo in ways that may have made sense but that, to the American people, seemed like a Band-Aid approach. He ordered that thermostats be lowered to 68 degrees and speed limits to 55 miles per hour, that air travel be cut 10 percent, and that gas stations close on Sundays. In addition, he urged Americans to limit ornamental lighting for their homes during the upcoming holiday season and to use minimal lighting in commercial businesses. In spite of all of this, the image of huge lines forming at gas stations all over the country—with motorists sometimes waiting hours to buy a half tank of gas—endured as a testament of America’s weakened and vulnerable state. Only the American oil companies weathered the crisis well, with Exxon’s profits increasing 59 percent, Texaco’s 70 percent, and Mobil’s 68 percent. This news deflected conspiracy theorists toward the oil companies, leading Senator Henry Jackson to question if the oil shortage was even real. “The American people want to know whether oil tankers are anchored offshore waiting for a price increase or available storage before they unload. The American people want to know whether major oil companies are sitting on shut-in wells and hoarding production in hidden tanks and at abandoned service stations. The American people want to know why oil companies are making soaring profits.” In the end, OPEC lifted the embargo in April 1974 and the gas lines receded, but prices never recovered.13
Given the prevailing state of economic thinking, no president could have forestalled the slide, but that did not stop the public from expecting a competent response. According to one scholar, Nixon “tried and then discarded all the fashionable economic remedies of his time, unintentionally exposing the poverty of economic theory.” As Herbert Stein, chairman of the Council of Economic Advisers, said years later, “our experience really confirmed how little economists know.” Politically, however, this failure of economic theory mattered to the American people only insofar as they were coming to know their president not only as a crook but as a poor manager of the economy to boot.14
Nixon’s successor, Gerald Ford, arrived in office with no magic wand with which to wave away the nation’s economic woes. By the end of 1974, the unemployment rate had jumped from 5.4 percent in August, when Ford assumed the presidency, to 7 percent in December. Inflation stood at almost 14 percent and the Gross National Product (GNP) had fallen 9 percent. Not since the Great Depression had the economic indicators looked so grave. In January 1975, Ford proposed attacking the recession with a 12 percent tax cut for businesses and individuals on their 1974 income. Such a policy, the president thought, would put money in American pockets, increase the GNP, and put people back to work—though it ran the risk of generating higher inflation, too. The Democrats in Congress, operating with a significant majority thanks to the 1974 midterm elections that followed Nixon’s resignation and pardon, countered by advocating New Deal–like government spending as a way to lift the nation out of the recession. But Ford balked, and over his two-and-a-half-year presidency, he vetoed sixty-six bills. In fact, in his 1976 State of the Union address, the president blamed government spending for bringing on the recession in the first place, though mostly he made it clear how much he disliked federal social programs.15
Even though unemployment and inflation dropped for a time in 1976, the term “gridlock” dominated public discourse; for much of the public, the veto battles between Congress and the president only further confirmed that they could no longer count on government. Already, citizens were organizing on a number of fronts—some more prominent than others—and battles raged between labor and management, consumers and big business, and neighborhood groups and city hall. In the immediate post-Sixties period, when activism supposedly took a holiday, a new era of economic and civic engagement was born.
* * *
If most activists of the 1970s and 1980s learned strategies and tactics from the civil rights movement, those in the emerging “citizen movement” also took inspiration from Ralph Nader, a kind of nerdy superhero in a rumpled suit. Although Nader became a much more controversial leader—arguably a presidential election spoiler—in later years, he first rose to prominence as a consumer advocate in 1965, when he published the muckraking book Unsafe at Any Speed: The Designed-in Dangers of the American Automobile. Indicting the entire auto industry for institutional malfeasance in manufacturing automobiles that endangered the public, the book, and public reaction to it, led Congress to pass both the National Traffic and Motor Vehicle Safety Act and the National Highway Safety Act. In the meantime, General Motors helped elevate Nader’s status to folk hero proportions when the company hired private investigators to, as one GM handler was caught on tape instructing a detective to “get something on him … shut him up” and find out “who he is laying.” Nader sued the corporate giant for invasion of privacy and settled out of court for $425,000 and a public apology—in embarrassing testimony before a Senate subcommittee—from GM president James Roche.16
Nader soon expanded his consumer advocacy by attracting hundreds of college interns each summer—dubbed “Nader’s Raiders”—to investigate and uncover corporate irresponsibility, often prompting continued citizen organizing at the local level. Through the newly founded Center for the Study of Responsive Law (roughly modeled on the NAACP Legal Defense Fund), the Raiders then pressured government authorities to pass appropriate legislation. When the GM suit was settled, Nader used the proceeds to start a law firm, the Public Interest Research Group, expanding his mission beyond consumer advocacy to investigating anything that might damage the public interest. That meant that even the government that he expected to respond to his corporate exposés could also be the subject of investigations. Moreover, Nader effectively spread the Raiders concept nationwide when he moved to set up student PIRG chapters on college campuses all over the country to target local business and government.17
Perhaps because critics have usually directed their ire at Nader himself, the public interest advocate’s image has always been—in spite of the publicity surrounding the Raiders—that of a lone crusading maverick. But frequently, Nader and his colleagues would uncover a problem and, to their delight, see local organizations rise up spontaneously to tackle the issue. Following the passage of the Occupational Safety and Health Act, a health condition called “brown lung” (byssinosis)—caused by the breathing of cotton dust—became an issue thanks in part to Nader. Although the industry had been aware of brown lung since the 1940s, companies like J. P. Stevens had never warned their workers of the potential harm. When a North Carolina doctor working for the North Carolina Board of Health tried to study the illness in 1966, owners did not let him near their mills and later used their political influence to get him fired. In the 1970s, more than a hundred thousand current and retired textile workers suffered from brown lung, but as one woman said, the first time she learned about it was when she heard Ralph Nader describe it on television. Soon locals formed the Carolina Brown Lung Association to provide health screenings for workers and retirees while also lobbying the state legislature for workplace legislation that would reduce dust levels.18
Grassroots consumer organizations—often inspired by Nader’s Raiders, but operating independently—likewise agitated at the state and local levels nationwide. The groups turned their attention not on how Americans worked but on how they shopped. “There is a quiet revolt in this country,” Utah senator Frank Moss noted at the time. “It is a revolt of people who are not violent, but who are angry—at incredibly shrinking packages and expanding prices and preposterous advertising. We in Congress, and they in the administration, ignore this revolt at our peril.” When President Nixon instituted his Phase III price controls and meat prices soared, consumers organized a nationwide boycott. Groups such as Operation Pocketbook, Fight Inflation Together, and Housewives Expect Lower Prices (HELP) launched their own price controls through a one-week nationwide meat boycott in which 25 percent of American consumers participated; prices on meat dropped 80 percent.19 The boycott, wrote a New York Times reporter, “is made up mainly of groups of tenants in apartment buildings, neighbors who shop at the same markets in small towns, block associations, and—perhaps most typical—groups of women who meet every morning over coffee.” Conveying the grassroots nature of the boycott, one historian notes that “more than 500,000 consumers from New York, New Jersey, Connecticut, and Pennsylvania sent cash register tapes to President Nixon.” Nixon responded with price caps, but more important, such grassroots power led state governments to establish offices of consumer protection and public advocacy agencies to handle consumer complaints. This was a hint of things to come.20
In 1974, U.S. News & World Report named Ralph Nader the fourth most influential person in the country. And the vast proliferation of public interest groups his work spawned guaranteed what one expert called “an expansion in the scope of political conflict.” Nader called for an “initiatory democracy,” in which the people could “initiate actions to make sure public officials were acting responsibly,” he said. “I’m talking about rights plus remedies plus legal responsibilities so it can be a citizen versus the ICC or the FDA.” In his view, “a civil servant should be forced to make the law work, and if he won’t do it he should be censured or expelled from the Government.” To Americans living in economically desperate communities, struggling to survive, it sounded like a winning formula.21
In fact, Nader and those he influenced were just one component of a complex nationwide phenomenon. In the 1970s, by one expert’s estimate, some 20 million Americans got involved in some form of “citizen advocacy”—whether in unions, consumer activism, or community or neighborhood organizing. But this was no centralized movement. Advocacy groups sprang up in response to local conditions and evolved in response to both national trends and hometown idiosyncracies. This sometimes made the activism difficult for pundits to see, appearing, as it did to the national media, as a “crazy-quilt array of protests, without apparent themes in common.”22
With elected officials seemingly overmatched by persistent economic decline, Americans began to organize, particularly at the local level. Such organizing would have been impossible, however, if not for the earlier example of the civil rights, peace, and women’s movements, as well as earlier experiments in community organizing. Harry Boyte, an activist and one of the most insightful writers on the citizen movement—what he called the “backyard revolution”23—argues that the citizen revolt originated with “an old American practice of cooperative group action by ordinary citizens motivated both by civic idealism and by specific grievances.” Like Nader, each group appealed to “some implicit popular conviction that there is a broad public good—a long term interest of the society as a whole which the contemporary generation is charged with guarding and preserving.”24
American cities, in particular, seemed in need of such attention in the mid-1970s. Urban America had fallen into such disrepair that Travis Bickle’s characterization of New York as “an open sewer” in Taxi Driver rang true to audiences everywhere. Like Nashville and Network, two other 1976 films offering astute commentaries on the national temper, director Martin Scorsese’s Taxi Driver captures the desperate political mood afflicting America at the time.25 For Americans living outside the nation’s cities, Scorsese’s depiction of a deranged vigilante may have confirmed a growing sentiment that fixing America’s cities was hopeless in an era of government abandonment and social dislocation. By the mid-1970s, with American cities starved of revenue thanks to residents moving to the suburbs, industry following them out and beyond—to havens of cheap labor and lower taxes—and the dramatically increased cost of services, city dwellers could not count on government to help them unless, as Ralph Nader suggested, they took matters into their own hands. Whereas cities had once been sites of wonder and centers of commerce, one commentator noted, “financiers and federal officials” now saw cities “as mismanaged, overly generous, unwieldy bureaucratic messes which needed to be taught the lessons of fiscal responsibility.” Banks thus began withholding funding until cities got their finances in order. City administrators were left alone to come up with solutions. Indirectly, poor and working-class residents were left to fend for themselves, too.26
By and large, city administrations initiated an array of measures designed to lure business and investment back, and in so doing telegraphed their willingness to cut back on services. During the recession, cities offered businesses massive tax breaks even though that cut some of the revenue that funded schools, police, fire, transportation, and sanitation. In the 1970s, urban “revitalization” focused on corporate development, tourism, and the drawing of entertainment attractions to newly built civic centers and arenas. The gentrification process, in which improvements drove up the prices of rental property—and drove out the poor and minorities (who were replaced by middle- and upper-class home buyers)—was aided significantly by a process called “redlining.” Banks “redlined” whole neighborhoods in the 1960s and 1970s, freezing loans to working-class neighborhoods, which sent those neighborhoods into decline and thus precipitated the suspension of city services. In 1976, Roger Starr, the New York City housing and development administrator, actually proposed closing schools and police and fire stations in “slum areas” as a way to speed up “population decline.” To do so would not only cut costs, but clear the way for new investment. But as one community organizer later commented, “faced with an eroding tax base, local governments initiated service cutbacks that detrimentally affected disparate elements in the urban core and made them ripe for organizing.” City residents prepared to fight.27
In 1975, as New York City faced bankruptcy and saw its appeals for federal help go unanswered, only a combination of creative borrowing from municipal union pension funds and service cuts finally earned some short-term loans from Washington and kept the city afloat. Although the city averted a possible revolution in its streets, the results of these drastic measures, one observer wrote, “could be seen on every street and in every institution of working-class New York.”28
The wealthy streets, naturally, did not have it as bad as the working-class and impoverished streets. People of means could afford private schools, colleges, and hospitals and transit fare hikes (or cars), and they were less likely to see a firehouse closed in their neighborhood. The vast majority of New Yorkers, however, could claim no such immunity from the city’s deteriorating physical and psychological state. Many of them promptly gathered together, with no help from activists or outsiders, into neighborhood and community groups to launch rearguard campaigns—“holding actions” in one scholar’s words—to get those in power to act on their behalf, regardless of what investors thought.29
Thanks to the flight of industry, population decline fueled by suburbanization, and redlining, huge swaths of New York fell into disrepair. Many landlords, unable to attract enough tenants and facing mounting costs, simply abandoned their buildings. In the mid-1970s, the city estimated the abandonment rate at twenty thousand to sixty thousand housing units a year. City banks all but stopped lending money to landlords and building owners in more and more city neighborhoods, and, thanks to recent deregulation, chose instead to invest money outside the city. Landlords were caught setting fire to their buildings (so they could collect insurance) or hiring kids to do it. Daily perils increased still further as economic austerity measures prevented the city from providing adequate safety and fire protection measures.30
In a white ethnic community like Greenpoint (in northern Brooklyn), working-class residents formed block associations and otherwise mobilized to protect their neighborhoods. As one scholar studying such communities noted, block associations led “persistent efforts to control the environment” and were a “significant feature of low-income life.” Not only did the Norman Street block association in Greenpoint serve as the locus of neighborhood socializing, but it also ran a summer lunch program for kids, built a children’s park, and pressed city officials on addressing unsafe, abandoned, and burned-out buildings in the neighborhood.31
Two interrelated problems, both stemming from the city’s 1975 fiscal crisis, made this kind of neighborhood advocacy necessary. First, in contrast to the 1960s, when the city had built thirteen new firehouses, between 1972 and 1975 it had closed seventeen and relocated seven others. In addition, in 1975 it capped the number of firefighters at 1939 levels (about ten thousand) even though the number of alarms sounded each year had increased tenfold (from about forty thousand in 1939 to four hundred thousand in 1975). Second, for most of 1975 and 1976, the city said it did not have the financial resources to seal abandoned buildings—the most frequent targets of arson. A Housing and Development Administration staffer told Greenpoint residents that instead of sealing buildings, it could demolish them “in an emergency”—which meant the building had to be collapsing before the agency would act. When a Greenpoint resident pointed out that “a building which is a fire hazard right next to our homes is just as much of an emergency,” the staff member said, “We agree, but if you’re talking about all the four thousand unsafe buildings, we haven’t got the money.” Eventually, the city cut services to deteriorating neighborhoods, so fewer subway trains and buses ran there; libraries and schools closed, and police precincts and firehouses, too.
When, in November 1975, the city announced the closure of eight engine companies and four firehouses—including one in Greenpoint—two hundred residents of that neighborhood, mostly Polish American, turned out and surrounded the firehouse before the firemen could remove the engine. Following a twenty-hour standoff, the residents finally agreed to let the firemen go, but the city left the engine. Protesters christened the building “People’s Firehouse No. 1” and occupied it in a sustained demonstration for the next eighteen months.
Most of the neighborhood’s protesters were white ethnics, the very people usually associated with the working-class “backlash” against liberalism, but to see them, one would more likely think of the brash Sixties activists who protested on behalf of civil rights and peace. Indeed, the Greenpointers did not limit their protest to the sit-in at the firehouse; they organized targeted protests in other parts of the city, too, including picketing the fire commissioner’s house and a mayor’s dinner at one of the city’s Hilton hotels. And they seemed downright radical when they blocked traffic on the Brooklyn–Queens Expressway during morning rush hour. Such tactics eventually achieved victory when the firehouse was reopened in March 1977—though, as the rest of the nation would see a few months later, following burning and looting during the city’s blackout, life in New York still looked like life in a combat zone.32
Another form of urban neighborhood group—the community development organization—gained considerable currency in the mid-1970s, too. Like the activists in Greenpoint, these organizations were neither left-wing insurgencies nor examples of right-wing backlash. Writing after several years of reflection, the sociologist Robert Fisher observed that community development organizations could typically be found in blue-collar, white ethnic neighborhoods where “the emphasis is on maintaining and strengthening neighborhood networks and organizations, not creating social change or a new political movement. They see neighborhood decline, especially physical deterioration, fostered by working-class powerlessness, as the problem.”33
In Southeast Baltimore, a working-class neighborhood of approximately ninety-five thousand second- and third-generation descendants of Eastern European immigrants, residents founded the Southeast Council Against the Road (SCAR) to prevent the demolition of hundreds of homes to make way for a six-lane highway. In the process, they realized that the city government, in tandem with its financial sector, had broader urban renewal plans about which they knew little, and that banks had already been redlining the neighborhood in hopes of fueling its deterioration and subsequent renewal. As a result, SCAR transformed itself into the Southeast Baltimore Community Organization (SECO), a coalition of more than ninety neighborhood groups that not only put a stop to bank redlining by shaming government officials into investigating it, but also succeeded, in a period of recession, in keeping a nursing home and a library open, obtaining two new schools, organizing a Youth Diversion Project, and, through a direct action protest featuring mothers with baby carriages filling the streets, stopping truck traffic through residential neighborhoods. Maybe most impressive, SECO established Neighborhood Housing Services as a community corporation—presided over by a board of residents and bankers—that helped 245 families become homeowners. Admirers attributed SECO’s success to its mainstream approach to political protest. “We knew we had to be tough and militant,” said Barbara Mikulski, the “godmother of SECO,” who later went on to Congress and the United States Senate. “But the ‘issue’ was always the enemy. And we knew the very institutions that we challenged were the ones we would have to work with when peace broke out.” In a time of severe economic anxiety, SECO spared Southeast Baltimore from the wrecking ball.34
* * *
Meanwhile, advocates of the poor lamented that blue-collar and middle-class organizing in American cities often happened at the expense of the most deprived. The National Welfare Rights Organization (NWRO) and regional welfare rights groups tried to fill the void left by Nixon’s dismantling of War on Poverty Programs. In fact, the NWRO had been active in assailing welfare bureaucracy in American cities since its founding in 1966—at the height of the War on Poverty. In many cities, the duplication of bureaucratic effort—with payments coming from multiple agencies—created a jungle of exhausting hoops and red tape. Following a scheme first articulated by the sociologists Frances Fox Piven and Richard Cloward, the NWRO did its best to swamp the system in cities like New York by organizing the poor to join urban welfare rolls, thus adding to the city’s accumulating financial woes and moving it toward a crisis. In time, such tactics succeeded in leading the state legislature to move to a system of single payments (which the NWRO nevertheless protested as wildly inadequate).35
The NWRO spawned one of the most important advocacy organizations at work in the 1970s and 1980s: the Association of Community Organizations for Reform Now (ACORN). Founded in 1970 by NWRO veterans Walter and Lee Rathke in Little Rock, Arkansas (when the A in ACORN stood for the state’s name), the association focused on organizing the poor in what it called a “majoritarian” strategy. “If you can fashion a program that will attract people who earn, say, $8,000 or less [a year],” Walter Rathke said, “then you’re appealing (in Arkansas) to a very large majority. You can develop an organization with real power.” ACORN estimated that low- and moderate-income people might amount to 85 percent of the American population. Rathke took inspiration from the legendary organizer Saul Alinsky, who had pioneered tactics for drawing citizens into political action aimed at gaining power in their own communities. In Little Rock, ACORN started mostly as an organization of welfare mothers. Rathke and another organizer, Gary Delgado, found, with a lawyer’s help, a little-known “minimum standard of need” regulation that said welfare recipients were entitled to furniture. Following a strategy that both Alinsky and NWRO would have recognized, Delgado and a group of local volunteers knocked on doors for weeks and spoke to church groups, ultimately bringing out hundreds of welfare recipients in waves of demonstrations demanding furniture from the county Welfare Department. Led by a twenty-three-year-old African American welfare mother named Gloria Wilson, ACORN took the fight to Governor Winthrop Rockefeller, holding demonstrations at his mansion. When the governor agreed to negotiate, Wilson attended the first meeting and, in a moment of frustration, lost her temper. “We have to choose between feeding our kids or buying a decent bed,” she said. She then removed her wig to reveal that the stress of life on welfare was causing her hair to fall out. “Something has to be done!” she shouted at the governor and his aides. Spontaneous outbursts like Wilson’s showed the power of emotion, both to mobilize Americans around a variety of issues and to move one’s political opponents. In part because he had been shaken by Wilson’s desperate plea, Governor Rockefeller soon agreed to provide a furniture warehouse where donations could be made and furniture acquired (this later led to a new state agency, Furniture for Families, responsible for collecting and distributing used furniture). This victory led to more campaigns in Arkansas—focused on housing, schools, traffic lights, public transportation, electricity rates, consumer issues, environmental issues, property taxes—and eventually to the proliferation of ACORN chapters in twenty-six states; by 1982, the organization claimed more than fifty thousand dues-paying members waging multistate campaigns on the same issues.36
The idea of “building the majority’s power”—power for a true silent majority made up of poor, working-class, and middle-class Americans who shared a sense of powerlessness in the face of government and business might—seemed to offer a concrete plan that fictional populist heroes such as Network’s Howard Beale and Nashville’s Hal Phillip Walker did not. Beyond any particular issues, an organizing manual claimed, ACORN was concerned with a more fundamental problem: the distribution of power in America. “When ACORN attacks a public utility for raising rates, the attack is not based on an analysis which holds that ACORN members, and other low to moderate Arkansans (or Texans or South Dakotans) are, as consumers, getting nailed by high electric rates,” the manual stated. “Undoubtedly they are getting nailed … but even more important … they are getting boxed because … a bunch of corporate directors and New York bankers have the power to unilaterally make decisions that affect the lives of ACORN members.”37
Critics pointed out that professional staffers did the lion’s share of the organizing for ACORN and many of the “citizen action” groups that had sprung up like flowers around the country in the late 1960s and early 1970s. In fact, in Chicago, Heather Booth, a veteran of the antiwar, student, and women’s movements, founded the Midwest Academy in 1972 as a training institute for organizers. The need for an institution that could both teach the difficult, detailed work of organizing and also “serve as a link between different groups—between community organizers, labor unionists, environmentalists, feminists,” and other activists—seemed apparent to Booth and others. By that time, the “citizen revolt” included not only ACORN but organizations such as the Citizens Action Program (CAP) in Chicago, Communities Organized for Public Service (COPS) in San Antonio, the United Neighborhoods Organization (UNO) in East Los Angeles, the Oakland Community Organization (OCO), Massachusetts Fair Share, the Ohio Public Interest Campaign, Oregon Fair Share, the Illinois Public Action Council, and the Connecticut Citizen Action Group. In most of these organizations, professional activists charted strategy, but, as Booth taught at the Midwest Academy, community organizing could not be considered completely successful even if it won clear victories on issues that improved people’s lives. Community organizers also had to build organizations “through which people can gain a sense of their own power … and which contribute to the general change in power relations, democratizing the broader society.”38 It did not always work out as planned, but whether the front porch activists who mobilized across so many issues in the 1970s and 1980s knew it or not, the mostly local organizing model that they followed was pioneered by the Alinsky-inspired organizers of “citizen action” groups.
Of course, even when an organization did succeed in bringing together many rank-and-file community participants, the results could be uneven. Massachusetts Fair Share, for example, started in working-class Chelsea and East Boston in the early 1970s and won victories involving housing, playgrounds, and street repair. “Perhaps most remarkable,” Fair Share had, in one scholar’s view, “begun to forge alliances among blacks and working-class whites on issues that transcended race” despite working in a tense racial climate. But then it took on the utilities. In Massachusetts, as in the rest of the country, citizens had seen their utility rates driven up by inflation and the oil crisis. Fair Share, in a major referendum campaign against the utility companies, advocated uniform rates for residential users and the big industrial users, who were then paying much lower rates. Business did its own organizing, taking tips from Business Roundtable strategists, and even enlisting the support of Democratic governor Michael Dukakis and key unions. It also funded an Associated Industries of Massachusetts campaign that warned of a mass exodus of business and investment from the state if business had to pay the same rates as homeowners. Companies enclosed warnings in employees’ pay envelopes, and colleges did the same with tuition bills sent to parents. Fair Share had no chance; when voters went to the polls in 1976, the referendum lost two to one.39
In other locales, the business lobby’s attempts to defeat reform by stressing the need for a business-friendly environment backfired. In San Antonio, Communities Organized for Public Service (COPS) grew by 1976 to a convention of six thousand delegates from the city and surrounding areas, making it the largest urban community organization in the country. Its primary political objective was to challenge San Antonio’s long-range development plan, which, COPS believed, privileged suburban development and the real estate and construction contractors pushing it “while neglecting the city neighborhoods and the needs of the majority of its citizens.” So COPS moved into electoral politics by interviewing and analyzing city council candidates and choosing which ones to endorse; all of the COPS-endorsed candidates won. After the 1977 election, COPS, with advocates on the city council, mounted a campaign to change the city’s approach to recruiting industry. Instead of emphasizing cheap labor as San Antonio’s main attraction for industry—as a report commissioned by a Chamber of Commerce research group had done—COPS argued that the city should encourage only businesses prepared to pay employees a living wage (Harry Boyte said “it mentioned $15,000 as a reasonable figure”). As in the Massachusetts utility rate campaign, the Texas Industrial Commission predicted that business would be “scared off.” Others in the business lobby threatened a corporate boycott. This time, though, it did not work. The city went with the COPS proposal, and the Chamber of Commerce was shamed into admitting that its “cheap labor appeal was wrong.”40
In many ways, the United Neighborhoods Organization (UNO) of East Los Angeles grew from the same fertile soil that produced COPS in San Antonio: East L.A. was home to an immense illegal Mexican immigrant population. The area’s Catholic parishes became important organizing centers for residents who felt the city and state catered to business and developers to the detriment of their neighborhoods. Certainly, antipoverty efforts had been tried, but it seemed most government officials saw East L.A. as hopeless. Under UNO, however, priests and nuns (who had been trained by Ernie Cortes, a Mexican American organizer from San Antonio) conducted ten thousand interviews to gauge the most pressing issues facing residents and their families. One frequent response was “skyrocketing insurance rates,” which, UNO researchers found, stemmed from auto insurance company redlining practices that charged rates not by how well a customer drove but by where he or she lived. UNO brazenly demanded a meeting with the state Insurance Board chairman, Wesley Kinder, who was stunned when three thousand East L.A. residents showed up, prepared to argue forcefully, if politely. As one priest said of the residents after the meeting, “They’re changing, you know, you should see the people talk. Sweet old ladies. It’s tremendous. There’s dignity in the determination of their lives.”41
* * *
Despite major obstacles, the rural poor also joined the citizens’ revolt. As the late Senator Paul Wellstone pointed out, in the mid-1970s, more poverty existed in rural America, proportionately, than in urban America. “While less than one-third of the nation’s population reside in rural, small-town communities, 40% of the nation’s poor, 9.2 million rural people, live in poverty,” he wrote in 1975. Likewise, almost 40 percent of the total rural population had incomes below the poverty level. Wellstone, then a political science professor, saw much of this poverty near his place of employment, Carleton College in Minnesota. In the late 1970s, he became involved with the Organization for a Better Rice County (OBRC).
In Wellstone’s experience, rural people seemed more difficult to organize. That was in part because they were isolated, living farther apart than city dwellers and harboring “no expectation for social change.” In addition, the power brokers in Faribault, a town of sixteen thousand and the Rice County seat, were the business owners and the local newspaper that “religiously attack[ed] welfare programs … and almost all social programs.” It seemed an unlikely place for a poor people’s movement, and at first, OBRC made little headway.42
OBRC had more success, however, when it set out to reform the county food assistance program, attracting not only welfare recipients but working poor and senior citizens to its banner. OBRC’s research showed that many people who were eligible for the program did not know it; for those who did know, the process of receiving assistance discouraged participation (it could be an arduous, daylong undertaking that involved not only arranging transportation into Faribault, but often waiting hours in a room with no chairs for the elderly or infirm before receiving any food). Using census figures, OBRC showed that 4,300 people were eligible for food assistance, but only 1,500 participated in the program. The organization demanded a hearing with the county board of commissioners, and dozens of poor people attended. Like the UNO activists meeting with the state insurance commissioner, OBRC members demanded to be treated with dignity. They asked for another distribution center, more staff to speed distribution, chairs for those who needed them, and outreach to inform eligible families about the program. The commissioners agreed to all but the last demand. Charles Liverseed, a seventy-two-year-old handyman with a seventh-grade education who was so poor that Wellstone once observed only “some turtle meat and a little cheese” in his old refrigerator, acted as OBRC spokesman that day. He seemed to understand the gist of Ralph Nader’s “initiatory democracy” idea when he said at the meeting’s end, “We here at the OBRC have just made a contract with you the commissioners about improving the [food] commodities program. If you don’t live up to your word, we’re going to come back and make you live up to your promises.”43
With that victory behind them, poor people became much more active in running the OBRC, but they also became targets of retaliation. The Faribault Daily News routinely described poor people as lazy welfare cheats and attacked OBRC for grandstanding. Meanwhile, the state welfare department suspended benefits to the OBRC president, a married woman with eleven children to support who worked a farm that made roughly $1,500 a year. The department argued that the farm’s value had increased, making the family ineligible for benefits. The department then violated its own due process regulations in responding to her request for an appeal and sparked a welfare rights movement in the process.
The larger problem was that even as food prices kept going up and up, the welfare department made no adjustments to payments, so Minnesota’s poor could buy less and less. The OBRC appealed to the county welfare official for support, but he consistently ignored its request for a meeting—until twenty-five women showed up with their thirty-six children at the Rice County courthouse. “The children were running all over the place,” Wellstone recalled, “and the noise level was unbearable.” They got their meeting and the simple letter of support they had been seeking, and that led to benefits being raised from $105 to $118 a month. More important than the modest increase, the women gained a sense of their collective strength in taking on the system. As one later said, the one good thing about being poor is that “you can’t be sued—so you might as well speak up.”44
You can’t be sued, but you can be intimidated. Once again, OBRC suffered retaliation from the welfare department. Without any prior warning, it obtained a court order and sent a social worker and police to the home of Artis Fleischfresser, an outspoken welfare mother. The officials took her four children into custody as she and the kids begged and screamed in protest. The children returned home a few days later after a county attorney, suspecting a political vendetta, did not hold the scheduled hearing to determine if Fleischfresser was a suitable mother. But even so, Fleischfresser ceased all activism. Thanks to the chilling effect this episode had on other welfare mothers, the OBRC nearly collapsed.45
OBRC did eventually close its doors, though not until a few years later, and not because of retaliation but because of organizational inefficiencies. In the meantime, it managed to win other significant victories: it secured from the county a share of federal block grant funds for a much-needed day care center for poor people; after a year of organizing, it launched a senior citizen “dial-a-bus” transit system after the county suspended public bus service; and when the state introduced food stamps to replace the inadequate food assistance program, OBRC played a part in determining that the distribution system was set up fairly and treated recipients with dignity. The most significant OBRC accomplishment, Wellstone said, came in the “dramatic change in the political consciousness among the poor.” Even after OBRC folded, plans quickly came together to start a new organization from scratch. At the first steering committee meeting, one woman said, “I still have that damn OBRC consciousness!”46
All over the United States, citizens faced with a new economic reality, notable for its effect on local services, saw their own collective consciousness raised. In turn, they channeled their emotional response into action. Perhaps without realizing it, they were both following earlier examples of grassroots organizing and modeling new ones. By the middle of the 1970s, these new forms of collective citizen action could be seen everywhere, from rural areas to cities and suburbs—and not only in response to questions arising from desperate new circumstances. In fact, at times the front porch impulse to mobilize came in response to familiar nagging issues.
Copyright © 2013 by Michael Stewart Foley