Skip to main content
Macmillan Childrens Publishing Group

This Chair Rocks

A Manifesto Against Ageism

Ashton Applewhite

Celadon Books



I’ve never lied about my age—I have no problem saying “I’m sixty-six” loud and clear—but I sure know a lot of people who do. People who’ve lied on résumés and on airplanes and on dates. There was the opera singer who fudged upward at the beginning of her career so she could get cast as Norma, but was holding at thirty-nine. And the woman who loved passing off her granddaughters as her kids, and who was regularly connected to her bank’s fraud department because she couldn’t remember what birth date she was using.

I would never have been able to keep my stories straight either—one reason I told the truth. Another was because the typical response wasn’t so terrible: “You look great for your age!” I inherited my mother’s no-gray-hair genes, I’ve always had plenty of energy and no plans to slow down, and I certainly never felt like any of the labels out there—“senior,” “cougar,” “woman of a certain age”—applied to me. But if I was so cool with it, why didn’t “You look great for your age” feel like a compliment? The fact was that the hazy prospect of growing old filled me with something between free-floating anxiety and stomach-churning dread. I didn’t want to think about it until I had to, and when it crossed my mind, I flipped the channel. Why not, as long as I could “pass” for younger, right?

That wasn’t a very solid strategy. Birthday cards circulated regularly around the charmless cubicles of the office at the American Museum of Natural History, where I worked with a guy named Ray for fifteen years. Ray and I didn’t have much in common. He handled the accounts; I wrote. He lived in the suburbs; I didn’t own a car. He was conservative; I’m progressive. With his fringe of snow-white hair, if he had gained weight and worn red, he would’ve made a perfect Santa Claus. He was proud of being cantankerous and was always muttering about his aches and pains, and he couldn’t wait to retire to Florida. So when I learned that Ray and I were exactly the same age, I panicked. I thought, “What if everyone finds out? They’ll think I’m old too.”

That wasn’t just condescending and mean-spirited of me; it was idiotic. My museum coworkers, Ray included, were an intelligent bunch. They didn’t have any trouble telling the two of us apart, and difficulties were unlikely to crop up. Why, then, was I so flipped out about landing with Ray on the same side of some hypothetical old/young divide? Why did I imagine that this would erase our individuality, and diminish me so frighteningly? Was I driven by fear of losing my looks? Of growing frail? Of my own mortality? Wouldn’t I be better off making my peace with the passage of time than waging a battle no one could ever win?

I wish I could report that I found the answers in one blinding epiphany. Instead, it’s been a gradual awakening over the past twelve years. There have been many glum days at the keyboard, and some sleepless nights dictating brilliant insights into my phone, most of which were a lot less brilliant in the morning light. I had the good fortune to be mentored by Dr. Robert Butler, coiner of the term “ageism,” before his death in 2010. I attended seminars for journalists who cover the “age beat,” inhaled countless books and articles, and started thinking out loud in blog form. I delved into a world of advertisements and movies, policies and bylaws, and products and promotions that had shaped my unconscious beliefs with one overarching message: old = no good. Or, as the Twitterverse might put it: It sucks to be old.

A chance dinner conversation in 2007 with my partner’s mother, Ruth Stein, got me started on this journey. In their eighties at the time, she and her husband, Bill, were booksellers, and that night she said, “I think you should write about something people ask us all the time: ‘So when are you going to retire?’” The idea was upbeat and sound bite–friendly. I started learning about longevity, interviewing people over eighty who work, and blogging about it.

I headed to Santa Fe, where I had family to stay with. My first interview was with eighty-eight-year-old folk artist Marcia Muth on the porch of her little adobe house, shaded by a tree festooned with shiny compact disks and surrounded by her hubcap collection. Muth had been raised in Fort Wayne, Indiana, by her grandparents, to whom she was “a disappointment, because I liked classical music, I liked Shakespeare, I loved poetry. To them, work was having a store.” She went on to become a law librarian, poet, publisher, and, in her fifties, a successful folk art painter and teacher. A newspaper clipping on the wall quoted Muth’s advice to her Elderhostel students: “You are never too old, and it’s never too late.”

Embarrassed by her lack of formal training, Muth painted in secret for ten years. When a local artist dropped by and caught her scurrying to hide her brushes, he offered her some advice: “Don’t take any lessons. Just keep going.” She did, and it became a way of life that she was grateful to have found in her middle years. Chronic bronchitis had ended her teaching career a few years earlier and tethered her to an oxygen tank, but “it doesn’t interfere with the painting, that’s the important thing,” she said. “Your life does change as you get older,” she told me. “You get into what’s important and what’s not.” She and her partner went out less and moved more slowly, but her work continued to improve as she got better at listening to herself. “Don’t fear old age,” she advised. “Your years can be just as wonderful as you get rid of some of the anxiety people suffer from. And I find my eighties have been even more fun than my seventies were.”

The possibility that life could become more fun in your eighties had never crossed my mind. Nor that growing a little shorter of breath each year would fail to terrify. Nor that an ever more circumscribed life could be an ever greater source of personal growth and specific pleasures. Nor that such joyful clarity would be rooted in awareness—not denial—that time was short and therefore to be savored. After this first jolt of fresh old air, I kept going. From pediatricians to park rangers, Americans of advanced years and all stripes told me about their work behind steering wheels and desks and band saws and television cameras and how they’d gotten there.

It came as no surprise that they were as different from one another as could be, not to mention from the stereotype of the doddering ancient. But something did surprise me: the discrepancy between what I’d simply assumed it was like to be eighty or ninety and what I was encountering firsthand. The more I read and the more experts I talked to, the clearer it became that these older workers were typical of a large and fast-growing cohort of older Americans. Why the disconnect between what I had imagined about old age and the reality that was coming into view? Had I bought into some kind of party line? What were some of my assumptions about what the future held?

My darkest nightmare was the possibility of ending my days drooling under a bad botanical print in some ghastly institutional hallway. If asked what percentage of Americans over sixty-five lived in nursing homes, I’d have ventured, “Maybe thirty?” I’d never have arrived at the actual number: 2.5 percent, down from 5 percent over the last decade! Even for people eighty-five and up, the number is only 9 percent.1

Only 2.5 percent of Americans over sixty-five live in nursing homes.

What about being sick, and helpless? It turns out that over three-quarters of the “oldest old”—ages eighty-five and up—can go about their everyday activities without any personal assistance.2 Probably not shoveling their driveways or doing Costco runs, but dressing, cooking, and wiping their own butts. People get chronic illnesses, but they learn to live with them. The vast majority of older Americans live interdependently until they come down with whatever kills them.

What about the specter of dementia? Everyone seemed to know a horror story. My memory’s never been any good, so maybe I won’t notice if I develop Alzheimer’s disease. It’s a terrifying prospect. But even as the population ages, dementia rates are dropping.3 The real epidemic is anxiety about memory loss. Remember the two and a half percent of people over sixty-five living in nursing homes? Ninety percent of the remainder can think just fine.4 Here too, the vast majority of older Americans will land in the rest of the pie chart, slowed somewhat but fully capable of finding their slippers sooner or later and making their way in the world.

How about my assumption that old people no longer have sex? It’s true that sexual activity tends to decrease with age. It’s equally true that retirement homes are hotbeds of lust and romance, as evidenced by skyrocketing sexually transmitted disease rates in people over fifty. Sex and arousal do change, but often for the better, especially for women.

I also figured that old people were depressed. After all, they were old, and they were going to die soon. Their droopy faces were all the evidence I needed. It turns out that older people enjoy better mental health than the young or middle-aged.5 Who knew? Here’s the kicker: People are happiest at the beginnings and the ends of their lives.6 If you don’t want to take my word for it, Google “U-curve of happiness.” Even as age strips us of things we cherished—physical strength, beloved friends, toned flesh—we grow more content.

The more I learned, the better I felt about the years ahead—no small accomplishment in itself. I had to acknowledge that the goalposts were shifting, but also that I remained very much in the game. I’ve always loved bicycling in the city, but now I wear a helmet and stay in the slow lane. I still barrel along the sidewalk, but recently had to slow down in order to keep pace with a seventy-four-year-old friend whose knees were killing her. Arthritis. She marveled that time and cartilage had waylaid her in this way, and I realized that I, too, will marvel when it happens to me. Like her, I’ll figure out how to deal with it. I’ll buy a cane just like she did, and keep on going. Just not as fast.

Specific concerns replaced nameless dread. I was onto something. Clearly, hitting ninety was going to be different—and way better—than the inexorable slide toward depression, diapers, and puffy white shoes I’d once envisioned, although I’m still worried about the shoes.

Things started looking so much rosier that I graduated to what I came to call I’m Not Ray–Stage Two: trumpet the fact that Ray and I are the same age, because see how much younger I look! Sliding happily to the other end of the spectrum, I spent several years chasing the idea that enough spinach, Sudoku, and positive thinking could “put old on hold.” This approach goes by all kinds of peppy names, like successful aging and productive aging, and it moves a lot of product aimed at keeping us “ageless.” It sounds comforting and it feels empowering.

But a question tugged at my sleeve. Was I actually coming to terms with what it meant to grow older? Or had I merely swapped my don’t-want-to-think-about-it foxhole for a hamster wheel to keep uncomfortable reckonings at bay—and Ray at a distance? Replaced dread with denial, in effect?

Hitting sixty felt just fine. I knew the years were bestowing more than they took away. I knew it from my own experience, and my research continued to confirm that I was no exception and that the years ahead had even more to offer. But I had yet to internalize that knowledge, to integrate it into my beliefs and attitudes, to embed it into my sense of self and my place in the world, to make it my own. I had to acknowledge and start letting go of the prejudices about aging that had been drummed into me since childhood by the media and popular culture. Wrinkles are ugly. Old people are incompetent. It’s sad to be old. Absorbing these fallacies had been effortless. Banishing them is unsettling, and infinitely harder. Present tense because I’m still at it, as I’m reminded on a regular basis.

What was the hardest prejudice to let go of? A prejudice against myself—my own future, older self—as inferior to my younger self. That’s the linchpin of age denial. Whatever form it takes, from a cutesy “Just say I’m over (fill in the number)” to faces frozen by needle and knife, denial creates an artificial, destructive, and unsustainable divide between who we are and who we will become. Concealing or disavowing our age gives the number power over us that it doesn’t deserve. Accepting our age, on the other hand, paves the way to acknowledging it with ease, even pride.

I am not saying that aging is easy. We’re all worried about some aspect of getting old, whether it’s running out of money or getting sick or ending up alone, and those fears are legitimate and real. But it never dawns on most of us that the experience of reaching old age—or middle age, or even just aging past youth—can be better or worse depending on the culture in which it takes place. And American culture is grotesquely youth-centric. Depictions of older people tend to be extreme. At one end of the spectrum, a silver-maned dude, beloved of marketers, surfs a turquoise wave. At the other end, beloved of the aging-industrial complex, a tiny woman withers in a hospital bed. These people exist, but they are hardly typical. The vast majority of us will end up in the middle, muscles and memory slowed, but out in the world and—with some help— able to enjoy our lives to the end.

I had to make my way to I’m Not Ray–Stage Three: I’m not Ray. Ray recently retired to Florida, where I bet he’s going to be happy as a clam; it’s the old age he wants. I’m making my way toward the old age I want, and it won’t look like his. I’m not planning to retire anytime soon, nor am I going to take up pole dancing or marathon running. I feel just fine about it. All aging is “successful”—not just the sporty version. Otherwise you’re dead.

All aging is “successful”—not just the sporty version. Otherwise you’re dead.

A bunch of pieces fell into place with that realization, but a fundamental, underlying question remained: Why had my vision of late life been so out of sync with the lived reality? Why had I bought into an unexamined narrative for all these years instead of taking comfort and guidance in the evidence around me? These facts were easy to come by, so why didn’t more people know them? What Kool-Aid had we drunk? What was it in the culture that had me and so many others so freaked out about the prospect of living to eighty or ninety? The answer, which grew into an itch that I had to scratch and ultimately a book that I had to write, is ageism—the relegation of older people to second-class citizenship, along with the disrespecting of youth. Here’s the formal definition: discrimination and stereotyping on the basis of a person’s age. We’re ageist when we feel or behave differently toward a person or a group on the basis of how old we think they are. “Ageism” isn’t a household word yet, nor a sexy one, but neither was “sexism” until the women’s movement turned it into a howl for equal rights.

As with all “isms,” stereotyping lies at the heart of ageism: the assumption that all members of a group are the same. It’s why people think everyone in a retirement home is the same age—old—even though residents can range from fifty-year-olds to centenarians. (Can you imagine thinking the same way about a group of twenty- to seventy-year-olds?) And the longer we live, as more experiences inform our uniqueness, the more different from one another we become. Think about it: Which group is likely to have more in common, a bunch of seventeen-year-olds or a bunch of seventy-seven-year-olds? As doctors put it, “If you’ve seen one eighty-year-old, you’ve seen one eighty-year-old.”

All “isms”—ageism, racism, sexism—are socially constructed ideas. That means we make them up, and they change over time. Like all discrimination, ageism legitimizes and sustains inequalities between groups, in this case, between the young and the no-longer-young. Different kinds of discrimination—including racism, sexism, ageism, ableism, and homophobia—interact, creating layers of oppression in the lives of individuals and groups. The oppression is reflected in and reinforced by society through the economic, legal, medical, commercial, and other systems that each of us navigates in daily life. Unless we challenge stigma, we reproduce it.

Like racism and sexism, ageism is not about how we look. It’s about what people in power want our appearance to mean. Ageism occurs when a group, whether politicians or marketers or employment agencies, uses that power to oppress or exploit or silence or simply ignore people who are much younger or significantly older. We experience ageism any time someone assumes we’re “too old” for something—a task, a relationship, a haircut—instead of finding out who we are and what we’re capable of. Or if someone assumes that we’re “too young”: ageism cuts both ways, and young people experience a lot of it. That’s what’s going on when people grumble about lazy Millennials or complain that “kids are like that.”

Now I see ageism everywhere. When old pals cringe at public mention of how long they’ve known each other instead of savoring their shared history. When men and women feel compelled to lie about their age on online dating sites. When people bridle at being kindly offered a seat on the bus. On billboards and television, in hospitals and hotels, over dinner and on the subway. (“At age eighty, who doesn’t need a facelift?” a poster announcing a subway station renovation asks brightly.) In the incessant barrage of messages from every quarter that consigns the no-longer-young to the margins of society. In our mindless absorption of those messages and numb collusion in our own disenfranchisement.

I’ve learned that most of what I thought I knew about the aging process was wrong. That staying in the dark serves powerful commercial and political interests that don’t serve mine. And that seeing clearly is healthier and happier. Yet, despite the twentieth century’s unprecedented longevity boom, age bias is only beginning to bleep onto the cultural radar—it’s the last socially sanctioned prejudice. We know that diversity means including people of different races, genders, abilities, and sexual orientation; why is age typically omitted? Racist and sexist comments no longer get a pass, but who even blinks when older people are described as worthless? Or incompetent, or “out of it,” or boring, or even repulsive?

Suppose we could see these hurtful stereotypes for what they are—not to mention the external policies and procedures that put the “ism” after “age.” Suppose we could step off the treadmill of age denial and begin to see how ageism segregates and diminishes our prospects. Catch our breath, then start challenging the discriminatory structures and erroneous beliefs that attempt to shape our aging. Until then, ageism will pit us against each other; it will rob society of an immense accrual of knowledge and experience; and it will poison our futures by framing longer, healthier lives as problems instead of the remarkable achievements and opportunities they represent.

Age is a criterion of diversity.

A good place to start is by jettisoning some language. “The elderly”? Yuck, partly because I’ve never heard anyone use the word to describe themselves. Also because “elderly” comes paired with “the,” which implies membership in some homogenous group. “Seniors”? Ugh. “Elders” works in some cultures but feels alien to me, and I don’t like the way it implies that people deserve respect simply by virtue of their age; children, too, deserve respect. Since the only unobjectionable term used to describe older people is “older people,” I’ve shortened the term to “olders” and use it, along with “youngers,” as a noun. It’s clear and value-neutral, and it emphasizes that age is a continuum. There is no old/young divide. We’re always older than some people and younger than others. Since no one on the planet is getting any younger, let’s stop using “aging” as a pejorative—“aging Boomers,” for example, as though it were yet another bit of self-indulgence on the part of that pesky generation, or “aging entertainers,” as though their fans were cryogenically preserved.

It always drove me nuts when some clown called me “young lady” and expected me to feel complimented, but I didn’t know why until I started thinking deeply about it. Made to our face, comments like these are disguised as praise. We tend to ignore them because the reference to being no-longer-young is embarrassing. And it’s embarrassing to be called out as older until we quit being embarrassed about it. Well, I’m not anymore. When someone says, “You look great for your age,” I no longer mutter an awkward thanks. I say brightly, “You look great for your age too!” When it dawned on me that one of the reasons older women are invisible is because so many dye their hair to cover their gray, I bleached mine white to see what it was like. When my back hurts, instead of automatically blaming it on my osteo-you-name-it, I stop to think whether shoveling or weeding could be to blame. I started a Q&A blog called Yo, Is This Ageist? where people can ask me whether something they’ve seen or heard or done is offensive or not. And I wrote this book.

Although we age in different ways and at different rates, everyone wakes up a day older. Aging is difficult, but few of us opt out, and the passage of time confers very real benefits upon us. By blinding us to those benefits and heightening our fears, ageism makes growing older in America far harder than it has to be. That’s why I’ve embarked on a crusade to overturn American culture’s dumb and destructive obsession with youth, and challenge the way people at both ends of the age spectrum are devalued and disrespected.

When someone says, “You look great for your age,” I no longer mutter an awkward thanks. I say brightly, “You look great for your age too!”

As I’ve gone on this journey from the personal to the political, it’s become clear that ageism is woven deeply into our capitalist system, and that upending it will involve social and political upheaval. Ageism, unlike aging, is not inevitable. In the twentieth century, the civil rights and women’s movements woke mainstream America up to entrenched systems of racism and sexism. More recently, disability rights and gay rights and trans rights activists have brought ableism and homophobia and transphobia to the streets and the courts of law. It is high time to add ageism to the roster, to include age in our criteria for diversity, and to mobilize against discrimination on the basis of age. It’s as unacceptable as discrimination on the basis of any aspect of ourselves other than our characters.

If marriage equality is here to stay, why not age equality? If gay pride has gone mainstream, and millions of Americans now proudly identify as disabled, why not age pride? The only reason that idea sounds outlandish is because this is the first time you’ve encountered it. It won’t be the last. Longevity is here to stay. Everyone is aging. Ending ageism benefits us all.

Why add another “ism” to the list when so many, racism in particular, call out for action? Here’s the thing: We don’t have to choose. When we make the world a better place to grow old in, we make it a better place in which to be from somewhere else, to have a disability or be queer or non-white or non-rich. Just as different forms of oppression reinforce and compound each other—that’s intersectionality, a term coined by feminist and civil rights activist Kimberlé Crenshaw—so do different forms of activism, because they chip away at the fear and ignorance that all prejudice relies upon. Ageism is the perfect target for compound advocacy because everyone experiences it. And when we show up at all ages for whatever cause tugs at our sleeve—save the whales, the clinic, the democracy—we not only make that effort more effective, we dismantle ageism in the process.

This book is a call to wake up to the ageism in and around us, embrace a more nuanced and accurate view of growing older, cheer up, and push back. What ideas about aging have each of us internalized without even realizing it? Where have those ideas come from, and what purpose do they serve? How do they play out across our lives, from office to bedroom, in muscle and memory, and what changes inside us once we perceive these destructive forces at work? What might an age-friendly world—friendly to all ages, that is—look like? What can we do, individually and collectively, to provoke the necessary shift in consciousness, and catalyze a radical age movement to make it happen?

Let’s find out.



When geriatrician Robert Butler coined the term “ageism” in 1969—not long after “sexism” made its debut—he defined it as a combination of prejudicial attitudes toward older people, old age, and aging itself; discriminatory practices against olders; and institutional practices and policies that perpetuate stereotypes about them. The term was quickly adopted by the media and added to the Oxford English Dictionary. Almost half a century later, it’s barely made inroads into public consciousness, not to mention provoked outcry.

Negative messages about aging cast a shadow across the entire life of every American, stunting our prospects, economy, and civic life. This is oppression: being controlled or treated unjustly. However, most Americans have yet to put their concerns about aging in a social or political context. When I ask people if they know what ageism is, most reflect for a moment, compare the word to other “isms,” and realize what it must mean. The concept rings true, and they nod. But it’s still a new idea to most. And unless social oppression is called out, we don’t see it as oppression. Perpetuating it doesn’t require conscious prejudice or deliberate discrimination. This lesser life is “just the way it is,” and the way it probably always will be.


In most prehistoric and agrarian societies, the few people who lived to old age were esteemed as teachers and custodians of culture. Religion gave older men power. History was a living thing passed down across generations. This oral tradition took a serious hit with the invention of the printing press, when books became alternative repositories of knowledge. As long as old age remained relatively rare, though, olders retained social standing as possessors of valuable skills and information. The young United States was a gerontocracy, which served the older men who held the reins; younger citizens had to age into positions of authority.

The nineteenth and twentieth centuries ushered in a reversal. Modernity brought massive transitions that reduced the visibility of older members of society, diminished their opportunities, and eroded their authority. Rapid social change made learning about the past seem less relevant. Aging turned from a natural process into a social problem to be “solved” by programs like Social Security and “retirement villages.” The nursing home, a “shotgun marriage of the poorhouse and the hospital” in geriatrician Bill Thomas’s memorable phrase, came into being and created a growth industry. The historians Thomas R. Cole and David Hackett Fischer have documented how, at the start of the nineteenth century, the idea of aging as part of the human condition, with its inevitable limits, increasingly gave way to a conception of old age as a biomedical problem to which there might be a scientific solution. What was lost was a sense of the life span, with each stage having value and meaning.

Propelled by postwar leisure and prosperity, the explosion of consumer culture, and research into a stage of life newly dubbed “adolescence,” youth culture emerged as a distinct twentieth-century phenomenon. As this “cult of youth” grew, gerontophobia—fear of aging and dislike, even hatred, of old people—gained traction. Those of us who grew up in the 1960s and ’70s were warned not to trust anyone over thirty, perhaps the first overt exhortation to take sides across a generational divide. The decades beyond thirty appeared ever less enviable. “Will you still need me, will you still feed me, when I’m sixty-four?” crooned the Beatles.


The status of older Americans is rooted not only in historic and economic circumstances but also in deeply human fears about the inherent vulnerabilities of old age: the loss of mobility, visibility, and autonomy. Not all of these transitions befall us all, and only two unwelcome ones are inevitable: We’ll lose people we’ve known all our lives, and some part of our bodies will fall apart. These changes are natural. But we live in a culture that has yet to develop the language and tools to help us deal with them. That’s partly because these changes make us feel vulnerable, partly because longer lives are such a new phenomenon, and partly because of ageism, both internalized and in the culture at large. As a result, all too often these transitions are characterized by shame and loss of self-esteem.

Internalized, these fears and anxieties pave the way for a host of unhealthy behaviors that include denial, overcompensation, and worse: actual contempt, which legitimizes stigma and discrimination. Two characteristics of marginalized populations are self-loathing and passivity—what my daughter tactfully dubbed the “yuck/pity factor” that the prospect of growing old invokes in so many.

As a friend who bought a house from a wheelchair user observed, “Damn, it’s nice to have wide doorways, and a toilet positioned this way—they should just do it for everyone.” That’s the premise of universal design—that products designed for older people and people with disabilities work great for everyone else too. Age-friendly products improve the built environment and make it more accessible, but stigma keeps them off the market. Realtors advise removing ramps and grip bars before putting a house on the market, as though no buyer could see accessibility as a bonus or aging into it as a necessity. Alas, thanks to internalized ageism, they’ve got a point.

Stigma trumps even the bottom line. There’s a fast-growing “silver market,” especially for products that promote “age-independence technology,” yet advertisers continue to pay a premium to target eighteen-to-thirty-five-year-olds. Despite the significant purchasing power of older buyers, retailers are uneasy about stocking products for them and companies are leery of investing. Unless they’re selling health aids, brands don’t want to be associated with the no-longer-young set either. Just as telling is the resistance of older consumers themselves to buying products that might telegraph poor eyesight or balance.

Instead we blame ourselves for a vast range of circumstances not of our making and over which we have no control. Difficulties turn us into “problem people.” When labels are hard to read or handrails missing or containers hard to open, we fault ourselves for not being more limber or dexterous or better prepared. Watching an older person struggling to heave herself out of a low chair, we assume her leg muscles are weak or her balance is shot, instead of considering the inadequacies of seating so deep or low to the ground. If we see a teenager perched on a kindergartener’s chair, we don’t bemoan the fact that his legs got so huge. Kiddie chairs aren’t designed for teenagers any more than armchairs are designed for ninety-year-olds.

As we age, we blame ourselves for a vast range of circumstances not of our making and over which we have no control.

The issue is not competence, or incompetence, but it’s hard to keep sight of that in an ageist world. These obstacles are less of a problem than the underlying policies and prejudices that reduce access and independence. We blame our own aging, instead of the ageism that renders these natural transitions shameful and these barriers acceptable. Discrimination—not aging—is the barrier to full participation in the world around us.


It doesn’t make much sense to discriminate against a group that we aspire to join. Or to rail about olders sucking up “entitlements”—which they earned—when both the need and the antagonism will come our way in turn. Ageism is a prejudice against our own future selves, as Todd Nelson and many other age scholars have observed, and has the dubious distinction of being the only “ism” related to a universal condition. It takes root in denial of the fact that we’re going to get old. That we are aging. Its hallmark is the irrational insistence that older people are Other, not Us—not even future us—and we go to great lengths to distance ourselves from that future state. “My mom is ninety, but she’s not old,” someone insisted to me not long ago, as though it were contagious. We exaggerate difference and overlook what we have in common, as with older people who spurn senior centers “full of old people in wheelchairs” lest they be tarnished by association.

In childhood we’re maddened when grown-ups don’t treat us with respect—that’s ageism too—but unable to imagine that our speech will someday quaver, skin crease, gait falter. Over time it gets harder to sustain that illusion, and a punitive psychological bind tightens its grip. Unless we come to terms with the transition, we hate what we are becoming. Historian David Hackett Fischer is blisteringly clear about the implications of this damaging divide, “destructive most of all to those who adopt it—for in the end it is always directed inward upon the mind it occupies.”1 That’s the nature of prejudice: always ignorant, usually hostile. It begins as a distaste for others, and in the case of age (as opposed to race or sex), it turns into distaste for oneself.

This self-hatred takes many forms. It’s manifest in the widespread effort to “pass” for younger, the way people of color have passed for white and gay people for straight; behavior spurred both by the desire to protect ourselves from discrimination and by internalized disgust. It underlies disparaging comments like, “I know that this isn’t true of anyone else in the room, but I’m not getting any younger” and “You don’t have to say when I graduated,” both of which I’ve heard verbatim from people on the front lines of aging policy. You’d think they’d be a little more self-aware, but many are invested in deficit models of aging. They’re experts in the important task of caring for the frailest and neediest—that’s how they get funded and promoted—and they have yet to reconcile that view of old age with what lies ahead for themselves. At the other end of the spectrum, many experts are proponents of the successful aging model, which holds that healthy behaviors and “can-do” strategies can hold aging at bay. That’s still denial, a high-end version that tends to overlook the very important role of socioeconomic class and potential disability in shaping how “successfully” we age.

We’re so busy feeling young that we stay blind to the ageism in and around us and never learn to defend ourselves against it. Older people tend to identify with younger ones as strongly as youngers themselves do. Other groups that experience prejudice, like gays or people with autism, develop buffers that can reinforce group identity, and even pride, at belonging to what sociologists call an out-group. Olders are apparently the only group whose attitudes about old age are as disparaging as those held by the in-group, the young.2 Talk about not wanting to belong to any club that would have you as a member! Which would be funnier, and a lot less ironic, if it weren’t the club that everyone is counting on getting into.


Why are stereotypes so insidious? Because when they apply to others, there’s no need to defend ourselves against them. They’re easily, often unconsciously, absorbed into our ways of thinking. Stereotyping obstructs empathy, cutting people off from the experience of others—even if, as is the case with ageism, those “others” are our own future selves. “Ageism allows the younger generations to see older people as different than themselves; thus they subtly cease to identify with their elders as human beings,” Robert Butler wrote in Why Survive? Being Old in America, which won him a Pulitzer Prize.3 When we see people as other than us—other color, other nationality, other religion—their welfare seems less of a human right. That’s why at least five out of six cases of elder abuse go unreported.4

Elder abuse can take many forms: neglect or abandonment; physical abuse (including the inappropriate use of drugs or confinement); emotional abuse such as intimidation or humiliation; sexual abuse; healthcare fraud; and financial exploitation. Because of ageism, elder abuse is less familiar to emergency room staff and law enforcement officers than other forms of domestic violence, and the public is less equipped to recognize it. “If nobody knows that I’m being abused, or I never hear about elder abuse and I think I’m the only one it’s happening to, I’m embarrassed and ashamed so I just keep my mouth shut,” explains Mary Anne Corasaniti, ex-director of New York State’s Onondaga County elder abuse program. It’s why some people rationalize exploiting olders with the repugnant excuse that the person is too old to notice.

Condescension alone actually shortens lives. What professionals call “elderspeak”—the belittling “sweeties” and “dearies” that people use to address older people—does more than rankle. It reinforces stereotypes of incapacity and incompetence, which leads to poorer health, including shorter life spans. People with positive perceptions of aging actually live longer—a whopping seven and a half years longer, on average—in part because they’re motivated to take better care of themselves.5 Dementia confers no immunity. Nursing home residents with severe Alzheimer’s have been shown to react aggressively to infantilizing language. Overaccommodation also harms—behavior like using simpler words and sentences or speaking louder and more slowly than we would to a younger person, instead of first ascertaining that the person is in fact confused or hard of hearing. Targets of this demeaning behavior appear to “instantly age,” speaking, moving, and thinking less capably.6

Internalized stereotypes also interfere with the value that people place on their own lives. Take the sad story of Bob Bergeron, a therapist in New York whose suicide at forty-seven took his friends by surprise. Described as “relentlessly cheery,” Bergeron had friends and family, financial security, and no history of depression. Extraordinarily beautiful as a young man, he was writing a self-help guide called The Right Side of Forty: The Complete Guide to Happiness for Gay Men at Midlife and Beyond. In Bergeron’s suicide note, next to an arrow pointing to the title page of his manuscript, he wrote, “It’s a lie based on bad information.” He was new to the struggles of the writing life and alone on New Year’s Eve; not a good combination. Belonging to a subculture that fetishizes youthful beauty and conventional sexual prowess did him no favors either. Bergeron’s greater tragedy, though, was to inhabit a world so bereft of alternative narratives that dread overtook him. That’s why we need more rich, complex stories that shrug off the mantle of decline and show there’s no “right” or “wrong” side to forty—or any other age.

In another study, people were exposed to negative or positive stereotypes of old age, then asked to request or reject life-prolonging medical treatment in a hypothetical situation. As expected, the negatively primed subjects were more likely to opt out.7 We see these values in the cultural controversy around assisted suicide, where the indignation index drops sharply when the population in question consists of the very old or severely disabled. Conversations need to factor into a cultural climate that barrages the old and disabled with the message that their lives are not worthwhile, nor worth paying for.

American culture barrages the old and disabled with the message that their lives are not worthwhile, nor worth paying for.

Euthanizing older people has a history in fiction that goes back at least as far as the Victorian-era novelist Anthony Trollope. Published in 1882, his novel The Fixed Period proposed mandatory euthanasia at age sixty-eight, ostensibly to relieve suffering. In satirist Christopher Buckley’s novel Boomsday, Millennials rise up. The movement’s prophetic leader urges folks to stop paying taxes that subsidize retirement, and create financial incentives for Boomers to commit suicide. The description of a seminar hosted by New York University in June 2013 called “Love and Let Die: An All-Day Consideration of Ballooning Longevity, the Quality of Life, and the Coming Generational Smash-up” posited that “We may well be approaching a situation in which we as a society will have to choose between living in a world where an eighty-five-year-old is routinely granted five hip operations, or one in which we can still afford, say, primary school.”

If someone botched my first four hip operations, I’d like a crack at a fifth, thank you very much. It’s not as though funding for primary school comes out of the same bucket as funding for joint replacements (and universal healthcare and decent public education would render the example meaningless). It’s not a question of resources but of how they’re distributed. People at both ends of the age spectrum are least likely to be economically productive in a capitalist system, and therefore the most likely to be discriminated against. For all the “family values” rhetoric coming out of Capitol Hill, programs for kids are underfunded because kids don’t vote and because the kids whose parents have political influence need those programs less. As with other “isms,” ageism pits the disenfranchised against each other in order to maintain the power of the ruling class.

“Kids vs. canes” is a false dichotomy that gerontologists have debunked countless times, but it makes great headlines. As it is, older people are lacking from the landscape, and pro-aging voices are rare. If ageism continues to go unchallenged, a dystopian future where they are missing entirely begins to seem conceivable. Given the remarkable set of achievements that longevity represents, that would be an ironic and tragic outcome.


Growing old isn’t new. What’s new is how many of us now routinely do so. The first leap in life span occurred some thirty thousand years ago, during the Paleolithic era, when people started living past the age of thirty. That’s when modern humans began flourishing, making art, using symbols, and thriving, despite the bitter cold of the last Ice Age. Why? Because thirty is old enough to be a grandparent, which conveys evolutionary advantages. Older people are repositories of knowledge, skilled in avoiding danger and storing food and knowing who’s related to whom, and at passing along these complex skills.

The next big shift occurred some 150 years ago, propelled by the extraordinary scientific and technological advances that began with the Industrial Revolution. As more children survived to adulthood, women began having fewer of them. (Somewhat counterintuitively, the main determinant of population aging is dropping fertility rates, not rising life expectancy.) The proportion of older people increased, and the life span in the developed world has since doubled. In the twentieth century, in the U.S. alone, the American life span increased by a staggering thirty years. This largely reflects the fact that more Americans are surviving to adulthood, but we’re living longer, too, gaining on average ten biological years since our grandparents’ era. In effect, thanks largely to clean water and antibiotics, we’ve redistributed death from the young to the old.

“It is, frankly, insane to look at an ageing population and not rejoice,” writes Guardian columnist Zoe Williams about the U.S. Census Bureau’s 2008 report on the unprecedented aging of the world population. “Why do we even have a concept of public health, of co-operation, of sharing knowledge, if not to extend life, wherever we find it?”8 A blue-chip roster of speakers at the 2012 Age Boom seminar for journalists in New York referred to the longevity revolution as “the most important phenomenon of our time in the world, more than the bomb, the Pill, or the Internet,” as “an extraordinary opportunity to solve almost all of our problems,” and as “potentially the biggest achievement in the history of the species.” Describing “a new stage of human history,” Linda Fried, Dean of Columbia University’s Mailman School of Public Health, referred to “the only natural resource that’s actually increasing: the social capital of millions more healthy, well-educated adults.” A growing body of knowledge from very different schools of thought—including the Rand Corporation, University of Chicago, Queen’s University Belfast, and Harvard and Yale Universities—now acknowledges that health, along with the longevity it brings, are important economic drivers that generate wealth by affecting healthcare costs, labor-force participation rates (given the appropriate incentives), worker productivity, and the financing of pension systems.9


A little less worried about the tug of time on your own prospects? It’s no time to relax! Journalist Paul Kleyman’s witty coinage—“global wrinkling”—evokes both the scale of this massive demographic shift and the free-floating anxiety that accompanies it. Global wrinkling is typically portrayed as a social problem, even a disaster in the making. Anxious times feed what Fried called “a deficit accounting of what it means to be an aging society.” When times are tough, we look for scapegoats. We project our personal worries about getting older onto the demographic phenomenon of population aging. And indeed, unless we prepare wisely for this demographic shift, we could turn feat into fiasco.

In October 2010, demographer Philip Longman warned of a “‘gray tsunami’ sweeping the planet.”10 The phrase summons a frankly terrifying vision of a giant wave of old people looming on the horizon, poised to drain the public coffers, swamp the healthcare system, and suck the wealth of future generations out to sea. Journalists jumped on it, and “gray tsunami” has since become widely adopted shorthand for the socioeconomic threat posed by an aging population.

“Is the progressive aging of society really equivalent to the instantaneous devastation of cities?” asks University of Toronto Assistant Professor Andrea Charise. As she notes, this language divides society into two opposing groups, the “needy old” and everyone else, and “traffics in the politics of panic”11 so successfully that all other narratives are effectively pushed aside. It’s not the first politically charged use of this kind of language. In the late nineteenth century, the influx of Asian immigrants was referred to as the “yellow peril.” A “rising tide” has been used to describe a whole host of diseases deemed threatening to society, from tuberculosis and syphilis in the nineteenth century to HIV/AIDS in the twentieth century and Alzheimer’s disease in the twenty-first.

Talk of plague and poverty justifies prejudice against older people, legitimizes their abandonment, and fans the flames of intergenerational conflict. It also obscures the fact that what we’re facing is no tsunami. It’s a demographic wave that scientists have been tracking for decades, and it’s washing over a floodplain, not crashing without warning on a defenseless shore. Since the wave has been on the horizon since the 1950s, why is society so ill-prepared? Why not conceive of it as a “silver reservoir,” as social gerontologist Jeannette Leardi proposes? The “tsunami” could fill that reservoir.

Part of our ambivalence about aging is just human. Nobody wants to die young, but concerns about scaling up the financial and physical support that long lives require are widespread and legitimate. Things are changing so fast that we’re carving out entirely new biological and social turf. Roles for this new cohort of older people have yet to evolve. The institutions around us were created when lives were shorter. For example, the notion that education is for the young, employment for people in middle age, and leisure for the old is clearly obsolete, but we have yet to revise these structures in substantial ways, or invent new ones. Science has leapfrogged culture, and society hasn’t had time to catch up. Humans are notoriously slow to reframe perception and behavior.

Sociologists call this “structural lag.” It happens when elements of a social system change at different rates and get out of sync. Small wonder that so many of our attitudes toward old age are irrational or downright contradictory. Americans over fifty control approximately 70 percent of the country’s disposable income, yet we are ignored by marketers. How can age be a burden, as the headlines insist, and also the gift that a thousand cloying affirmations rightly declare it to be?


Operating in the global economy means competing intensely for any kind of economic edge. The difference between success and failure hinges on slim and fast-moving margins. As the quest for wealth and power has gone global, the people who inhabit that globe are rapidly growing older. Those trends are colliding.

According to the “gray tsunami” narrative, an aging population makes it impossible to compete in the global economy. A young labor force, on the other hand, attracts global businesses and investors. Call it “global age arbitrage,”12 a term coined by business reporter Ted C. Fishman, author of The Shock of Gray, a preview of the global effects of the longevity boom. (Arbitrage means buying an asset cheaply and promptly selling it elsewhere at a profit.)

Global competition for “economic youth” is driving political and institutional ambivalence about the longevity boom, and the interests of the very young and very old—the most exploitable labor force, and therefore least valuable—are linked. Writing about how the government defines poverty, journalism professor Thomas B. Edsall observed that “both the beginning and the end of life are becoming increasingly subject to market decisions, cost-benefit analyses, and bottom-line considerations that had not been so glaringly explicit in the past.”13 Olders are perceived as a drag on the economy because of the way the economy is structured, and the structure has yet to be revised in order to take advantage of the vast new untapped resource we represent.

The language is cold-blooded, the trajectory is evident, and the culprit is clear. As Fishman put it, “The high costs of keeping our aging population healthy and out of poverty has caused the United States and other rich democracies to lose their economic and political footing.”14 In other words, according to this school of thought, Western imperialism is in decline not because of the accumulation of toxic debt that threatened the global banking system, or the effects of climate change, or the stagnation of real wages, or high youth unemployment rates, or crumbling public infrastructures, or a workforce left behind by automation and the information economy, or because the middle class is under siege and wealth is being concentrated in ever fewer hands. The problem is too many old people!

This is hogwash. In The Imaginary Time Bomb, British economist Phil Mullan exposes the reactionary analyses of people like Fishman and makes a persuasive case that the modern world’s growing preoccupation with aging has little or nothing to do with demography. Instead, it is used to justify further reductions in the role of government in the economy and the curbing of the welfare state. “Often what is presented as a population problem is better understood as a moral or ideological problem which assumes a demographic form,” writes sociologist Frank Furedi in the preface.15 This justification for austerity “lies in the socially constructed notion that federal spending on the elderly and the poor is the cause of the problems of the US economy,” writes Mullan.16 Blaming aging for the problems that afflict the U.S. economy—the way Fishman, Longman, and so many alarmist demographers do—obscures their origins in global capitalism.


• Society will be swamped by all these old people!

Consider the oft-repeated statistic that as of 2015 there were more Americans over sixty than under fifteen. Yes, there are a ton more old people in the boat, but there are also a lot fewer kids. Will we be drowned in a glut of olders, or starved by a dearth of youngers? Another way to look at it is that by 2020 there’ll be one older adult for every child—far better for the children’s welfare than the inverse, when birth rates and infant mortality were high.

It’s helpful to keep in mind that the projections that have people so worked up are largely the result of a specific historical phenomenon: the cohort effect of the postwar generation growing old—the proverbial bulge in the python. Tellingly, relatively few U.S. population graphs extend past midcentury, by which time the proportion of people over sixty-five will be in decline. Even countries that are rapidly aging can produce “youth bulges,” Longman points out, describing them as looming disasters “with all the attendant social consequences, from more violence to economic dislocation.”17 Can’t win for losing.

• An older population will bog everyone else down in caring for the sick and the frail.

The longevity boom does indeed call for massive investment into the biology of aging and related medical issues. New and often expensive medical treatments make it possible to prevent or treat many more conditions than fifty years ago. The caregiver crisis is real and growing more acute. But the assumption that older people are inevitable money pits for health dollars is incorrect. Medical expenses are highest in the period just before we die, but that’s true whether we die at eighteen or eighty, and evidence suggests that how long we’re sick affects spending more than how old we are.18 The postwar generation is the healthiest one in history. One study of twenty-two wealthy countries (including the U.S.) actually found population aging negatively correlated with health expenditures.19 Rather it was people with debilitating illnesses or injuries—regardless of age—who used the most resources. According to the World Health Organization, aging has far less influence on healthcare expenditures than several other factors. For example, between 1940 and 1990, when the U.S. population aged most rapidly, aging appears to have contributed only around 2 percent to the increase in health expenditures. Technology-related changes were responsible for between 38 and 65 percent of that growth.20

People aren’t just living longer; they’re healthier and are disabled for fewer years of their lives than older people of decades ago. According to the U.S. Department of Health and Human Services, the share of U.S. healthcare spending going toward nursing and retirement homes has declined since 2000 and been flat since 2006.21 The ten-year MacArthur Foundation Study of Aging in America concluded that once people reach sixty-five, their added years don’t have a major impact on Medicare costs, although this may change as the number of people living with Alzheimer’s disease increases.22 People over eighty actually cost less to care for at the end of life than people in their sixties and seventies, possibly because aggressive interventions become less common. Chronic conditions pile up, but they don’t keep most older Americans from functioning in the world, helping their neighbors, and enjoying their lives.

• Olders are a drag on the economy.

Absolutely not. People fifty and up fuel the significant, fast-growing, and often overlooked “longevity economy.” According to AARP, spending by the fifty-plus population amounted to $5.6 trillion in 2015. Factor in the effects of this direct spending as it circulates through the economy and the contribution to GDP amounted to $7.6 trillion. Overall, this spending supported more than 89.4 million jobs in 2015—61 percent of all U.S. jobs.23 By 2032 the fifty-plus age-group is projected to drive more than half of U.S. economic activity, as their spending fuels industries that include apparel, healthcare, education, leisure, and entertainment.24 The trend is global. As Joseph Coughlin, director of the MIT AgeLab, writes in The Longevity Economy, “It’s no exaggeration to say that the world’s most advanced economies will soon revolve around the needs, wants, and whims of grandparents.”25

The U.S. also has more older workers than ever before. By staying employed longer, generating tax revenue, and continuing to earn and to spend, olders are fueling economic growth far longer than previous generations did. As they age and transition out of the workforce, people will need more help with tasks like home maintenance, driving, and downsizing, all of which generate jobs. Older people also drive investment in a multitude of new products and services, especially technologically innovative ones. And while “entrepreneur” might conjure up an image of a kid in that proverbial garage, twice as many successful American entrepreneurs are over age fifty as in their early twenties.26 Labor statistics capture only part of the economic contribution of older Americans, whose unpaid volunteer work in 2015 was valued at $75 billion. As Baby Boomers transition out of the workforce, this figure will rise.27

• One generation benefits at the expense of another.

For starters, the common—and intuitively attractive—perception that distinct “generations” share and represent a set of experiences and characteristics has no scientific basis. The variation within members of a given group—people born between 1980 and 2000, for example—is greater than the variation between generations (as is also the case with people of a given race or ethnicity). Most studies that claim to demonstrate generational differences show something else instead, as with the ageist trope that Millennials in the workplace are spoiled and dissatisfied. The same was true of GenXers and Baby Boomers at that age; as we get older we tend to move into jobs that suit us better. It’s an age effect, not a generational effect.28

The tension between generations is indeed worth studying, but mostly as a red herring and a symptom of how aging has been reframed as a problem. The postwar generation in the U.S. had the good fortune to come of age in an era of unheralded peace and prosperity. It’s understandable for younger people to resent that good fortune, and to feel as though the Boomers have pulled up the drawbridge after themselves. But pitting groups against each other—old against young or, in this case, vice versa—is a time-honored tactic used by the wealthy and powerful to divide those who might otherwise unite against them in pursuit of a fairer world for all. It’s like setting groups of low-wage workers against each other, or the interests of stay-at-home moms against women in the paid workforce. The underlying issue is a living wage for all, and redress requires collective action. When issues are instead framed as zero-sum—more for “them” means less for “us”—it’s harder to see that the public good is at stake and the issue affects everyone.

Because conflict sells papers, the media perpetuate the myth that intergenerational competition is inevitable, and people readily buy into it. Barricades are easier to build than bridges. But they’re a lot less useful, and at a minimum this kind of thinking is shortsighted, as when older people complain about school taxes. Don’t they want the guy delivering their oxygen tank to be able to read the instructions? Having an educated workforce is better for everyone: individuals, families, communities, and society as a whole.

Pitting the generations against each other also obscures the key fact that income inequality does not discriminate by age. The wealthiest 1 percent consists of people of all ages, just like the ninety-nine. As leading economists have been arguing for years, growing wealth disparity within different age cohorts (not between them) underlies the shrinking prospects of ordinary Americans. Much of the intergenerational angst centers on the loaded term “old-age dependency ratio,” which compares the number of people over sixty-five to those aged fifteen to sixty-four, typically framed as the ratio of “dependents” to those of “working age.” With the number of people over sixty-five growing and the number in the workforce shrinking, the reasoning is that outnumbered youngers—often referred to as Gen Xers, Millennials, and Generation Z—will be left to shoulder enormous burdens. In fact, the proportion of older workers has been falling pretty steadily for over one hundred years. As Mullan observed about the United Kingdom, “The number of working people ‘supporting’ each pensioner has fallen from fourteen to one in 1900 to four to one in 1990, and hardly anybody noticed.”29

In recent years, many scholars have criticized this dependency ratio because of its crudeness as a measure and because of its blatant anti-olders ideology.30 It’s based on the assumption that people become economic deadweight as soon as they hit sixty-five, when the reality is far more nuanced. Older Americans draw heavily on their own resources in retirement, and many never become wholly dependent on government support. Many people require benefits well before age sixty-five, and a growing proportion remain employed long after it.

Fortunately the World Bank has developed a long-overdue alternative formula, called the adult dependency ratio, which takes these trends into account. Instead of comparing the ratio of older to younger workers, it compares the ratio of inactive adults to adults who remain economically productive. This ratio stays more or less constant until it eventually declines, and depicts a far more reassuring economic forecast.31

Economic dependence is hardly a one-way proposition. More resources have always flowed from older generations to younger ones than the reverse. Older people provide as much or more care than they receive, and people over seventy-five spend more time looking after someone, usually a partner, than young people do.32 In 2012 the Pew Research Center reported that one out of every ten children in the U.S. lived with a grandparent, most often in the grandparent’s house under the grandparent’s care.33 Many programs that benefit olders benefit youngers, too, like Social Security and Medicare payments that keep olders self-sufficient while their kids are busy raising their kids. In an era of stagnant wages and rising tuitions, more and more olders are helping grandchildren pay for college.

Families are multigenerational, after all. When fundamental problems with the housing and job markets go unaddressed, everyone suffers: the jobless “boomerang generation” of Millennials living with their parents; their parents—the “sandwich” generation—who are doing the lion’s share of wage-earning and caregiving for both younger and older family members; and, in a longer-lived world, even a “club sandwich” generation of adults who look after grandchildren, help their own offspring, and tend to their own nonagenarian relatives. Although much of this assistance goes unpaid, it has economic value, and it allows others to do paid work. That’s why the MacArthur Foundation paper Facts & Fictions about an Aging America concluded that there’s no evidence of “significant intergenerational conflict over old-age entitlements. In fact, quite the opposite appears to be true.”34

Another false dichotomy is that older workers take jobs away from younger ones. When jobs are scarce, this is true in the narrowest sense, but people seldom compete for the same jobs across generations. A different 2012 Pew Research Center study of employment rates over the last forty years found that rates for younger and older workers are actually positively correlated.35 In other words, as more olders stayed on the job, the employment rate and number of hours worked also improved for younger people. A 2015 study in the UK also found that higher employment rates among older workers benefited youngers, as older workers had more money to spend, thus creating more jobs.36 The challenge is to create enough jobs to prevent resentment and envy from affecting relations between generations—to create a world, in the words of historian David Hackett Fischer, “in which the deep eternal differences between age and youth are recognized and respected without being organized into a system of social inequality.”37

• Social Security bankrupted! Medicare exhausted!

A large population over the age of sixty-five will strain federal programs, and the government’s future financial obligations, currently underfunded, are indeed significant. Social Security, which is notably well-managed at low cost, can be fixed with relatively small adjustments, such as raising the cutoff point for taxing earnings. (Because high-end earnings have grown faster than average, today only about 83.5 percent of earnings are taxed, as compared to 90 percent in 1983.)38 Nor is the longevity boom to blame for the mess American healthcare is in. The failure lies in the way the system is organized. Designed as an acute care program, Medicare needs to be overhauled to deliver care management to people with disabilities and chronic illness—an aging America, in other words.

Compare the U.S. to Canada, whose citizens have free universal healthcare. In 2012 the Canadian Institute for Health Information (CIHI) reviewed thirty-five years of healthcare costs with a focus on the effect of an aging population. Contrary to the conventional belief that an aging population will overrun hospitals and drain healthcare budgets, the CIHI reported that elderly related care actually added less than 1 percent to public-sector health spending each year, despite the fact that olders are proportionately higher users of hospital and physician services, home care, and prescription drugs.39 In other words, spending on seniors is not growing faster than spending on the population at large. A lifetime of governmental assistance has reduced the vulnerability of older Canadians to illness and disability in old age.

Their American counterparts, on the other hand, still evince the lifelong effects of a system that leaves the poor in the lurch, whether old or young—the people who need healthcare the most, so that they don’t get sick and stay sick. Note: lifelong. The cumulative effects of poverty, stress, and harsh work environments manifest over time in illnesses that are often attributed to aging but actually reflect persistent disadvantage. As they mount, the personal and financial consequences reflect the growing cost of gross social inequality.

• We can’t afford longevity.

We can if we want to. A blue-chip panel of experts convened in 2015 by the Leadership Council of Aging Organizations concurred that we can provide for the healthcare and retirement income security needs of older Americans by using existing resources more efficiently.40 It’s the right move economically as well as ethically. Over the last century, national GDPs around the world, along with life spans, have rapidly increased.41 Health and longevity generate wealth.

Spending money on older people is often portrayed as a cost. It is an investment, not just for ethical reasons and not only because everyone will benefit down the line. Better health systems would cost less and keep people healthier, enabling them to work longer and contribute more to the economy. A sustainable long-term care system would allow the women who currently perform the bulk of this unpaid labor to stay in the workforce, enable people with significant disabilities to continue to live the way they would like to, and encourage risk-sharing and bonding across communities. Supporting engagement for olders in the arts and education improves cognition, bolsters social ties, and benefits the quality of life of everyone involved.

Spending money on older people is often portrayed as a cost. It is an investment.

Older people do indeed receive a disproportionate amount of government and welfare spending: more healthcare, more personal social services, and of course, by definition, all retirement benefits. Is this really outrageous, or even surprising? Isn’t this what the system was designed for—to provide for those who can’t provide for themselves anymore? Between one-fifth and two-thirds of today’s older Americans haven’t saved enough for retirement.42 As author Susan Jacoby observed in Never Say Die, “a decent life for the old old cannot, in most cases, be financed by individuals.”43 Providing them with a modicum of financial security into their eighties and nineties will require significant government support.

A big GDP is less important than political will and long-term planning. Resources are not inherently scarce; the United States spends almost as much on its military as all the other nations of the world combined.44 This “scarcity” is the result of policy decisions in a society whose oldest—and youngest—citizens are demeaned and disregarded.

Life spans are our most basic measure of well-being. For the best crack at reaching ninety, become a wealthy Asian-American man. Yet more advisable, decamp to Anguilla, Austria, Australia, or any of the forty-two other countries that ranked above the United States in global life expectancy in 2017.45 (Monaco tops the list with a life expectancy of 89.40; the U.S. is in forty-third place at 80; Chad has the lowest at a shocking 50.60.)

Shamefully, in a historic reversal, the life expectancy of the poorest Americans is falling. Studies describe a society in which socioeconomic status is destiny. In 2016, economists at the Brookings Institution found that for men born in 1920, there was a six-year difference in life expectancy between the top 10 percent of earners and the bottom 10 percent. For men born in 1950, that difference had more than doubled to fourteen years. For women, the gap grew to thirteen years from 4.7 years.46 In regions with higher income and education levels, life spans rose. Thus, even as the gap in life expectancy narrows between men and women and between blacks and whites, it widens between the haves and the have-nots.


In late life, almost everyone finds out what it’s like to be excluded from mainstream discourse and possibility. As writer Walter Mosley put it, “When you become old, you become black … anybody that’s poor, who gets really old, anybody who suffers some kind of traumatic physical ailment, they realize what it is to be pushed aside by a society that’s moving ahead only with what they believe is good—the experience that black people have had in America forever. If you’re old, you’re not good; if you’re paraplegic, you’re not good; if you’re black, you’re not good.”47

These lines are never more clearly drawn than during natural disasters, when poor people, people of color, and older people die in disproportionate numbers. Chicago’s 1995 heat wave claimed 729 lives. Most of the victims were olders living in the heart of the city, isolated by urban decay, afraid to open doors and windows and unable to afford air-conditioning. Blacks were more likely to die than whites, who were more likely to die than Latinos, who tend to live in densely populated neighborhoods and face less isolation.

Of the nearly one thousand people who died when Hurricane Katrina hit the Gulf Coast in August 2005, almost half were seventy-five or older, and more than half of those were black. New Orleans is both a poor city and a segregated one. Hardest hit were the immobile and impoverished. The deaths of many more older residents can be attributed to the stress of being evacuated and losing their homes.

Almost half of those who died when Superstorm Sandy blew into the mid-Atlantic coast in October 2012 were over sixty-five. As with Katrina, most died on the day of the storm, and most drowned alone. Some were homebound; others chose to stay put. Those in institutions also suffered. Evacuating more than forty nursing homes and adult homes in low-lying areas for Tropical Storm Irene a year earlier had cost millions of dollars. As Sandy approached, officials recommended against evacuation. The hurricane severely flooded at least twenty-nine facilities in Queens and Brooklyn. Over four thousand nursing home residents and fifteen hundred adult home residents sat in the cold and dark for at least three days before being transported through debris-filled floodwaters to crowded, ill-equipped shelters and homes as far away as Albany. Many low-income olders were trapped for days without power in housing projects as well, left behind in every sense by an ageist world.


At one end of the spectrum is “shortgevity,” a term coined by Dr. Robert Butler to describe countries where people don’t live long and healthily enough to be productive. The United States is not among them—far from it—but both temperament and circumstance stand between many Americans and old age, not to mention a good old age. Not everyone ages well, because of who they are (depressed, reckless, extremely self-involved) or what they are (poor, frail, isolated, African American, Native American, female), and many don’t live long enough to grow old.

For those of us with access to healthcare and education, however, for the first time in human history four living generations will become commonplace. We’re going to have more time to figure out what we want to do with our lives, more time to accomplish it and share what we know, and more time to wind down with those we love.

To take advantage of this “longevity dividend,” we need to quit the reflexive hand-wringing, challenge the ageist assumptions that underlie it, and think realistically and imaginatively about the kinds of intergenerational contracts an equitable future will require—a task all ages would do well to engage in together. Stripping older people of their ability to contribute places a true burden on young adults, who are supposed to mate, breed, establish careers, and start saving for retirement by age thirty-five or so—another example of how ageism affects everyone. The mutually advantageous alternative is to see age as an asset. Exploit the “experience dividend” that this new cohort embodies. Acknowledge that olders are not mere burdens but contribute to society, and that their value as human beings is independent of conventional economic productivity. As Marina Gorbis, executive director of the Institute for the Future, puts it, “Productivity is for machines.”

That cultural shift is within our grasp. Allocating resources according to whiteness or maleness now seems unthinkable, but it went unquestioned until not long ago (and in many arenas still does). Slavery was fundamental to the American economy until the abolitionist movement turned it into a crisis. Brutal segregation was a reality for black South Africans until the anti-apartheid movement rose up against it. Not until the women’s movement emerged did women challenge their second-class status. All these struggles are ongoing, and none are easy. It took nearly a century for American women to win the right to vote, a struggle tainted by racism on the part of white suffragettes, and the ugly legacy of slavery continues to blight the lives of African Americans. It’ll take time to develop a culture that acknowledges and reflects on the emerging meanings of longevity, but the conversation is beginning. Let’s flip it, as Laura Carstensen, director of the Stanford Center on Longevity, suggests, “from one about growing old to one about living long.”

Copyright © 2016 by Ashton Applewhite.

Excerpt from “Here” from Begin Again: Collected Poems by Grace Paley. Copyright © 2000 by Grace Paley. Reprinted by permission of Farrar, Straus, and Giroux.