1
EARLY LIFE STRESS:
The Biological Impact of Rising Inequality
THE BATTLE OVER NATURE VERSUS nurture—over the importance of qualities we inherit as a result of genetics versus those that come from things we are exposed to after we are conceived—has raged for more than a century. And at the time I entered the field of psychology, that battle had entered a particularly nasty phase. Little did anyone know that a new science was taking shape that would forever transform this battle—in fact, it would render it obsolete. The result would directly impact my chosen field.
* * *
In the late 1980s and early 1990s, headlines like “Murders Surge As Crack Spreads” and “Race, Genes and IQ” screamed from newsstands. The social safety net was rapidly fraying, with proposals to roll back the Great Society emanating from the White House. The liberal efforts of the 1960s were widely viewed as having failed, and even if this wasn’t entirely accurate, the political direction seemed to be taking a different tack. Doubts about the costs of social programs, their effectiveness, and even their fairness were rising fast.
From this swell of anxiety and uncertainty emerged the outspoken University of Chicago political scientist Charles Murray, who argued that welfare and the social safety net had done little more than encourage dependency. Murray launched a project designed to test the idea that racial inequality was based on, as he put, “intractable race differences” in intelligence. With The Bell Curve, the bestselling book he coauthored with Harvard psychology professor Richard Herrnstein, hit the bookstores in 1994, Murray would push this stance even further, claiming that African Americans and the poor could not succeed because “what’s holding them back is that they’re not bright enough,” and that welfare along with remedial education efforts should be tossed overboard. “For many people, there is nothing they can learn that will repay the cost of teaching,” was his dire conclusion (New York Times, October 9, 1994, “Daring Research or ‘Social Science Pornography’?”). The book took a clear stance on the nature-nurture debate, and it spawned a rancorous public debate, with other scholars attacking the credibility of its statistical analyses, its unstated assumptions, and its failure to consider alternative explanations.
Nurture wouldn’t go down without a fight, though. Taking an opposite tack, but with equally dire implications, the “culture of poverty” arguments began to make a comeback to redirect blame away from ineffective antipoverty programs and place it on those who were the most vulnerable. Harking back to Daniel Patrick Moynihan’s famous and controversial 1969 report, “The Negro Family: The Case for National Action,” there was a resurgent belief that the ills of the “underclass” arose from cultural deficiencies. In Code of the Street, the sociologist Elijah Anderson argued more persuasively that the claim of presumed deficiencies misunderstood the real cultural imperatives at work. Because inner-city youth were blocked from more traditional ways to succeed and achieve a meaningful identity, the violence and criminality that they engaged in were in fact adaptive in their circumstances—even if they ran against mainstream norms.
In part, both of these schools of thought rose in reaction to the claims of the Great Society and War on Poverty initiatives, which implied—or said outright—that increased attention and funding would soon take care of persistent problems by improving the social environment. But with rising rates of violent crime and limited progress on child poverty, clearly they hadn’t succeeded in the public eye. The argument that more spending was the answer, that we hadn’t done nearly enough to implement promising programs like Head Start, was met with increasing skepticism, even when there was evidence to support this unpopular rebuttal.
So we basically had a replay of the nineteenth century, when the social Darwinist belief in the survival of the fittest—as shown in the “natural superiority” of white Europeans—led directly to the eugenics movement in the early twentieth century. This supremacy of the nature view resonated clearly in the work of Murray and others, like the psychologist Arthur Jensen, who explained the better performance of whites in the United States by claiming that they had a superior genetic endowment. And yet the counterargument—that genes played no role and that good nurturing could conquer all—wasn’t viable either, as we uncovered genetic vulnerabilities that created particular challenges for some children—like dyslexia, attention disorders, and other learning disabilities.
This nature-nurture game had been going on for so long, and so fiercely, it almost required picking a team—Team Nature or Team Nurture—not only among culture critics, but among researchers as well. Even those who acknowledged that both nature and nurture mattered couldn’t move the ball much—no one could say exactly how they mattered or what to do about it.
Meanwhile, these extreme positions exerted powerful influences on key policy decisions that were being made based on faulty science—sharp reductions in education for low-income populations, deep cuts in the social safety net, “superpredator” laws that removed judicial discretion in favor of harsh mandatory sentencing—all of which created consequences that bedevil us to this day, like mass incarceration, which rose dramatically over twenty years, from an already shockingly high rate in 1990 (1,860 out of every 100,000) to over 2,200 in 2010—and almost twice that for African American men. The statistics alone fail to capture the disruption created in individual lives, families, and entire communities. Although most striking for African Americans, the pattern was similar for white Americans, as the overall rate of imprisonment continued to rise, long after the drop in the violent-crime rate that began in the mid-1990s and still continues.
In short, we were stuck. We were stuck in policy debates based on stubbornly misguided science. Researchers, including me, were stuck in the middle of an either-or dichotomy that was increasingly sterile. And we were all stuck in a nature-nurture debate that had gone nowhere for more than a century.
DISRUPTION: A NEW APPROACH
In the spring of 1992, out of the blue, I got a call from someone asking if I might have time to speak with a man named Fraser Mustard. I had no reason to suspect that this call would not only change my research career dramatically but also lead to a new way of looking at nature and nurture, eventually breaking us out of the box we’d been mired in for decades—and in the process offering insights that would affect not just the debate on poverty but other social ills, including chronic stress, social inequality, health and health care, and mortality.
I’d heard the name before—I knew his team had played a significant role in the discovery that aspirin could help to prevent heart attacks and strokes. As I soon learned, Fraser Mustard was nothing if not ambitious in his goals. Just like the work leading to the now-common use of low-dose aspirin to prevent initial or subsequent heart attacks that have extended the lives of thousands, he was deeply interested in new areas of science that could make a difference for society as a whole. Quite soon, this man with the unusual name would come to mean much more to me, drawing me into the heated nature-nurture dispute in an entirely unexpected way, leading me to a completely new way of looking at why some kids struggle while others thrive.
* * *
A decade earlier, Mustard had founded the Canadian Institute for Advanced Research, an international think tank that was driven by the simple insight that the best way to tackle really complex problems with multiple causes was to assemble a diverse team of the very best experts capable of addressing each cause. With very little by way of introduction, and no explanation of why he was calling, Mustard asked me if I’d heard about three recent studies that, he said in his probing way, had intriguing implications individually and perhaps even more intriguing implications when considered together.
He started by talking about something called the Whitehall Study. This now-famous study of the health of British civil service employees in the Whitehall district of London, led by British epidemiologist Michael Marmot, was conducted twice, once in the 1960s and again starting in the late 1980s. Marmot found that lower-ranking employees were four times as likely to suffer serious illness (such as heart disease, chronic lung problems, and depression) and earlier deaths than higher-level administrators. Interestingly, because researchers had narrowed their observation to these British civil servants, they were able to exempt the usual reasons offered to explain such a phenomenon: the physical demands of a job (pushing paper is, after all, not arduous labor), limited access to health care (there is universal health care in England) and lifestyle (smoking, diet, and exercise), which they measured through questionnaires filled out by participants. And yet, even with these factors removed from the equation, between 65 percent and 75 percent of the health differences associated with social status were left unexplained. This mystery caught Fraser’s interest—what else could be causing those lower on the totem pole to fall ill? But he was also taken by the fact that these weren’t the usual victims either. Until this point, research had been centered on the health impacts of being poor, but here, the sufferers were clerks in the British civil service. They were lower on the pecking order, to be sure—but they were hardly poverty-stricken.
Fraser then moved on to the work of an American psychologist named Emmy Werner, who focused on resilience—the reality that some children who experience early life adversity, indexed by various risk factors like economic or social disadvantages, nevertheless enjoy substantial success in later life. She had recently published a book, Overcoming the Odds: High Risk Children from Birth to Adulthood, which followed a group of 505 men and women on the Hawaiian island of Kauai—many of whom didn’t graduate from high school and went on to work as unskilled laborers—and tracked risk and stress as it played out in their lives. Ultimately, Werner found that those who were able to rebound from setbacks and troubles in early life had benefitted from a close nurturing relationship in childhood or adolescence–either with a parent or with someone who stepped in to fill that role. This success against the odds set by early life adversity was not a widespread phenomenon—it benefitted around 10 percent of those she’d followed—but it suggested an intriguing possibility: that the right kind of nurturing might repeal some of the damage low social status could inflict.
Lastly, Mustard brought up a theory put forward by the British medical researcher David Barker suggesting that a baby’s nourishment in utero was a crucial factor in predicting aspects of adult health decades later, including blood pressure and heart disease. This so-called Barker hypothesis, which had emerged two years earlier, in 1990, represented a revolution in the way we thought about how social factors impact adult health. Until this work appeared, most of the focus was on stressors that happened during adulthood, like difficult working conditions. This research showed that the more important link was between prenatal conditions in utero and later adult health. Barker had examined the detailed medical records of 449 men and women born in Preston, England, between 1935 and 1943, following them from birth until their late forties and early fifties. Those most likely to suffer heart disease were the group whose birth weight was substantially less than expected based on placental weight—indicating the fetus had not grown to optimum weight. Although Barker made the link between fetal growth and adult heart disease clear, he did not offer the cause.
After reeling these three studies off, rat-a-tat-tat, Mustard now paused for breath. And then: did I see the connection? He could see from Marmot’s study that there was clearly something else—something linked to our socioeconomic circumstances—affecting our health that had not yet been revealed. Werner’s work on resilience offered the glimpse of a solution, or at least a way to ease the harm our early life circumstances might inflict on us. And yet Barker raised the disquieting possibility that there might be factors, in addition to the social elements Marmot had introduced, influencing our health before we’d even entered the world.
What was really at the heart of these seemingly disparate studies, however, was the notion that one could not simply point a finger at nature or nurture. These findings were beacons signaling that there was something larger and more complicated going on than the intellectual leaders of our day would have us believe. Mustard didn’t yet know what that something was, but it seemed clear that venturing further down this road might offer an alternative argument to the sterile debate raging around us.
I was intrigued. My own work to that point had focused on trying to find the hidden forces driving otherwise bright kids—the Jasons and Davids of the world—to lose control when push came to shove in the classroom or at home. And I, too, had been working to escape the confines of the national conversation. In speaking with Mustard, I saw that by closely studying the interplay between the two—nature and nurture—we might uncover just how they influence our lives. And perhaps this would offer real help for those whose fates were being inexplicably altered by their circumstances and their genes.
Within days, Mustard and I were meeting in his CIFAR office in Toronto. (I was working as a professor of applied cognitive science at the University of Toronto at the time.) When I encountered him in person, I glimpsed the U of T football player he’d been long ago—he was a towering figure with a sturdy build. And yet his white ring of hair, like a halo around his otherwise bald head, combined with his frank open-mindedness, made him seem more like a Benedictine abbot from the tenth century.
He got right to the point: he wanted to launch a new CIFAR program that would examine the developmental reasons behind why some people remain healthy while others do not. He had an instinct that the research we’d discussed was only the beginning of a larger conversation. Clearly, the classic accounts for the ways in which social status—whether measured as income, school achievement, or career standing—affected our health were not enough. He wanted to find the missing piece. He also believed that more than just our cardiovascular health could be determined in our early lives. Barker’s work may have only opened a door. In the end, it boiled down to this question: what else could be radically affecting the health of certain adults—and when did it take hold?
MAKING THE CONNECTIONS
A year later, we launched a new program—the Human Development Program (HDP)—with me at the helm. We assembled a diverse group of experts, which included an epidemiologist, a couple of neuroscientists, a child psychiatrist, two developmental psychologists, and a primatologist, who worked with monkeys exploring the behavioral and physiological consequences of early nurturing deprivation. The studies that Mustard had cited—by Marmot, Werner, and Barker—were like three pieces of a puzzle, offering an early, albeit incomplete, picture, and now we had to fill in the rest.
We decided to start at the beginning—that is, the beginning of life. We would seek out research expanding on Barker’s theory that our experience in utero affected our adult health. Barker’s work had been very specific, looking mainly at heart disease; we wanted to explore other problems that might have their origins in utero. And we also wanted to follow through on how environment might continue to affect a person after birth.
Soon, the two epidemiologists on our team found a way to delve into this. They had been working with data from a government study called the 1958 UK Birth Cohort, which followed the lives of 17,000 people born in England, Scotland, and Wales over the course of a single week in 1958. Researchers had tracked these people from birth forward—checking back in on them at ages seven, eleven, sixteen, twenty-three, and thirty-three (they would carry on from there, but at that time, participants were in their thirties). At each meeting, our colleagues presented new analyses of participants’ health, education, career, and general life circumstances. They quickly discovered that individuals’ socioeconomic standing at the time of birth correlated with a wide range of difficulties throughout life: lack of success in school, delinquency, low career achievement, and health problems. This bolstered and, more importantly, broadened the work done by Marmot and Barker: socioeconomic circumstances at birth had an impact throughout life. And the consequent problems were as sweeping as academic difficulty at the age of eleven or general health at age thirty.
In addition, we discovered that this was true across the classes; socioeconomic status had what we called a “gradient effect,” another way of saying that these health and behavioral consequences were strongest at the lowest end of the scale. They continued to play out, though with decreasing impact, all the way through the middle and upper classes. Not surprisingly, children born into families with the lowest socioeconomic status (SES) struggled academically at the start of school and continued to struggle academically. But just as in Marmot’s Whitehall studies, this pattern was repeated at each higher level of SES—children at each level performed worse than those at the next higher level, right to the top.
Still, SES was basically a stand-in for what we were truly looking for, which was the factor—the why—leading to these debilitating changes that were taking place early on and unfurling over a lifetime. We still did not know why social status would cause a teenager to falter in school or a forty-year-old man to have a heart attack. Given that we had an inkling from Barker that whatever was happening could occur even in utero, it seemed there was something that was “getting under the skin” or, as we came to describe it, becoming biologically embedded. But what could this mystery mechanism be?
FALLING INTO PLACE
Monkeys, as it turned out, would offer us our first clue. One of our members, Stephen Suomi, a primatologist and scientist at the National Institutes of Health (NIH), had been studying the effect of early adversity on rhesus macaques, a species of Old World monkey. Several of his studies involved housing a small group of newborn monkeys together without an adult in order to see what would happen in the absence of nurturing. Sure enough, after a year, when Steve put these monkeys back with the larger troop, they experienced an array of challenges, from behavioral struggles to addiction.
I recall watching a video that Steve showed us of one of these “peer-reared” monkeys roughhousing with another monkey. At first, it seemed like normal horseplay, but it soon became apparent that the troubled monkey didn’t know when to stop. He continued to pummel his friend, even after that friend had curled up into a ball, his head in his hands. In fact, no amount of signaling came through—not lying on the ground, not trying to walk away. The troubled monkey just kept right on hitting.
It was around this time that we first began to see a link with stress. A number of studies had shown that the level of the stress hormone cortisol could be reduced by nurturing as simple as mere physical contact. One experiment discovered this by accident: simply picking up rats from their cages in a lab in order to perform research inadvertently reduced their cortisol activity. Guessing that cortisol might be at the root of the monkeys’ behavior, Steve tested them for the opposite problem: a rise in cortisol that he thought might have resulted from their harsh upbringing.
In a series of studies, newborn monkeys were grouped together and supplied with ample food but with no adult monkey mothers. Sure enough, his peer-reared monkeys had an amped-up stress response system: their hypothalamic-pituitary-adrenal (HPA) axis, the system that controls the release of the fight-or-flight hormone, was in overdrive. It activated with less provocation and kept going long after others would stand down. When Steve presented us with his findings, the glint in his eye revealed what we also quickly grasped: the link from hyper-HPA activity to these monkeys’ hyperaggression and inability to control their own behavior was a breakthrough discovery. It gave us our first hint as to what might be going on in humans—what was getting under the skin—early on and generating difficulties throughout life.
Steve then presented us with a second important finding: when offered the choice of water, juice, or a cocktail of juice and alcohol, the deprived monkeys were far more likely to choose the alcoholic drink. And they were far more likely to become addicted. Although Steve warned us not to look at the monkeys as directly related to humans, the parallels in the data were striking—addiction was one of the key factors associated with high stress reactivity in humans. We felt strongly we were on the right path.
Not long after Steve had discussed his illuminating research with us, the psychologist Megan Gunnar, who came to give a guest lecture at CIFAR (and soon afterward joined our network), provided us with the next crucial piece of the puzzle. She was in the early days of what would become a landmark multidecade study of the lives of a group of orphans who’d grown up under devastating circumstances. In the 1980s, Nicolae Ceausescu, the Romanian dictator, had concluded (incorrectly, alas) that his country required large-scale population growth for economic health. As a result, people were prohibited from using birth control. While all large-scale policies have unintended consequences, in this case they were particularly brutal—a vast number of unplanned children wound up in Dickensian orphanages, warehoused in large rooms with row upon row of cribs, and only a small group of adults to supervise them. These caretakers had little time for—and weren’t expected to give—even the most basic human contact. These orphanages were basically run as if they were factories with an assembly line of babies to be perfunctorily changed and fed.
Soon after this tragedy came to light, with the collapse of Ceausescu’s regime, a significant number of these children were adopted into well-off families in Europe and the United States, where Megan and other researchers had begun to study their development. The findings were striking: virtually all of the children who were not taken into adoptive homes before the age of one showed the same patterns that the monkeys had shown, from a dysregulated stress response system to difficulty forming relationships.
The dramatic effects of deficient nurturing in Steve’s and Megan’s research, in both monkeys and children, clearly showed its unquestionable impact on the stress system and on problem behavior later on. And so we now decided to focus our attention on stress, both on how it is shaped by experience and how it changes behavior and health.
Enter Michael Meaney, a professor at McGill University who specialized in neurology, stress, maternal care, and gene expression. Like Suomi, he had been studying animals displaying SDR, but he was working with rodents rather than monkeys. Michael, too, had discovered similar physiological differences and behavioral problems in rats who’d been deprived of maternal nurturing—specifically, the “arched-back licking and grooming” of the newborn pup—but he also arrived with a brand-new and as yet unpublished finding. He had actually found a biological mechanism—a process that seemed to explain why those who experienced stress early in life had so much trouble thereafter. As he explained what he had learned, we suddenly realized that this was the missing piece of our puzzle.
* * *
Meaney’s lab had been studying the link from deficiencies in early nurturing to SDR for some time and had been seeking the underlying biology of why this happened. As Michael recounted the story, a chance meeting at a conference with a McGill colleague, Moshe Szyf, provided the inspiration. Szyf, a pioneer in the growing field of epigenetics, hearing about the work of Michael’s group, suggested that an epigenetic change to genes that control the stress system might be worth exploring. Up to this point, nearly all the work on epigenetics had looked at it in terms of normal fetal development—where it plays a major role in controlling how and when genes work—or in response to physical inputs throughout life, like the effects of smoking as it leads to cancer. The spark here was to explore whether social experiences—in this case, early nurturing—could have a similar effect.
Ordinarily, our stress response system amps up or powers down proportionate to threats we face. If there’s a lion about to pounce, or a man with a gun walking our way, the system releases cortisol, which puts us on high alert. When the threat passes, the cortisol is shut off. Well, it turned out that when Michael’s newborn rats experienced the stress of poor or missing maternal nurturance at a high enough level, something happened that prevented the cortisol from being shut off.
This process is what’s known as an epigenetic change: a gene’s function is altered—either switched on or switched off—by an external factor. In this case, the external factor was extreme childhood stress without comfort, and it caused an epigenetic change called “stress methylation.” Methylation means that a methyl group—a specific type of chemical molecule—has attached itself to the on-off switch that is a part of every gene. In the particular case of stress methylation, the gene whose job it is to tell the HPA axis to stand down—to shut off the flow of cortisol—is silenced. High levels of stress experienced in early life can methylate the key gene that controls this stress system. When this happens, we live as if constantly facing the pouncing lion or the man with the gun.
There it is, I thought, there’s our answer: stress can get under the skin, changing the very way our genes function. I was far from alone in recognizing how this changed the landscape of how to look at early life stress. As my colleague, Clyde Hertzman, who had a penchant for pithy and pointed conclusions, remarked on first hearing about this new social epigenetic story of the biological embedding of SDR in early life: “It’s a way of getting a message to a newborn that it’s a dangerous world out there, so you’d better live hard, live fast, and, very probably, die young.”
* * *
The minute we learned of this epigenetic effect we realized the potential implication for understanding not just people at the lower end of social status, but all of us. Clearly, it fit with the social-inequality story we had been pursuing for a decade. Low socioeconomic status as a marker of early life adversity, with the lifelong consequences we had come to understand, was a natural fit for this new story. But it went well beyond that. Difficulties in early nurturing arise from many other sources than economic and social disadvantage. In the modern world, the stresses of managing dual careers or the worries about the hypercompetitive world that one’s children may face can interfere with the kind of nurturing that infants need. At a later point, we learned that this epigenetic change could follow another social pathway: If the mother is hyperstressed during pregnancy, the same stress methylation can follow. Parents from lower SES groups may have a greater risk of stressful pregnancies or stressed-out early parenting, but it can happen at any level of SES—which was entirely consistent with our findings on how stress can show up at any level of society: it can happen to anyone.
Our team was not alone in grasping the profound implications of this dramatic new science; soon researchers around the world joined this exploration, revolutionizing the way we look at child development. Starting immediately after the publication of several seminal papers by Meaney’s group in the early 2000s, we have seen a massive undertaking to extend and understand the social epigenetics of early life adversity. And from all we’ve learned, what we know about early adversity altering our genetic functioning is likely to at least double in the next few years. For now, however, we can say with certainty that stress can change the way our genes work, with consequences across the lifespan. And beyond. It turns out that this epigenetic change—which doesn’t affect the DNA at all—can be passed down to the next generation; this has been found in animal studies, but there is recent evidence that it happens to us, too. This remarkable finding means that the social experience of early adversity can make a change that becomes part of our biology—and part of our biological inheritance: nurture becoming nature.
Once this was clear, I began to give talks—to psychologists, teachers, social workers, community health groups, bankers (who were helping to fund CIFAR), legislators, and policy makers, essentially anyone whose own work would be enriched by this remarkable news. Naturally, everyone also had a personal interest; inevitably, there were many questions to answer with respect to their own families and their concerns about the effect this was having on our society at large. How much stress is too much? What does stress methylation look like? What can I do if it seems that I or my child is already living with a disrupted stress system? If I can’t find affordable day care for my baby, is there a greater chance my baby will experience stress in a less desirable setting? How can I get my teenager the help she needs when she is so withdrawn? How do I support my student who is struggling with anxiety, anger, or behavioral problems if the school doesn’t have a framework in place to address mental health issues? How can we decrease the stress in our family when we cannot afford to step off the treadmill even for a vacation? Is it too late to do something about the stress I’ve already experienced, potentially putting me at risk for a heart attack? What are the broader consequences of this epidemic? Is stress changing our basic nature, making it ever more difficult to mend?
In the coming chapters, I will go through the individual stages—from baby to toddler to adolescent to adult life—explaining the current science on stress and SDR for each age, as well as the best ways to integrate and respond to it in a variety of circumstances. With each chapter, the perspective will broaden until finally, in the last chapters, we will view our struggle with stress from a societal standpoint, giving us a firm sense of how we can change social policy to break this dangerous cycle that threatens us all.
Copyright © 2017 by Daniel P. Keating