To live the considered life is to dwell in an enigma. Nothing is truly as it seems. The certainties we hold when we are twenty- five have become absurdities by fifty. In the pro cess, we have to nose along through the clamor and smoke of existence, trying to understand what is really going on as opposed to what only seems to be going on, struggling to separate the vagaries of the moment from the constants of existence, to eliminate the obvious irrelevancies that so many people get hung up on, like fashion and, oh, I don’t know . . . dogs. Dogs are a good example. They are there, for some reason, and can be enjoyable creatures, but the why of it is not worth our time and energy. Alexander Pope said as much: "The proper study of mankind is man." Man, not cats or rabbits or hyenas or aardvarks. Or dogs.
Eliminating the trivia, clearing our minds of chaff, allows our attention to fix on the things that really matter. Then sometimes those things, things we otherwise might not even notice, can stun us with their relevancy.
The element of surprise adds to the power of such moments. As scientists like to say, serendipity is often the crux of discovery. In my case it all started thirty years ago or so, in an instant burned forever into my mind.
I was sitting in the northeast corner of the Baltimore Sun newsroom, feet propped on my desk and a cold cup of coffee by my elbow. I was opening mail, which, for me, was like a hound hunting rabbits. I would patrol the mail, ruffing my nose through the news, never sure what would emerge from the next envelope I slit open. So it was that one day in the late 1970s, I opened an envelope and pulled out a photograph that would forever change my perception of myself and set me on a journey that has consumed much of the rest of my life since.
They called me a science writer, but I thought of my beat as Truth and Beauty; and, yes, in my heart of hearts, I looked down on other reporters. All they got to cover was the everyday trivia of city hall, elections, the machinations of administrators and officialdom, fires and foods and homicides and growth plans and school bud gets and cats stuck in trees—this and all the other superficial stuff that seemed relevant to the dullards on the city desk. Everything else—the whole, magnificent, unfolding panorama of scientific endeavor, from the moot to the apocalyptic—belonged to the science writer. Me!
Think of the great imponderables of existence. The mystery of the quasars, shining so brightly at the far horizon of time. The enigma of what makes a volcano explode, or a tornado germinate. Why butterflies flutter and fireflies blink and black carbon fuses together to make glittering diamonds. Fruit fly genetics. The denning behavior of Asian bears. The molecular machinery of the red blood cell, the burial preferences of ancient Micronesians, the tidal motions of the Bay of Fundy, the sex life of the lesser kudu, the quixotic search for the ivory- billed woodpecker . . . what is science, anyway, if not a living compendium of ballads, mystery yarns, and shaggy dog stories?
And there was always more, and more, and more. The forces of history shifted with the avalanche of discovery and invention, and the hard realities that shaped life came less and less from the political process and more and more from the laboratory. Everywhere you look, you see the results of science: The hydrogen bomb, the birth control pill, the computer, the Internet—science is everywhere. Science has become the primary driving force of modern existence. And politics? In modern times, politicians are almost always behind the game. Science acts. Politicians scramble to react but by the time they do, the science is usually a fait accompli.
Drama, meanwhile, does just what it always it did, which is follow reality. When the stirrup was invented, allowing armored warriors to balance themselves on horses, the balladeers strummed their lutes and sang of knight and fair damsel. A few centuries later, when a spacecraft blew up or a researcher died in pursuit of knowledge, you had a tragedy with Shakespearean possibilities. When biologists tracked down the cause of a disease (or when one of them phonied up a journal article) you had a detective story Sir Arthur Conan Doyle would have killed for. When a team from France and another from the United States raced to find the cause of AIDS, it was mano a mano à la Ernest Hemingway, even if the fight was set in the barroom of the mind. Sometimes, as when ulcers turned out to be caused by bacteria that thrive in the hot acid of the stomach, you had a delicious surprise ending. Other times, as when beasties were found living happily in the throats of undersea volcanoes, it was pure science fiction—a bad phrase, probably, considering that the science had long surpassed most of the fiction built on it.
It isn’t all that much of stretch to connect science with art. If science is based on pro cess and obscured with unfamiliar words, it nonetheless grew out of a fundamentally human, childlike curiosity. What makes the sky blue, why does ice float, what is "blood," how does the mind work? What child fails to ask those questions? And what child fails to try to draw what she sees, or sing what he knows?
As curiosity morphs into science and the powerful new instruments come into play the questions go deeper and the fascination quotient goes up. Look close enough into the cell and the gooey protoplasm turns into a churning mix of little gizmos. Suddenly the living cell is crisscrossed with highways, all crowded with trucking engines carrying supplies in and out of the industrial areas near the center of the cell. There are factories and recycling plants, entry ports and guarded secure zones. Or you can look through the other end of the microscope and focus your mind on the realities of black holes, supergalaxies, and parallel universes that go on and on beyond the meager limits of our comprehension.
In the middle decades of the twentieth century, physics and astronomy were the real sciences. But as the century proceeded the best stories were happening in biology. Once it was the purview of rich men with butterfly nets, physicians with leeches. By the 1970s, when I was earning my spurs as a science writer, biology had turned into Big Science, complete with million- dollar bud gets, interdisciplinary research teams, and instruments that examined molecules as small as the galaxy was large. Medicine followed right behind, beginning its historic transformation from art to science. Brain scanners were on the horizon. Psychology, which I had long considered only slightly more credible than voodoo, would evolve into a hard science. I was in the audience of a press conference Johns Hopkins held in 1973 to publicize a paper one of their professors had written about the breakthrough he, Dr. Solomon Snyder, and his postdoctoral partner, Candace Pert, had made. They’d discovered that the human brain contained receptors—tiny ports on the surfaces of brain cells—that were built to attract and hold opium molecules. I thought that sounded bizarre. If the receptors were there, did that mean the brain made its own opium? Was that why people became addicted to opiates like morphine and heroin? What on earth was the purpose of that?
Trying to write the story of the discovery for the newspaper, I read the scientific paper again and again and reviewed my notes. Piece by piece, it came together. If the brain had opium receptors, that must mean it made its own opium. If so, I could only conclude that these powerful secretions were the source of good feelings, and that we were hooked on the behaviors that produced those feelings. You didn’t have to reach far to conclude that the human mind was therefore a chemical pro cess, explainable in chemical terms—really, in natural addictions.
Motherly love, for instance, would be a woman’s addiction to her own opiates. Her baby smiles; she gets a fix.
The more I puzzled over the implications, the more it seemed to me that Snyder and Pert had just handed us the keys to the kingdom. We had a way, now, to truly understand ourselves. A few steps down the scientific road and we’d have a new and very specific grasp of the mechanism of yearning, the metabolism of love, the chemistry of all human compulsion. It was a glimmer of a different future.
There is nothing more fundamental to our worldview than our attitude toward the mind, the seat of the self. Now, in a couple of hours, the foundations of how I saw myself and the human race had been deconstructed and restructured. I was left in the state of shock and confusion that follows any major psychological realignment. The enigma had rolled over on me, the world was different than I had thought it was. The only thing I understood clearly was that I was in on the ground floor of a revolution in biopsychology that would ultimately change everything from the depredations of mental illness and addiction to our view of our own history.
Much has been written about the explosive chain reaction of knowledge that happened in the following decades. It started in brain chemistry and spread rapidly to neurophysiology. New generations of brain scanners appeared. Now, with our first glimpses of the thinking human mind, the revolution spread further: to psychiatry, to social science, and to anthropology. Every new discovery in brain science seemed to reinforce the thinking in anthropology, which fed back to suggest new experiments in the flux rate of neurotransmitters.
It stands to reason that all the really important scientific papers on the brain went back, one way or another, to Darwin. The premise of the new brain science was that the brain had evolved to produce thought, emotion, and behavior—and in those terms, the discoveries were often understandable even to the layperson.
Why did we have opium receptors in the brain? Obviously, to make us feel good. When you were a student, and you did a math problem properly, you felt good. So you did it again, and again, and again. Enough of that and you’d be addicted to numbers and formulas for life.
The brain had other uses for opium as well. As the story unfolded the "runner’s high" was also laid to natural opium. Other receptors and messenger molecules were discovered and linked with everything from anger to self- hypnotic states.
With specifics such as these, the ancient philosophical questions of existence seemed to pass from philosopher to scientist. For the first time, we could focus the armamentarium of science—scanners, giant computers, statistical models, and large- scale collaboration—on the ancient questions of who we are and where we came from. There was an intellectual juggernaut rolling straight toward the center of the human universe and that inner singularity for which we had no other word but "soul."
Even the non- laboratory sciences were coming along, albeit much more slowly. Anthropology had grown into a sort of a science (and never mind the pseudoscience around the edges). As a result we now knew some of our evolutionary history. We knew, for example, where the road had forked and our ancestors began making stone tools. That change separated them from the rest of the apes and sent them down the evolutionary road that would lead to modern humans. Half a million years ago, they mastered fire, and the tree branched again. By a hundred thousand years ago, having spread across the face of the old world, they were recording the birth of human imagination in rock art. They did not yet produce what we would call architecture, but they had definitely learned to impose their will on the most fundamental and resistant material in their world: rock. Their flint edges were as sharp as scalpels. I happen to know this because I once wrote a story about a surgeon who made and used stone knives to operate. He claimed they cut cleaner than steel, though they dulled faster.
By some point late in the last ice age our ancestors had become anatomically modern. They looked like us. Put them in neckties or pantyhose and you wouldn’t have noticed them on the subway—at least, not to look at them. But however modern they might have appeared, they remained among the animals. Their numbers remained small and the populations of competing animals did not shrink. Humans remained part of the ecological balance. They built shelters but they did not settle in them, and they did not even dream of cities. They harvested, as the other animals did, only what nature planted. They hunted and killed, but did not domesticate. Depending on which archaeological calculation you believe, they remained that way, fairly static, for a hundred thousand years or even longer.
Then, twelve thousand years ago, as the ice age ended, something truly extraordinary happened and the human race simply . . . changed. Exploded. Blossomed. Suddenly and inexplicably we began to herd, dig, build, draw, plan, and invent on a scale only hinted at by our earlier existence. In an evolutionary heartbeat we became the uncontested masters of the planet. Twelve thousand years Before Present is a date that evokes the human enigma itself.
The changing perception of human evolution is itself a study in the evolution of ideas. Archaeologists have been working on the problem for centuries, and much of what they concluded early on was so wrong as to be humorous. For the better part of a century humans were thought to be descended from knuckle-dragging Neanderthals. We weren’t, of course, and in any event Neanderthals weren’t knuckle- draggers. They looked very much like us—but that’s a part of a different enigma.
When I went to college in the 1960s, we thought the birth of modern humans occurred at twelve thousand years BP, and was surely explained in terms of brain size. Anatomically modern human beings, I was taught, did not actually appear until about that time. Our brains got bigger, we got smarter, and finally our brains reached a critical mass and we took over. The logic was impeccable. Any child could see the progression. End of story.
But when archaeologists pushed the existence of modern humans back many tens and then hundreds of thousands of years, the whole narrative linking brain size and human evolution began to come apart. Worst of all, to those still defending the old view, the human brain didn’t suddenly get bigger at the end of the last ice age. Instead, at the precise moment of human ascendancy, when the human animal stepped up to rule the earth, its brain got smaller. Average cranial capacity shrank by 5 to 10 percent.
This was stunning. Wasn’t our big brain the thing that made us smarter than the rest of the animal world, and therefore superior? Didn’t we assume, in our science fiction stories and films, that as we continued to evolve about, our brains would expand until our descendants, somewhere down the generations, would have heads the size of basketballs? Wasn’t cranial capacity the obvious and inarguable mechanism of advancement? Apparently not.
Scientists measured, re- measured, and re- re- measured fossil crania. They subjected their data to rigorous statistical analysis. They argued the question to a virtual standstill in the scientific literature. But still, the fact stood. The human animal, generally, was distinguished by its big brain, yet the moment the species embarked on the road to civilization was marked by a significant diminution of brain size.
Anthropology being what it is, conjecture grew to fill in the holes left by discredited facts. What was the link between brain shrinkage and the explosion of human influence? Not only was there no shortage of explanations, there were so many conflicting conjectures, so many mutually exclusive scenarios, that they added up to no explanation at all. The whole messy situation just hung there in the back of the anthropological mind, another reminder (if one was needed) of how little we really know about ourselves.
And finally there is yet another issue, equally puzzling, related and yet not related.
A science writer’s livelihood is based on the assumption that people are interested in science. There is plenty of evidence for this. They watch the Discovery Channel. They buy books about science. They are fascinated by the drifting continents, the mysterious dark matter that more and more astronomers think holds together the universe, the stunningly beautiful lemurs of Madagascar.
People want to know, intellectually as well as practically, why arteries harden. They want to know what causes schizo -phrenia, why salamanders can grow new legs and we can’t. Our interest in biology is intensified by our relentless self- absorption, our abiding conviction that we, and we alone, are at the very center of the universe. If human beings have a creed, it would have to be that nothing matters except me—and "me" was turning out to be an essentially biological concept.
So why, even at the height of the brain science revolution, did we spend less money on the combined disciplines of archaeology, sociology, psychology and neuroanatomy than we spent each year on, say, football?
Fortunately, that wasn’t my problem. For what ever reason, the human race existed. It was what it was, and it was nothing if not specialized. Everyone had their little niche, their own little piece of the machinery to run. One person did my taxes, another fixed my car, another filled my teeth. As a journalist, I was no less specialized. In fact, I was so specialized that I didn’t actually do anything. I just watched the world of science and translated it for anyone who cared to watch it with me. I did what I did because I loved it. I loved the considered life, and even developed a certain fondness for the enigma. I loved to watch the great curling wave front of discovery. I loved to talk to the scientists. I loved to write stories that fascinated people, that lodged in their minds and wouldn’t go away.
I loved it so much that I developed my own private game about it. I’d try to outguess the scientists, figure out where all this new information was going to take them. If that sounds arrogant it probably is, but I had an advantage because my work gave me a broad view of emerging science. Working scientists are so specialized that they often can’t see much beyond the minutiae that shape their own lives and work.
Excerpted from The Wolf in the Parlor by Jon Franklin.
Copyright 2009 by Jon Franklin.
Published in First Edition 2009 by Henry Holt and Company.
All rights reserved. This work is protected under copyright laws and reproduction is strictly prohibited. Permission to reproduce the material in any manner or medium must be secured from the Publisher.
Jon Franklin is the winner of the Pulitzer Prize for Explanatory Journalism and the Pulitzer Prize for Feature Writing, among numerous other awards. He was a science writer for The Baltimore Evening Sun and is now a journalism professor at the University of Maryland. He is also the author of The Molecules of the Mind, a New York Times Book of the Year, and Writing for Story.