Are We Scaring Ourselves to Death?
TO YOUR HEALTH!
HEALTHY, WEALTHY, AND WISE (?)
Tylenol poisonings. AIDS. Rare killer viruses with no cures. Hamburgers that kill children and old people. Beef that makes you mad. Salmonellacontaminated chicken in the local grocery store. Pesticides on fruits and vegetables. Toxic shellfish. Milk pumped up with bovine growth hormones. Mexican food, Italian food, Chinese food, movie theater popcorn. According to media stories over the past few years, all of these are "harmful to your health"!
To add to the confusion, sometimes one piece of health news is followed within weeks by another that directly contradicts the first.
• In February 1995, the Centers for Disease Control told us that any kind of exercise improved life expectancy, even sporadic modest exercise. Then, two months later, a Harvard University study announced that only people who exercised strenuously and regularly enjoyed longer lives.
• A Danish study, published in the 6 May 1995 issue of the British Medical Journal, encourages the drinking of wine. According to the ten-year study, people who drink three to five glasses of wine a day live longer. However, other studies point out that the benefits of alcohol might not outweigh the risks.
• One study finds that eating fish doesn't prevent heart disease, as it is widely believed to do. Another study reports that reducing dietary fat to 30 percent of calorie intake doesn't help reduce the risk of heart disease--in spite of dozens of news stories to the contrary.
It's difficult to know what to embrace and what to fear. Questions cloud our attempts to apply the results of scientific research to our lives. We learn that food irradiation can kill potentially harmful bacteria, such as E. coli in beef and pork, and salmonella in chicken. But does irradiation carry unforeseen cancer risks?
Another example is caffeine. Eighty percent of Americans consume caffeine daily. However, a Johns Hopkins University report published in the Journal of the American Medical Association in October 1994 found that caffeine causes birth defects and is an addictive drug with classical psychoactive dependence. Other studies tell us that caffeine magnifies the risk of heart attacks and the effects of stress. Conversely, different researchers and food professionals assert that caffeine can protect against colon and rectal cancer, increase alertness and productivity, reduce suicidal tendencies, and increase sexual activity in the elderly. So what do we do, have that second cup of coffee or not?
And what is the difference between saturated fat, unsaturated fat, and polyunsaturated fat? Does fluoride help our teeth, or hurt our kidneys?
Help! We're continually bombarded with advice on health issues. A recent New York Times headline (10 May 1995) read, "Amid Inconclusive Health Studies, Some Experts Advise Less Advice"!
The overabundance of health advice can, itself, be harmful to our health. Dr. Donald Louria, chairman of preventive medicine and community health at the New Jersey Medical School in Newark, told the New York Times (10 May 1995) that "the danger of [overselling health advice to the public] is that they will not believe the stuff we have that's documented." In other words, the bona fide, scientifically sound health advice gets lost in the crowd. Another physician, Dr. Walter Willett of the Harvard School of Public Health, worries about the "lack of distinction between wishful thinking and solid facts in health advice. When a new study contradicts conventional wisdom, people throw up their hands and decide not to believe anything scientists say" (New York Times, 10 May 1995).
The good news is that our health has never been better in the history of mankind! According to the National Center for Health Statistics, as recently as 1890 the average life expectancy for Americans was just over 31 years old. By 1930, the average life lasted just under 60 years--almost doubling in only forty years. By 1990, the average life expectancy had reached 75.4, the highest in history. Infant deaths have dropped by half over the past twenty years, to just over 9 babies in every 100,000 born today (Los Angeles Times, 11 September 1994).
In addition, the death rate from heart disease (the No. 1 killer) has dropped27 percent since 1970. Deaths from stroke (the No. 3 killer) are down 44 percent. Smallpox is gone, and polio is almost gone. We can cure leprosy. Cancer deaths (the No. 2 killer) have increased, but this rise is mostly due to smoking and to the fact that we're living longer; other diseases don't get us first.
Nevertheless, in spite of this overwhelmingly good news, 78 percent of Americans still think that they face more risk than their parents did twenty years ago, according to researchers Marsh and McLennan in the Los Angeles Times (11 September 1994). Only 6 percent think that we face less risk. Why is this? It seems that we're scaring ourselves to death. The following information can help us sort through some of the many "scare stories" and the conflicting advice about our health.
BREAST CANCER SCARE
U.S. News and World Report describes "a breast-cancer epidemic": "Every three minutes, a woman is diagnosed, and every 11 minutes, a woman dies from the disease. It strikes one in eight women and kills 46,000 a year" (23 November 1992).
The National Breast Cancer Coalition sends a fundraising letter to millions of Americans. The letter reads, "We are at war. And our enemy is breast cancer. It attacks women brutally and indiscriminately. It has reached epidemic proportions in America ... . And if you think you're safe because breast cancer doesn't run in your family, think again. Three-quarters of new cases are diagnosed in women who have no family history of the disease."
The concerns about breast cancer being reported in the media are important. This dreaded disease is deadly, disfiguring, and physically and emotionally painful even if "cured." It's true that breast cancer is on the rise. At the same time, women can't walk around in terror of breast cancer. In fact, many studies have shown that anxiety predisposes people to diseases such as heart disease and even cancer.
Let's try to sort through the rhetoric on breast cancer to identify the real risks and arrive at an intelligent approach for women to prevent the disease.
Outlook (6 May 1995) reports on how the Women's Health Letter breaks down that famous "one in eight women" statistic. Those 1-in-8 odds refer only to women who live to be ninety-five. Three-quarters of diagnosed breast cancers occur in women over fifty. Breast cancer is primarily a disease of elderly women. Thanks to medical advances, we have more elderly people alive today in our society than ever before in history. If we have more casesof breast cancer, it's because we have more older women. That's the "epidemic."
It's true that there are more cases of breast cancer in young women than there were sixty years ago, because there are more younger women today, especially the baby boomers. Our population has increased. The number of cases has grown, but the rate of cases--the percentage of people in that group who get breast cancer--has not changed. The actual risk itself hasn't increased.
The Women's Health Letter (6 May 1995) stated that the 1-in-8 statistic doesn't describe a woman's immediate risk. Rather, women have different risks at different ages, with the risk increasing as a woman ages. Specifically, according to a chart developed by the National Cancer Institute, "A 20-year-old woman has a 1-in-2,500 chance of developing breast cancer. At age 30, it's 1 in 233. At age 40, it's 1 in 63. At age 50, it's 1 in 41. At age 60, it's 1 in 28. At age 70, it's 1 in 24. At age 80, it's 1 in 16. And at age 95, it's 1 in 8."
Another reason why we might seem to be having a breast cancer epidemic is related to the sophistication of our medical technology. A specialist in breast cancer at the University of Southern California, Dr. Malcolm Pike, told the Los Angeles Times (13 September 1994) that "there's been an enormous increase in screening, in early detection, an enormous increase in the number of mammographic machines around." Dr. Pike believes that most of the increase in breast cancer cases is due to the increase in detection. We can't name something and turn it into a statistic if we don't know it's there. Sixty years ago, if a woman died of breast cancer, we might not have known what caused her death. Now we know.
This same medical technology that increases the statistics on cases of breast cancer also decreases the statistics on deaths from breast cancer. The National Cancer Institute (Los Angeles Times, 1 1 January 1995) reports that breast cancer deaths dropped 6 percent from 1989 to 1992 (only three years) because more women were screened and treated. Women whose breast cancer is detected early have up to a 93 percent chance of survival.
One of the problems in coping with the fear of breast cancer has been mixed messages from the experts. Just as a woman is about to rush out to get a mammogram to assure herself that she doesn't have breast cancer, she hears that mammograms might increase the risk of cancer because of the radiation in the X ray. In fact, the National Cancer Institute argues against women under fifty having mammograms. The woman rushes back home, only to read in the newspaper the next day that the American Medical Association and the American Cancer Society say that women in their forties should have mammograms.
Why so much media attention and so many mixed messages about breast cancer? Consider who is reporting the news. Four out of 10 newsroom workers are women, and many women feel (perhaps correctly) that women's health issues haven't received a fair shake from the media and thus from government funding. Breast cancer has become more than a health issue that reflects the actual incidence of the disease. It is a political issue. For that reason it gets more coverage than it otherwise would.
Breast cancer is also what journalists call a "sexy" issue--one that grabs attention. For example, a movie about a woman fighting breast cancer would receive more attention from the general public than a movie about a woman fighting heart problems, even though heart disease kills more women each year than breast cancer. Breasts are sexual; they have deep reproductive significance; they are central to a woman's own sense of her attractiveness; they flounce through men's fantasies. In American culture (unlike many others) women cover their breasts, implicitly making them even more special, not to be revealed to just anyone. Breasts are special to us; breast cancer, therefore, is also special (and especially frightening).
How about the allegation from environmentalists that pesticides have contributed to the so-called breast cancer epidemic? Research answers with a resounding no. The largest study ever on breast cancer was conducted by the Mayo Clinic in 1994. This study found no evidence that pesticides in our food cause breast cancer. Still, the proposition that pesticides cause breast cancer is another of the many scares put forth by environmentalists.
In fact, we might argue that indirectly, at least, pesticides help prevent breast cancer. Pesticides help farmers produce low-cost and plentiful fruits and vegetables. Many studies have shown that eating vegetables and fruit protects people from cancer. The Los Angeles Times (19 January 1995) reported on a recent Greek study that found that eating vegetables reduced a woman's risk of breast cancer by 12 percent. Fruits help prevent breast cancer by 8 percent. If a woman avoids fruits and vegetables because of the media's reports linking pesticides and cancer, she may be truly "scaring herself to death."
Let's get back to the quandary faced by the forty-year-old woman considering a mammogram. Should women under fifty have regular mammograms? Breast cancer might not be an epidemic, but it's still a risk, right? That's true--a risk that varies in seriousness for different women. If a younger woman has a family history (i.e., "high risk") of breast cancer, she should have mammograms. If she doesn't have this history and is under fifty, experts argue that she can probably go either way, depending on how "safe" she wants to feel and how much money she wants to spend. Mammogramscost money. Generally speaking, mammograms for low-risk women under forty don't make economic sense because mammography for women in this age group has not been shown to save any lives, according to a recent study by RAND, the think tank based in Santa Monica, California. Also, the composition of younger women's breasts makes it harder to see tiny tumors.
Women should not avoid mammograms because of fear of the radiation in X rays. The risk of the X ray pales in comparison to the benefit of the mammogram, even for women under forty. So if the X ray is the only thing coming between you and a mammogram that you should have, you're hurting not helping yourself.
For young women who are afraid to do breast self-examinations in the shower and in front of the mirror, take courage: you're very unlikely to find a tumor. But you will become more familiar with your breasts so that you won't panic one morning when you snap on your bra and think you feel a little knot that, in reality, has been there all the time. You'll also know what your breasts are like so that as you age, entering into higher risk groups, you can (quite literally) save yourself from invasive breast cancer.
AMPLE ENDOWMENT: THE PUMPED-UP HYSTERIA OVER BREAST IMPLANTS
The largest product-liability settlement in American history--$4.25 billion --was reached in September 1994. Over ninety thousand women have filed claims to receive compensation awards ranging from $105,000 to $1.4 million, depending on their medical condition and their age when their symptoms first appeared. While the final disposition of this case is still uncertain --plaintiffs, claims administrators, and lawyers are still haggling over many parts of it--it is clear at this point that the companies involved in making breast implants will be shelling out at least the $4.25 billion already agreed to in the settlement.
Why are these women being awarded such ample endowments? Because, they claim, their health has been placed at severe risk by the breast implants they received. They claim that their implants cause cancer, rheumatoid arthritis, short-term memory disorders, lupus, other immune disorders, and fevers, aches, and chills. They assert that the main manufacturer of breast implants, Dow Corning, hid scientific evidence for twenty years that silicone, the most important ingredient in implants, suppresses the immune system.
It would seem that a settlement of $4.25 billion implies an admission of guilt by Dow Corning. But this isn't the case. Dow Corning (and other manufacturers of breast implants and their components) agreed to the settlement to avoid the legal costs of individual lawsuits, but they do not admit that breast implants cause harm. On the contrary. The companies insist that breast implants don't cause cancer, immune system disorders, or any of the other diseases claimed by the women.
The implant manufacturers aren't simply trying to deny guilt. There isstrong medical evidence that implants don't cause the symptoms claimed by the women. This should come as good news to the two million women who have implants, and who have been unnecessarily terrified by the hype against implants.
Doctors have also contributed to the hype. Many of them have their own agendas behind either banning or promoting implants. For example, The Los Angeles Times (9 September 1994) ran an op-ed piece by Dr. Samuel Epstein, a public health professor at the University of Illinois at Chicago and chairman of the Cancer Prevention Coalition. Dr. Epstein claims that breast implants, particularly polyurethane-wrapped implants, are horribly dangerous and that Dow Corning is a first-class villain. As Dr. Epstein put it, "Polyurethane-wrapped implants are ... carcinogenic sponges." That's a highly politicized way of discussing risks, even if it were true.
To the Los Angeles Times' (30 September 1994) credit, a couple of weeks later they printed a reply from Dr. Dennis Deapen, a researcher from the University of Southern California. Dr. Deapen insisted that Dr. Epstein blatantly ignored "the strong and consistent evidence that breast implants do not cause cancer in humans." Dr. Deapen pointed out that no breast cancers have been reported among the tens of thousands of women who have received polyurethane-covered implants--supposedly the most carcinogenic kind of implant. In fact, he says that breast implant patients in general develop "significantly fewer breast cancers than expected." Moreover, despite the FDA's decision to ban implants in 1992, an FDA panel of experts concluded that the risk of implants was negligible and recommended that implants remain available to women.
It's important to recognize that the FDA is currently extremely conservative and eager to ban anything that poses any potential risk at all. The FDA has also become so ultraconservative about approving anything without extensive testing that it's almost impossible to bring new products to market. In fact, Republicans in Congress, including House Speaker Newt Gingrich, want to do away with the FDA because (as they claim) the FDA prevents useful devices from being made available to the public. Therefore, just because the FDA bans something, such as breast implants, that doesn't mean the product is necessarily dangerous. It simply means that the FDA is being careful.
More and more evidence shows that the FDA might have overreacted. In 1994, researchers at Massachusetts General Hospital conducted a preliminary study of 32 women (26 with breast implants) that found that silicone seeping into the body from implants might actually help combat breastcancer. A second study of 5 women with implants found that the plasma from 3 of the women who had their implants for over a decade was able to kill the cancer cells. This study, and others like it, are leading scientists to consider how silicone might be used to help fight cancer in general (Los Angeles Times, 10 August 1994).
Researchers at Harvard University recently published the results of the largest study ever on the relationship between breast implants and immune disorders such as those claimed by the plaintiffs in the Dow Corning lawsuit. While an initial look at the numbers might be shocking (a 24 percent increase in the risk of disease), these statistics need a closer scrutiny. Although 24 percent increase seems an astoundingly high number, one must be aware that in the general population, the risk of these diseases is extremely low, only .114 percent. This means that in women with breast implants the increased incidence of these diseases is only 1 case per 3,000 (Wall Street Journal, 28 February 1996).
Furthermore, the Mayo Clinic conducted another large study of women with breast implants. This study found absolutely no evidence that implants cause any of the diseases claimed by the women. (Several other large studies also showed no association between implants and the diseases.) The Mayo Clinic researchers claim that the diseases contracted by the women who won the $4.25 billion could be completely unrelated to their implants. Indeed, until the Mayo Clinic study, no one had ever conducted a scientific comparison of the general health of women with implants versus women without implants (Los Angeles Times, 16 June 1994)!
Then why did Dow Corning and the other manufacturers agree to pay women with breast implants all this money? Do these women have a case?
Their case rests on the fact that Dow Corning didn't reveal a research study, conducted at their laboratories in 1975, that showed a relationship between silicone and cancer. This study matters, the women say, because silicone breast implants sometimes leak or rupture. Silicone can then travel through the body and cause all sorts of health problems, including cancer.
It's true that Dow Corning didn't exactly highlight this study when the FDA conducted an earlier examination of breast implants in 1987. It did mention the study, but only briefly, explaining that it didn't pertain to breast implants.
A look at the study itself will show why Dow Corning said that it wasn't relevant. In 1975, Dow Corning conducted a study of the effects of silicone injected into mice. The purpose of the study was to see if silicone could be used as a medicine. The mice in the study developed malignant tumors. Thisseems to establish a strong connection between leaking implants and cancer, doesn't it?
Not necessarily. The mice received D4, a purified form of silicone found in very tiny amounts in the implants. By itself, D4 didn't hurt the mice at all. Only when D4 was combined with another chemical, which is not included in implants, did it harm the mice. Even then, the harm was only temporary. Still, an independent evaluation of the study concluded that high levels of D4 were toxic, and this was sufficient to provide a basis for the lawsuit.
The final argument used by the women is that the carcinogenic effects of implants might require thirty years or more to show up, like the effects of asbestos and dioxin. That may be the case; we just don't know yet. However, hundreds of thousands of women have received breast implants in the last thirty years. Thus far, there is no direct proof that breast implants cause any disease.
Who really benefits in the breast implant battle? Not the women, even though those who truly are ill will receive the largest amounts of money, whether or not their illnesses were caused by the implants. Money can never compensate for what they've suffered.
Rather, it's the lawyers who are going to become truly "amply endowed." Right after the settlement was approved, the big topic of discussion became how much of that money the lawyers would get. The women are trying to limit it to $1 billion of the $4.25 billion. Overall, the breast implant litigation and settlement took two years to resolve, and it may take another year for the claims administrator to evaluate the validity of all the claims. A billion dollars for a few years' work isn't bad.
AN APPLE A DAY ...
... causes cancer?
According to a now-famous 60 Minutes broadcast in February 1989, the "most potent cancer-causing agent in our food supply is a substance sprayed on apples ... ." 60 Minutes warned that this substance--Alar--was especially dangerous to children.
Hearing those words, mothers across the country snatched the apples out of their kids' lunch boxes. Grocery stores around the nation pulled apples off the shelves. Bachelors threw cans of apple juice concentrate out of their freezers. School boards banned apples from their school lunch programs. Health-conscious organic types, who really did eat an apple a day, felt suddenly nauseated at the thought of what they'd done.
What 60 Minutes didn't say was that, although apple growers had in fact already moved away from using Alar, it had once been used to keep apples longer on the tree and make them look prettier. The program also didn't say that scientists were uncertain about any health hazards caused by Alar, and that scientific studies contradicted each other. But that didn't stop the media machine from revving into high gear.
The Alar scare is a classic case of how the media can create mass hysteria and cause far more damage than the alleged culprit ever could.
Why was the Alar story such a media gold mine? It had two of the media's favorite attention-grabber words--children and cancer. These were coupled with that most American of concepts, the apple--part of apple pie, Johnny Appleseed, Washington and Vermont autumns, pork chops and applesauce.
Who actually suffered most as a result of the Alar scare? To begin with, we did. Most of us became unnecessarily scared of a healthy, nutritious food.Second, we allowed the media to tell us what was dangerous, rather than science. That's more dangerous as a precedent than any Alar-sprayed apple. And finally, the apple industry lost more than $100 million and dumped billions of apples. Some smaller apple growers were even forced out of business.
It's more than just a shame that all this had to happen, especially because the evidence against Alar wasn't absolutely sound. The U.S. Environmental Protection Agency had been investigating Alar, but the evidence was so incomplete that they had decided there was no rush to take any action.
Even the U.S. surgeon general, C. Everett Koop, whom the public viewed with esteem and trusted as the protector of the public health, wasn't worried about Alar. Dr. Koop explained in 1992 why he didn't jump onto the Alar-scare bandwagon: "If Alar ever posed a health hazard, I would have said so then and would say so now. When used in the regulated, approved manner, as Alar was before it was withdrawn in 1989, Alar-treated apple products posed no hazards to the health of children or adults."
According to biochemist Bruce Ames of the FDA Consumer (June 1990), the risk of cancer from Alar is about the same as from tap water (which contains chloroform)--and, in fact, is about thirty times lower than the risk from peanut butter (which contains aflatoxin, a natural carcinogen).
So there we have it. As one of the directors at the National Cancer Institute, Dr. Richard Adamson, put it, the risk of eating an apple sprayed with Alar was "certainly less than the risk of eating a well-done hamburger."
In 1989, during the Alar scare, its manufacturer pulled it off the market. But it wasn't until 1992--three years later--that the EPA officially banned Alar (a bit of a moot point, wasn't it?). The EPA proclaimed that Alar "posed an unreasonable risk of cancer to the public." Realistically, despite the opinions of experts such as the surgeon general, the EPA had to condemn Alar--or look foolish, given the public uproar that had occurred. Worse yet, the EPA would look like a puppet of pesticide companies, which would play right into the hands of environmental groups.
Ultimately, what really happened in the Alar wars is that one special interest group "won": the environmentalists. They played on the media's favorite buzzwords (children and cancer), manipulated the EPA into taking a public stance against Alar, and struck the fear of pesticides into our hearts. The Alar scare made us more aware of the possible dangers of pesticides than ever before. That's just what the environmentalists wanted. They dangled bait--an Alar-sprayed apple--in front of the media, and the media bit.
In the end, the advice still holds true: "An apple a day keeps the doctorsaway." Apples are an excellent source of fiber and many vitamins. The benefits of eating apples far, far outweigh any possible risk you might incur. So pack those apples in kids' school lunches, bring an apple to work, bake apple pies, drink apple juice, and enjoy.
SOUR GRAPES
How many cyanide-laced grapes does it take to kill someone? A bunch? Ten? How about if the amount of cyanide isn't even close to lethal? How many grapes does it take then?
In 1989, two grapes--yes, two--from Chile were found to contain nonlethal amounts of cyanide. As a result, the U.S. Food and Drug Administration pulled all fruit from Chile off supermarket shelves and stopped all imports. The cost to the U.S. economy was at least $20 million, according to the New Republic (1 May 1989). That's $10 million per grape--and these two grapes weren't going to hurt anyone even if they were plucked and eaten.
Why did the government overreact so much? Well, imagine the accusations that would be flung at government officials if they knowingly let those two grapes show up in a store: "The U.S. government doesn't care about the public's health." The president and his cabinet officials might be blamed for being "unresponsive to public needs." Various members of Congress would make soapbox speeches about the imperative to protect the public good ... .
All over two little grapes.
Maybe some judicious testing of samples of the fruit at different locations around the country would have been a more appropriate approach for the FDA to take.
But consumers also reacted. Grapes sales were down for months. What could happen to two grapes could happen to any grape, right? Even out of millions of grapes. Why did we overreact when we heard that two grapes contained trace amounts of cyanide?
We fear that the two-grapes-out-of-a-million could end up on our plate. And what if there are more than two? How do we know there aren't others? What if there's a conspiracy by Chilean extremists to kill us? Maybe we also should give up Chilean black beans. How about Chilean sea bass? And orange juice, too; some of the oranges in that concentrate in your freezer come from South America ... .
Whoa! Look at it this way. You don't quit your job and stay home for the rest of your life even though people are killed on their way to work in car accidents every day. You don't stop eating meat because someone was once poisoned by it--or avoid wheat because once there was a bug in the flour--or fruit because you once bought a peach that had a worm. Of course not. Life is risk. And when the risk is 1 grape in 1 million, the odds aren't bad.
On a larger scale, think about the possible repercussions of our overreactions. Aren't we giving a little too much power to terrorists and maniacs? If one person tampers with Tylenol and we pull Tylenol off the shelves, we are being careful with the public's health, but we're also giving tremendous power to that terrorist. And all the people who need Tylenol because they can't take aspirin are out of luck.
It's true that many things can kill us. Every object and experience harbors hidden danger. Will that danger step out of the shadows and strike you? Will the two grapes in a million end up on your plate? Maybe. Does that mean that you'll never eat another grape for fear of cyanide, or never go outside for fear of murder, or never touch another person for fear of contracting a rare infectious disease? We hope not. Helen Keller put it best: "Life is either a daring adventure, or it is nothing."
FEAR OF FAT
One of the worst fears that individuals have in our modern society is FAT. We dread becoming fat. Diet books sell by the millions of copies as the public eagerly reads how to eat and cook in low-fat ways. Thousands of young women become anorexic each year because they fear becoming fat. People pay hundreds of dollars to visit "fat farms," where they can exercise strenuously, eat low-fat or no-fat foods, receive nutritional advice, and (hopefully) lose a few pounds. New Year's resolutions typically involve a pledge to lose weight. Thousands of television programs, magazine articles, and newspaper stories tell us how to avoid the "f" word: FAT.
"Fear of fat" truly illustrates how we're scaring ourselves to death--literally, in the case of anorexic and bulimic young women. But Americans' fear of fat isn't shared around the world. In other countries, particularly in the Third World, chubbiness is seen as a sign of wealth (indicating the ability to grow or purchase food). Only in countries of wealth and abundance is fat feared.
How bad is fat really?
Media reports make fat sound really, really bad. Fear of fat hogged the headlines when the Center for Science in the Public Interest (CSPI) slammed Chinese, Italian, and Mexican food for its high fat content. Fettuccine Alfredo was called "A HEART ATTACK ON A PLATE" (Center for Science in the Public Interest as quoted in the Los Angeles Times, 11 September 1994). The CSPI equated a dinner-size serving of fettucine Alfredo, at 97 grams of fat, with eating a full stick of butter. Two chile rellenos with beans and rice are only a little less bad--about a quarter-pound stick of butter, or 92 grams of fat. Chimichanga: 86 grams. Kung pao chicken: 76 grams, orthe equivalent of four McDonald's Quarter Pounders.
Worst of all was the CSPI's popcorn study. A medium tub of buttered popcorn at movie theaters, they reported, has "more fat than a bacon-and-egg breakfast, a Big Mac-with-fries and a steak dinner with all the trimmings ... combined." As columnist and humorist Dave Barry wrote, "You got the impression from the wildly excited press coverage that after a movie ends the ushers have to use forklifts to clear the bloated corpses out of the theater."
It's true that excessive fat intake is linked to cancer and heart disease. The non-fat and low-fat food trend is positive in that it does help to improve our health. The media blitzes by the Center for Science in the Public Interest have led movie theaters to switch from saturated-fat coconut oil to healthier alternatives in making popcorn. Supermarkets have introduced health foods. McDonald's and other fast-food chains now present lower-fat alternatives, including low-fat burgers and fat-free muffins.
However, this dietary craze against fat can go too far. We need fat. Women need body fat in order to have children, and our skin needs body fat to retain elasticity. There are other benefits to fat as well. How have we come to see fat as an enemy to be vanquished at all costs?
The no-fat/low-fat trend has the same impetus as other food trends throughout history: in the human desire to figure out ways to live longer, to be healthier, and to meet with social approval. It can be amusing to look at dietary advice of the past to see how wrong such advice later proved to be. For example, it was once believed that salt and other condiments cause insanity. Cooked vegetables were "against God's law." Food gurus of the past once believed that consumption of fourteen pounds of strawberries and grapes a day lowers blood pressure. Slow, methodical chewing lowers blood pressure. Grape Nuts cures malaria. Fasting and then eating as many grapes as possible will cure cancer. Eating starches with anything else (especially protein with carbohydrates) could lead to a "digestive explosion." Honey and meat served rare can prevent aging, loss of sexual potency, and premature death. Vitamin C cures back pain and viruses. Drinking a quart of milk a day prevents cancer. Bone meal retards aging.
And so on--all the way up to the present. We continue to seek the dietary key to longevity and health. Today the trend is toward no-fat or low-fat foods.
The contemporary wisdom--and culture--condemning fat began in the 1970s. (Remember the model Twiggy?) In 1977, George McGovern's Select Committee on Nutrition and Human Needs announced that Americans should cut back on fat. The committee's report, called "Dietary Goals for the United States," told us to cut down on cholesterol (a fat), other fats,and sugar. We were told to eat fiber-rich and cruciferous vegetables and fruit (such as broccoli, cabbage, and apples). Within ten years, at least ten more health agencies had provided similar advice.
At the same time, conflicting advice on the same subject began to proliferate. Dr. Robert Atkins, writer of many successful diet books in the 1970s, told people they could eat as much high-fat food as they wanted as long as they didn't eat carbohydrates, fruits, and vegetables. Then in 1981 Nathan Pritikin told consumers to cut fat to 10 percent, even though the government recommended a 30 percent intake. Recently Dr. Walter Willett of the Harvard School for Public Health told the New York Times (10 May 1995) that the famous government recommendation to limit fat intake to 30 percent in order to prevent heart disease isn't based on facts. Dr. Willett claimed that research hasn't identified an ideal amount of fat for our diets.
The "Seven Countries Study" of 1980, which has been considered the pivotal study for dietary recommendations, compared heart disease rates and dietary fat intake among different countries. The heart disease rate was lower where less fat was eaten. However, the lowest heart disease rate was found in Crete, where people ate 40 percent fat! A Harvard study showed that certain types of fat, such as olive oil, can help ward off cancer. It seems that the type of fat consumed, not simply the amount, is most important.
No wonder we're confused! What do we do? Should we simply eat, drink, and be merry, for tomorrow we die?
Virtually all ancient wisdom has prescribed moderation as an ideal path toward health. Fat isn't evil. We shouldn't be morbidly scared of it. Rather, a diet composed only of fat, or a diet without any fat at all, is what's dangerous. Our culture's obsession with fat has led to healthful changes and greater awareness in eating habits. There are areas of strong consensus among scientific studies, such as the benefits of eating fruits and vegetables, and the risks of eating too much fatty meat. Let's use our knowledge and also enjoy our lives: intelligently eating, drinking, and being merry.
And if you really crave a Big Mac (or movie theater popcorn fried in coconut oil), buy it, bite into it, and savor it. Maybe what we really need is research studies on the health benefits of enjoying food--the good, the bad, the fattening, and the low-fat alike.
AIDS, FEAR, AND THE MIDDLE-CLASS HETEROSEXUAL
Heterosexuals have become so afraid of contracting AIDS that they lie awake at night in bed with their long-term heterosexual monogamous partner and worry whether that partner was monogamous five years earlier. A heterosexual hesitates to shake hands with an old friend who is now HIV-POSITIVE, even though HIV can't be transmitted just by shaking hands, and the HIV-positive friend, who has already faced so much anguish, feels even more from being ostracized. A woman dies when she refuses a blood transfusion because she is afraid--unnecessarily--of tainted blood.
Why have we focused so much attention on the risk of heterosexual AIDS? Because the media told us we are in impending danger. As summarized in the American Spectator, July 1994, in 1994, CNN proclaimed that new AIDS cases rose 111 percent in 1993 "because of a sharp increase in infections among heterosexuals ... . AIDS resulting from heterosexual contact in 1993 rose 130%." Time and Newsweek, always jumping at a sensational story, reported that AIDS cases had doubled in 1993 largely because of infections in heterosexuals. Even past surgeon general Antonia Novello said in the Los Angeles Times (as quoted in an article from American Legion magazine, September 1993) that "AIDS in homosexual men will be surpassed by AIDS in heterosexuals in some parts of the population."
The increased rate of heterosexual AIDS was reported in 1993 when the Centers for Disease Control expanded the definition of AIDS. Before 1993, an official diagnosis of the disease required that a person (1) was HIV-positive and (2) had symptoms typical of homosexuals in the later stages of AIDS. But in 1993, the CDC said that an official diagnosis of AIDS required only HIV-positive status and a blood T-cell count below two hundred. AIDS alsocould be officially diagnosed if an HIV-positive person had pulmonary tuberculosis, recurring pneumonia, or invasive cervical cancer, diseases much more common among heterosexuals than homosexuals.
Why is this important? The change in definition increased the number of people who would be counted as having AIDS. People who were HIV-POSITIVE but hadn't been officially diagnosed--such as women--became part of the AIDS statistics. The change also meant that people who wouldn't have joined the statistics under the earlier, narrower definitions for several more years suddenly became diagnosed with AIDS. In fact, as Michael Fumento reported in his book The Myth of Heterosexual AIDS (1993), if we compare 1993 to 1992 using the original definition, there was a 2 percent decrease in AIDS overall. Heterosexual cases were slowing, not increasing, by the original CDC definition. That's great news, but it was ignored by the media.
But why would the CDC change its definition, causing more grief and fear in everyone? The CDC changed its definition for two good reasons. The new definition meant that nonhomosexual AIDS sufferers could get the federal support and other benefits (such as free medication) that an official diagnosis of AIDS would bring them. Second, interest groups such as ACT UP had been pressuring the CDC to broaden the definition, both to help people receive benefits and also to keep the statistics high. These groups wanted to keep AIDS a "hot issue" so that research funding would continue. So for these reasons, the CDC expanded the definition of AIDS. It was great for treating sufferers of AIDS, but it seriously skewed the statistics for heterosexuals.
The reported rate of increase of AIDS among heterosexuals is also misleading. The horrified cries that we've been hearing about a "130% increase!" among heterosexuals make the spread of infection sound much worse than it truly is. Look more carefully at the numbers behind that "rate." We might hear that AIDS is increasing among heterosexuals at a rate of 130 percent, which is higher than the current rate among homosexuals. However, what that statement doesn't include is the fact that the heterosexual group is only a tiny fraction, in terms of actual size, of the other group (homosexuals). Let's simplify the math to see the problem more clearly. If there are 10 heterosexuals who have AIDS, and 10 more then come down with it, that's a 100 percent increase. But if there are 10,000 homosexuals who have AIDS, and then 1,000 more contract it, that's an increase of only 10 percent. The "100% increase among heterosexuals" sounds a lot worse than the "10% increase," doesn't it? But actually we're talking about 10 people versus 1,000.
All this attention to AIDS in the heterosexual middle class doesn't beginto describe who really has the disease! In 1994, the Centers for Disease Control released numbers on who really has AIDS, and how they got it:
• 54 percent were homosexual
• 24 percent were heterosexual but used drugs intravenously
• 2 percent were recipients of drug transfusions
• 1 percent were hemophiliacs
• 7 percent were homosexual and used drugs intravenously
• 6 percent didn't know how they got AIDS
• only 6 percent (or 23,069 people) contracted AIDS through heterosexual contact
As for that 6 percent who contracted AIDS through heterosexual contact, most had slept with an intravenous drug user (Los Angeles Times, 17 September 1994).
In spite of this reassurance, AIDS is nothing to scoff at. It deserves the personal concern and research dollars devoted to it. At the same time, worrying too much about AIDS--especially if you are not in a high-risk group--can cause more harm than good.
One of the worst results of the heterosexual AIDS paranoia is homophobia. When we become really frightened, we tend to run from the people who make us face those fears. Now the media have been screaming at us that heterosexuals are not immune to a disease many heterosexuals thought could only occur to "them." For some heterosexuals, that reality threatens to obscure sexual differences, and evokes fears of death. So what do some heterosexuals do when they feel the fear of AIDS? They blame, hate, slander, or fear homosexuals.
With AIDS, we must understand the real, and yet quite limited, risks for heterosexuals, and intelligently channel our energy and federal dollars toward the groups that are really suffering the most.
THE SCARLET "H"
Joan and Kevin have been dating casually for a few weeks. They have a lot in common, including important values. Tonight they've had a romantic dinner at Joan's apartment. They each think that "tonight's the night" for their first time making love.
Still fully clothed, Kevin is on top of Joan on the couch, or is it Joan on top of Kevin? It's hard to tell because they're having so much fun. Suddenly Kevin stops, sits up, and says, "We should talk about our sexual histories." His words put a damper on their ardor, but Joan appreciates his desire to talk before they sleep together.
"Well," she starts, "I had genital warts a couple of years ago. And I had chlamydia once. They're both treated now." Both of these sexually transmitted diseases, she knows, are extremely common. The warts can recur anytime but haven't so far.
"I've never had those," Kevin says. "But I have herpes." Joan pulls away. She remembers reading a Time magazine story back in the mid-'80s about "the scarlet letter," H for herpes.
"It's not really a problem," Kevin is continuing. "I have an outbreak about twice a year, and I just don't have sex then. Pretty much the only way to pass herpes is during an outbreak."
"'Pretty much?'" She thinks. How do either of them know she wouldn't get it?
"There's a tiny chance--maybe a few days a year--that you can get it through what they call 'asymptomatic shedding.' That's if the virus is on my skin without my knowing it. But if we use condoms, that shouldn't be a problem."
"Always use condoms?" Joan had been dreaming about marrying Kevin, and she wanted to have children. How could she have kids if they had to use condoms every time they had sex?
"Even if you do get it someday, it's not the end of the world. A herpes sore is just an annoyance. That's all. It doesn't lead anywhere else or cause other problems, like chlamydia or warts."
Joan stood up. "What do you mean 'it's not the end of the world'?! Speak for yourself!"
Should Joan refuse to sleep with Kevin because he has herpes? Should she be so afraid of this disease?
Herpes has received more overblown media attention than any other sexually transmitted disease until the discovery of AIDS. That media attention--and the panicked responses it evoked--have been responsible for countless stories of heartbreak. People who have herpes--and that's 1 in 4 people in the United States--can feel like no one will want them simply because they have this sexually transmitted disease.
Herpes is not actually one disease, it's a family of viruses, including herpes zoster and the herpes simplex virus (HSV). Herpes zoster causes chicken pox and shingles. HSV comes in two forms: HSVI and HSV2. HSV1 typically produces cold sores around the mouth. HSV2 usually causes genital herpes. Both HSV viruses, however, can be spread from the mouth to the genitals.
Herpes does carry risks. An infected mother can transmit the infection to her infant. But that risk can be virtually eliminated if a pregnant woman lets her doctor know she has the disease. The doctor then can monitor the mother's condition, and if an outbreak occurs late in the pregnancy, he can prevent transmission through delivery by cesarean section. Studies are not yet conclusive, but at least one does suggest that HIV may be transmitted through genital blisters caused by HSV2.
Nevertheless, of all sexually transmitted diseases--gonorrhea, syphilis, chlamydia, genital warts (HPV, or human papillomavirus), and more--herpes is the most benign. All of the other diseases can lead to infertility and, in the case of genital warts, to cancer. Herpes leads to nothing else. The herpes viruses remain dormant in the body, like thousands of other viruses that we harbor, and travel to the infection site (such as the genitals) when the body's immune system is lower for whatever reason. The viruses cause a small, painful blister on the skin, but sometimes it is hardly noticeable. After a few days, even without treatment, the blister goes away. That's it.
Herpes is also incredibly common. Eighty percent of us suffer occasionally from herpes zoster, which emerges in the form of cold sores on the lips and/or nose. We don't socially ostracize people if they have a cold sore on their lips (which is passed the same way that genital herpes is passed, through direct contact). But we have ostracized people who happen to have these sores on their genitals.
Long-term research studies, such as the UCLA "Couples Study" in the 1980s, have shown that a person with herpes doesn't always transmit the disease to his or her partner, even if the couple isn't using condoms. Couples sleep together for years without the noninfected partner ever being infected with herpes.
Despite these facts, herpes has been elevated from the minor inconvenience that it is into a nightmare, raging disease. The media needed some dread disease to focus on in the mid-'80s (note that herpes coverage declined dramatically once the media had something really worth focusing on, AIDS). Herpes is more a media construction than a serious public health problem. And it is a "social disease" in the true sense of the word: the primary harm is social, not physical. Some people who contract herpes withdraw from dating altogether because they are afraid of rejection if they tell someone that they have the infection.
What people like this might not know is how many other people have herpes. The Herpes Resource Center of the American Social Health Association estimates that about 25 percent of the people in the United States (or sixty-five million) have genital herpes. That's far more than the entire population of Canada (which is twenty-eight million). Each year in the United States we have an estimated 500,000 new herpes infections.
Not all of these people know that they have herpes, and having herpes causes no problem for two-thirds of them. One-third of those who are infected with the virus never have outbreaks (they have built up strong antibodies). Another third have tiny outbreaks that they don't really notice. The remaining third have noticeable outbreaks like Kevin described.
Unlike some sexually transmitted diseases, herpes can be treated and even suppressed. Even the third of herpes carriers who do have outbreaks can prevent them by taking a drug called acyclovir. Acyclovir disables the herpes virus and for most people carries no side effects.
But herpes is incurable-- right? That's right: in the same way that the common cold is incurable. Viruses continue to live in your body, and you develop antibodies to them. Unlike bacterial infections, which generally respond to antibiotics, viral infections are "incurable."
It's certainly wise to take precautions against the possibility of getting herpes. But it's certainly not worth breaking off a relationship with someone you love. And if you already have herpes (one-fourth of you do), your primary concern should be to educate other people to the reality of this "disease," rather than the horrors fabricated by the media blitz.
EVERYTHING CAUSES CANCER
Sometimes it really does seem like everything causes cancer. A computerized search through various news media databases identifies literally thousands of stories on cancer and cancer-causing agents each year. We're all only too familiar with the headline: " ... Causes Cancer." (You fill in the blank.)
It's true that cancer is the second-biggest killer in the United States, after heart disease. It certainly makes sense to wonder what has caused this increase in cancer deaths. Cancer kills more than half a million Americans annually. Heart disease, though, kills 40 percent more--700,000 people--but gets much less press.
Here is a litany of only some of the many carcinogens presented to us as explanations for the rise in cancer: smoking, secondhand smoke, dioxin, Alar, DDT, asbestos, nuclear power, electromagnetic fields, radon, benzene, air pollution, radiation and X rays, ultraviolet rays, food dyes, tanning beds, breast implants, red meat, saccharine, coffee, insecticides sprayed by crop dusters in Third World countries, water, chinaware, paint, video display terminals, cell phones, any packaged food, pesticides in general ... .
Notice that all of these alleged carcinogens are products of industrialization, or at least weren't identified as causing cancer until we had developed sophisticated technology (in the case of radon and other gases, for example). The arrival of these products and elements of industrialization into our lives has coincided with the increase in cancer. We have connected the two: the incidence of cancer has risen along with industrialization. Therefore, many have concluded, the products of industrialization cause cancer. This is a convenient explanation.
What is usually not presented to us, though, is the No. 1 cause of cancer: old age. Old age has also "increased," so to speak, throughout the twentieth century. We can now expect to live past the age of seventy-five. In 1900, we lived only half that long. As Bruce Ames, chief of the Microbial Genetics Section at the prestigious National Institutes of Health, pointed out in an interview with the Los Angeles Times (11 August 1994), "The incidence of cancer goes up very sharply with age. Mice live two years; by the end of their life span about a third of them have cancer. Monkeys live twenty or thirty years; by the end of their life span about a third of them have cancer. People live eighty or ninety years; by the end of our life span, about a third of us have cancer." In other words, cancer in old age is not abnormal at all. There's nothing to blame but old age itself.
Columnist Robert Scheer explains that "longevity is the real epidemic, the greatest danger to our health. Because we live longer, we are more likely to develop health problems associated with aging that our shorter-lived greatgrandparents didn't face" (Playboy, March 1995). Ames phrases the same idea slightly differently: "Now that we've conquered all those diseases, we're living long enough to get cancer."
Ames explains why old age predisposes a person to cancer. Our metabolism is based upon oxygen, and the by-products of oxygen cause infinitesimal "lesions" in each of our cells each day. These lesions also can harm the DNA in cells. When those cells divide, as our cells do continuously, any harm to the DNA is passed on. Ames says that "by the time you're old, we find a few million oxygen lesions per cell. And when the cell divides, those turn into mutations, or some percentage of them does. Most mutations don't matter, but some are in key genes and then you have cancer."
There's not a lot we can do about our need for oxygen. But by increasing our dietary intake of antioxidants--such as Vitamin C, carotenoids, and Vitamin E--we can combat some of the negative effects of oxygen on our cells. Dr. Gladys Block of the University of California at Berkeley has found strong evidence that eating fruits and vegetables (the best sources of antioxidants) protects us from cancer. Dr. Block analyzed 172 epidemiological studies from around the world to arrive at this conclusion.
Perhaps our improved diets--more fruits and veggies--can account for the lower cancer rate that we've seen recently. (Yes, lower, despite the media hype about carcinogens everywhere.) In fact, as our population is aging, we actually have less cancer than we would expect. People live longer, so we should see more cases of cancer, right? Wrong.
The Centers for Disease Control tell us that death rates for most forms of cancer have been declining for decades. Sixty years ago, cancer of the stomachkilled just over 31 people out of every 100,000. Today cancer of the stomach kills fewer than 5 in 100,000. Because pap smears and other gynecological cancer prevention tests are so widespread, uterine cancer has declined from 31 cases per 100,000 to 6. Liver cancer has seen a similar striking drop (Los Angeles Times, 11 September 1994).
The exceptions to this general decline are lung cancer and leukemia (cancer of the blood). Twice as many people today (6 in 100,000) die from leukemia than in the 1930s, and almost twelve times as many die from lung cancer. Sixty years ago, lung cancer killed only 3 people in every 100,000, while now it kills about 48. Lung cancer is believed to be caused almost entirely by smoking.
The media furor over preventing cancer is useful in some respects. News reports tell us that smoking does indeed cause cancer. Living directly underneath a high-power electrical line could cause cancer. Pesticides do cause cancer, at least in the high quantities given to laboratory animals. High levels of radon in a closed room can cause cancer. And so on.
More than anything else, cancer has become the symbol of our mortality. No wonder, then, that we obsess over it, seek to explain it away, try to overcome it, hate it, and fear it. Amazingly, our advancing technology is managing to understand and control it, at least somewhat. We shouldn't be scaring ourselves to death over cancer, but thanking our scientists and physicians for allowing us to live longer and for making inroads in prevention and treatment of this dreaded disease.
"PESTICIDES PREVENT CANCER" (NO, THAT'S NOT A TYPO)
What? Isn't this title wrong? Shouldn't it read "Pesticides Cause Cancer"? That's what headlines of news stories across the nation have shouted to us for years now. "Pesticides Levels Unsafe for Children." "Toxic Strawberries." "Is Your Family Safe?"
No, the title "Pesticides Prevent Cancer" is correct. This chapter will show you not only that synthetic pesticides are safe (as currently regulated by the government) in our diets, but that their existence actually does help keep us healthier.
We can discover some of the reasons for optimism by following the story of one of the world's most famous and respected biochemists, Dr. Bruce Ames. Dr. Ames started as a critic of pesticides and, through his illustrious career, has become a champion of the benefits of pesticides. First, here are his credentials. Ames is chief of the Microbial Genetics Section of the National Institutes of Health, chairman of the Biochemistry Department at the University of California, Berkeley, and a member of the National Academy of Sciences. Other honors include service on the Board of Directors of the National Cancer Institute, numerous scientific awards, and over three hundred scientific papers in prestigious journals. He is currently the director of the National Institute of Environmental Health Sciences Center. He does no work for private industry that could lead him to make "scientific" pronouncements that support a commercial purpose more than a scientific one.
Early in his career, Dr. Ames was the darling of environmentalists and an outspoken critic of pesticides and man-made food additives. Today, fifteen years later, Dr. Ames says, "Unless you wade around in the stuff, pesticidesdon't cause cancer. That's the bottom line!" (Forbes, 25 October 1993) In fact, he thinks that pesticides help prevent cancer.
What happened during those fifteen years? Dr. Ames invented a sophisticated test to determine the cancer-causing potential of various substances --from chemical pesticides to dyes to foods. The test identified the presence of "mutagens," which cause cancer. As a result of his tests, many synthetics were indeed identified as carcinogenic and were pulled off the market.
However, what Dr. Ames and his colleagues also found was that mutagens are everywhere--in supposedly benign, "natural" foods as well as in synthetic pesticides. As Dr. Ames explained, "There were mutagens in celery and in a cup of coffee. The natural world is full of mutagens." Peanut butter contains aflatoxin, a known carcinogen. Many spices, smoked or salted fish, corn, pickled dishes, and broiled or fried beef, pork, eggs, and chicken are only a few other "natural" foods that contain cancer-causing mutagens.
As a result of these findings, Dr. Ames began to look more critically at the antipesticide environmental movement. He saw that this movement tended to divide all foods and substances into two camps: "'If it's man-made, it's bad; if it's natural, it's fine.'" He explains, "That didn't fit with anything I knew about toxicology, so I became increasingly suspicious of this kind of thing."
Just because a substance is synthetic or man-made doesn't mean that it will cause cancer. Conversely, just because a substance is "natural" doesn't mean that it's not a carcinogen. Dr. Ames is fond of pointing out that a cup of coffee contains ten milligrams of natural carcinogens, which is about how much pesticide residue the average person ingests in a year. Eating a full year's worth of nonorganic produce, then, is as dangerous as drinking one cup of coffee.
So maybe buying fruits and vegetables at the nonorganic market isn't so dangerous after all. Actually, quite the opposite. Many studies have proven that consumption of fresh fruits and vegetables is one of our best preventions for cancer. Dr. Ames claims today that "pesticides lower cancer rates because they make fruits and vegetables cheaper and people buy more of them." Hundreds of studies have proven that eating fruits and vegetables, with their antioxidant Vitamins A, C, and E, is our best defense against cancer.
But what do we read about in the news? The synthetic carcinogens are all around us. In 1992, researchers Robert Lichter, Stanley Rothman, and Mark Mills analyzed 1,147 news stories on cancer that had been published over the last twenty years. These researchers saw that the media paid the most attention to artificial carcinogens, most of which scientists think are low-risk.At the same time, the media tended to ignore the many "natural" carcinogens around us.
Dr. Ames also presents us with a different way of looking at pesticides. He points out that we eat "pesticides," in the literal sense, in every bite we take of any food that comes from the earth. "Pesticides" are like plants' immune systems. Plants don't have weapons to ward off insects, and if plants hadn't developed natural chemical "pesticides," they would have been devoured by insects and become extinct millennia ago. Dr. Ames said that "99.99% of the pesticides we consume are naturally present in plants to ward off insects and other predators." We ingest ten thousand times more natural pesticides than artificial pesticide residues every day.
The real danger of all the media attention paid to synthetic pesticides is that more-serious problems are ignored. Daniel Puzo, who covers food safety for the Los Angeles Times, has pointed out that all the attention to pesticides -- and the research money thrown into it -- have meant that government and the public haven't focused on more dangerous risks, such as bacteria-borne food illnesses (including salmonella and E. coli bacteria). According to the Centers for Disease Control, the United States reports 6.5 million food-borne illnesses per year, with nine thousand deaths from food poisoning. This is a much higher rate than cancers allegedly caused by pesticides, but we don't hear about them.
Elizabeth Whelan, president of the American Council on Science and Health (a consumer-education organization), thinks that consumers who choose organic produce make this choice based more on fear than on reason. She states (New York Times, 7 July 1995) that people get confused over the pesticide risk issue, merging "real concerns with hypothetical ones." If the pesticide risk is 1 in 1 million, while the risk of bacterial contamination in food is 1 in 100,000, people might still focus on pesticides.
There are dozens more encouraging testimonials about pesticide safety that we could report here. We'll briefly mention a couple more to encourage you to eat fruits and vegetables in peace and confidence.
The University of California at Davis FoodSafe Program provides research-based information concerning food safety to the public, educators, and regulatory agencies. Dr. Carl Winter, the program's director, cites the Food and Drug Administration's "Total Diet Study," in which researchers purchased food at retail outlets, cooked or prepared it as we normally do, and then analyzed the pesticide levels. The FDA found that our average exposure to pesticide residues is less than 1 percent of the level that is "allowable" -- that is, safe levels "based on the results of long-term animal toxicology studies and conservative 'safety factors.'"
In a report titled "Pesticides and Food Safety: Doing the Right Thing for the Wrong Reasons," Dr. Winter reviewed a National Academy of Sciences study that, he says, "concluded that the significant health benefits from consumption of a diet rich in fruits and vegetables (such as decreases in heart disease and in certain types of cancer) far outweigh any potential risks from pesticide residues in the diet. In a recent California consumer survey performed by a UC Davis colleague, however, 8 percent of consumers surveyed made the unhealthy choice to decrease their consumption of produce due to pesticide concerns." This fear-based behavior isn't protecting us; rather, we're scaring ourselves to death.
What about the impact of pesticides on children and infants, whose bodies are more immature and vulnerable? Dr. Winter points out another National Academy of Sciences report, titled "Pesticides in the Diets of Infants and Children" (1993). Although the Academy found serious deficiencies in the methods used to assess risks and regulate pesticide use for children and infants, the report nevertheless told parents to continue to feed children lots of fruits and vegetables. The benefits of fresh produce far outweighed the risks.
The American Cancer Society, the American Medical Association, the World Health Organization, and many other scientific groups have gone on record that they aren't worried about the risks from pesticide residues in our diet. What we should be concerned about, these organizations all say, is problems of microbial contamination (such as E. coli or salmonella), and most of all the need to eat a nutritionally balanced diet, including fruits and vegetables, which the careful use of pesticides has made more readily available to the public.
All of the media attention paid to pesticides has proven to be a double-edged sword. It has been bad because fear of cancer-causing pesticides has kept people from eating fresh produce, the very thing that prevents cancer. On the other hand, the attention has been good because pesticides can cause cancer if they are found in high amounts on food. The media hype has led to tighter regulations on pesticide use, which affects not only our food, but our groundwater, our atmosphere, and the health of farm workers.
Several federal agencies are intensely involved in pesticide regulation, including the Environmental Protection Agency, the Food and Drug Administration, and the Department of Agriculture, all the way up to the president's office. For example, the EPA is busy considering how to improve the measurement of pesticides in our food. Procedures are being developed to check not how much pesticide is on the produce as it leaves the farm (which is how pesticides are currently measured) but how much actually makes itto the dinner plate. Pesticide residue can sometimes be reduced by washing the produce; other times the pesticides become more concentrated by the methods used in preparing processed foods. The plan would also consider the impact of pesticides on children, who have immature bodies and consume more fruits and vegetables in proportion to body weight than adults do.
The trend is toward toughening pesticide regulations. In 1949, the FDA set the first safety standard for food additives, and then tightened it in 1960 with the "Delaney Clause," which requires a "zero-risk" standard. Because today we have sensitive chemical methods for identifying truly minuscule quantities of carcinogens--in parts per billion or trillion--the EPA has been allowing "negligible risk" for small amounts of pesticide. However, in October 1994 the Clinton administration signed an agreement to uphold the Delaney Clause. This agreement means that up to eighty-five popular pesticides could be banned, but with the ban phased in over five years so that farmers can find substitutes.
Weighing the pros and cons, clearly pesticides are a positive result of chemical technology. They make fresh fruits and vegetables plentiful and readily available. Other biochemical technology allows government and private-interest agencies to become "watchdogs" to make sure that pesticides don't go awry. Bite into the apple, make a strawberry pie, steam some zucchini, cut into a cantaloupe, toss a salad, eat your brussels sprouts--because you can hardly do anything better for your health.
HAMBURGERS: THE ALL-AMERICAN ... POISON?
In 1994, every American ate an estimated sixty-seven pounds of beef, on average. That means that America consumed almost seventeen billion pounds of beef in 1994 alone. And hamburgers are about as all-American a form of beef as you can get. They're the centerpieces in barbecues all across the United States, shared by people in New England, the Old South, the Midwest, the Great Plains, sunny California, the western deserts, and even the Pacific Northwest. All over the world people are eating Big Macs. Almost everyone loves hamburgers.
So what happens when hamburgers kill people?
In the fall of 1993, four children died and over seven hundred people became seriously ill from eating hamburgers at Jack in the Box fast food restaurants along the Pacific coast. The burgers, it turned out, had been made with meat contaminated with "E. coli 0157:H7" bacteria. At the time the bacteria were little known, but we soon learned that it is extremely aggressive. Individuals who are already somewhat vulnerable to disease, such as the elderly, children, and pregnant women, are this bacteria's favorite victims. The bacteria slowly shuts down all of a person's vital organs, one after another, until the person dies. That's what happened to the four children.
Later, it was discovered that the contaminated meat had not been heated to the recommended internal temperature of 155 degrees Fahrenheit. But federal officials, such as experts from the Centers for Disease Control, the Food and Drug Administration, and the Department of Agriculture, never could figure out exactly where the E. coli bacteria had entered the system. Was it in the slaughterhouse? During the grinding process? In transport? In the Jack in the Box outlet, where food workers may have left the beef unrefrigerated,or perhaps hadn't cooked it long enough?
Experts say this crisis could happen again. As the Institute for Science in Society said in a study reported in the Los Angeles Times (22 September 1994), "Bacterial contamination of meat and poultry is a time bomb waiting to go off." In fact, E. coli outbreaks occurred again in California in 1994, although not with the severity of 1993. This time, three children were hospitalized.
Should you be scared about eating a hamburger at a fast-food restaurant or at a baseball game? How about the meat at the grocery store? How do we know what's safe?
The answer, happily, is that you can still enjoy hamburgers, even at fast-food restaurants. Sometimes a company will use a crisis to examine its operations and make meaningful improvements. That's what Jack in the Box did--and they created a new standard and system for food safety that has been copied around the world.
Jack in the Box and most large restaurant chains have adopted a program called the Hazard Analysis Critical Control Point program, which locates and monitors all of the critical points in the food system where the integrity of their products might be compromised, including processing, transportation, cooking, and serving. This program was originally created in the 1960s for the space program to ensure that the astronauts' food was sterile. This system exceeds the standards set by the U.S. Food and Drug Administration for safe food processing, transport, preparation, and service.
Let's assume that one-third of the beef consumed in the United States in 1994 went into hamburgers--about 5.3 billion pounds--and that hamburgers are made with a quarter-pound of beef each. That year alone, then, Americans consumed over twenty-one billion hamburgers. In 1993 fewer than eight hundred made people sick. That is still too many, but with tighter inspections of beef processing plants, better testing methods, and higher standards in the fast-food industry, even these minimal risks should decrease.
MAD AS A ... COW?!?
"Killed by Mad Cow Disease"
--Headline in the New Statesman & Society, 29 March 1996
"Silence of the Calves"
--Headline in the Economist, 6 April 1996
"Mad Cow Epidemic Puts Spotlight on Puzzling Human Brain Disease"
--Headline in the New York Times, 2 April 1996
"Fatal Case in France"
--Headline in the New York Times, 27 April 1996
Judging from these and other headlines in the early part of 1996, you would think that the human race faces a devastating epidemic. True, it is a devastating disease in cows, and thus affects the beef and cattle industry financially, but to worry that "mad cow" disease is a real threat to the human population is a bit of a stretch.
What needs to be cleared up, though, is that bovine spongiform encephalopathy (BSE)--commonly labeled "mad cow" disease--is a cattle disease. The observable symptoms during the final stages of the disease are easily recognized. These include abnormal gait, increased sensitivity, loss of coordination, and in a minority of cattle the aggression that earned the disease its popular name (Science, 247:523, 1990).
BSE, scrapie, and Creutzfeldt-Jakob disease (CJD) are neurodegenerative diseases of cattle, sheep, and humans respectively. All are characterized by long incubation periods, over ten years in many cases, before onset of anyclinical symptoms. Unlike typical viral and bacterial infections, the infecting "agent" in these diseases is a protein (called a "prion"), but a small, stillundiscovered helper-virus might also be involved (Scientific American, August 1990). In about 15 percent of all CJD cases a specific, inherited mutation is to blame for the disease (Science, 271:1798, 1996). Kuru, which is similar to CJD and was first discovered in New Guinea, was shown to be transmitted by ritualistic handling and eating of human brain, an activity few modern people are likely to practice.
BSE was first identified in November 1986 in Britain. Most scientists believe that the spread of the disease was facilitated by the common practice of feeding meat and bone meal in rations to cattle as a source of protein. This practice, however, became widespread in the 1920s, so how come it is only ten years since BSE was first noticed? The reason may lie in the fact that up until the early '80s, an organic solvent (e.g., chloroform and acetone, also known as nail polish remover) extraction step, used to extract tallow from sheep and other animal carcasses in the production of cattle feed, was excluded and replaced by heat treatment. Ironically, the "infectious agent" in BSE is destroyed by organic solvents, but can withstand temperatures above that of boiling water (New Scientist, October 1993). Eight years later nearly 150,000 animals on over 30,000 British farms had the disease (New Scientist , February 1995).
So what has all this to do with BSE, and how come humans are catching a cattle disease? To start with, no one has shown conclusively that BSE can be transmitted to humans through eating beef. However, an article in the highly regarded journal Lancet (April 6, 1996) presents data that show a possible link with BSE in ten patients with CJD-like symptoms. Worldwide CJD occurs in about 1 person per 1 million per year. In western countries many of those cases can be traced to tissues taken from (infected) cadavers and used in specialized surgery, or hormones produced from the pituitary glands of diseased people (Scientific American, August 1990).
Mass hysteria has decimated the British beef industry to the point where some are calling for the slaughtering of all cattle in the United Kingdom. Only a few BSE cases have been reported in Europe outside the United Kingdom, and inspectors in the United States are tracking down a handful of cattle that were imported before BSE was commonly identified. Over the next decade or so new BSE cases are certain to decrease dramatically, due to the new regulations that now prohibit feeding meat and bone meal to cattle. For those still worried, eating beef is still safe, just stay away from theoffal from cattle, such as the brain, thymus, and spleen (BSE target organs). Keep in mind that you are more likely to choke on a bite of beef than to catch mad cow disease from it. So enjoy your juicy steak ... but chew carefully!
NIGHT OF THE LIVING FLESH-EATING SLIME ...
When a newspaper headline sounds like the title of a horror movie, a newspaper headline sounds like the title of a horror movie, it's time to take a skeptical look at the claims of the story. A good example is the recent "flesh-eating bacteria" scare.
Headlines in Great Britain read "Killer Bug Ate My Face," "Eaten Alive," "Curse of the Killer Bacteria," and "Dither--and You Die." In the United States, Time magazine (12 September 1994) ran an eight-page article with two-inch type screaming, "The Killers All Around." Time's story began, "They can strike anywhere, anytime. On a cruise ship, in the corner restaurant, in the grass just outside the back door. And anyone can be a carrier ... even the sweetheart who seems perfect in every way ... ."
Reading those words, did you run to the pharmacy to stock up on bandages and antibiotic ointment? Did you call the pediatrician to ask if your child could play outside with a scraped knee? Did you stay home from work with a minor sore throat, just in case? In other words, did you not live your life as fully as you could because of a negligible risk sensationalized by the media?
No, you might say, pointing to the article in the newspaper: this disease is for real. It's common--acousin to the bacteria that causes strep throat--and it kills. The British "epidemic" of early 1994 killed a whopping eleven people (Newsweek, 20 June 1994). A few cases cropped up over summer 1994 across the United States, with several more reported in Great Britain. A college president in California was eaten alive by the bacteria. It struck him down within two days following a "misdiagnosis" from a doctor! The bacteria often infects a minor wound, which causes pain and fever. Then it starts killing, or "eating," the skin, penetrating through a sore or cut, and beginsto gorge on the body. It leaves a trail of blackened dead tissue or, even worse, a gaping hole. It moves as fast as one inch per hour. It can kill in one day.
All this is true. It is also true that truth taken out of context, or reported only partially, is a lie.
The Centers for Disease Control in Atlanta report about five to fifteen hundred cases of severe streptococcus infection in the United States annually. Of these, less than 10 percent developed into a case of flesh-eating bacteria, or "necrotising fasciitus." (The term means literally "the dying of fascia," which is the tissue that connects muscles and skin.) The population of the United States is 258,245,000 -- making the incidence of flesh-eating bacteria less than 1 in 2.5 million, or .0000003 percent.
And why is it that our well-paid researchers and doctors don't know much about this deadly disease? Is it because it's so mysterious and defies medical technology? No. Even though the disease has a long history--descriptions of it date back to 1783 in France--it occurs quite rarely. Scientists just don't have much chance to study it.
Necrotising fasciitus is caused by a bacteria that is in the same family as the bacteria that causes strep throat. You've had strep throat. Is the next logical step developing necrotising fasciitus? No. Just because you've harbored the bacteria that potentially causes the disease does not mean that you will come down with the disease.
While necrotising fasciitus is technically the same bacteria that causes strep throat, the two strains are different in the way that two types of apples might be different--a Granny Smith and a Golden Delicious, for example. "Group A beta-hemolytic streptococcus" is the deadlier, invasive, and genetically different strain of plain old strep A. Necrotising fasciitis occurs in 5 to 10 percent of cases of this more-severe form of strep. But strep bacteria, like other organisms, appears in a number of slightly different forms. In fact, friendly forms of streptococcus live happily in the bodies of about 25 percent of us, and we don't know that they're permanent house guests. Will these house guests turn into murderers of their host? Not likely.
If a house guest wanted to murder you, he or she would need a moment of opportunity. With necrotising fasciitus, the first condition is the presence of the highly rare bacterial strain to begin with. Then you need an open cut or sore to allow the killer bug to enter. (That's why children with chicken pox are more likely to contract it--although the risk is still extremely tiny. The bug can enter through the open sores caused by chicken pox.) If you don't have a cut or sore, or if you keep your wounds very clean, the bug is turned away at the door.
What's more, even if you are the 1 in 2.5 million who does contract the disease, it won't necessarily kill or maim you. If a person seeks medical attention within a couple of hours after developing symptoms, neither life nor limbs will be lost. The symptoms are strong: severe soreness, redness, and spreading swelling; high fever; and blistering of the skin.
Therefore, from without and within--in terms of the incidence of the disease in the overall population, and the precise conditions required inside the body -- the probability that you will witness the spectacle of flesh-eating bacteria devouring your belly is about as likely as the chance that you will star in a hit horror flick and win an Academy Award.
Dr. Edward Kaplan of the University of Minnesota, who runs a World Health Organization laboratory responsible for tracking the strep germ, admits that we have an incomplete understanding of the strep bacteria, but still feels angry at the overblown media coverage of this bacteria. "If I lived in your part of the country," he told a reporter at the Los Angeles Times (15 June 1994), "I would be much more worried about an earthquake than I would about getting flesh-eating bacteria."
We love to fear flesh-eating bacteria for the same reasons that we thrill to the terror of horror movies. The British medical journal Lancet (19 November 1994) "reviewed" the latest horror queen star--necrotising fasciitus --in its article "Superbug Stars in Media-Made Epidemic." If this flesh-eating superstar enters your nightmares, just remember that she's much smaller in reality than she looks on the silver screen or in blown-up tabloid photos.
HAUNTED BY HANTA
Killer rats! Mysterious plague spreads across the West! rats! Mysterious plague spreads across the West!
Are these headlines from medieval Europe? The bubonic plague wiped out so many people that there weren't enough left to bury the dead. This plague was carried by rats. The now-famous "hantavirus" of 1993 brings up echoes of the plague, even though only thirty-two people were killed (versus millions in the bubonic plague) (Morbidity and Mortality Weekly Report, 13 May 1994). But as we've seen with other "scares," numbers and probability don't usually keep us from becoming scared anyway.
We all know that rodents carry disease, not only rabies, but a host of fearsome horrors. We scurry away from mice and rats with good reason. One of the latest of these horrors to enter our awareness was the hantavirus "plague" during the summer of 1993. When the hantavirus hit the headlines, we scurried far, far away from the area hardest hit--the "Four Corners" region of the United States, where New Mexico, Colorado, Arizona, and Utah meet.
Of course, people don't run away from a region of the country that has an unusually high traffic accident rate, even though traffic accidents claim far more lives than the hantavirus ever will. People don't even stay away from San Francisco or New York City because their AIDS rates are higher than those of the rest of the nation. So why all the uproar over hantavirus?
As the Los Angeles Times stated in an 11 June 1993 story on hantavirus, the disease "illustrates the potential for the emergence of deadly new infectious agents." That certainly grabs our attention, even though the hantavirus had existed a long time before being identified and named. Europe and Asia have experienced hantavirus outbreaks since the 1930s. Some United Nations troops were infected during the Korean War, and U.S. militarypersonnel in Korea have also battled the disease. Cases of hantavirus have been reported sporadically for years in the United States: a case in 1980 in California, a single case each year in 1990 and 1991, and six cases in 1992 before the outbreak in 1993.
The hantavirus is frightening because it's a mystery illness that seems at first like a bad flue--fever, muscle aches, headache, red eyes--and then, sometimes within a few hours, causes death as the lungs fill with fluid. From New Mexico, the hantavirus "spread" to fourteen states.
The hantavirus, which takes its name from the Hanta Valley, where it was first identified, is also scary because scientists have a hard time isolating the strains of this virus. We're accustomed to scientists "mapping" a virus quickly, but this one is difficult to grow in a laboratory. In fact, one of the earlier strains of hantavirus took thirty years to grow in a lab.
If this virus is so tough to grow, how does it spread? According to the federal government's Morbidity and Mortality Weekly Report (30 July 1993), people contract the virus by breathing dust and other airborne particles from the saliva, feces, and urine of infected rodents. A bite from a rodent carrier can also lead to hantavirus infection. Eating food or drinking water contaminated with the virus can also bring it on.
The outbreak in 1993 occurred because of an explosion in the rodent population in the rural areas that were affected. Because of heavy rains in the previous year, the number of deer mice, the virus's favorite carrier, grew by ten times the normal population. Of the fifty-one cases reported, half were American Indians, mostly from the Navajo reservation. As a result of the outbreak, Congress authorized $6 million to keep the virus from spreading further. This money was spent mostly on informing people about ways to keep rodents out of their houses.
The hantavirus is relatively easy to prevent by proper sanitation and cleanliness. The hantavirus dies when it comes into contact with soap or household disinfectants, so rural dwellers also can avoid breathing hantaviruscontaminated dust by washing dishes immediately and by keeping surfaces clean. Other precautions include storing food (including pet food) in tightly sealed containers, keeping garbage in rodent-proof containers, sealing all openings through which rodents can enter the home, and setting rodent traps in woodpiles and unoccupied structures such as barns and storage sheds.
For most of us, and especially urbanites, the dangers of hantavirus are minuscule. (No, cats and dogs don't carry hantavirus.) For those living in rural areas, keeping the mouse out of the house is the best protection. Hantavirus is not the next bubonic plague.
JUST A SMOKE SCREEN?
NO SMOKING ALLOWED
The signs are everywhere: on the job, in public buildings, and increasingly in restaurants and even bars in many parts of the country. Nonsmokers have become increasingly intolerant of their nicotine-loving brethren. In many places, smokers have become near-pariahs. But is the danger from "secondhand smoke" real?
The courts and public officials seem to think so. Parents have lost child custody cases because they smoke. No one is allowed to smoke on domestic airline flights in the United States (there are no more "smoking sections"). Smoking is not allowed in over 30 percent of American workplaces. Nor is it allowed in restaurants in Los Angeles, California. Hillary Rodham Clinton has insisted that no one smoke in the White House. Even fast-food restaurants, including McDonald's, Taco Bell, and Jack in the Box, have prohibited smoking. The federal Occupational Safety and Health Administration is considering banning smoking in all workplaces.
Nevertheless, some people say that all the hoopla over secondhand smoke is just a smoke screen. Let's investigate the supposed dangers of secondhand smoke to see if this one, like many of the "risks" in our society, is more hype than truth.
Secondhand smoke wasn't seen as much of a risk until an Environmental Protection Agency report in 1993 declared it a "Group A" human carcinogen.The report said that secondhand smoke accounts for fifty-three thousand deaths each year, including three thousand from lung cancer. Actually, this piece of data was buried in the middle of the report, but the media made it a headline. U.S. News and World Report demanded on its cover, "Should Cigarettes Be Outlawed?" Time posed a daring question on its cover: "Is It All Over for Smokers?"
Good questions. Unfortunately, the answers weren't explored fully in the media--because the answers aren't all that clear. The media, particularly television, like clear-cut stories. This one is still shrouded in behind-thescenes controversy.
Several hundred scientific studies have linked heavy secondhand smoke to diseases such as cancer, heart disease, respiratory infections, asthma, and sudden infant death syndrome (SIDS). The strongest link by far is to lung cancer. However, the relationship between secondhand smoke and the other diseases is far from established. Of the 53,000 deaths supposedly caused by secondhand smoke, most (37,000) were related to heart disease, and 12,000 to cancers other than lung cancer. Scientists do not agree about secondhand smoke and heart disease. Out of these several hundred studies, only fourteen were able to conclude that the link exists (Los Angeles Times, 26 May 1994).
Another question is how much smoke is "heavy" smoke. A third murky area is just how much secondhand smoke actually contributes to these diseases. And does it cause the disease, worsen a condition that was already brewing, or neither of the above?
Uncertainty among specialists is the only thing that is really clear about the majority of alleged deaths from secondhand smoke. The Los Angeles Times quotes one of the world's leading epidemiologists, Sir Richard Doll of Oxford University, who found that "there is a real problem estimating the quantitative effect of environmental tobacco smoke." Sheryl Stolberg, the Los Angeles Times (26 May 1994) medical writer, assessed the secondhand smoke issue this way: "A little bit of science--still emerging, not all of it conclusive --shaping a lot of public policy."
So why has secondhand smoke become such a captivating public issue? It has the word cancer associated with it, and that always gets lots of attention. The tobacco lobby and its opponents both have invested so much money and energy into the issue that they've heightened the controversy beyond where it might otherwise be. And the issue allows us to vent anger at the people we love to hate: the big tobacco companies. The tobacco companies are believed to have lied for so long about the risks of smoking, which they still deny, that vengeance is sweet.
It seems quite clear that "heavy" secondhand smoke (define that however you want) can possibly lead to lung cancer. Is living with a smoker like a ticket to chemotherapy? Maybe, but a lot of people have lived with heavy smokers and not had problems. Will a whiff from someone's Marlboro down the hall give you cancer--or pose enough of a risk that you should deny that person the right to smoke in the hall? Probably not. It may be sensible to reduce or even prohibit smoking in offices and other enclosed spaces. However, the effort to vilify smokers and prevent virtually all smoking everywhere does seem to be an overreaction to the threats of secondhand smoke as we currently understand them.
Copyright © 1997 by Affinity Communications Corp. All rights reserved. No part of this book may be used or reproduced in any manner whatsoever without written permission except in the case of brief quotations embodied in critical articles or reviews. For information, address St. Martin's Press, 175 Fifth Avenue, New York, N.Y. 10010.