Survival of the Richest
Looking at how social factors affect human health offers proof of conventional wisdom—and some more surprising insights into disease and prevention.
“Them that’s got shall get,” goes the old Billie Holiday song, challenging the accepted notion of the United States as an exceptionally mobile society. “Them that’s not,” the lyrics continue, “shall lose. So the Bible”—plus, it appears, mountains of demographic data—“said, and it still is news.” Also still news is a growing recognition of the connections between “getting” and “losing” and health and disease and the relentless discovery of ways in which deprivation, of any sort, can impair well-being.
Extreme poverty has long been associated with reduced lifespan, but now studies are revealing that not just the very poor—the malnourished or the homeless—but people in each socioeconomic category have worse health than those a notch above them. Similar links between resources and risk have been found in every modern industrial society, human societies throughout history, and even among nonhuman primates and other social animals.
Many of these ideas can be traced to a few pioneers, among them Alvin R. Tarlov, MD’56. Three decades ago Tarlov—chair of Chicago’s department of medicine from 1968 to 1985 and now executive director of the Texas Program for Society and Health, a consortium based at Rice University—began to study how health is affected by social factors.
Such study of social determinants has been slow to take off, however, because the topic is inherently so wide ranging. Doctors don’t often chat with economists, who seldom work with sociologists, who rarely seek out lawyers. So Tarlov protégé Mark Siegler, MD’67, a physician with a penchant for ethics, and his Chicago colleague Richard Epstein, a lawyer with a taste for economics, pulled together a conference this past November at the Law School—“Social Determinants of Health and Disease: Recognizing the Contributions of Dr. Alvin R. Tarlov.”
Practitioners of the social-determinants approach vary widely but agree that those concerned with public health should no longer focus simply on biology, on germs and genes, but should shift more attention to such variables as financial resources and social status, cognitive skills and educational background, racial attitudes and ethnic practices, personal behavior and lifestyle, even a person’s neighborhood and friends.
A key factor in the growth of modern industrialized societies is also a fundamental component of good health: “physiological capital,” or the accumulation of health resources. In 1750, noted Robert W. Fogel, a Chicago Nobel laureate who applies the tools of economics to historical problems, one-fifth of the English population was kept out of the labor force because of poor health, largely chronic malnutrition. These down-and-outs were smaller, frailer, sicker, and died younger than working people. Since about 1800, however, increased access to food has meant a dramatic improvement in public health—from 1750 to 1975 the average Englishman’s body size increased by nearly 50 percent—and a more capable workforce.
Particularly important were the improved well-being and nutrition of pregnant women. Well-fed moms gave birth to bigger, healthier children. This initial investment, Fogel explained, “reduced the rate of depreciation,” meaning that newborns who acquired more health resources early in life could fight off the diseases of old age longer. The result was a kind of “biological but not genetic evolution,” survival not quite of the fattest but of those with a sufficient nest egg of calories.
One good measure of this progress is the decline in infant mortality. In 1800 some 17 percent of English children died in infancy and a whopping 79 percent of the children born to poor mothers weighed less than five pounds. Today England’s infant mortality has fallen to less than 1 percent, and fewer than 8 percent of newborns weigh under five pounds.
While some may explain these health-status improvements by citing better medical care, Fogel attributed them to better nutrition and an improved environment. Physician intervention could “slow the rate of depreciation of physiological capital,” he admitted, but the real gains in public health came from better diets, access to clean water, and better sanitation, which have delayed the onset of the chronic diseases of aging by five to ten years. As for exact figures on the salutary role of the environment versus health care, Fogel was “in the middle of applying for a grant” to determine precisely that.
Yet it isn’t only the impoverished whose health is affected by socioeconomic status. While Fogel has focused on the English poor, Sir Michael Marmot, director of the Whitehall I and II studies of civil servants, researches the British middle class. These long-term studies—the first, which began in 1967 and followed participants for 25 years, and Whitehall II, which began in 1985 and is still under way—have shown, said Marmot, head of epidemiology and public health at University College, London, that the social gradient in health extends from the bottom to the top of society. Among civil servants, “none of whom is poor,” the first study found that the least well off had mortality rates nearly eight times as high as the wealthiest. More important, there was a significant gap between each step in the hierarchy. In other words, “The problem is not confined to the high risk at the bottom.”
Nor is the problem purely economic. In the United States, Marmot added, 77 percent of whites live to at least age 65, but only 61 percent of blacks live that long. Even more revealing, 65 percent of poor whites live to 65 but only 30 percent of poor blacks do.
What causes such discrepancies? Is it access to health care? No, Marmot said. Is it genetics—do better genes lead to better health and higher socioeconomic status? No. Is it primarily income, which is closely tied to education? Again, no, Marmot answered. On an international scale U.S. blacks are far from poor, yet their life expectancy is comparable to the residents of Kazakhstan, where income is measured in goats. More significant than actual income is relative deprivation—where one lies in the local hierarchy, a notion that applies not simply to finances but also to power and independence at work, levels of social participation, education, and early life experiences, all of which can influence behaviors, such as smoking and drinking, that have a health impact. “Control over life,” he said, “and opportunities to participate fully in society are powerful determinants of health.”
Yet despair, emphasized Marmot, “is not warranted. Health for everyone can improve.” Lifespan and well-being for all social classes rose dramatically during the 20th century, and the “gap between rich and poor, between top and bottom, can change. The slope of the social gradient in mortality is not fixed.”
Where you find yourself in the social hierarchy may not be as important as who’s there with you. Arguing for the need to integrate sexuality into considerations of social factors affecting health and medicine was Chicago sociologist Edward O. Laumann, the world’s authority on sexual practices. His large-scale surveys of sexual behavior in the United States, and more recently around the world, have found high rates of sexual dysfunction in all 32 countries studied, with about 40 percent of women and 30 percent of men acknowledging sexual problems. “Is that a medical problem?” he asked. “Or is it inherent in the nature of sexual expression? Are there too many other things vying for our attention?”
In contrast sexually transmitted diseases are a crucial part of human health status and can only be understood in a sociocultural context. At first glance, Laumann said, it appears that STDs, like many health problems, are concentrated among the disadvantaged, with the highest documented rates found among inner-city African Americans. But that could be a function of reporting: the poor go to public clinics that notify the authorities, while the wealthy choose private physicians who are more discreet.
Indeed, Laumann’s recent research in China found just the opposite. The sexual revolution that swept the United States in the 1960s has had different effects behind the Great Wall. More than 80 percent of Chinese women, and 60 percent of men, have had only one sexual partner. Although few poor women catch sexually transmitted diseases, 38 percent of the wealthiest Chinese women have had chlamydia. What makes this infection so elitist? It turns out that the use of prostitutes is more socially acceptable in China, but costly. So wealthy men visit commercial sex workers, then bring the disease home to their wives.
Cutting across economic lines are social isolation and loneliness, factors that Chicago psychologist John T. Cacioppo called key determinants of health. In all age groups, he has found, loneliness predicts mortality, with increased rates of cardiovascular disease, stroke, and cancer occurring in those who are socially isolated. Efforts to intervene, such as providing short-term social supports for people recovering from a heart attack, have been “mostly unsuccessful,” largely because “we don’t understand the process” that creates such isolation: “There’s no particular pathophysiology.”
Nor are there obvious differences between the lonely and the non-lonely. They look the same. Their personalities are similar. They face the same difficulties in life. But lonely people, Cacioppo’s research has found, don’t cope as well with stressful events; they tend to withdraw rather than confront problems, perhaps because they have fewer opportunities to share their successes and frustrations or conspire with colleagues. Hit harder by stress, as they age the lonely develop more peripheral resistance—a sort of vascular teeth clenching—which reduces cardiac output, increasing the demands on the heart. They also have higher levels of the hormones associated with chronic stress. And they complain about sleeping poorly, a problem confirmed by studies reporting more microawakenings among lonely sleepers. An estimated 31 million people, most 65 or older, will be living alone by the year 2020. Cacioppo urged, “We need to find ways to support these people”.
Sociologist Robert J. Sampson, who recently left the University to take a post at Harvard, pushed the scope out beyond the individual to look at the health-related effects of neighborhoods. Since the 1920s poor, inner-city neighborhoods have been known as “hot spots” for unhealthy statistics: higher rates of violent crimes, child abuse, infant mortality, suicide, and accidental injuries. Clustering people by class, race, and health, Sampson noted, “is a robust and apparently increasing occurrence.”
But Sampson and colleagues have begun to uncover factors that can improve the health of even poor city neighborhoods. The Project on Human Development in Chicago Neighborhoods is a massive effort to assess community ecologies, a process the researchers have dubbed “ecometrics.” The project team began by dividing the city into 343 neighborhood clusters, each relatively homogenous in racial or ethnic mix, socioeconomic status, population density, and family structure. Then the researchers interviewed 8,782 residents and 2,900 business, law-enforcement, educational, religious, political, and community-organization leaders. They also videotaped 28,000 “micro-community environments” (such as street blocks) looking for health risks such as garbage in the streets, public intoxication, or unsafe housing.
They found several traits linked to better community health. Most important is what sociologists call “collective efficacy”—residents’ willingness to work together to solve a neighborhood problem. How eager, the researchers asked, are community members to step in if children are skipping school and hanging out on a streetcorner, or spray-painting graffiti on buildings? It turns out that higher levels of collective efficacy are associated with lower rates of current and future violence: “Social ties create the capacity for informal social control.” Just being next to a neighborhood with high collective efficacy, Sampson said, “is one of the best predictors of lower homicide” rates.
That statistic bolsters his overarching contention that “community-level prevention that attempts to change social environments” may prove an effective complement to traditional thinking about disease and its “individual and disease-specific approach.”
One health factor that traditionally lands squarely in the lap of the individual sinner is obesity. What was once decried as a personal weakness, however, has become a national trend, perhaps “the greatest change in an important variable across the entire population that the nation has ever seen,” according to Arthur H. Rubenstein, dean of the University of Pennsylvania’s medical school, who recited well-known but alarming statistics on obesity. Forty years ago, being fat was a bit odd, something to be teased about. In 1960 only one in four adults was overweight and about one in nine considered obese, with a body mass index (BMI) of 30 or more. (Normal BMI is 18 to 24, overweight is 25 to 29.) Today two in three adults are overweight, and nearly one in three is obese. Still more troubling is the emergence of obesity at younger ages. In 1970, for example, less than 5 percent of teenaged boys were obese; now more than 15 percent are. Although the waistline explosion has hit every societal level, it has had the greatest impact on those with lower socioeconomic status and on disadvantaged ethnicities, such as Mexican and African Americans. Half of all adult black women are now obese, Rubenstein pointed out, many of them morbidly obese, with a BMI over 40.
“So what?” asked Rubenstein, admitting that the medical profession has been slow to address the trend despite a series of studies since the 1970s clearly linking excess weight to poor health, including arthritis, asthma, birth defects, breast cancer, gall-bladder disease, hypertension, and more—especially heart disease and diabetes.
It usually takes about 20 years, said Rubenstein, to progress from obesity to heart failure, an ailment that has increased dramatically in the past decade and continues to rise. But as a specialist in diabetes who has watched that disease ravage thousands of patients, Rubenstein is particularly troubled by the rapid rise of type-2 diabetes, up 61 percent since 1990, including a jump of 8.2 percent from 2000 to 2001, the last year measured. The highest rates of diagnosed diabetes are seen in two already disadvantaged groups: African Americans and adults with less than a high-school education. “Next,” he warned, we can “expect to see more and more people who were obese as children dying of type-2 diabetes in their 30s or 40s,” which used to be considered the age of onset. “This is unprecedented. If any of these predictions are right, we have a serious problem.”
“Just to focus on the costs of obesity seems a little narrow minded,” responded the Harris School’s Tomas Philipson, who noted that obesity can be seen as a result of technological and social progress. First—and hardly a bad thing—the cost of calories has plummeted. “The price of food has dropped, and the percentage of our income that we spend on food has dropped,” said the economist. For the first time, the poor in the United States and increasingly in less developed countries can afford all the food they want and have more food choices than ever before. The bad news, of course, is that many less expensive choices are highly processed, calorie-dense, low-nutrient foods, flavored with cheap sweeteners and artery-clogging fats.
Second—you want fries with that?—people now eat more calorie-dense fast foods because the value of women’s time has gone up, which means wives and mothers no longer have as much time to cook. Between 1970 and 1996 the portion of food dollars spent on meals away from home jumped from 25 percent to 40 percent. Americans now spend more than $100 billion a year on fast food. Such restaurants compete on price and volume, which drives them to offer ever-larger portions of emptier calories. In 1960 an order of McDonald’s French fries, for example, contained 200 calories; now the popular supersize fries has 610.
Third—supersize meets Super Mario—people no longer burn calories as rapidly. “We used to get paid to exercise, to perform manual labor for up to ten hours a day,” Philipson said. Now most people work at a desk, accessed by elevators and e-mail. They pay for the privilege of exercising at a gym three hours a week. “Or they spend that time on something else,” he said, “maybe with their kids”—who often spend their spare time playing video games rather than sports.
In the 1950s, well before the dawn of the social-determinants field, noted James House, director of the Survey Research Center at the University of Michigan, it appeared that antibiotics had conquered infectious diseases. It seemed a triumph for undiluted, laboratory-based biomedicine, which was “openly hostile” to the psychosocial aspects of health.
That triumph was short-lived. The nature of disease was changing. Those who survived infectious diseases began to die, somewhat later in life, from heart disease and cancer. The quick killers, germs, gave way to something slower but harder to stop. Microbes had passed the baton to “risk factors,” like high cholesterol, elevated blood pressure, smoking, and something vaguer still, labeled “stress.” There were initially no drugs to combat these disorders, only protective behaviors. By the 1970s, said House, “risk-factor epidemiology had become a growing concern,” followed by social epidemiology, which implicated things like limited education, lack of social support, and personal and occupational stress as contributors to disease.
Compared to the complex epidemiology of interrelated risk factors, germs began to look delightfully simple. A microbe launched the battle. Either your immune system, with support from antibiotics, beat it back and you lived, or it didn’t and you died. But as scientists tried to unravel the social and behavioral components of disease, they increasingly realized that there were too many risk factors; they were too mixed up, too complex, and too interconnected to be easily deciphered.
A 1975 study by Graduate School of Business economist Sam Peltzman, PhD’65, highlights the problem. Long before New York passed the nation’s first mandatory seatbelt law in 1984, Peltzman found that even a simple behavior like buckling up can produce unforeseen consequences, with real health risks. Once drivers fastened their safety belts they felt snug, safe and secure, drove faster, and thus had more and nastier accidents. So what starts as a simple, mechanical solution can create a new social problem.
At the conference Peltzman expanded this theory, suggesting similar responses are likely to attenuate the benefits of any medical breakthroughs. Risk taking increased after the introduction of antibiotics, he argued, because people stopped dreading infectious diseases; as a result, deaths from infectious disease regained some lost ground. More recently, interventions such as cardiac-bypass surgery and angioplasty may have reduced the fear of heart attacks, removing a powerful disincentive to obesity. Similarly, the development of effective AIDS drugs appears to have undercut the public-health push for safe sex.
But medical science needn’t give up just yet. GSB economists Kevin M. Murphy, PhD’86, and Robert H. Topel have spent the past few years trying to measure the financial benefits of medical research, what they call the economics of well-being. They began by asking how much people would pay for an extra year of life. Unfortunately, you can’t just buy a few months at a time, like topping off your gas tank, so they had to calculate how much people would value a little extra time, based on the decisions they made about what health risks they would accept in exchange for a better paying job, or how much they were willing to invest in a safer car. It turns out that people value an entire lifetime at about $5 million dollars, more than most taxpayers earn over their three score and ten but less than a malpractice lawyer would request for a client’s loss of a single year. The early years are worth more than later ones—for example, 12 months of being 30 and healthy is worth more than 52 weeks at 50 with angina, but even the senior moments of life are worth quite a bit, said Murphy. “No one wants to live to 65, then die the day they retire.”
Most U.S. citizens, of course, now live well past retirement age, to something approaching four score. From 1970 to 1998, the period Murphy and Topel studied, the average life expectancy for a 65-year-old increased by about 3.6 years, from 15.2 to 17.8. If each bonus year is valued at about $75,000, then multiplied by the U.S. population, around 280 million, the result is a very vast number.
Murphy and Topel were stunned to find that the economics of well-being added up to about $73 trillion for the 28-year study period, or about $2.6 trillion per year—about half of the nation’s gross domestic product. The survival gains for men are greater than those for women, and almost half of the profits are reinvested in medical care, not a bad thing since it provides steady jobs in a nonpolluting industry. But the overall benefits, the authors emphasize, are enormous, staggering, and increasing. “As the U.S. population grows, as lifetime incomes grow, as health levels improve, and as the baby-boom generation ages toward the primary ages of disease-related death, the economic reward to improvements in health will continue to increase.”
This suggests that medical research is vastly underfunded. Currently the United States invests about $40 billion annually in medical research. But the value of the lives extended by reducing the annual death rate from either of the leading killers, heart disease or cancer, by one-tenth of a percent would add up to nearly $50 billion.
So—increased risk taking aside—biomedical research is worth every penny, health care extends life, friendship and social support and collegial communities bring comfort and safety and stability, education enhances existence. Yet the people who most need these things get the least. What should we do? When Richard Epstein organizes a conference, the lawyers get the last word. And when Epstein speaks the message is, by and large, hands off.
There is an inherent tension, Epstein said, like fire and ice, between regulation and freedom, and from his experience with regulation he holds with those who favor freedom. Regulation is the realm of lawyers, and “when push comes to shove lawyers are the most powerful and the most dangerous” people on earth. He traced the history of public-health legislation from its early days, when it sought only to contain communicable diseases and ensure proper sanitation, to the emergence of a more inclusive, modern version—dominant after 1937—that covers “any and all matters that relate to the distribution of health care and health-care services.”
Such broad and meddlesome definitions of public health “will in all likelihood be conducive to the ill-health of the very individual it seeks to protect,” Epstein argued, citing several examples of regulatory failure (cases that protected the powerful and hurt the vulnerable, such as a quarantine affecting a specific ethnic group, or mandatory vaccination—with the option for the wealthy of buying their way out, thus posing a risk to others) traceable to equal parts venality and incompetence. Legislatures, he said, have “every incentive to get it wrong, and they will succeed.”
Wake Forest law professor Mark Hall offered his own example of government overzealousness. When his “germophobic” 14-year-old daughter was nipped by her new puppy she felt anxious enough to phone their veterinarian and ask about rabies. Although the dog had already received his shots, the vet was legally obliged to notify the local health department, setting in motion a process that culminated in a playful and perfectly healthy puppy being quarantined for three months.
Hall was troubled by the notion of siccing that same process on obesity, for example, by classifying overeating as a public-health problem. Once you identify a cause, such as cheap and tasty fast food, he said, then action becomes necessary. “Public-health law confers tremendous authority on government officials,” Hall said, “allowing measures that are justified only in situations of extreme emergency.” That makes sense in the battle against a pathogen like cholera or rabies, or even against a harmful behavior like smoking perhaps, but not for an “ecological problem” such as obesity, which is immersed in social, economic, cultural, and political considerations. At least one member of Congress has already begun to discuss launching a war on fat, said Hall, a new battle of the bulge. This is not “a rhetoric of prudence, balance, and restraint.”
So far the legal community agrees but has left room to change its mind. In January a judge dismissed a potential class-action suit blaming the McDonald’s Corporation for obesity in teenagers. The decision to dismiss was guided by the principle that “legal consequences should not attach to the consumption of hamburgers” unless consumers are unaware of the dangers of eating such food. If a person knows that eating copious orders of supersized McDonald’s products is unhealthy and may result in weight gain, “it’s not the place of the law to protect them from their own excesses.”
The real challenge confronting any attempt—whether legal, educational, or biomedical—to alter the social gradient of health was perhaps best summarized nearly 50 years ago by socialite Babe Paley, who was born rich, grew up thin, and married two rich, thin men. “You can’t be too rich,” she supposedly said, “or too thin.” Paley’s key insight was to connect the two. Marmot’s pioneering Whitehall study, for instance, showed the correlation between socioeconomic status and lifespan, but it also revealed that low status was associated with obesity, smoking, less leisure-time physical activity, higher blood pressure, shorter height, and coronary heart disease.
A glance at the conference speakers and taller-and-thinner-than-average audience revealed many of the same connections. While not exactly rich by Paley’s standards, they all had advanced degrees from elite institutions and the enhanced career paths that follow. Most had a BMI within spitting distance of 25. No one deserted the lectures to smoke. Most, if not all, reside in neighborhoods of high, even obsessive, collective efficacy, primarily Hyde Park, and all were deeply concerned about matters of public health. They were uniformly numbered among the “them that’s got,” and they were go-getters.
They apparently have the added advantage of being persuasive. Three months after the Chicago conference the president of the American Association for the Advancement of Science opened that organization’s annual meeting in Denver with a plea for more socially focused research, specifically citing Marmot’s Whitehall Studies. Floyd Bloom, chair of neuropharmacology at the Scripps Research Institute and former editor-in-chief of Science, author of more than 600 papers and the text The Biochemical Basis of Neuropharmacology—in short as hard-core a molecular biologist as they come—kicked off the meeting by telling his audience that genomic-based health care, though often described as a miracle on the horizon, is likely to be expensive and require many more years of research before new options are available to patients. “The puzzles of better health promotion and disease prevention may be approached more rapidly and effectively through intensified social-science research,” he concluded, “rather than by awaiting the expected evolution of gene-based explanations and interventions based on future genetic discoveries.” No longer a new kid on the block, the social-determinants field has finally been blessed and is coming into its own.
the Magazine Alumni
uchicago® ©2003 The University of Chicago® Magazine 5801 South Ellis Ave., Chicago, IL 60637
phone: 773/702-2163 fax: 773/702-0495 email@example.com