continued...
One
health factor that traditionally lands squarely in
the lap of the individual sinner is obesity. What was once
decried as a personal weakness, however, has become a national
trend, perhaps “the greatest change in an important
variable across the entire population that the nation has
ever seen,” according to Arthur H. Rubenstein, dean
of the University of Pennsylvania’s medical school,
who recited well-known but alarming statistics on obesity.
Forty years ago, being fat was a bit odd, something to be
teased about. In 1960 only one in four adults was overweight
and about one in nine considered obese, with a body mass
index (BMI) of 30 or more. (Normal BMI is 18 to 24, overweight
is 25 to 29.) Today two in three adults are overweight,
and nearly one in three is obese. Still more troubling is
the emergence of obesity at younger ages. In 1970, for example,
less than 5 percent of teenaged boys were obese; now more
than 15 percent are. Although the waistline explosion has
hit every societal level, it has had the greatest impact
on those with lower socioeconomic status and on disadvantaged
ethnicities, such as Mexican and African Americans. Half
of all adult black women are now obese, Rubenstein pointed
out, many of them morbidly obese, with a BMI over 40.
“So what?” asked
Rubenstein, admitting that the medical profession has been
slow to address the trend despite a series of studies since
the 1970s clearly linking excess weight to poor health,
including arthritis, asthma, birth defects, breast cancer,
gall-bladder disease, hypertension, and more—especially
heart disease and diabetes.
It usually takes about 20
years, said Rubenstein, to progress from obesity to heart
failure, an ailment that has increased dramatically in the
past decade and continues to rise. But as a specialist in
diabetes who has watched that disease ravage thousands of
patients, Rubenstein is particularly troubled by the rapid
rise of type-2 diabetes, up 61 percent since 1990, including
a jump of 8.2 percent from 2000 to 2001, the last year measured.
The highest rates of diagnosed diabetes are seen in two
already disadvantaged groups: African Americans and adults
with less than a high-school education. “Next,”
he warned, we can “expect to see more and more people
who were obese as children dying of type-2 diabetes in their
30s or 40s,” which used to be considered the age of
onset. “This is unprecedented. If any of these predictions
are right, we have a serious problem.”
“Just to focus on
the costs of obesity seems a little narrow minded,”
responded the Harris School’s Tomas Philipson, who
noted that obesity can be seen as a result of technological
and social progress. First—and hardly a bad thing—the
cost of calories has plummeted. “The price of food
has dropped, and the percentage of our income that we spend
on food has dropped,” said the economist. For the
first time, the poor in the United States and increasingly
in less developed countries can afford all the food they
want and have more food choices than ever before. The bad
news, of course, is that many less expensive choices are
highly processed, calorie-dense, low-nutrient foods, flavored
with cheap sweeteners and artery-clogging fats.
Second—you want fries
with that?—people now eat more calorie-dense fast
foods because the value of women’s time has gone up,
which means wives and mothers no longer have as much time
to cook. Between 1970 and 1996 the portion of food dollars
spent on meals away from home jumped from 25 percent to
40 percent. Americans now spend more than $100 billion a
year on fast food. Such restaurants compete on price and
volume, which drives them to offer ever-larger portions
of emptier calories. In 1960 an order of McDonald’s
French fries, for example, contained 200 calories; now the
popular supersize fries has 610.
Third—supersize meets
Super Mario—people no longer burn calories as rapidly.
“We used to get paid to exercise, to perform manual
labor for up to ten hours a day,” Philipson said.
Now most people work at a desk, accessed by elevators and
e-mail. They pay for the privilege of exercising at a gym
three hours a week. “Or they spend that time on something
else,” he said, “maybe with their kids”—who
often spend their spare time playing video games rather
than sports.
In
the 1950s, well before the dawn of the social-determinants
field, noted James House, director of the Survey Research
Center at the University of Michigan, it appeared that antibiotics
had conquered infectious diseases. It seemed a triumph for
undiluted, laboratory-based biomedicine, which was “openly
hostile” to the psychosocial aspects of health.
That triumph was short-lived.
The nature of disease was changing. Those who survived infectious
diseases began to die, somewhat later in life, from heart
disease and cancer. The quick killers, germs, gave way to
something slower but harder to stop. Microbes had passed
the baton to “risk factors,” like high cholesterol,
elevated blood pressure, smoking, and something vaguer still,
labeled “stress.” There were initially no drugs
to combat these disorders, only protective behaviors. By
the 1970s, said House, “risk-factor epidemiology had
become a growing concern,” followed by social epidemiology,
which implicated things like limited education, lack of
social support, and personal and occupational stress as
contributors to disease.
Compared to the complex
epidemiology of interrelated risk factors, germs began to
look delightfully simple. A microbe launched the battle.
Either your immune system, with support from antibiotics,
beat it back and you lived, or it didn’t and you died.
But as scientists tried to unravel the social and behavioral
components of disease, they increasingly realized that there
were too many risk factors; they were too mixed up, too
complex, and too interconnected to be easily deciphered.
A 1975 study by Graduate
School of Business economist Sam Peltzman, PhD’65,
highlights the problem. Long before New York passed the
nation’s first mandatory seatbelt law in 1984, Peltzman
found that even a simple behavior like buckling up can produce
unforeseen consequences, with real health risks. Once drivers
fastened their safety belts they felt snug, safe and secure,
drove faster, and thus had more and nastier accidents. So
what starts as a simple, mechanical solution can create
a new social problem.
At the conference Peltzman
expanded this theory, suggesting similar responses are likely
to attenuate the benefits of any medical breakthroughs.
Risk taking increased after the introduction of antibiotics,
he argued, because people stopped dreading infectious diseases;
as a result, deaths from infectious disease regained some
lost ground. More recently, interventions such as cardiac-bypass
surgery and angioplasty may have reduced the fear of heart
attacks, removing a powerful disincentive to obesity. Similarly,
the development of effective AIDS drugs appears to have
undercut the public-health push for safe sex.
But medical science needn’t
give up just yet. GSB economists Kevin M. Murphy, PhD’86,
and Robert H. Topel have spent the past few years trying
to measure the financial benefits of medical research, what
they call the economics of well-being. They began by asking
how much people would pay for an extra year of life. Unfortunately,
you can’t just buy a few months at a time,
like topping off your gas tank, so they had to calculate
how much people would value a little extra time, based on
the decisions they made about what health risks they would
accept in exchange for a better paying job, or how much
they were willing to invest in a safer car. It turns out
that people value an entire lifetime at about $5 million,
more than most taxpayers earn over their three score and
ten but less than a malpractice lawyer would request for
a client’s loss of a single year. The early years
are worth more than later ones—for example, 12 months
of being 30 and healthy is worth more than 52 weeks at 50
with angina, but even the senior moments of life are worth
quite a bit, said Murphy. “No one wants to live to
65, then die the day they retire.”
Most U.S. citizens, of course,
now live well past retirement age, to something approaching
four score. From 1970 to 1998, the period Murphy and Topel
studied, the average life expectancy for a 65-year-old increased
by about 3.6 years, from 15.2 to 17.8. If each bonus year
is valued at about $75,000, then multiplied by the U.S.
population, around 280 million, the result is a very vast
number.
Murphy and Topel were stunned
to find that the economics of well-being added up to about
$73 trillion for the 28-year study period, or about $2.6
trillion per year—about half of the nation’s
gross domestic product. The survival gains for men are greater
than those for women, and almost half of the profits are
reinvested in medical care, not a bad thing since it provides
steady jobs in a nonpolluting industry. But the overall
benefits, the authors emphasize, are enormous, staggering,
and increasing. “As the U.S. population grows, as
lifetime incomes grow, as health levels improve, and as
the baby-boom generation ages toward the primary ages of
disease-related death, the economic reward to improvements
in health will continue to increase.”
This suggests that medical
research is vastly underfunded. Currently the United States
invests about $40 billion annually in medical research.
But the value of the lives extended by reducing the annual
death rate from either of the leading killers, heart disease
or cancer, by one-tenth of a percent would add up to nearly
$50 billion.
So—increased
risk taking aside—biomedical research is worth
every penny, health care extends life, friendship and social
support and collegial communities bring comfort and safety
and stability, education enhances existence. Yet the people
who most need these things get the least. What should we
do? When Richard Epstein organizes a conference, the lawyers
get the last word. And when Epstein speaks the message is,
by and large, hands off.
There is an inherent tension,
Epstein said, like fire and ice, between regulation and
freedom, and from his experience with regulation he holds
with those who favor freedom. Regulation is the realm of
lawyers, and “when push comes to shove lawyers are
the most powerful and the most dangerous” people on
earth. He traced the history of public-health legislation
from its early days, when it sought only to contain communicable
diseases and ensure proper sanitation, to the emergence
of a more inclusive, modern version—dominant after
1937—that covers “any and all matters that relate
to the distribution of health care and health-care services.”
Such broad and meddlesome
definitions of public health “will in all likelihood
be conducive to the ill-health of the very individual it
seeks to protect,” Epstein argued, citing several
examples of regulatory failure (cases that protected the
powerful and hurt the vulnerable, such as a quarantine affecting
a specific ethnic group, or mandatory vaccination—with
the option for the wealthy of buying their way out, thus
posing a risk to others) traceable to equal parts venality
and incompetence. Legislatures, he said, have “every
incentive to get it wrong, and they will succeed.”
Wake Forest law professor
Mark Hall offered his own example of government overzealousness.
When his “germophobic” 14-year-old daughter
was nipped by her new puppy she felt anxious enough to phone
their veterinarian and ask about rabies. Although the dog
had already received his shots, the vet was legally obliged
to notify the local health department, setting in motion
a process that culminated in a playful and perfectly healthy
puppy being quarantined for three months.
Hall was troubled by the
notion of siccing that same process on obesity, for example,
by classifying overeating as a public-health problem. Once
you identify a cause, such as cheap and tasty fast food,
he said, then action becomes necessary. “Public-health
law confers tremendous authority on government officials,”
Hall said, “allowing measures that are justified only
in situations of extreme emergency.” That makes sense
in the battle against a pathogen like cholera or rabies,
or even against a harmful behavior like smoking perhaps,
but not for an “ecological problem” such as
obesity, which is immersed in social, economic, cultural,
and political considerations. At least one member of Congress
has already begun to discuss launching a war on fat, said
Hall, a new battle of the bulge. This is not “a rhetoric
of prudence, balance, and restraint.”
So far the legal community
agrees but has left room to change its mind. In January
a judge dismissed a potential class-action suit blaming
the McDonald’s Corporation for obesity in teenagers.
The decision to dismiss was guided by the principle that
“legal consequences should not attach to the consumption
of hamburgers” unless consumers are unaware of the
dangers of eating such food. If a person knows that eating
copious orders of supersized McDonald’s products is
unhealthy and may result in weight gain, “it’s
not the place of the law to protect them from their own
excesses.”
The real challenge confronting
any attempt—whether legal, educational, or biomedical—to
alter the social gradient of health was perhaps best summarized
nearly 50 years ago by socialite Babe Paley, who was born
rich, grew up thin, and married two rich, thin men. “You
can’t be too rich,” she supposedly said, “or
too thin.” Paley’s key insight was to connect
the two. Marmot’s pioneering Whitehall study, for
instance, showed the correlation between socioeconomic status
and lifespan, but it also revealed that low status was associated
with obesity, smoking, less leisure-time physical activity,
higher blood pressure, shorter height, and coronary heart
disease.
A glance at the conference
speakers and taller-and-thinner-than-average audience revealed
many of the same connections. While not exactly rich by
Paley’s standards, they all had advanced degrees from
elite institutions and the enhanced career paths that follow.
Most had a BMI within spitting distance of 25. No one deserted
the lectures to smoke. Most, if not all, reside in neighborhoods
of high, even obsessive, collective efficacy, primarily
Hyde Park, and all were deeply concerned about matters of
public health. They were uniformly numbered among the “them
that’s got,” and they were go-getters.
They apparently have the
added advantage of being persuasive. Three months after
the Chicago conference the president of the American Association
for the Advancement of Science opened that organization’s
annual meeting in Denver with a plea for more socially focused
research, specifically citing Marmot’s Whitehall Studies.
Floyd Bloom, chair of neuropharmacology at the Scripps Research
Institute and former editor-in-chief of Science, author
of more than 600 papers and the text The Biochemical
Basis of Neuropharmacology—in short as hard-core
a molecular biologist as they come—kicked off the
meeting by telling his audience that genomic-based health
care, though often described as a miracle on the horizon,
is likely to be expensive and require many more years of
research before new options are available to patients. “The
puzzles of better health promotion and disease prevention
may be approached more rapidly and effectively through intensified
social-science research,” he concluded, “rather
than by awaiting the expected evolution of gene-based explanations
and interventions based on future genetic discoveries.”
No longer a new kid on the block, the social-determinants
field has finally been blessed and is coming into its own.
<<
previous page