Ebola
and the Epidemics of the Past
Just a few generations ago, progress against
infectious disease convinced Americans that modern medicine had won the battle
against microbes. Why is the public so skeptical today?
By David Oshinsky in
the Wall Street Journal
In
the winter of 1947, an American tourist arrived in New York City on a bus from
Mexico, feeling feverish and stiff. He checked into a hotel and did some
sightseeing before his condition worsened. A red rash now covered his body. He
went to a local hospital, which monitored his vital signs and transferred him
to a contagious disease facility, where he was incorrectly diagnosed with a
mild drug reaction. He died a few days later of smallpox.
By
this point, the man had infected at least a dozen New Yorkers, one of whom
died. Taking no chances, city officials began a massive but voluntary
vaccination campaign against a disease that had killed more people than any
other in history. Within weeks, several million New Yorkers took the vaccine.
Though health experts still disagree about the danger posed by these isolated
smallpox cases, one point remains clear: There was precious little panic.
Outside schools, fire stations and hospitals, the vaccination lines snaked for
blocks. People didn’t worry about the vaccine’s safety; they feared that there
might not be enough vaccine to go around.
Sound
familiar? Parts of the 1947 smallpox scare—the sick traveler harboring a deadly
disease, the missed hospital diagnosis, the quickly spreading infection—strike
a disturbing chord. A key difference between that crisis and our current one
with Ebola is, of course, the absence of an effective vaccine—and the fact that
Ebola is usually transmitted through close, direct physical contact with the
bodily fluids of someone infected.
But
Americans in the 1940s had a different mind-set as well. Today many Americans
doubt that health authorities can handle the crisis. Back then, by contrast,
there was a growing confidence in the power of medical research to solve any
problem, tame any epidemic, conquer any disease. It was a confidence grounded
in the miracle drugs and vaccines beginning to emerge from university and
pharmaceutical laboratories, and in the public health apparatus that had served
the nation and its troops so well during World War II.
It
hadn’t always been this way. What is truly remarkable about the march of modern
medicine is how slow the progress was in the preceding centuries. Though the
vaccine for smallpox was discovered by the British doctor Edward Jenner in the
1790s, it didn’t trigger a revolution in medical thinking. Until well into the
1850s, the onset of disease was still attributed to foul-smelling clouds of
decomposed matter known as “miasmas,” and the most common remedy was to purge
ill patients of supposed impurities until the body’s equilibrium was restored.
It’s
hard today to imagine such dangerous foolery passing for mainstream medicine,
but let one example suffice. In 1799, a Virginia gentleman suffering from a
severe throat infection “procured a bleeder in the neighborhood, who took from
his arm, in the night, twelve or fourteen ounces of blood.” Feeling no better,
the man sent for his doctors. The first to arrive prescribed an enema and then
“two copious bleedings.” Seeing no improvement, a second doctor ordered “ten
grains of calomel [a devastating mercury-based drug] succeeded by repeated
doses of emetic tartar,” causing a massive discharge “from the bowels.”
Then
the real bleeding began. Thirty-two ounces were drawn by lancet, while blisters
were applied “to the extremities.” (A person giving eight ounces of blood today
must wait two months before donating again.) The man finally told his doctors
to stop. “Let me go quietly,” George Washington pleaded, and he did.
The
great medical breakthroughs in the mid-19th century came mainly from Europe.
Among these was the concept of germ theory proposed by Louis Pasteur, Robert
Koch and Joseph Lister. Germ theory linked specific germs to specific diseases,
like rabies, cholera and tuberculosis. It taught people to accept the peculiar
idea that humans shared their communities, their homes, even their bodies with
invisible, often dangerous microorganisms. Put simply, what you didn’t see
could make you very ill.
Germ
theory spurred the development of modern laboratory research. Its impact on
pathology and bacteriology can hardly be overstated. In 1900, the life
expectancy for an American man was 46, and for an American woman 48. By 1950,
the figures had jumped to 65 and 72 respectively.
Some
of this increase can be explained by factors such as better nutrition, cleaner
water and the passage of pure food and drug laws. But much of it was due to the
vaccines, sulfa drugs and antibiotics aimed at the deadly infections that put
children at special risk. In the 1870s, one infant in five born in New York
City died in the first year of life. Among those fortunate enough to reach
adulthood, a quarter did not live to see 30.
Progress
came in fits and starts, with devastating setbacks along the way. The influenza
pandemic of 1918-1919 killed tens of millions around the globe. Approximately
one in four Americans took sick, and a half million died. The number of U.S.
soldiers lost to influenza during World War I (44,000) rivaled the number
killed by enemy fire (50,000). Army virologists waged an all-out (and
moderately successful) campaign to develop an influenza vaccine and began to
vaccinate GIs for a host of diseases.
In
terms of public confidence, America’s golden age of medicine reached its peak
in the 1950s. It was here that the miracle of the laboratory routed the terror
of infectious disease in the most dramatic imaginable way. The disease was
polio—also known as infantile paralysis—which descended like a plague upon
Americans each summer, killing thousands of children and leaving thousands more
in leg braces, wheelchairs and iron lungs. Polio in the 1950s, like Ebola
today, put everyone at risk. The fear was palpable. Newspapers kept daily box
scores of those admitted to hospital polio wards. Beaches, swimming pools, movie
theaters and bowling alleys were closed. Rumors abounded that one could get
polio from an unguarded sneeze, handling paper money or talking on the
telephone. “We got to the point that no one could comprehend,” a pediatrician
recalled, “when people would not even shake hands.”
But
Americans channeled these fears into a common purpose, much like the smallpox
episode of 1947. Uniting behind Franklin D. Roosevelt’s March of Dimes, they
raised hundreds of millions of dollars to find an effective polio vaccine. In a
move probably incomprehensible to most parents today, they volunteered their
children—almost two million of them—for the massive public trials in 1954 that
tested Dr. Jonas Salk ’s killed-virus injected polio vaccine. When the results
came in, showing the vaccine to be “safe, effective, and potent,” the nation
celebrated. At a White House ceremony honoring Salk, President Eisenhower
fought back tears as he told the young researcher: “I have no words to thank
you. I am very, very happy.”
Salk’s
triumph was followed, in short order, by Albert Sabin ’s equally effective
live-virus oral polio vaccine (given on a sugar cube or in a medicine dropper)
as well as vaccines for measles, mumps, chickenpox and whooping cough.
Meanwhile, the remarkable success of penicillin and other antibiotics in
destroying harmful bacteria led some researchers to declare victory in the war
against infectious disease. Medical students in the 1960s were warned away from
the field and encouraged to study chronic disorders like cancer and heart
disease, where the real action—and the research money—would be found.
Humanity
appeared to be on the verge of a most improbable goal: eliminating the threat
of deadly infectious disease. “Will such a world exist?” a prominent researcher
asked at midcentury. “We believe so.”
Rarely
has a scientific prediction been so thoroughly shredded. The hubris of that era
collapsed under the combined weight of HIV/AIDS, SARS, Ebola, Avian flu and
deadly drug-resistant bacterial infections. And let’s not forget Enterovirus
D68, a pathogen that has sickened more than 1,000 American children this year
and likely killed at least six. In the so-called war between “man and
microbes,” there is never a truce.
Ebola
is currently dominating the news, and for good reason. Part of an entire
continent is at risk. Named for the Ebola River in Central Africa, where it
first emerged in 1976, the Ebola virus, like polio and influenza, has several
different strains. The reservoir for the virus is uncertain, though bats—the flying
mammals that harbor dozens of viruses perilous to humans—are the leading
suspects. A bat takes a bite of fruit; it falls to the ground; a primate eats
the remains; a villager slaughters the primate—there are multiple variations.
The
first outbreaks of Ebola occurred in rural African villages, but rarely
traveled far. Unlike bacteria, viruses cannot live long on their own. They
depend on the cells of the host they invade to reproduce. When the host dies,
the virus does, too. Having killed off so many villagers, Ebola simply burned
itself out.
The
difference in 2014 is that Ebola no longer haunts just the rural countryside.
Its reach now extends into densely populated cities, where there is no shortage
of human hosts. There already have been 10 times more deaths from Ebola than in
any previous outbreak, and that number is climbing fast. Now it has reached the
U.S.—disease, in our interconnected world, being an easy plane ride away.
What
seems most apparent at this early point is the yawning chasm between public
health officials and the public at large. We live in a post-Vietnam,
post-Watergate, Internet-obsessed culture, where respect for government
pronouncements and expert opinion has dramatically eroded. Distrust is now
endemic, and a crisis like Ebola, which few saw coming, much less planned for,
only fuels this divide.
Health
officials strongly believe that the chances of a major outbreak occurring in
the U.S. are slim to none. The disease is not transmitted when the carrier is
asymptomatic, and can only be passed from person to person through the exchange
of bodily fluids. A robust public health system—unlike those in West
Africa—should easily contain its spread.
But
the public sees something quite different. A single traveler arrives in Texas
from Liberia. He quickly takes ill with a high fever, visits a hospital and is
sent home. Feeling worse by the hour, he returns to the hospital, where he
dies. When two nurses who treated the man test positive for the disease, it
becomes clear that the hospital had no effective plan in place to deal with the
situation. To compound matters, one of the nurses had boarded a plane to visit
relatives in Ohio. The possible ring of contamination now extends well beyond
Dallas, showing the lightning speed with which an infectious disease can spread
in the modern world.
On
no issue is the public more at odds with health experts than on the question of
a temporary travel ban on West Africans coming to the U.S. Opinion polls show a
clear majority in favor of the ban, which public health officials
overwhelmingly oppose. The issue has become a centerpiece of the approaching
midterm elections, with Republicans bashing President Obama and the CDC for
their supposed negligence, while many liberals portray supporters of the ban as
racists, xenophobes and imbeciles.
In
truth, Americans who oppose the ban appear quite sympathetic to sending
doctors, soldiers and medical supplies to combat Ebola in West Africa. But many
Americans simply doubt the ability of our government to carefully screen
travelers from the affected areas. Thus far, public health officials have done
little to placate these fears.
History
assures us that Ebola will be conquered. It also tells us that the next “fatal
strain” is likely bubbling up somewhere right now—in a bat cave, a pig farm or
an open-air poultry market. That’s the nature of these microbial beasts, and we
may not be spending enough now to understand these threats. But public trust in
dealing with future crises is perhaps the dearest resource of all.
Next
week marks the 100th birthday of Jonas Salk. Shortly after his vaccine was
declared successful, he gave a nationally televised interview with Edward R.
Murrow. “Who owns the patent on this vaccine?” Murrow asked. “Well the people,
I would say,” Salk replied. “There is no patent. Could you patent the sun?”
For
Dr. Salk, the whole endeavor was a gift from science to humanity, nurtured by
the goodness of the American people. We must find ways to keep that spirit
alive—winning back for modern medicine and public health the full confidence of
the world’s most generous nation.
Prof. Oshinsky is a member of the history department at New York
University and director of the Division of Medical Humanities at the NYU School
of Medicine. His book, “Polio: An American Story,” won the 2006 Pulitzer Prize
for history.
No comments:
Post a Comment