Friday, April 25, 2014
Inquirer Daily News

History

POSTED: Monday, April 21, 2014, 6:30 AM
A GI and his family seeking assistance through EMIC during World War II. (National Archives)
In 1943, the United States government began paying for medical, nursing, and hospital maternity and infant care provided to the wives of enlisted men in the lowest four military pay grades. The Emergency Maternity and Infant Care Act, known as EMIC, funded the care of about 1-1/2 million women and infants from 1943 to 1949. Although opponents questioned whether EMIC was a dangerous form of “socialized medicine,” patriotism—and the fact that this was presented as an emergency measure—overrode their opposition. There was widespread legislative support.

EMIC answered a demonstrated need. After the attack on Pearl Harbor and the United States’ entrance into World War II, the size of the military expanded rapidly. Over 16 million Americans saw service during the war. The wives of soldiers and sailors moved with them to military bases and lived far from home and family on low pay. Men in grade 7, for example, the lowest pay grade, earned only $50 a month, although base pay increased annually. Lacking the means to pay for medical care during and after their pregnancies or to cover the costs when their babies fell ill and needed services, families turned to state programs supported by the Social Security Administration and to local charities, but these did not underwrite the full costs of charitable care.

EMIC was the answer. Created and run by the United States Children’s Bureau, the program sent funds to the states to pay physicians and hospitals for the services provided. Reimbursement for complete maternity services—at least five prenatal visits, delivery, care of the newborn, and postpartum examinations—ranged from $35 to a maximum of $50 depending on the state reimbursement rate. Cesaerean deliveries were paid at the same rate and there were no co-pays for any EMIC-supported services! Hospital stays—a minimum of 10 days after delivery—averaged $5.38 per day in 1944 and rose to $6.58 by 1946, again with variations by state. When the figures were tallied at the close of the program, families' cost for maternity care (including doctor and hospital) averaged $92.49.

POSTED: Wednesday, April 16, 2014, 6:30 AM
Filed Under: History
Visiting Nurse Society of Philadelphia nurse, circa 1947. (Copyright Barbara Bates Center for the Study of the History of Nursing, University of Pennsylvania School of Nursing.) Photo gallery.

For more than a century, nurses have served as the cornerstone of public health efforts in this country. Most Americans at some point in their lives have had contact with a public health nurse. It’s hard to live for very long without meeting one. Remember the school nurse who made sure your vaccinations were up to date and bandaged the cuts and bruises you suffered during recess? She was a public health nurse. Have you or a member of your family ever needed the services of a visiting nurse to care for someone at home? She—and it  was almost always a woman—was a public health nurse, too.

Public health nursing traces its roots back to the late 19th century, when huge numbers of immigrants came to America seeking a better life but failed to find an environment conducive to healthy living. Nurse Lilian Wald receives the main credit for establishing public health nursing. Wald was inspired by a visit to the lower east side of New York City, where she witnessed residents living in incredible poverty and unhealthy conditions. She reasoned that those in ill health possessed limited chances to achieve their full potential -- and believed that nursing offered a means of improving the lot of the poor. Wald opened the Henry Street Settlement House, which quickly became a major center of health care services for those without access to care. Henry Street nurses visited the sick in their homes, identified health hazards in the community, staffed clinics and schools, advocated for better housing conditions and worked in a multitude of other jobs that promoted the health of the community in general.

Wald’s efforts were impressive. Yet she was not the first to organize public health nurses. Other large cities also set up public health/visiting nurse agencies similar to Henry Street. These agencies, often the only heath care services available to member of impoverished neighborhoods, employed public health nurses eager to take on the job of providing health care services to those unable to afford care. In 1886, a group of benevolent Philadelphian women established the District Nurse Society, later renamed the Philadelphia Visiting Nurse Service. The Visiting Nurse Service delivered care to those unable to afford or access health care services. Much of the early work of the Philadelphia Visiting Nurse Service revolved around maternal and infant services, with nurses assisting at home births and delivering follow up care after delivery. They also provided a variety of other services including care of the sick, clinic and school health work, and maintenance and promotion of health in community settings. For many, public health nurses were the connecting link to the health-care system, a position that public health nurses still hold today.

POSTED: Wednesday, April 9, 2014, 6:30 AM
Filed Under: History | Jonathan Purtle
In mid-17th century London, Bills of Mortality - the precursor to the modern death certificate - were simply lists of the dead. (University of London-Institute of Historical Research)

Why do we die? The question is existential, scientific, and spiritual at the very least. It’s also bureaucratic.  Like voter registration cards and driver’s licenses, death certificates relegate the cause of our physical demise to a discrete category that becomes a single data point in a sea of statistical information. And they are vitally important to the public’s health.

In Final Forms,” an excellent article in the April 7 issue of The New Yorker, Kathryn Schulz tells the story of the death certificate: its history, its public health significance, its shortcomings.

Schulz traces the origins of the modern death certificate back to 1512. In London, Bills of Mortality were first issued to track the number of people who died from the plague, in addition to the number succumbing to non-plague causes. The Bills included no information about individual decedents, such as their name or what exactly killed them (other than the plague), and were only issued sporadically after the terror of the Black Death subsided.  Things changed, however, in 1629 when King James I mandated that the Bills be issued on a regular basis, capture every death, and its cause. Years later, an actuarially-oriented haberdasher named John Gaunt reviewed 20 years worth of Bills and teased out 81 distinct causes of death across the four categories: chronic diseases, epidemic disease, conditions that killed children, and injuries.

POSTED: Sunday, April 6, 2014, 6:30 AM
Dr. Bernard Rollin, a Colorado State University professor and leading scholar in animal rights and animal consciousness. (William A. Cotton)

There are few issues in the public sector today that affect us all in the way that industrial animal agriculture does. We all eat, and almost all of the food we consume is produced by this system. Not only aren’t most of us aware of the nature of the system that provides us with our food sources (for most Americans, it is as if food appears magically on our plates every day), but most of us certainly aren’t aware of the impact that the system has on the public’s health. From the pesticides that impact us and our environment, to the concentrated animal feeding operations (CAFO) that house many of the animals we eat, to the overuse of antibiotics throughout agriculture, our health and environment is ever at risk.

On Tuesday, renowned philosopher and ethicist Dr. Bernard Rollin from Colorado State University will be giving a lecture at the Academy of Natural Sciences on the history, ethics and public health impact of industrial animal agriculture. Dr. Rollin's free public lecture, which begins at 6 p.m., is entitled "This Ain't Agriculture: How Industrial Agriculture Hurts Animals and the Public's Health." The talk will examine the impact of industrial animal agriculture on animals, humans, and the environment, and proposes ways to improve this system and make it more sustainable. The event is co-sponsored by the Program for Public Health Ethics & History at the Drexel University School of Public Health (I am director of this program), the Center for Science, Technology and Society at Drexel, and the Academy of Natural Sciences.

Dr. Rollin is an expert in this area and has worked closely with both government and corporate interests with the goal of improving the current agricultural system. His 1982 book, Animal Rights and Human Morality, now in its third edition, is a classic in the field, and he has authored over 500 papers and 17 books, the most recent of which is the autobiographical Putting the Horse Before Descartes: My Life's Work on Behalf of Animals. Most recently, he served on the Pew National Commission on Industrial Farm Animal Production, which in 2008 released a series of landmark reports on the public health, environmental, social and animal welfare issues implicated in industrial animal agriculture.

POSTED: Wednesday, April 2, 2014, 6:30 AM
The Plague marched across Europe in the mid-1300s; death followed quickly.

On television, forensic scientists can solve the mystery of someone’s death in an hour. In reality, uncovering the facts can take a lot longer. As an anthropologist leading the investigation of some skeletons dug up in England last year put it: their discovery “solves a 660-year-old mystery.” DNA tests on the skeletons revealed that they didn’t die of bubonic plague; they died of pneumonic plague.

Workers extending the London railway line unearthed 25 skeletons. They were victims of the Black Death that ravaged the world from 1348 to 1350, killing at least 75 million people. Scientists examining the bones confirmed not just the cause of death but details about the lives of those who died. Their bones reveal lives marred by violence and characterized by heavy work and malnutrition. The Black Death carried them away quickly. Untreated, it can kill in a few days. With no understanding of the cause of the disease, 14th-century Europeans often blamed Jews and foreigners for the disastrous epidemic that transformed life around the globe.

Bubonic plague is spread by fleas from infected rodents and is now easily cured by antibiotics. If the infection reaches the lungs and becomes pneumonic plague, however, it can be transmitted from person to person via infected droplets in cough. Victims must be treated promptly; mortality rates from this form of the disease are high. There is a third form of plague, septicemic plague, also spread by fleas.

POSTED: Wednesday, March 26, 2014, 6:30 AM
Filed Under: Food | History | Janet Golden | Nutrition
Ahhh, the quest for the perfect weight loss diet—the one that lets you eat and shed pounds. With so many Americans obese or overweight, the marketplace is full of diet books and over-the-counter drugs. There’s the Paleo diet —eat meat like a cave man! And the Mediterranean diet —eat vegetables like a peasant! And the grapefruit diet —eat like a Florida farmer!

There used to be more daring choices. Like the tapeworm egg diet. That’s right, a program that told you to swallow tapeworm eggs and lose weight.

In the early 20th century, marketers began selling this program to what were then called “fleshy people” under brand names like “Lard-B-Gone.” Sanitized tapeworm eggs delivered what they promised. You got rid of pounds without exercise, dieting, surgery, or dangerous drugs like arsenic pills, which were once a popular means of weight loss because they allegedly cut the appetite. With the tapeworm diet you swallowed the eggs and the tapeworm did all the work—consuming your meal while living in your digestive tract. Meanwhile, the tapeworm produced and shed millions of eggs in your intestine and grew up to 20 feet long.

POSTED: Wednesday, March 19, 2014, 5:30 AM
Filed Under: Addiction | Ethics | History | Medication
Shortly after a 48-hour bout of immobilizing back pain and a visit to the emergency room where he received Percocet, my husband went to his primary care doctor to discuss managing the continuing pain and numbness. What he encountered took him aback. Perhaps concerned about “drug-seeking behavior,” the primary care physician commented offhandedly that back pain eventually goes away; the physician failed to do a physical examination, asked no questions about his level of pain or work situation, and offered no suggestions for dealing with the numbness or a recurrence. When asked about pain medication, his physician gave him a bottle of naproxen (which he was already taking) with no instructions regarding appropriate dosage. After two weeks and no further treatment, the pain began to subside; nine months later the numbness and tingling had diminished.

My husband’s experience could have simply been an unfortunate encounter with a busy physician. Or it could be symptomatic of a new attitude towards pain patients and prescription opioids by primary care and public health practitioners. With increasing deaths from overdoses since the early 2000s, public health concern over the abuse of prescription opioids for pain management has mounted. Last month's tragic death of actor Philip Seymour Hoffman from an overdose has heightened awareness of the social burden of addiction and raised anxieties about prescription opioids as “gateway” drugs to heroin.

In 1996, the Food and Drug Administration approved OxyContin, a potent pain reliever marketed by Purdue Pharmaceuticals in a long-acting time-release capsule that industry promotional materials claimed would minimize abuse. Within a few years, however, reports of abuse, addiction, and death from overdose of OxyContin surfaced, and the Drug Enforcement Administration kicked into gear, lobbying for more intensive regulation and surveillance of prescription opioids. Public health and medical experts joined the DEA and the FDA in promoting heightened state oversight of medical providers, pharmacies, and manufacturers. Primary care practices revised their policies towards chronic pain patients to include pain contracts – written agreements intended to discourage abuse by them or others – profiling of “drug-seeking behavior,” frequent office visits, calls for “evidence-based” practice, and urine toxicology screening. The discussion soon moved from concerns over the addictive properties of OxyContin to calls to dramatically limit such widely prescribed opioids as Vicodin and Percocet

POSTED: Monday, March 17, 2014, 10:53 AM
Filed Under: History | Janet Golden | Kids
"Have you a Red Cross service flag?" by Jessie Willcox Smith, 1918

March is Red Cross month and Women’s History month—a perfect time to celebrate Red Cross founder Clara Barton, who began this organization in 1881 and served as its leader until 1904. Barton provided humanitarian aid to soldiers in the Civil War. During a visit to Europe she learned about the activities of the International Committee of the Red Cross and worked to create the American National Red Cross, which was awarded a federal charter in 1900. Today the Red Cross is an international organization and is best known in the United States for its work in disaster relief, supporting military families, and collecting blood and blood products. (If you’d like to celebrate women’s history and the Red Cross by making a donation you can find a local blood donor drive here.)

Few Americans are aware of the contribution of children to the work of the Red Cross. A Junior Red Cross began shortly after the United States' entry into World War I. On Sept. 15, 1917, President Woodrow Wilson issued a proclamation that announced this new effort to encourage school children to “work in the great cause of freedom to which we have all pledged ourselves.” It went on to promise that joining the organization “will teach you how to save in order that suffering children elsewhere may have the chance to live. It will teach you how to prepare some of the supplies which wounded soldiers and homeless families lack.” 

A wartime theme presented in educational materials was the need to sacrifice at the dinner table. An educational conference in 1918 stated this aim as teaching “the boys and girls of this country to eat less candy and give up sweet drinks,” so that the sugar could be shipped to Allies and used in food for soldiers.  Anyone who has ever heard about cleaning their plate because children overseas were starving should know that this lesson went back to World War I and the Junior Red Cross, when American children were also taught to “think of the hundreds of thousands, in fact millions of people who have been scourged by the German Army, and who have from two to three years been suffering not only the pangs of hunger but actual starvation.”

About this blog

What is public health — and why does it matter?

Through prevention, education, and intervention, public health practitioners - epidemiologists, health policy experts, municipal workers, environmental health scientists - work to keep us healthy.

It’s not always easy. Michael Yudell, Jonathan Purtle, and other contributors tell you why.

Michael Yudell, PhD, MPH Associate Professor, Drexel University School of Public Health
Jonathan Purtle, MPH Doctoral candidate and Research Associate, Center for Nonviolence and Social Justice, Drexel University
Janet Golden, PhD Professor of history, Rutgers University-Camden
Also on Philly.com:
Stay Connected