MILWAUKEE — Hospitals and family doctors, the mainstays of health care, are pulling out of poor city neighborhoods, where the sickest populations live.

A Pittsburgh Post-Gazette/Milwaukee Journal Sentinel analysis of data from the largest U.S. metropolitan areas shows that people in poor neighborhoods are less healthy than their more affluent neighbors but more likely to live in areas with physician shortages and closed hospitals.

At a time when research shows that being poor is highly correlated with poor health, hospitals and doctors are following privately insured patients to more affluent areas rather than remaining anchored in communities with the greatest health care needs.

The Post-Gazette/Journal Sentinel analysis shows that nearly two-thirds of the roughly 230 hospitals opened since 2000 are in wealthier, often suburban areas.

As health systems open those facilities, they have been closing their urban counterparts. The number of hospitals in 52 major U.S. cities has fallen from its peak of 781 in 1970 to 426 in 2010, a drop of 46 percent.

Most of the facilities closed were small to mid-size community hospitals in poor urban neighborhoods and public hospitals, leaving many low-income neighborhoods with no safety-net hospital.


New York City’s boroughs have lost more than 20 hospitals since 1990. Detroit has gone from dozens in the 1960s to four.

Since 1988, Milwaukee County has lost its public hospital and five city hospitals.

Between 1990 and 2010 alone, 148 nonprofit hospitals closed in the largest cities, along with 53 for-profit hospitals.

In addition, five public hospitals closed, according to Boston University’s Alan Sager, whose research shows it’s not just poor-performing hospitals being closed; the ones that shut down often are rated as being more efficient than those that remain.

“In a competitive free market, efficient hospitals would be likelier to survive,” he wrote. “That hasn’t happened, providing evidence that no such market is present.”

When communities lose hospitals, they lose doctors, too.

And poverty itself can make people sick.

Think of a child with asthma living in a moldy apartment, a man with an infected foot living on the streets or a diabetic without a refrigerator to store insulin.

Mary Mazul, director of population health management and integration at Wheaton Franciscan Healthcare in Milwaukee, described a conversation with a doctor who works with mostly poor patients, who told her: “Mary, I’m a good doctor but my outcomes aren’t so good.”

“Health care can’t fix poverty, homelessness, racism,” Mazul said.

Early death is the simplest measure of compromised health.

A Centers for Disease Control and Prevention study of U.S. counties conducted by University of Wisconsin researchers showed that the premature death rate was 39 percent higher in the poor counties that have been losing hospitals and doctors.

The past few decades of closures completed a chapter in which the founding principle of hospitals in the United States was stood on its head.

Most hospitals began as charitable institutions dedicated to the poor, often started by religious groups or social reformers.

In Milwaukee, Lutheran deaconesses converted a farmhouse into a hospital for the poor in August 1863. Originally named Milwaukee Hospital, it was often called “The Passavant” after William A. Passavant, a Lutheran minister who founded hospitals, orphanages, seminaries and colleges across the country. Passavant brought German deaconesses to the U.S. to replicate the medical training and hospital model established in Dusseldorf, spreading the best care practices of the day. Individuals and congregations contributed to the effort.


.Around the country, dozens of similar institutions were founded. In the 19th and early 20th centuries, local governments – often counties – began to open public hospitals, designed to serve the poor.

Most hospitals remained charitable institutions, but many began to accept paying patients as well. At Milwaukee Hospital, the first non-charity patients were admitted in 1873, at the rate of $5 per week. By the 1920s, hospitals served affluent as well as poor patients; medical schools became well established; and the American medical profession grew more powerful and prestigious.

The advent of private and public insurance reshaped the economics of health care.

During the Great Depression, administrators at Baylor Hospital in Dallas created the “Baylor Plan” – the first prepaid hospital insurance plan in the United States and predecessor of Blue Cross. Insurance for physicians’ services was also developed in the 1930s.

The success of these programs encouraged more insurers to enter the health care market, and a labor shortage during World War II led to health insurance becoming part of many benefits packages. Employers found that offering a health insurance package was a way to attract workers, and the federal tax write-off companies got for providing the benefit gave an incentive to make it a regular practice.

The entry of the government in the health insurance market with Medicare and Medicaid in the 1960s put many more Americans on an insurance plan, and more and more employees had private insurance through their employers.

By 1968, 80.8 percent of Americans had coverage, according to the U.S. Centers for Disease Control and Prevention.

“Our whole pricing system is illogical and unnecessarily complex,” said George Brown, CEO of Legacy Health Care in Oregon. “It’s a system created by the bright idea during World War II of giving employers tax breaks.”

That created new incentives and erased others.

Because a third party was paying, patients weren’t deterred by the normal market mechanism – cost. And because health care providers were being reimbursed on a fee-for-service basis, they had little incentive to keep those costs down. Health care got more and more expensive. It also became lucrative for many providers and insurers. In the 1960s, the number of hospitals climbed, reaching its peak in 1970 as insurance reimbursement rewarded tests, treatments and hospitalization.

Hospitals competed with one another to get patients and began to acquire or merge with other hospitals and to buy physician practices to get their patients.

But even as hospitals added beds, technology and changes in Medicare were reducing the need for them. In the 1980s, Medicare switched to paying a fixed amount for specific services. This gave hospitals an incentive to send patients home as quickly as possible.

Advances in technology made it possible for outpatient care to replace inpatient care for many procedures. The demand for health care services continued to grow, but many hospitals struggled to fill their beds. That led to chaotic patterns of growth and retraction.

Health care experts recognized that competing hospitals were overbuilding and duplicating services and expensive equipment, which increased the costs of care. Nevertheless, expansion continued across the nation. When the supply exceeded demand, mergers, acquisitions and closures resulted.

Meanwhile, medical students facing education debt, which has soared, are more likely to choose to enter higher-paid specialties, such as an anesthesiology or oncology.

The number of medical students entering family practice training dropped by 50 percent between 1997 and 2005, according to American Academy of Family Physicians data. Most now become specialists, who are more likely to be affiliated with large hospitals where most patients are privately insured.