Saturday, December 31, 2011

Avastin®, Plan B®, and Magical Thinking


“Magical thinking” is believing something is true because you want it to be true even when there is strong evidence that it is not. It is normal in young children. They believe in Santa Claus and the Easter Bunny and conjurer’s tricks. This is in part because adults encourage them to, and because they do not know the evidence and they haven’t enough brain maturity to make the connections. Beyond a certain age, however, it is not normal. Yet we do it all the time.

It is common enough in politics, for sure. A wise expert (OK, me) once said “Data is only useful if it confirms your preconceived notions”. Otherwise, hearing the data that should demonstrate that you are wrong only confirms your pre-existing beliefs because it reminds you of why you believe it. The evidence is the evidence, and sometimes it is inconclusive and subject to different interpretations depending upon one’s perspective. That’s what makes horse races. Sometimes it is conclusive, but leads to a different conclusion than the one that you want to hear.

Religion is different; it is, by definition, based on faith. It becomes confusing, for me, when this is complicated by searching for evidence (e.g., the Catholic Church searching for evidence of a miracle in order to sanctify someone), but at bottom it is about faith. Some people have lost their faith in the religion in which they were brought up because of seeing contradictory evidence in the world, others have reconciled that evidence with their beliefs, others manage to separate the evidence from their faith, and still others reject all the evidence of their senses if it contradicts their faith. We have classic examples of this last, with lecturers in the early European medical schools reading from Aristotle on anatomy, ignoring the visual evidence provided by the cadavers being dissected in front of them that demonstrated that what Aristotle described was wrong. Luckily for anatomy and medicine, the schools were able to move on from this, in part because Aristotle, while revered, was not a Christian expert. It was rougher for Galileo when he demonstrated that the earth rotates around the sun.

I understand people’s interest in believing to be true things that the evidence demonstrates is not. It is comforting, it offers hope, and it can offer consistency. I wish, sometimes, I had more of it. My son died 9 years ago from completing suicide. If I believed that there was an afterlife, and that he was somewhere happily being cared for by my mother, who died over 30 years ago, it would make me feel better. After all, she was a wonderful, nurturing person, a kindergarten teacher who loved children, and she died just after he turned 2, so never got to see him grow up. It would be great to believe that they were getting to know and enjoy each other now. But I don’t.

Nonetheless, I am sure there are things that I believe that are contrary to the evidence. Certainly, things I believe that have conflicting evidence. Like that people are good, that the world can be a better place, that the ‘better angels’ of our nature may overcome selfishness and greed and hypocrisy and meanness. Sometimes that belief is sorely tried. It has been a particularly hard couple of years as the perpetrators of the greatest worldwide financial crisis have gotten off and maintained and increased their wealth while hundreds of millions of their victims have had their lives ruined, with no end in sight. And with whole cohorts of politicians and pundits advocating that these perpetrators be spared any penalty while slashing any programs that benefit their victims.

For most of us, and in most societies, there are limits to what we tolerate because of people’s beliefs. We do not, as a rule, accept that a false belief, a delusion, about another is an excuse for murder. Of course, if that false belief is on the part of the government that sends young people to war and to kill, it is accepted. And for many zealots, of many beliefs and causes, whether Islamic terrorists or anti-abortion murderers, there is a portion of the population who will accept it.

One group that has good reason to want to believe in things for which there is no evidence is those who are threatened with death from a disease for which there is no effective, “approved”, treatment. Cancer, for instance, or AIDS. In the 1980s and 1990s, AIDS advocacy groups pushed for quick FDA approval for drugs to fight a disease that was killing lots of people. To some degree it happened, and luckily those drugs were effective, and better drugs were developed, and today AIDS is most often a chronic disease. When a study showed that bevacizumab (Avastatin®), an anti-cancer drug created through recombinant DNA that had positive effect for some other cancers such as colorectal cancer, was also effective in prolonging the lives of women with metastatic breast cancer for a few months (not curing them), the large breast-cancer advocacy community pushed the FDA for early approval. It was approved. But then more studies appeared that showed it was not effective. Several of them. And the FDA, appropriately based upon the evidence, withdrew their approval. Blue Cross/Blue Shield of California then decided it wouldn’t pay for it. Yes, much of the motivation was financial – it costs $90,000 per year to treat a patient (except less, really, because few last a year), but it was based on the evidence. Would you pay $90,000 for a drug that didn’t work? How about spending that on treating someone else with a drug that doesn’t work? But having someone else pay for it for you (your insurance company and those other people who are paying premiums)is less painful. There was a big uproar. BC/BS (and Medicare) are now again paying $90,000 a year for treatment of breast cancer with a drug that doesn’t work.

On the other hand, kowtowing to true believers can have the opposite effect. It can lead to restricting access to a drug that does work. This has occurred recently with Plan B One-Step®, “the morning-after pill” which effectively provides emergency contraception if taken within 72 hours (maybe more) of unprotected intercourse. Approved for women 17 and over without a prescription, this form of the hormone levonorgestrel is kept “behind the counter” so those under 17 cannot get it. It doesn’t make sense, since girls under 17 can and do have unprotected sex and get pregnant. It is also safe. So, recently the FDA, examining all the evidence, recommended that it be made available without a prescription and sold “over the counter”. Then Secretary of HHS Kathleen Sebelius overruled, in an almost unprecedented action, the FDA’s recommendation. There was no science or evidence behind the Secretary’s action. Her stated reason, that younger women cannot understand the instructions, would, if one wanted to believe it, be an unreasonable standard. Can they understand the instructions to prevent adverse effects from ibuprofen or acetaminophen? Is the risk of pregnancy in these girls less than the risk from taking Plan B incorrectly? Nonsense. It is a political judgment, pandering to the belief of those who magically believe that because they don’t want young girls to have sex they won’t as long as contraception is not available to them to “encourage” it.

People read and support things that agree with what they think. I do not delude myself into thinking that what I write in this blog “converts” people; I recognize that people who read and like it probably already agree with me. But I do try to present evidence. And sometimes readers challenge me on my interpretation of the evidence (see, for example, the comments on Fluoridation: Dental health for all, October 26, 2011). One of the hardest things for physicians to do is to “un-learn”, to change the beliefs that they have had for years or decades when new information shows that what they believed is wrong. It is hard for them, and harder for the lay public, to understand that doing something was the right thing in the past because of the best evidence at the time, but is the wrong thing now. And what we think is the right thing now, based on the best evidence available, not be true in the future. That is how science evolves.

But magical thinking should have nothing to do with it.

-------------------------------------------------------------------------------------

Oh, yes. And in support of a tradition which Dilbert correctly points out is only a random point in time (and despite his use of "oxytocin" when he may have meant "oxycodone"):
HAPPY NEW YEAR!

Sunday, December 25, 2011

Index to Medicine and Social Justice year 3, 12/2010-11/2011


Saturday, October 8, 2011, Healthful Behaviors: Why do people adopt them? Or not?
Friday, September 16, 2011, Unintended pregnancy and health disparities
Wednesday, October 26, 2011, Fluoridation: Dental health for all

Health Policy
Saturday, December 18, 2010, ACA, ACOs, and Meaningful Competition
Thursday, September 22, 2011, Legislating Public Health and Medical Care


Primary Care
Saturday, February 5, 2011, AMA response to "Outing the RUC"

Medical Education
Wednesday, March 9, 2011, The Education of Health Professionals and Prospects for Transformation (Guest post by Seiji Yamada, MD)
Monday, June 6, 2011, Comments on Free Medical Schools


Family Medicine in the Era of Health Reform (3 part series)

Medical Ethics
Friday, April 1, 2011, Conflict of interest reporting

Social Justice
Thursday, December 30, 2010,Immigration and the US: Happy New Year


The Tucson shootings
Thursday, January 27, 2011,The Devil Inside: Access to Mental Health Care in the United States  (Guest post by Robyn R. Liu, MD)
Friday, January 21, 2011, Tucson is worth struggling for...(Guest post by William Bemis)
Sunday, January 9, 2011, The Arizona shootings: When will we ever learn?
Links to other blogs

Other
 Tuesday, June 14, 2011, Barbara Starfield
Wednesday, July 6, 2011, New blog and post

Sunday, December 18, 2011

To improve health the US must spend more on social services


That the US spends far more, in total and per capita, on health care than any other country is a well-established fact which no one bothers to deny. That this expenditure has not brought us greater health is also established fact, although many still find this hard to believe, or don’t want to believe it. That we do not have the “best health care system in the world”, or even close, or even, actually, a health care system at all, is also demonstrably true. This does not stop a larger percent of the population, and particularly the very privileged sector represented by politicians, from maintaining that untruth.

However, in a provocative op-ed in the New York Times (“To fix health care, help the poor”), Elizabeth H. Bradley and Lauren Taylor argue that it is only when health care is viewed in its most narrow sense that the US spends more than other countries. Their study of 30 countries expenditures, “Health and social services expenditures: associations with health outcomes”[1], “…broadened the scope of traditional health care industry analyses to include spending on social services, like rent subsidies, employment-training programs, unemployment benefits, old-age pensions, family support and other services that can extend and improve life.”

Essentially, their data shows that having services available to people that improve the quality of their lives, or, more important, decrease the negative health impact of the adverse circumstances into which they are born, develop, and live, lessens disease burden and improves health. This then decreases the costs of providing medical care to them. For example, they note, “The Boston Health Care for the Homeless Program tracked the medical expenses of 119 chronically homeless people for several years. In one five-year period, the group accounted for 18,834 emergency room visits estimated to cost $12.7 million.”

Bradley and Taylor indicate that among industrialized countries, the US ranks #10 in total health + social service spending , and is one of only 3 that spend more on health care than on all other social services. This means that, in addition to not getting the preventive or early-intervention health care that they need, Americans are at higher risk of illness and more ill when they come to medical attention. They may not be homeless, although obviously this dramatically increases their risk. People may not have adequate food, not have adequate warmth (see the discussion of “excess winter deaths” in Michael Marmot, the British Medical Association, and the Social Determinants of Health, November 1, 2011), not had a safe environment. They likely had far too little income. Many of them are children, and many of those, and often their parents before them, have had an inadequate education. A large number of the determinants of health are antenatal, and many more are in the early years of life. The other group at high risk of both adverse health outcomes and the poverty-related social deficits that influence them, are the elderly. So what do we see in the US? Threats to cut Medicare, cut Social Security, cut education.


This wouldn’t affect everyone equally, of course. Only the most vulnerable. Or, at least, the more vulnerable. The wealthy, of course, are unlikely to be inadequately housed, inadequately nourished, inadequately educated, and, in a tautology, inadequately employed. Another recent study, from the Organization for Economic Cooperation and Development (OECD), called “Divided we stand: why economic inequality keeps rising”, demonstrates rising inequality in income as indicated by the difference between the income of the top 10% and bottom 10%. “The income gap has risen even in traditionally egalitarian countries, such as Germany, Denmark and Sweden, from 5 to 1 in the 1980s to 6 to 1 today. The gap is 10 to 1 in Italy, Japan, Korea and the United Kingdom, and higher still, at 14 to 1 in Israel, Turkey and the United States. In Chile and Mexico, the incomes of the richest are still more than 25 times those of the poorest, the highest in the OECD, but have finally started dropping. Income inequality is much higher in some major emerging economies outside the OECD area. At 50 to 1, Brazil's income gap remains much higher than in many other countries, although it has been falling significantly over the past decade.”

 

In the report’s “country note” on the US, it observes that “The United States has the fourth-highest inequality level in the OECD, after Chile, Mexico and Turkey. Inequality among working-age people has risen steadily since 1980, in total by 25%. In 2008, the average income of the top 10% of Americans was 114 000 USD, nearly 15 times higher than that of the bottom 10%, who had an average income of 7 800 USD. This is up from 12 to 1 in the mid 1990s, and 10 to 1 in the mid 1980s….Income taxes and cash benefits play a small role in redistributing income in the United States, reducing inequality by less than a fifth – in a typical OECD country, it is a quarter. Only in Korea, Chile and Switzerland is the effect still smaller.” Of course, comparing deciles is deceiving; as the Occupy Wall Street movement emphasizes, the concentration of wealth is in the top 1%, and economist and NY Times columnist Paul Krugman (“We are the 99.9%”, November 24, 2011) and others point out that most of that wealth in the US is in the top 0.1%! The wealthiest 400 families in the US own as much as the bottom 50% of the population.

 

One obvious result of the rising inequality in the US is the increase in the overt control that this wealthy class exerts over the political process, through direct lobbying, political contributions, employment after and between stints of government service, and control of media. The “corporate personhood” decision by the US Supreme Court in Citizens United simply codified and protected this inequality. But income inequality in itself is not sufficient to lead to the destruction of the social safety net that exposes increasing numbers and percents of people to ravages that adversely affect their health. It also requires extreme selfishness and disrespect, so that billionaire people and corporations pay little in tax, and governments are purposely squeezed so that they have neither the will nor the resources to provide services.

 

The findings of Bradley and Taylor are not news to the public health community, of course, which is very familiar with the social determinants of health and the positive impact that investment in basic social supports has on the health outcomes of both populations and individual people. Investment is required to see future benefit, and the investment that we need, and are not making, is in education, is in nutrition, is in housing. It is far more than a shame. It is shameful.  



[1 Bradley EH, Elkins BR, Herrin J, Elbel B.,Health and social services expenditures: associations with health outcomes, BMJ Qual Saf. 2011 Oct;20(10):826-31. Epub 2011 Mar 29

Saturday, December 10, 2011

GME funding must be targeted to Primary Care



Much of the cost of training physicians is currently borne by Medicare (and, to a lesser extent, Medicaid). This is known as Graduate Medical Education, or GME, funding, and it pays some, all, or more than all (depending upon the hospital and based upon a complicated formula discussed on May 25, 2009, Funding Graduate Medical Education) of the cost of training residents in the various specialties that comprise medicine. For those unfamiliar with medical education, graduation from medical school, while it confers the MD (or DO, Doctor of Osteopathy) degree and the title “doctor”, no longer permits practice in any of the US states. A least one, and in some states two, years of residency (“GME”) is required for licensure, and most doctors complete an entire residency of 3 or more years to make them eligible for certification as a specialist in a field (eg, family medicine, general surgery, internal medicine, psychiatry, etc.). Fellowship training is requires addition years beyond the core residency to become a sub-specialist – for example, those who complete an internal medicine residency can then do additional years to become a cardiologist, gastroenterologist, endocrinologist, etc.

Medicare augments its payments to institutions (usually hospitals, although there are a few consortia and federally-qualified health centers) with two types of payments, Direct GME which is intended to pay residents’ salaries and cost of teaching, and Indirect ME which is for the additional costs that training hospitals bear for a variety of reasons. (In addition to the piece linked above, see also Training rural family doctors, Nov 5 2010; PPACA, The New Health Reform Law: How will it affect the public's health and primary care?, Apr 22,2010; Primary Care and Residency Expansion, Jan 7, 2010.) These payments have been the cornerstones for funding residency education. Because the amount is tied to the percent of Medicare patients in a hospital, rather than the total number of patients cared for in hospitals or outpatient settings, it could be (and has been) argued that funding GME should be done comprehensively and separately from Medicare. The most persuasive argument is that private insurers should also contribute to GME (they don’t, although Medicaid does in some, but not all, states). On the other side, many fear that uncoupling GME funds from Medicare would make it easier for a Congress looking at ways to cut the budget to cut GME than having it as part of Medicare.

Except this year, with exceptionally high pressure to cut the budget, Medicare is not even sacrosanct, although, as I have recently argued, (Medicare: A lifeline, not a Ponzi scheme, Dec 2, 2011) most of the proposals to cut it across the board by tactics such as raising the age of eligibility are poorly conceived. So there are now proposals to cut the funding from Medicare for GME. Unsurprisingly, this has created great anxiety in the community of academic health centers, and the Association of American Medical Colleges (AAMC), which has strongly supported expansion of GME residency slots, is quite alarmed (Preserve Medicare support for physician training, revised Oct 2, 2011). The Accreditation Council on Graduate Medical Education (ACGME), which accredits institutions that sponsor residency programs and, through its subsidiary Review Committees (RCs), each individual specialty and subspecialty, has done a study that shows that cuts in residency positions have already occurred and more major cuts are threatened if Medicare decreases its funding ("The Potential Impact of Reduction in Federal GME Funding in the United States: A Study of the Estimates of Designated Institutional Officials”). ACGME CEO Thomas Nasca, MD, is quoted by AAFP News Now as saying “We will actually reduce the number of physicians who are trained in the United States at a time when all workforce studies are demonstrating a mounting deficit of physicians….That will place us in a position where our physician-to-population ratio in 2020 and beyond is below (that of) most of the developed countries in the world." The study found that “With a 33 percent reduction in GME funding
  • 68.3 percent of responders said they would reduce the number of core residency positions,
  • 60.3 percent would reduce the number of subspecialty fellowship positions,
  • 4.3 percent would close all core residency programs, and
  • 7.8 percent would close all subspecialty programs.”

Because there are many more “core” residency positions than subspecialty fellowship positions, these would be disproportionately affected by across-the-board cuts. In addition, residency programs in primary care, which are not as profitable to the sponsoring institution, are even more likely to be cut despite the service that they provide to patients, especially those most in need. Perry Pugno, M.D., M.P.H., AAFP vice president for education, notes in that same article that "…any cuts to GME that go across the board are going to hurt primary care -- especially those of us who disproportionately take care of adults with chronic illnesses….In communities where primary care residency programs are present, those programs become the access point for the poor and disenfranchised of the area.” He says that it's not unusual for family medicine residency programs to see patients who live both in poverty and with numerous chronic illnesses. "The payment for taking care of those patients is so low that the local medical community often doesn't want to provide that care…But residency programs take all comers."

The key issue that Pugno is addressing is one that is very important issue and is not usually made explicit in national policy discussions: our current method of allocating Medicare GME funds to institutions (hospitals) rather than to individual residency programs tends to encourage funding the funding of positions in specialties that most profit those hospitals. The interests of the American people, in regard to the kinds of specialists they need, are not necessarily (and I would argue in fact are not) the same as the interests of the hospitals that sponsor residencies. Hospitals like to fund specialties whose trainees’ work enhances their revenue (e.g., cardiology fellows, who can increase the number of profitable procedures that are done) or at least decrease their loss (e.g., emergency medicine residents, who can fill gaps in seeing patients in emergency departments). Indeed, when hospitals can afford to, they often augment Medicare GME with their own funds to create more such positions. This is about their own financial interest, and does not take into account whether or not the US needs more cardiologists or ER docs, or more family physicians and general surgeons.

This contrast between the interests of the hospital (what kind of residency positions are most beneficial to its bottom line) and the needs of the population, is, of course, a subset of the larger tension. We train doctors in highly-specialized tertiary care academic health centers, while they will mostly practice in the community. There are a number of reasons that this is not brought up more often. For the general lay public, including most members of Congress and their staffs, it seems like a subtle difference. For experts, such as the AAMC, the issue is that they represent the interests of the medical schools, and want to have those interests seen as also representing the interests of the US population. Of course, they do not always, especially the interests of the most rural, poor, minority and other underserved portions of that population.

I think we need to use every opportunity to make this issue more clear and open. While it is probably true that it is a mistake to decrease federal funding for GME, it is absolutely necessary to increase the support for primary care and, in particular family medicine. And this will only happen if GME funding is explicitly tied to requiring it to be spent on primary care programs, and “prevents substitutions”.


Friday, December 2, 2011

Medicare: A lifeline, not a Ponzi scheme

.
In an earlier post (Medicare: We need to expand it, not cut it!, July 1, 2011), I commented on the proposals from politicians such as Wisconsin representative Paul Ryan and Connecticut Senator Joseph Lieberman to limit Medicare.  I quoted economists Austin Frakt and Aaron Carroll (as cited by Paul Krugman (“Medicare saves money”, NY Times June 12, 2011), from their post on the Incidental Economist, that  “…right now Americans in their early 60s without health insurance routinely delay needed care, only to become very expensive Medicare recipients once they reach 65. This pattern would be even stronger and more destructive if Medicare eligibility were delayed.” It is a stupid idea, more designed to engender the political support of people who do not think the issue through than to practically save money.

There are other similar proposals to “fix” Medicare that fit the same pattern: they superficially seem to make sense, but are actually nonsense. One of the most popular is the idea that we exclude “wealthy” seniors from Medicare, or, at least, require them to make a significant financial contribution. This contribution could consist of premiums paid to Medicare that were tied to income (or wealth, more relevant for retired people but much harder to assess accurately) or co-payments for services, again tiered to income. This seems to make sense – why not? There are many well-to-do elderly; why should currently-working people, who are struggling to make ends meet, have to pay for their care?

One reason is that the reason that Medicare is an “entitlement” because these people have paid for it in advance by their taxes during their working lives. Some of this is from the specific Medicare deduction that comes from each of our paychecks, which supports only “Part A” (coverage for hospital care), as well as from the general income tax revenue that pays for “Part B” (doctors) and “Part D” (drugs). People pay into these plans during their working lives, and draw the benefits when they need it when they are older. This is, in principle, what “saving” is about, but it goes beyond an individual retirement plan to cover everyone. This is the nature of social insurance.
Governor Perry of Texas, a Republican candidate for the presidential nomination (perhaps, if we are lucky, soon to be former candidate), called Medicare (and Social Security, vide infra) “Ponzi schemes”. : “Perry: I think every program needs to stand the sunshine of righteous scrutiny. Whether it’s Social Security, whether it’s Medicaid, whether it’s Medicare. You’ve got $115 trillion worth of unfunded liability in those three. They’re bankrupt. They’re a Ponzi scheme.” They are not. A “Ponzi” scheme involves taking one person’s property (money), and using it to pay off previous investors, who are seeking to make money on their investments. Medicare (and Social Security) are social insurance programs where the benefit is understood to be care (in the case of Medicare) or [minimal] income (in the case of Social Security). The entire beauty of both of these programs is that they involve everyone. Thus the well-to-do as well as the poor and the people in the middle have a stake in keeping the program running and effective.
If we were to exclude certain sectors of the population from receiving benefits from either of these programs, it would undermine the collective investment that we as a society have in each other. The better off, better educated, more empowered now fight for these programs because they are beneficiaries, and results in their being in place for those who are not so privileged. It is probably this very sense of mutual interdependence that makes ideological conservatives oppose them, but such opposition is short-sighted. The reason for having social insurance programs that make us interdependent is that – we are interdependent. The society, in the US (and, arguably worldwide) requires not only healthy, educated, productive workers but also consumers who are able to purchase goods and services. Billionaires like Warren Buffett call for higher taxes on the wealthy (an idea picked up on by President Obama) because they understand that a prosperous society requires contributions from everyone. We ARE in it together.
If we were to exclude only the very wealthy from benefits under these programs (say the top 1%), it would not hurt them financially, but it would hurt the rest of us because these very powerful people would no longer have a personal stake in supporting such programs. And, of course, it would save essentially no money; the corollary of the enormous concentration of wealth in a small number of people is that there are not very many of them. Thus, if they never drew a single dollar of benefit from Medicare (or Social Security) the programs would not be any better off. In order to save money, we would have to exclude a lot of people beyond the very wealthy (10%? 20%? 30%? of the population), and this would be then excluding a large section of the population, and truly reduce support.

More recently, Jane Gross writes in the NY Times about “How Medicare fails the elderly” (October 16, 2011). Her emphasis is not on excluding people from coverage, but rather on not covering services that do not enhance, and often decrease, recipients’ quality of life. Medicare pays for many services that fall into this area, and the reason has rarely to do with the desires of the patients themselves. “Of course, some may actually want everything medical science has to offer. But overwhelmingly, I’ve concluded in a decade of studying America’s elderly, it is fee-for-service doctors and Big Pharma who stand to gain the most, and adult children, with too much emotion and too little information, driving those decisions.” Among the treatments that she notes that Medicare pays for but are usually not medically indicated (especially in the old, debilitated, and demented) are feeding tubes, many forms of surgery (particularly abdominal and joint replacement) and “tight” control of Type II diabetes. All of these treatments have high risks and rarely prolong life while significantly decreasing its quality.

Gross notes that when these complications arise patients often need long-term, very expensive (she cites costs for her mother 8 years ago of $14,000 a month!) care in nursing homes, which Medicare does NOT pay for. Medicaid will, but only after the senior has exhausted all their resources (including savings house, etc., and then only in some nursing homes which are willing to take Medicaid reimbursement, and these are often not those of highest quality). Thus, by paying for the performance of procedures that do not help, Medicare leads patients into worse quality of life at high cost.

Clearly, the motivations of the drug and device makers, hospitals and physicians and nursing homes are often (in some cases usually or always) financial, but this is not the case for the family members, who mostly want to “do the best” for their parent or relative. However, given unclear guidance by their physicians, or incorrect information from any source, they may associate “doing something” with “doing the best thing”; often “doing the best thing” is not doing “something”. If Medicare did not pay for unnecessary and potentially harmful procedures, there would be little motivation among providers to do them, and it would not only save money but more important improve the health care and preserve the dignity and quality of life of people in their last years.
There is a solution to the potential bankrupting of Medicare. One: Pay for only medically necessary and indicated services. Two: revise the Medicare fee schedule to maintain the payment for primary care services but decrease excessive payment for high cost specialty services. Three: Expand Medicare to include everyone. Then we all have a stake, right now.

Friday, November 25, 2011

Veterans Day, the “Bonus Army”, and honoring veterans by actions, not words


We recently celebrated Veteran’s Day, an opportunity to honor the men and women who have served the rest of us, putting their lives on the line, in the wars that our nation has fought. It was a numerologically special day, November 11 this year, being 11/11/11. While I have opposed almost all of the wars fought in my lifetime, as stupid and often motivated by the same greed on the part of the wealthiest that so clearly determines the behavior of our nation, I have only admiration and respect for those who put their lives on the line. The history of the world is often the history of wars, usually one more senseless than the last, and it is the history of the regular people who serve, and are killed, or wounded, or mutilated, or survive apparently intact.

Veteran’s Day began as Armistice Day, with the signing of the peace after WW I, a model for a brutal war that slaughtered millions for no good reason. I live in Kansas City, home of the nation’s WW I Museum, and it is a must-see for anyone who has not studied this first modern war, with millions soldiers dying in trenches; with the first large-scale wartime use of airplanes, with poison gas, with all the other viciousness that people were able to devise. There are some who prefer the use of name “Armistice Day” because it signifies “peace”; I am willing to celebrate our veterans without celebrating, or even condoning, the wars that took the lives of so many of their comrades.

We have not always honored veterans, and we do not do so now. “Honored” in words, sure; honored in deeds, in providing services for them to re-integrate into civilian society and find jobs, even to provide the health care that they need to treat the wounds, physical and mental, that they suffered in battle, not so much. Perhaps the most ignominious and dishonorable treatment of veterans was the attack on the “Bonus Army” of 1932. In 1924, Congress had issued “bonus certificates” to these veterans, but there was a catch – they were not redeemable until 1945. This was not of much help to the men who had “won the war” but were suffering unemployment during the depths of the Great Depression. Over 43,000 people, as many as 20,000 veterans plus members of their families, were camped in Washington DC parks, to demand payment of these bonuses. (It is of interest that President Coolidge had vetoed the bonuses in 1924 with the statement that "patriotism... bought and paid for is not patriotism," before Congress overrode his veto!) Tiring of all these dirty and ragtag families camped on public property (and, of course, the reminder that they brought of the broken promise), on July 28, 1932, President Hoover send the army to break up the encampment and rout them.

That is correct. The President of the United States sent active duty army troops, under the command of General Douglas MacArthur and assisted by Majors Dwight Eisenhower and George Patton, to attack its own veterans.  You didn’t learn this in school? Maybe it wasn’t really that important. Right. It happened. From Wikipedia:

“At 4:45 p.m., commanded by Gen. Douglas MacArthur, the 12th Infantry Regiment, Fort Howard, Maryland, and the 3rd Cavalry Regiment, supported by six battle tanks commanded by Maj. George S. Patton, formed in Pennsylvania Avenue while thousands of civil service employees left work to line the street and watch. The Bonus Marchers, believing the troops were marching in their honor, cheered the troops until Patton ordered the cavalry to charge them—an action which prompted the spectators to yell, "Shame! Shame!"

After the cavalry charged, the infantry, with fixed bayonets and adamsite gas, an arsenical vomiting agent, entered the camps, evicting veterans, families, and camp followers. The veterans fled across the Anacostia River to their largest camp and President Hoover ordered the assault stopped. However Gen. MacArthur, feeling the Bonus March was a Communist attempt to overthrow the U.S. government, ignored the President and ordered a new attack. Fifty-five veterans were injured and 135 arrested….During the military operation, Major Dwight D. Eisenhower, later President of the United States, served as one of MacArthur's junior aides. Believing it wrong for the Army's highest-ranking officer to lead an action against fellow American war veterans, he strongly advised MacArthur against taking any public role: "I told that dumb son-of-a-bitch not to go down there," he said later. "I told him it was no place for the Chief of Staff." Despite his misgivings, Eisenhower later wrote the Army's official incident report which endorsed MacArthur's conduct.”

That’s right. They used poison gas on WW I veterans, many of whom were suffering the effects of gas attacks during the war. Eisenhower, who may look like the “good guy”, was mainly concerned about the seemliness of the army’s Chief of Staff (MacArthur) leading the attack on Anacostia, not the attack itself.

The country was in a Depression. The more than $3 Billion that was owed these veterans was a lot of money for the government during the Depression. Not a good reason to not pay it. Just as it is not a good reason for us to cut back benefits for veterans today, in our own “recession”. In 1930, the Veterans Administration was created, combining several “veterans’ homes” and hospitals. After WW II, when the bonus checks would have come due for the WW I veterans, the GI Bill was passed, granting veterans the opportunity to get needed benefits, including an education delayed by the war. These benefits are regularly eroded by Congressmen who give fine speeches on November 11, but care as much about the actual people who fought our wars as much as Presidents Hoover and Coolidge did. In fact, President Coolidge’s statement about “patriotism” justifying not paying the bonuses would never be uttered by a current-day politician, but the actions of the Congress, which overrode the veto, would not either. We do not have enough money in the US, the story goes. We need to work down the deficit. By taking the money from the most needy, from the poor and the working class and the middle class, including our veterans; certainly not from the wealthiest.

The deficit was created by politicians doing the bidding of the <0.1% of the population who control most of our wealth, cutting their taxes to increase their wealth. And, oh yes, fighting two wars in Iraq and Afghanistan, killing and maiming and creating new veterans who can barely get the help that they need. And, of course, insuring that the 0.1% have every dollar of ours that they lost for us replaced – to them, not us, we pay the bill – and more, is far more important than providing health services and education and jobs for the veterans, or for anyone else.

We would (I think) not send the Army to attack a veterans’ encampment today, but who knows? The people who had fought WW I were honored by our people in those days as heroes even as much or more than our current veterans, and yet our President sent the Army to attack them with cavalry, tanks, and poison gas. Recent history shows us there is no depth of calumny and duplicity to which defenders of the status quo will not go to achieve their ends; remember the military history of #1 hawk Richard Cheney (he had none; he was doing “more important” things during the Vietnam war). Remember the defeat of Senator Max Cleland of Georgia by an opponent who questioned his patriotism and toughness because the Senator had raised questions about the war in Iraq? Sen. Cleland was a decorated Vietnam veteran who had lost both legs and an arm in that war; his opponent had not served.

And, unlike after WW I or WW II, without a draft, with a large group of young people who can find no other jobs, most of us are no longer involved in paying the human price of war. This is the focus of As Fewer Americans Serve, Growing Gap Is Found Between Civilians and Military by Sabrina Tavernise in the NY Times, November 25, 2011. “`What we have is an armed services that’s at war and a public that’s not very engaged’ said Paul Taylor, executive vice president of the Pew Research Center. `Typically when our nation is at war, it’s a front-burner issue for the public. But with these post-9/11 wars, which are now past the 10-year mark, the public has been paying less and less attention.’”

This separation means that, while politicians laud their service on Veterans’ Day, the actual veterans, after serving and suffering from real wounds both physical and mental, are returning to a society that has no jobs,  and is investing less and less in their care. What we need to see is more action on behalf of veterans, and on behalf of the American people. Instead, what we see from too many of our hypocritical Congressmen and “leaders” who sing the praises of our veterans while cutting their benefits, are actions that would make Calvin Coolidge proud. 

Tuesday, November 15, 2011

Troubled hospitals, troubled health care system: Not just in Brooklyn




In Seeking a Cure for Troubled Hospitals in Brooklyn, NY Times, November 10, 2011, Nina Bernstein reports on the challenges faced by not-for-profit hospitals in that part of New York City. In 1980, she notes, Brooklyn had 26 hospitals, while now it has 15. It has 41% fewer acute-care beds, with a ratio of 2.1/1000 people (national average: 2.6, NY State 3.1, Manhattan 4.7). Five of the largest remaining hospitals are in danger of closing; these hospitals account for 83,000 admissions, 325,000 emergency room visits, and 760,000 clinic visits per year. There is no way, the article makes clear, that the 3 public (2 city and 1 state) hospitals in the borough can come close to making up this deficit should those hospitals close. But they may, because they are running in the red, and there is no reason to think that, even if President Obama’s Health Reform stays intact, this will change. They largely care for Medicaid patients, and Medicaid both doesn’t pay enough to cover a hospital’s costs, and is targeted for cuts because it accounts for such a large portion of state budgets.

The reason is that these hospitals care for poor people, as the original title of the article in the Times’ print edition, “Brooklyn’s ailing hospitals and care for the poor”, made clear. The problem, however, is not unique to Brooklyn; it confronts hospitals all over the country. “Brooklyn shows the acute stage of a problem that has vexed the nation for years: how to sustain delivery of major medical care to the poor.” Even more, the fact that increasing portions of the population are uninsured or poorly insured, and that the focus in of the federal deficit reduction process is to further cut payments for Medicare as well as Medicaid, the trend is likely to continue and to increase. From the point of view of hospitals, the issue is whether they will survive or not survive, largely dependent upon where they are located and their ability to attract the decreasing number of well-insured patients. While those who run successful hospitals like to congratulate themselves on being such good managers, the article notes the observations of Alan Sager of Boston University, a long-time student of hospital closings across the country, that “what best predicted that a hospital would be closed was not inefficiency, but location in a minority neighborhood.”

This is not at all surprising; indeed, it tracks with everything else that has been going on in our society: services for the most needy are cut back and ultimately disappear, while services for the least needy get more and more available, marketed, extensive (and expensive) as providers of those service seek to make themselves attractive to a shrinking, privileged market. The problem is that for this to be OK, one has to accept the idea that healthcare access should be determined by the market, rather than that they should be available for everyone in the society. This means that hospitals will close, and providers will not practice, in areas that have high concentrations of people who are poor, uninsured, underinsured, and members of minority groups. But those hospitals that do survive, in higher-income neighborhoods, will compete in the areas that are high-profit “product lines” so that they, and not their competitors, will attract that market segment. Such product lines can include elective and cosmetic surgery, but they also include areas such as heart disease and cancer care because payers (driven by the federal payer, Medicare) reimburse hospitals at rates far above their costs for providing care for these conditions, but not for others. Thus, capacity is overbuilt, resulting in greater capacity (for example, for cancer treatment for the insured) than is needed for the population because each hospital wants to be the one who makes the big markup on chemotherapy drugs.

 But, of course, there is much less access to care for the same conditions for people without insurance, or even for those whose conditions are not in the “high profit” group. And the lack of access to preventive care, to primary care, to care of conditions in their more treatable stages, means that the people who enter the “ailing hospitals” of Brooklyn or elsewhere, are farther along in their diseases and more expensive to treat, so that caring for them drives the hospitals deeper into debt. And this creates a downward spiral. For a patient described in the article “Surgery revealed a strangulated hernia so far gone that cutting out life-threatening infected tissue left an open wound…but before Mr. Hutchins could be released, the hospital had to get him a portable wound pump. At hospitals that pay suppliers promptly, administrators say, the device typically gets same-day delivery. At Wyckoff, it took a week.” And, since “…last year, Medicaid cut by 31 percent what it would pay for a case like his,” the hospital loses even more money providing his care for an extra week.

In poor neighborhoods, almost all services have more limited availability. This may make sense, say, for upscale restaurants, or clothing stores. It is much more problematic when those communities do not have food stores. Or healthcare. It is, however, the result of applying a competitive market model to healthcare, leading to overcapacity for a portion of the population and a deficit or absence of care for another part of the population (based on wealth, location, and type of condition). This is why most other countries with the resources have made the decision to provide access to health care to all their people, rather than ration based on the market, which by definition leaves out the people at the bottom who cannot pay. Our healthcare nonsystem reinforces these inequities, which are more than unfair, they sap the ability of our country to have a healthy and productive workforce.

There are solutions, but not the ones being suggested by some for Brooklyn (“…expunge the hospitals’ debt of more than $1 billion, partly at taxpayer expense, and then let large for-profit companies take over the facilities and restructure patients’ care,” which sounds an awful lot like “bail out the bankers and financiers with public funds”. The solutions are to create a national health system, a system which guarantees healthcare access for everyone. Most cost effectively, a single-payer system. It can be done, for not much more than we now spend, because of the excess waste and profit built into our reimbursement methodology. It can be driven by the federal government because the federal government is the largest payer for health care.

In an article on the reopening of the national physician database (After protests, national doctor database reopens — with a catch), Alan Bavley of the Kansas City Star quotes Senator Chuck Grassley, an Iowa Republican, as saying “This agency needs to remember that half of all health care dollars in the United States comes from taxpayers, so the interpretation of the law ought to be for public benefit.” That half of all healthcare dollars is as much, on a per-capita basis, as most other OECD countries spend altogether, and it is what drives reimbursement (for cancer chemotherapy or diabetes or asthma) in this country. It would be great if Sen. Grassley would take the lead in ensuring that not only the physician database, but all of healthcare services provided with dollars from taxpayers, is “for public benefit” and not private profit.

The general counsel for one of the threatened Brooklyn hospitals is quoted by Bernstein as saying “We stay open at the grace and generosity of our vendors.They know it will eventually get better, because we have to have hospitals. Otherwise, we’ll have sick and dying people lying in the streets, and nobody wants that.” But the solution is not just to patch up Brooklyn’s, or anywhere else’s, acute problems; it is to fix the broken system and perverted incentives.