Gout: an unwelcome and surprising visitor

April 12th, 2016

A few months ago I rather suddenly developed an acute painful swelling at the base of my left big toe. "I think I'm having an attack of gout, but why me?" I wondered

Of course this had to happen on a weekend, when the Family Practice Clinic I'd usually go to was closed, as was my Podiatrist's office.

I wondered if I could have septic arthritis, an infected joint, but had no obvious reason for this much more frightening diagnosis. I remembered there was also an entity called pseudogout, where the crystals were calcium pyrophosphate, not uric acid.

So I went to the Urgent Care Center our local hospital established a few blocks away from my house. A Family Practice physician examined me and said, "I think you have gout, but you'll need to go to the hospital's Emergency Department (ED) so they can get some fluid from your joint and decide if it's really gout. We don't do that test here." She was also concerned about the rather slim possibility of septic arthritis.

At the ED I was triaged as someone who could wait a while and eventually seen by a Physician's Assistant (PA) who said, "I think you have gout!"

Now I trained at Duke, the home of the original PA program and one of the second generation of PA's taught me how to do dialysis in the ICU for patients with acute kidney failure, so I have no problem at all being seen by a PA.

"What do we do to make sure it's gout and not septic arthritis or pseudogout?" I asked.

"I really don't think you have an infected joint. We'd have to "tap your toe" (i.e., aspirate some fluid from the joint), to make sure it's gout, but that's a tricky procedure, not one that I would do. You should see a podiatrist. In the meantime, take big dose Ibuprofen."

I had a bottle of that at home and knew I could take 800 milligrams three times a day as long as I ate a sandwich or a meal first. I wasn't going to bother the on-call Podiatrist that weekend unless the pills didn't help.

When I did see my regular Podiatrist, two days later, the clinical signs of acute inflammation that I had learned in Latin and English during my first year of medical school:, rubor (redness), calor (increased heat), tumor (swelling), dolor (pain), had all diminished markedly.

"I don't stick joints unless I have to," said my highly experienced Podiatrist. "The PA you saw at the hospital was right; it's not something we do routinely. You're better and this was a classical gout attack, so keep taking the NSAID for a few more days."

When I was in medical school, half a century ago, I thought of gout as a kind of acute arthritis that affected corpulent, older men who ate too much red meat and drank port wine. They then deposited uric acid crystals in joints and sometimes developed a chronic form of the disease called tophaceous gout, where nodular masses of the crystals (tophi) are deposited in different soft tissue areas of the body. Tophi are most commonly found as hard nodules around the fingers, at the tips of the elbows, and around the big toe, but they can appear anywhere in the body, even in the ears, vocal cords, or even against the spinal cord! As a Nephrologist, I was also aware that some people develop uric acid kidney stones.

Here's a link to an article with photos of acute gout in a toe and Henry VIII who suffered from the disease. http://www.dailymail.co.uk/health/article-2210797/Disease-kings-rise-people-gout-increase-obesity.html

I didn't fit the image I had of someone who'd have a gouty attack. I wasn't overweight, didn't drink much (I usually have a glass of wine or whiskey three times a week), didn't have a family history of gout and hadn't over-indulged in the high-purine foods that can increase uric acid levels.

Purines are natural substances found in all of the body's cells, and in virtually all foods. A relatively small number of foods, however, contain concentrated amounts of purines. For the most part, these high-purine foods are also high-protein foods, and they include organ meats like kidney, fish like mackerel, herring, sardines and mussels, and also yeast.

When cells die and get recycled, the purines in their genetic material also get broken down. Uric acid is the chemical formed when purines have been broken down completely. Low-purine diets are often used to help treat severe gout in which excessive uric acid is deposited in the tissues of the body. Purines from meat and fish clearly increase our risk of gout, while purines from vegetables fail to change our risk. Dairy foods (which can contain purines) actually appear to lower our risk of gout.

I had been taking a baby aspirin a day for reasons that were "iffy," and some recent research had implicated even low-dose aspirin as a possible risk factor for gout. Here's a link to that summary online http://www.ncbi.nlm.nih.gov/pubmed/23345599. So I stopped taking aspirin.

I had no personal history of heart disease, although my brother, who had many risks factors, died at 57 of a heart attack and my mother had one at age 74. Aspirin for secondary prevention of cardiovascular events seems to make sense to me (always discuss this with your own physician before you start taking long-term aspirin or any other drug). But primary prevention, that is taking aspirin if you haven't had a heart attack or angina, is a different matter, one that is being hashed over in the medical literature.

I found an article on pseudogout on the WebMD website  http://www.webmd.com/osteoarthritis/arthritis-pseudogout and wondered if that's what I had had an episode of. Time would tell.

Then recently I flew to the DC area to visit family and friends, but mostly to hear and see Jordi, my sixteen-year-0ld grandson, who had the male lead roll in "High School Musical," at HB Woodlawn High.

I normally drink three very large glasses of water a day and my family had purchased the limes I squeeze into the water (It tastes better, so I drink more.) But on the flight home I didn't drink much at all and I was aware of pain in my right big toe as I came off the plane.

I woke up at 4:15 a.m. the next morning with fairly severe pain in  that toe, took 800 milligrams of Ibuprofen after eating a thick slice of bread abundantly smeared with cream cheese and jam, and went back to sleep.

I was fortunate; my Podiatrist had an appointment cancellation the next morning and he said, "Clinically this is gout. I'm going to give us a "script for another NSAID that I find works better for acute gouty arthritis. And drink lots of fluids"

I went to the pharmacy and got a bottle of indomethacin, an NSAID I haven't used for many years. I had asked him if he wanted me to get a uric acid level blood test and he said, "You can have gout without having an elevated uric acid, but if you do have one, we can treat it. I think the time to get blood tests is after the acute attack resolves.'

I was aware that another old drug used in gout, Allopurinol, was still around, but had to look up its mechanism of action. What I found was Allopurinol reduces the production of uric acid in your body, so if I did have an elevated blood level of uric acid, it could be used to potentially prevent me from having gouty attacks or forming uric acid kidney stones..

I hadn't been aware that an elevated uric acid level might go down during an acute gouty attack; I visualized uric acid crystals migrating to my big toe, but was unsure if that's what the Podiatrist meant.

I'd be very happy if the next episode waits a long time to happen, but I'm not betting on that being the case.

 

Hospice: costs and your own choices

February 26th, 2016

The Wall Street Journal had a February 19th front-page article on how much Hospice is costing Medicare with the emphasis being on those patients who go into home-based Hospice care and stay there for prolonged periods. While many of that group have Alzheimer's or other dementing diseases, others have chronic obstructive pulmonary disease (COPD, e.g., emphysema), heart failure or cancer. The average time spent on Hospice in 2013 alone was 93.2 days for people with Alzheimer's and similar dementias and they consumed 22% of the total Medicare spent on End-of-life (EOL) care that year.

The statistics in the article were striking: over an eight-year period (2005-2013) 107,000 patients had been on Hospice for much longer, close to a thousand days. That's a very small subset of all those who had sought out Hospice care during that timeframe, 1.3%, but that relatively small number of patients cost Medicare a huge amount, 14% of its total dollars spent on Hospice in that timeframe.

The program itself was originally set up for those whose physicians could certify were terminal, within six months of dying. The overall Medicare Hospice expense total in 2013 was roughly $15 billion and the WSJ's data references a study stating that that care, at least for those who did not have cancer, actually cost Medicare 19% more than for similar patients who did not seek out Hospice care.

That's especially true for those who have dementia, but other chronic conditions (where the time to death is less predictable than it often is for patients with cancer) clearly play a role also.

As I was not surprised there's often another side of the picture. I have had two examples in my extended family; one involving Paul, my son-in-law and the other Lynnette, my wife. My son-in-law's father had Lyme disease with significant brain involvement. He went from being a distinguished systems engineer to needing help with many of the activities of daily life. It was a long time before his family was able to get him on home Hospice; when they finally did, my son-in-law said to me, "it was the difference between day and night." The family really appreciated the care given to him during his final weeks.

Recently my wife had a major stroke and lived one week, one hour and seven minutes after the onset of the bleed into her brain. I had seen her stumble in our bedroom, realized she was unable to get up, carried her to our bed and called 911. We went from ambulance to a local hospital where CT scanner which showed a considerable area of bleeding in her brain. She was immediately moved to a bigger hospital where a neurosurgeon was available. He ordered a second CT scan which showed even more bleeding. All the things he could do would come with consequences she had always said she would not want to live with.

I was her health care surrogate and carried out her oft-expressed wishes, rejecting the surgery, so she went, several days later, to Hospice at a third hospital. The care there was exemplary, but this was clearly to be short-term, inpatient Hospice.

So when I read the WSJ article, I did so with a jaundiced eye, not from the viewpoint of a physician or a taxpayer, but from that of a family member who had experienced the positive side of Hospice.

The issue for me, as it was for my wife, is quality of life. That's a hard subject for many of us to discuss, but I think it's crucial. We had had those talks repeatedly over the past five or six years and both of us knew what the other wanted.

I don't have answer for others, but I urge you to think about the subject, talk about it with your spouse, significant other and children, whoever might be asked to make a choice for you if you are unable to make it yourself.

There was the other aspect we hadn't fully explored. Neither my daughter, who would have been the one to make choices for me if my wife was unable to do so, or a nephew, who would have been my wife's secondary decision maker, had been party to our discussions.

So what would have happened if Lynn and I had been in a car crash, had both been severely injured and our alternate health care surrogates had to make those tough choices?

I don't know the answer, but it's clear to me now that EOL discussions should involve more than you and your spouse or significant other.

It's also clear that neither Lynn nor I would have wanted prolonged home Hospice, but would have (I did at least) really appreciated its availability for relatively short-term care, whether in our home or, as it turned out in a hospital setting. You may or may not feel the same way.

Please think of having those talks now with whomever in your family might be called upon to choose for you.

 

 

 

Even the best of us...smallpox, anthrax, influenza and the CDC

July 16th, 2014
This is our premier laboratory

This is our premier laboratory

The Center for Disease Control and Prevention, AKA the CDC, America's central medical laboratory has recently had multiple problematic episodes. I was trying to follow up on the vials of smallpox virus that were found in an old refrigerator that the FDA apparently had forgotten, The question, of course, was whether the virus samples were long dead or still viable. They had been sent to the CDC to have that highly significant issue resolved.

Since then there has been a followup announcement, but also several articles on significant issues with procedures and safety at the CDC itself. The first was published in The New York Times, AKA NYT, (as well as in other papers, but I get the NYT daily on my iPad , so saw it there first). The startling title was "C.D.C. Closes Anthrax and Flu Labs after Accidents." The current director of the CDC, Dr. Thomas Frieden, called the lab/agency "the reference laboratory to the world," but admitted there had been a series of accidents (actually lapses in set safety procedures), in the recent past, that were quite frightening.

A month ago potentially infectious samples of anthrax, a bacteria found naturally in soil and commonly affecting wild and domesticated animals worldwide, causing an extremely serious, but rare illness in people, were sent to labs that were not equipped to deal with them (anthrax would normally be handled only with the highest level of protective biosafety gear and procedures (BSL-4). The CDC also has a rather simplistic YouTube video discussing anthrax's use as a potential bioterrorism weapon, but in this case 62 or more CDC employees were potentially exposed to the bacteria in the course of their work.

The good news is it appeared nobody was in danger; all those employees were given the anthrax vaccine and also begun on antibiotics. The background information available online says there has never been person to person spread of the disease.

It appears that it's exceedingly tough to get rid of anthrax in the environment; I'll go over the classic historical example of how careful government researchers have been with its spores..

In the 1940s, British scientists used a small Scottish island (Gruinard) for germ warfare research. That island, thoroughly contaminated with anthrax spores, remained off-limits for forty+ years before extraordinary efforts, begun in 1986, rendered it safe for ordinary use. The surface of the island was only 484 acres; it was sprayed with a herbicide, then all dead vegetation was burned off. Next 200 tons of formaldehyde solution was diluted in 2,000 tons of seawater and sprayed over the entire island. Perforated tubing was used to ensure that 50 liters of solution were applied to every square meter being treated.

Later the effectiveness of the decontamination process was assessed by taking two duplicate sets of soil samples. Each was tested at two major government labs. Anthrax spores were detected only in "small quantities in a few places." These specific areas were treated in July 1987, followed by further soil sampling in October 1987. No further traces of anthrax spores were found.

Blood samples from local rabbits were also tested for anthrax antibodies. No such antibodies were found.

Following these measures, a farmer grazed part of his flock of sheep on the island for six months. The sheep were inspected monthly by the District Veterinary Officer, and returned to the mainland in October 1987 in excellent condition.

On April 24, 1990, 4 years after the decontamination works had been completed, a Defense Minister visited the island and removed the safety signs, indicating that the island had finally been rendered safe. Then, per agreement  the island was sold back to the heirs of the original owner for the WWII sale price of £500.

But a senior British archeologist said he still wouldn't set foot on the island; he was concerned because of potentially infectious particles found in some of his digs.

Yet another NYT piece, "Ticking Viral Bombs, Left in Boxes," this one written by a distinguished physician, Lawrence K. Altman, M.D. recalls the irony of the outcry for mass smallpox vaccination of our entire U.S. population after 9-11 (when no Iraqi supply of the deadly bacterium was ever located), contrasted with the recent finding of six vials, two with live smallpox bugs, being found in in Bethesda, almost within "spitting distance" of our center of government.

In 2011 the Birmingham Mail reviewed a tragic lab accident which led to the last known smallpox death . The city, now England's second largest, was a site of a medical research laboratory associated with the local medical school. Viral particles got into an air duct and a photographer whose studio was one story up from the lab became the last known case of active smallpox and died from the disease in spite of having been vaccinated twelve years before

Dr. Altman discusses the pros and cons of eradicating the last two known stocks of the virus, one at the CDC, the other in a Russian lab in Siberia. Even if the natural virus is finally and totally eliminated , a rogue group may well be able to re-establish their own supply from the known genetic sequence of smallpox.

Lastly I saw a NYT article with an even more disturbing title, "After Lapses, C.D.C. Admits a Lax Culture at Labs." CDC workers had somehow shipped a dangerous strain of avian influenza to a poultry research lab run by the Department of Agriculture. Known as H5N1, the virus had killed more than half of the 650 people who had been infected with it since 2003. Again there were no deaths from this mistake.

After all of this recent furor plus the historical examples, I'm heartily in favor of the idea that's been broached saying such dangerous organisms should be confined to a minimal number of labs and even those clearly need to tighten up their standards.

 

 

 

 

 

 

Smallpox: vials found in NIH lab

July 9th, 2014

I was glancing through The Wall Street Journal. this morning (that period is intentional as I found out recently in their 150th anniversary issue) and saw an article about smallpox,  that old enemy of mankind. The CDC issued a media statement saying six vials labeled with the technical name of the disease, variola, had been found in an old storage room belonging to an FDA lab that is on the NIH Bethesda, Maryland facility. Forty-two years ago the FDA took over that lab, among others, and only now were those labs being moved to the main FDA location in the DC area. The vials themselves date back ~60 years and now will be tested to see if the material in them is viable (i.e., live smallpox viruses).

I reviewed the CDC's Bio-hazard Safety Levels; they range from 1 to 4 with more serious infectious agents occupying the higher levels. A BSL-3 agent can cause serious or deadly disease, but either doesn't spread from person to person (at least not easily) and/or has a preventive or treatment known. Plague, rabies virus and West Nile fit into this category. Smallpox is obviously a BSL-4 bug, the most dangerous kind and in the company of Ebola virus. A February 15, 2012 Reuters article, "How secure are labs handling the world's deadliest pathogens?" talked about the precautions used in such a lab in Galveston, Texas. The boss there got entry by swiping a key card, was scanned by 100+ closed-circuit cameras as he opened seven additional locked doors before he reached the lab where another card swipe and a fingerprint scan were necessary for entry. The Washington Post article on the recently found vials has a six-minute video on BSL-4 procedures with a comment that there are three over-lapping types of safety precautions: those for containment of the hazardous material; those for personal protection and overall administrative factors.

And this may get you into BSL-3/

And this may get you into BSL-3

The air flow and exhaust systems used in Galveston, the full-body suits with their own air supply and the intruder drills that are conducted all made me somewhat more comfortable. But that's in a government-funded laboratory. Even in the United States, a private-funded lab may not be subject to the same rules and regulations, Elsewhere the procedures that must be followed vary. In 2011 there were 24 known BSL-4 labs in the world with six in the U.S. (The GAO said we had considerably more.) In 2013 there was considerable protest in Boston over the proposed BSL-3 and BSL-4 lab there.

We don't see these anymore.

We don't see these anymore.

I've written about smallpox before, but a brief history, available online on a dermatology website was worth reviewing. The disease likely originated in Africa about 12,000 years ago. caused a major epidemic during an Egyptian-Hittite war in 1,350 B.C.E and left typical scars on the mummy of Pharaoh Ramses V who died in 1157 B.C.E. It got to Europe somewhere between the 5th and 7th centuries C.E.; millions died in Europe and the Western Hemisphere before Edward Jenner developed vaccination in 1796. The term came from the Latin word for cow (vaca), as Jenner used fluid from a cowpox-infected dairymaid's hand to inoculate an eight-year-old boy. In 1967 WHO estimated there were 15 million cases of smallpox and 2 million deaths from the disease. Total smallpox deaths over the past 12,000 years have been estimated at 300-500 million, more than all the world wars combined.

By 1980 the World Health Organization declared the globe smallpox-free. In this country, we quit vaccinating the general population in 1972 and stopped giving the inoculation to new military personnel in 1990.

My wife's old shot record shows she got her first vaccination against small pox in 1956 and the last booster in 1980. We were both assigned to bases in the Far East in the early and mid 80s. I can't find my military vaccination record from that time frame, but logically wouldn't have had a booster after 1986 when I got back to a base in Texas. Since immunity is unlikely to last more than ten years, at this stage we'd both be vulnerable to smallpox, like most everyone else.

The only known supplies of the virus remained in government laboratories in the United States and Russia. There has been considerable international protest against keeping those specimens alive starting in the early 1990s, but thus far neither country wants to give in to that pressure. One rationale was the genetic structure of the virus was known, so it could conceivably be recreated by terrorists.

In 2004 the CDC said they had stockpiled enough smallpox vaccine on hand to vaccinate everyone in the United States. I haven't found any updates on that statement. But the U.S. military was still giving those shots to personnel going to USCENTCOM areas (the Middle East and the "stans") until the middle of May, 2014, to Korea for another two weeks and to some special mission troops after that with an end date unspecified.

So now it's the middle of 2014 and, in one manner or another, smallpox is still lingering, fortunately not as an active disease. The CDC is testing those re-found vials of the virus  and we'll hear in a couple of weeks is they were still viable.

 

 

 

 

.

Long-term acute care hospitals

June 30th, 2014

An article in The New York Times (NYT) several days ago opened a new topic and re-visited an old discussion in our household. The title was telling, "At These Hospitals Recovery is Rare, but Comfort is Not" and talked about what Medicare calls long-term care hospitals (LTCHs). I had never hear of the term.

The article said there were 400 of these facilities in the United States, but lots of practicing physicians are unaware of them. I did an online search and found a 20-bed facility in thus category about 15 miles from where we live and a 63-bed hospital in Denver, roughly 65 miles away. I wasn't sure of any in the southeastern part of Wyoming, 40-50 miles north of us.

The Medicare website on Long-Term Care hospitals says they focus on those whose inpatient stay is likely to be more than 25 days. The contrast is stark as this is an age when many surgeries are done in a technically "outpatient" fashion (the current definition of an inpatient says you're in the hospital at least two midnights). Medicare says LTCHs specialize in treating patients, even those with more than one serious medical condition, who may improve with time and eventually return home.

Yet the NYT piece talks of patients who are critically ill, may be unresponsive, even comatose and, except for those who are younger and have been in an accident, may stay for months, years, or the rest of their lives. In 2010 another NYT article discussed significant issues with some LTCHs.

At that point my wife and I both said, "Don't put me in a LTCH!" We are 73 years old, relatively healthy at the present time, and enjoy life. We have living wills and medical durable power of attorney documents naming each other as first decision makers if we can't choose for ourselves.

I've mentioned before how my parents approached this quandary. Mom had a cardiac arrest at age 74, was resuscitated by Dad who was still a practicing physician, and lived 16 more years. But when she was in her declining last four years and they had moved from totally independent living to a seniors' residence, they encountered a situation that influenced their future decisions. Mom had a minor acute illness and moved short-term into an associated facility.

She was there for a few days to get some antibiotics and nursing care, but in the next room was a woman, the wife of one of their friends, who had been in extended care for seven years. For the last four of those she no longer recognized her husband, yet he requested treatment of her bouts of pneumonia on three separate occasions. Dad and Mom each said, "Don't do that to me!" They had signed the appropriate end-of-life documents before Mom showed signs of initial dementia.

A 2011 article in Kaiser Health News stresses that making end-of-life decisions can be tough, especially if they aren't made in advance. But a professor of ethics was noted as saying more than 90% of all families who have a loved one in intensive care want to hear prognostic information that will help them make those difficult decisions.

Hospital care has changed..a lot, since I last saw inpatients. It used to be that the physician who organized your treatment was the same one you saw in her or his outpatient office. Now the primary care physicians I know, unless they are part of a residency program, don't see their long-term patients at all when they are hospitalized. Instead patients see an ER doc, a hospitalist (physician whose focus is inpatient care) and, if they go to an ICU, an intensivist.

Intensivists  are physicians who have completed specialty training, often in Internal Medicine or Anesthesia and then take an additional two-to-three year fellowship in critical care medicine (some are triple board certified, in lung disease (Pulmonology), for example. They are often thought of as primary critical care physicians and in major academically-oriented clinics and their associated hospitals (e.g., the Cleveland Clinic), they may provide most or all of the physician care in the ICU.

Do you need an intensivist?

Do you need an intensivist?

The article from the NYT said we spend over $25 billion a year in long-term acute care in the United States., The path to landing in a LTCH often begins in an ICU. The major task for intensivists is keeping patients alive during critical illnesses. That often means deciding on short- or long-term-ventilator support, the possibility of a tracheostomy (a surgically created hole through the front of the neck into the trachea (AKA windpipe) to allow this, feeding tubes of several varieties or long-term intravenous access.

I don't ever want to be on a ventilator long-term. I might allow one short-term if I had a clearly treatable, potentially curable patch of pneumonia, for instance, but I would  want to set a time allowance for that.

When my mother quit eating, her physician wanted to create a long-term method of feeding her, a percutaneous endoscopic gastrostomy (PEG). If someone cannot eat and needs to be fed long term, one method of doing so is to place a PEG tube through the wall of the abdomen directly into the stomach.

This could be done for someone who has had a stroke and is at risk of aspirating food if fed normally. In my Mom's case, by then she had developed significant dementia and Dad said, "We've made our decisions; she is not having a PEG tube."

She could have gone into a LTCH and lived a while longer, but Dad knew that her refusal to eat meant she had come to a logical stop point.

There are an estimated 380,000 patients in LTCHs at present. Some (roughly 10 to 15%) are there for appropriate reasons and have a reasonable chance of recovery; many are not. A study by a Duke critical care specialist found half who enter these facilities die within a year; a majority of the remainder are in "custodial care."

I don't choose to join their ranks.

So there are some decisions that you and your family may want to make. I'd suggest you read the NYT articles and think about what your choices might be. It's never easy, but a careful discussion in advance with your long-term goals in mind makes sense.

 

 

 

Dengue fever and its major Mosquito vector

June 21st, 2014

I don't like being bitten by mosquitoes any more than the rest of you do, but worldwide the real reason to avoid them, kill them or alter them is the enormous disease burden they cause. One recent estimate , surprising to me, said "mosquitoes have been responsible for half the deaths in human history." I was aware, having lived as an Air Forced physician in the Philippines and traveled in South America and Africa, that malaria was one enormously dangerous, mosquito-carried disease, but there's a long list of other illnesses that contribute to the threat from these insects.

This one doesn't carry dengue

This one doesn't carry dengue

From 1690 to 1905 major epidemics of yellow fever struck parts of southern and eastern America: Boston, New York, Philadelphia, New Orleans killing over 40,000 people. A 2006 PBS website gives short summaries of nine of the outbreaks and alludes to even larger mortality figures.

And then there's dengue, a disease primarily transmitted by the bite of infected female Aedes Aegypti mosquitoes. They don't make the telltale sound that alerts you to other mosquitoes, they also strike during daytime and may follow their human target, biting repeatedly.

Dengue attacks 400 million people every year world-wide., mostly in the tropics and sub-tropics. Three-fourths of those infected never develop symptoms and of the remaining 100 million, a large majority have a mild to moderate nonspecific acute illness with a fever. But 5% can have severe, even life-threatening disease with terrible joint and muscle pain (It's called break-bone fever), hemorrhages and shock. The World Health Organization estimates 22,000 die from dengue yearly, but other estimates range from 12,000 to 50,000.

The first known case in the United States occurred in Philadelphia in 1780 and was documented by Benjamin Rush, the distinguished physician who was a signer of the Declaration of Independence.

The Center for Disease Control and Prevention (AKA the CDC) has an entire chapter on dengue in its "Infectious Diseases Related to Travel" publication and a shorter version with links for travelers. Their maps of disease distribution focus on warmer areas in Africa, Central and South America, Asia and Oceania.

There has been no vaccine available to prevent the disease and no specific anti-viral treatment for those with severe cases of dengue. Because of known bleeding complications, those who get the dengue are advised to avoid taking aspirin or any of the nonsteroidal anti-inflammatory drugs , AKA NSAIDs, such as ibuprofen.

The continental United States was essentially dengue-free for over fifty years, but marked increases in dengue infection rates have occurred in our hemisphere, mostly in South America and Mexico.

Now Aedes Aegypti is back in Florida, Texas, and Hawaii. The article in The New Yorker mentioned a small 2009 outbreak of dengue in Key West with fewer than 30 cases, but that was the first real brush with the disease there in over seventy years. In 2010 there were twice as many cases. An entomologist (insect specialist) with the Florida Keys Mosquito Control District reminded the reporters that the manner in which the populace lived was crucial; from 1980 to 1999 there were only sixty-four cases on the the Texas side of the Rio Grande and 100 times as many just across the river.

What was the difference? Likely screens on windows, cars with AC running and windows closed and how often people were exposed outdoors. Key West, in a 2013 followup, had seen no further cases, but the World Health Organization called dengue the "fastest-spreading vector-borne viral disease," saying cases had gone up thirty-fold over half a century.

Why has this happened and what can be done about it?

How can we do this?

How can we do this?

Is this another consequence of global warming? After all dengue has appeared in France and Croatia for the first time. But I just watched an online video by Dr. Paul Reiter, a world-famous medical entomologist, who spent much of his professional career at the CDC's Dengue Control branch. It was obvious that he does not believe in man-made global warming (I do) or that any form of global temperature change is responsible for the spread of malaria or dengue.

How about used tires? He thinks they are great incubators for mosquitoes and billions of those tires have been moved around the globe. So Aedes aegypti has adapted to the city, in part because of our habit of having water-containing used tires around the places where we live.

I don't have any old tires in my yard and I change the dog's water bowl and the bird water outside frequently.

A few new ideas are out there: a British company called Oxitec has genetically modified (GM) mosquitoes, making the males able to mate, but also giving them a gene which kills their offspring soon after they hatch. An initial field trial in Brazil was successful in markedly reducing the population of disease-carrying adult females (remember, males don't bite humans for a blood meal; females do).

Further field trials of these GM-mosquities, titled OX513A, have met with considerable opposition and an engineer involved has published a paper examining the ethical issues involved. The lifespan of mosquitoes is short and they don't appear to be a major food source for other creatures; the most significant issue likely is fully informing the people in the test area are who may consider OX513A to be just another threat.

A French pharmaceutical company recently announced an experimental vaccine for dengue was moderately successful in a late-stage, placebo-controlled clinical trial involving 10,000 children in Southeast Asia, reducing dengue incidence by 56%. A similar clinical trial is underway in South America.

It's a bad disease, coming back at us, but perhaps there's some good news on the horizon.

 

 

 

Using (or at least minimizing) our food waste

May 21st, 2014

I recently read an article in The New York Times with the interesting title, "Recycling the Leftovers." It was written by Stephanis Strom, one of their regular correspondents, and covered a variety of programs in America for recycling food scraps. Lynnette and I have been separating our own waste streams that for at least ten years and have a garbage bin, a trash sack, a recycle sack and a composting pail in our kitchen and laundry room. Our waste-collecting company keeps adding new items that can be recycled, but at present we only put out two containers for them: trash goes to the curb to be picked up weekly and recyclables go out every other week.

Composting is one approach to food waste.

Composting is one approach to food waste.

Now the city of Austin, Texas has plans to markedly extend its  food waste pilot project; Strom's article says 14,000 Austin residences currently have a third garbage bin, one for food scraps, collected weekly Twenty-five years ago the city started with a "Dillo Dirt" program; the city made over a quarter million dollars last year selling the end product, compost made from yard clippings and treated sewage sludge. The newer approach, adding organic waste, currently has enrolled less than 10% of the city's ~185,000 households; the plan is for all of them to be offered the service. I'm unaware of a city-wide program here in Fort Collins for food scrap recycling; ours end up  in a vermiculture bin that's outdoors, but in a fenced-in corner. The worms doing most of the work in turning food waste into compost thus far have survived our winters.

The concept is being highlighted nationally by the U.S. Food Waste Challenge (FWC), a joint project of the U.S. Department of Agriculture (USDA) and the Environmental Protection Agency (EPA). The goal of the FWC is to bring about a "fundamental shift" in the way we manage and even think about food and food waste. The USDA/EPA wants to have 400 organizations enrolled this years and 1,000 by 2020 and they are well on their way already with an impressive list of governmental and private partners including companies, colleges and universities, K-12 schools and at least one major hospital having joined.

We as individuals can't join the FWC, but there is a webpage of suggestions for consumers. Basically it says shop thoughtfully, cook with care, serve portions that you'll eat then and there, save whatever can be kept (while eating what would otherwise spoil) and, if possible, grow part of your meal. It also mentions we should shop our own refrigerators first; plan meals before we go grocery shopping so as to buy only those items we actually need; freeze, preserve or can seasonal produce; ask restaurants to bag up leftovers and be realistic at all-you-can eat buffets.

I was at a writers' meeting recently and drove to the event with my long-time writing mentor. She said her family almost always eats everything she buys, but even with a husband and three teenagers on board I knew she was being modest. She obviously shops carefully and plans ahead.

Our lunch yesterday featured a Quiche my wife (professionally a Jung) made that was "Jung Fun." It wasn't your typical recipe, but used up everything in the vegetable drawer that was needing to be eaten ASAP. We still occasionally have spoiled vegetables and fruits, especially when our CSA gives us more than its usual abundance, but those go into the compost bin.

We did go to the CSA a few days ago to purchase four beefsteak tomato plants. We've got a special above-ground gadget for planting tomatoes and have consistently done well with those we bought at a nursery, but, having grown up eating beefsteak tomatoes, I'm really looking forward to have an abundance of them. Our local grocery store generally has good produce, much of it grown locally or regionally, yet it's been my experience that homegrown tomatoes are several orders of magnitude better than anything I can buy at a store.

Beefsteak tomatoes are yummy!

Beefsteak tomatoes are yummy!

The EPA's Food Recovery Challenge webpage has a horrifying set of statistics from 2011 (they're still collecting/collating the 2013 stats apparently, but what happened to 2012?). Almost all (96%) of the 36 million tons of food waste generated in 2011 ended up in landfills or incinerators. The food sent to landfills breaks down and releases methane, a nasty greenhouse gas, twenty times as effective in increasing global temperature than CO2 is. More than a third of all methane released into the atmosphere comes from landfills (domesticated livestock accounts for 20% and natural gas and oil systems another 20%) .

While all that food is being wasted and much of it is contributing to global climate changes, 14.9% of U.S. households were food insecure in 2011, not knowing where they'd get their next meal. Fortunately we have a strong local Food Bank serving Larimer County and their "Plant It Forward" campaign's 2014 goal is to obtain 15,000 pounds of produce donated by local gardeners.

So where are you in the nationwide quest to cut food waste?

 

Food Safety Issues: America in 2014

March 19th, 2014

Having written recently about China's food problems, I knew there were some remaining in the Untied States, but their scope amazed me. Each year forty-eight million of us suffer from food poisoning. Over 125,000 of that group are ill enough to be hospitalized and 3,000 die.

Having seen those numbers on a government website, I decided to review the modern timeline of food-related illness in America and how our laws help prevent it.

One step in meat processing

One step in meat processing

My initial thought was of Upton Sinclair's 1906 novel, The Junglea powerful expose' of the American meat-packing industry. After its publication, public outcry led President Theodore Roosevelt to appoint a special commission to verify Sinclair's tale of the horrors of meat production in Chicago and elsewhere, and eventually led to the meat Inspection Act of 1906 and the Pure Food and Drug Act.

For many years a so-called "Poke and Sniff" system prevailed. The 1907 law said federal employees could inspect and approve (or disapprove) any animal carcasses which were to be sold across state lines. The inspectors could physically monitor the slaughter line, touching and smelling the animals and cuts of meat. They could remove any meat that was rotten, damaged or had abrasions or growths. Some felt that provided only minimal protection for the public, but that's what we had for over eighty years.

I grew up in Wisconsin in the 40s and 50s. My father, in addition to his medical practice, was the local Public Health Officer and I remember going to inspect local area dairy herds with his sanitarian when I was a teenager. I don't recall major food safety issues surfacing in those decades., although there may have been some isolated cases that I didn't pay attention to.

I was in medical school from 1962 to 1966. During that time, two women died in Michigan from botulism, a rare but extremely serious paralytic disease caused by a toxin produced by a bacteria. In their case the toxin was in canned tuna fish. There were other botulism outbreaks in 1971, 1977, 1978 and 1983 with 59 people being affected in the largest such episode. All were related to food being improperly canned or prepared.

In 1985 a huge outbreak of another form of food poisoning happened. This one involved at least 16,284 people (and perhaps up to 200,000) in six different states and was caused by bacterial contamination of milk.

Some new laws only applied to a few food items.

Some new laws only applied to a few food items.

The Department of Agriculture's food safety and inspection timeline appears to skip over a considerable period of time, although a number of laws were passed to strengthen federal regulation of the food chain. The 1957 Poultry Products Inspection Act and the 1970 Egg Products Inspection Act added to the government's ability to prevent food-related illness in specific areas, but wouldn't have prevented the major food-related episodes I just mentioned.

Then in late 1992 and early 1993 an E. coli outbreak sickened 623 and killed 4 children in four western states (Washington, Idaho, Nevada and California). It was eventually traced to contamination of under-cooked Jack in the Box hamburgers with that common bowel bacterium. Those affected developed bloody diarrhea and, in a few cases, severe kidney disease from an entity termed hemolytic-uremic syndrome (HUS). This is a disease which is the most common cause of acute kidney failure in children and usually occurs when an infection in the digestive system produces toxic substances that destroy red blood cells, causing severe kidney injury. The CDC traced the meat back to five slaughter plants in the United States and one in Canada.

In 1998 the USDA introduced a brand-new method for inspecting meat. The "Hazard Analysis and Critical Control Point (HACCP) system had been pioneered by NASA. That agency had protected our astronauts by adopting a system of critical control points, anywhere a germ, invisible to the naked eye, could find its way into a food meant for a space mission.

Pinging off the NASA approach, the USDA also mandated inspectors could order meat plants to do microbial testing. The meat industry became responsible for establishing and submitting their own HACCP plans. Then USDA would review the plan, approve it if it seemed appropriate and inspectors could monitor the plans' compliance with their own safety plans. The problem is the age-old one of the fox guarding the hen-house; inspectors no longer had the power to physically examine the meat on the line. The acronym HACCP was often derided as "Have a cup of coffee and pray."

On January 10th, 2014 two articles were published that changed my mind: the first, in UPI.com's website simply said, "U.S. food Safety a big issue in 2014." It mentioned that already in 2014 the U.S. Department of Agriculture had shut down a meat-processing facility in the state of Minnesota.

The other online article was written by Dr. Margaret A. Hamburg, the Commissioner of Food and Drugs, i.e., the head of the FDA. It discusses the Food Safety Modernization Act (FSMA), signed by President Obama in early January, 2011. It was a reaction to the figures I mentioned at the start of this article.

This law gave the FDA "a legislative mandate to require comprehensive, science-based preventive controls across the food supply."

But let's look at its provisions, some of which make eminent sense and others, in my opinion, ask for the impossible.

On the one hand the FSMA required food facilities to have a written preventive control plan. I agree with that idea, but note it's a complex process with multiple steps involved. Such a plan includes evaluation of possible hazards, figuring out what one has to do to marked alleviate or totally eliminate them, noting how you will monitor these control measures (and keep appropriate records) and specifying what you will do when issues arise. Oh, and by the way, you had a year and a half to do all that.

Other parts of the FSMA involved standards for safely producing and harvesting vegetables and fruits plus another set involving the prevention of "intentional contamination" of food. The latter may be quite difficult. As the law is written, 600 such foreign food facilities must be inspected in its first year with the number doubling for each of five additional years. let's see, that's 600, 1,200, 2,400, 4,800, 9,600 and 19,200. Where in the world would the FDA get enough trained inspectors? And that's assuming that the foreign countries would allow such detailed examinations of their food-producing and exporting businesses.

One of every six Americans becomes ill from food-bourne disease each year. Only a small fraction of  them (approximately 1/4th of 1%) need to be hospitalized and even of those who do only 2.3% die. But another way of looking at those mortality statistics is to say it's equivalent to almost 10% of the number who die from motor vehicle accidents each year in this country.

 

 

They're transplanting what?

February 24th, 2014

I read an article in The New York Times that gave me pause for a moment. It was on fecal transplants. Initially that didn't seem to make sense. Then I remembered there had been something on this topic last year in the New England Journal of Medicine, a Dutch study done at an academic center with the title "Duodenal Infusion of Donor Feces for Recurrent Clostridium Difficile."  So I did a Google search and found the Mayo Clinic's website for medical professionals where the subject was titled "Quick, inexpensive and a 90 percent cure rate."

Why in the world would you need this kind of a transplant? Well let's start with the possibility of eliminating  a majority of 14,000 deaths a year in this country alone. The CDC website on the bacterial overgrowth that can cause the issue is a good resource, but let's start with a few basics. Your intestines normally have lots of different kinds of bacteria, up to 1,000 varieties according to some experts. The term "gut flora" is often used to mean our normal sea of bowel bacteria.  But when you take antibiotics, especially for a prolonged time, you run a risk of getting rid of the balance between bacterial species and having some (that are normally harmless) cause major problems.

I may need another roll after this one!

I may need another roll after this one!

One of these kinds of potentially nasty "bugs" is called Clostridium difficile technically, or C. diff as a shortcut name. The WebMD site has an easy-to-understand short tutorial on C. diff When it becomes the predominant gut flora it releases toxins that attack the bowel lining and causes severe diarrhea with up to 15 watery stools a day, fever, weight loss, abdominal pain and blood or pus in the stools. The disease often hits older patients (those over 65, so I'm in that higher risk category) and, in the past, was usually treated with one of three antibiotics given orally. Up to a quarter of those so treated need a second round of antibiotics.

The Dutch study randomly assigned patients to standard treatment with a drug called vancomycin, or the same drug plus four liters of a bowel-cleansing solution, or the drug plus that bowel washout plus infusion of a solution of donor feces through a tube inserted through the patient's nose and into their stomach (typically called an NG tube, shorthand for nasogastric). Less than a third of those in the first two groups had their diarrhea resolve while 81% of those given the fecal infusion (13 of 16) improved after one treatment and only one of the three remaining patients didn't improve after a second infusion.

One Mayo Clinic branch had tried a fecal transplant in 2011 in a patient with severe C. diff colitis (inflammation of the large bowel). In that case the medical staff infused the patient with their brother's stool given via a colonoscope, instead of an NG tube, therefore going up the intestinal tract, not down and getting right to the colon. The patient had been bedridden for weeks prior to the procedure, but was able to go home within one day after it.

Since then the same Mayo branch has done 24 fecal transplants. Every one of the patients had their infection go away within a short period of time; only two had a recurrence of the disease. (both had other illnesses). The senior nurse who played the major role in starting the Mayo program interviewed every patient and said their quality of life improved tremendously. Mayo now uses the procedure only for those who have severe relapsing C. diff infections, but is researching its use in other medical diseases.

Then in 2012, Mark Smith who was a doctoral candidate launched OpenBiome with three colleagues. It's a nonprofit 501 (c)(3) organization they organized after a family member/friend had gone through seven rounds of vancomycin for a C. diff infection that lasted a year and a half. They call the procedure Fecal Microbiota Transplantation (FMT) and, according to the New York Times article, they've supplied more than 135 frozen, ready-to-use Fecal Microbiota Preparations to over a dozen hospitals in the last five months. Much of the work is done in M.I.T.'s laboratories. All the medical facility needs is a doctor with an endoscope.

So have we solved the C. diff overgrowth problem or nearly so? I went back to a July 12, 2010 article in The New York Times titled "How Microbes Defend and Define Us." It described the work of a University of Minnesota gastroenterologist, Dr. Alexander Khoruts, who not only performed a fecal transplant on a woman with an intractable C. diff gut infection, but also looked closely at what bacteria were in her intestines before and after the procedure.

In this case the donor was the patient's husband and the analysis of the gut flora revealed his bacteria had taken over, supplanting the abnormal bacteria that were there before the transplant.

Khoruts continued to use the new procedure, fifteen by 2010 with 13 cures, but according to the NYT article, is now concerned that OpenBiome's model is just an early step. The Food and Drug Administration, in early 2013, classified fecal transplants as biological drugs. As such, any clinician who wished to use them would need to obtain an Investigational New Drug (IND) application, much as a pharmaceutical company would need in developing a new antibiotic.

Since then the FDA has relaxed their ruling, slightly, saying doctors performing FMTs for C. diff wouldn't be prosecuted. Smith and colleagues want FMT to be classified as a tissue, not a drug, allowing more research to be done on the procedure in other diseases and conditions and, at the same time, letting clinicians use FMT, at least for C. diff, without an IND permit or fear of FDA reversing its stance on such therapy.

There are a host of other diseases where FMT has been suggested as possibly effective in treatment. Some seem farfetched to me at first glance, but investigators appear interested in pursuing research on many of those conditions. I bet they would need an IND in such cases, even if FMT is reclassified as a tissue.

We all have bacteria in our colons, but in other places too.

We all have bacteria in our colons, but in other places too.

Khoruts and others think FMT for C. diff is just the tip of the iceberg. The NIH has been carrying out a huge Human Microbiome Project since 2007 with the first five years  of the investigational study being devoted to cataloging the microbiome of our noses, skin, mouths, GI tracts and urogenital systems. That term refers to the aggregate of all the microorganisms, including bacteria and fungi that live on and in our bodies. From 2013 on they have shifted gears, aiming at an integrated dataset of biological properites in microbiome-associated diseases.

Having read a number of papers and looked at a variety of source materials on the concept I'm no longer astounded by the idea. It still sounds strange, but obviously reputable academic centers have pioneered this research with great results.

One question that seems unresolved was highlighted on a patient-website. Is my bowel flora the same as someone's who lives in another part of the world and eats a totally different diet?

But it seems like FMT, in one form or another, is here to stay.

Electronic Health Records & Medical Scribes

February 5th, 2014
Turn over the data entry to someone else, doctor.

Turn over the data entry to someone else, doctor.

Recently, in the online version of The New York Times, I saw an article by Katie Hafner titled "A Busy Doctor's Right hand, Ever Ready to Type." The article described a new movement among medical personnel, one to hire scribes to make entries into an Electronic Health Record (EHR).

The concept made great sense to me, but it's clearly not a new one. Our ophthalmologists have, over the last fourteen years, routinely had an assistant who entered data into some form of a medical record, allowing the physician to concentrate on examining us.

Only five years back the use of an EHR was clearly the exception for other medical personnel with perhaps a tenth of physician office practices and hospitals utilizing them. now that percentage is well over two-thirds.

So what are the problems with universal acceptance of EHRs?

One that I touched on in my previous post on EHRs is interoperability between different health-record systems. My translation of that term is that Dr. A using, for instance, Epic at a UCH site like our local hospital, should be able to access and read my medical record from the Department of Defense or the Veteran's Affairs' systems. At the moment I doubt that's even remotely possible and there will obviously be issues with patient confidentiality. Those should be eventually solvable, although the mechanism for doing so is well beyond my computer skill level.

But, for an individual practitioner, on a day by day patient-care basis , there's an entire other set of issues.

I had mentioned in a recent post our pleasure at watching a Family Practice intern who kept eye contact with her patient (in this case my wife) while she examined her and informed her about test results.

The intern wasn't entering data and there's the rub with an EHR. She presumably had the choice of doing her examination and keeping as much eye contact as possible with her patient while remembering all the accumulated data points versus typing while she asked questions and, if she were a typical doc typist, looking at the keyboard and the screen much of the time.

The opposite end of the spectrum was a nurse who, in order to give Lynnette an ibuprofen tablet, spent twelve minutes (I timed the interaction) between my request for her pain med and it being put in her mouth, mostly on the computer, occasionally glancing up to ask a question (e.g., "On a scale of one to ten, what is your pain level? What is your full name and date of birth?{the fourth time she'd asked that during her shift}).

As the EHR has grown more complex, with more mandated information being necessitated by organizational, certifying and governmental entities, the potential for increased human-machine time has grown hugely, while the doctor-patient segment of a physician's day is squeezed.

The potential for burnout of physicians, especially in emergency medicine, family practice and primary care internal medicine has increased. The link is to a free article that appeared in the Archives of Internal Medicine in 2012 comparing both burnout and satisfaction (with physicians' balancing work and outside life) to others in the United States. Bottom line was of the 7,000+ docs who filed in a survey, over 45% had some symptoms of burnout and were much less satisfied with their ability to find a counterpoise between their work time and the rest of their life than those with comparable professional degrees.

Burnout meant less enthusiasm for work, development of cynicism and less of a sense of accomplishment than those of us who practiced medicine years ago had. There are lots of components as to why this has become more common among "front-line" physicians, but as I've talked to some recently the EHR has been a very significant contributor.

This was a somewhat unexpected development for me, although based on what I had seen with my radiologists attempting to dictate into an earlier version of an EHR in 1988-1991, not one that I  should have been surprised by.

Adding one more to the medical team should be easy.

Adding one more to the medical team should be easy.

There is a growing industry providing medical scribes to physicians and others and, since 2010, certification available through a non-profit, the American College of Medical Scribe Specialists. I was somewhat surprised that patients not only haven't objected to a scribe being present, but often have warmly welcomed them. They may be introduced as "my data entry specialist." Obviously, in teaching hospitals, patients see a team of physicians already. Only the most intimate parts of a physical examination would need to be conducted in a one-on-one basis. Then the scribe could be on the other side of a curtain and the doctor would verbally describe her or his findings.

If I had the choice of my physician looking at me almost all of the time and, in essence, dictating her findings (my own doctor is female) or having to type much of the time, my choice would be simple.

Then there's the possibility of a remote scribe. I had envisioned a future EHR which had set areas to be filled in and a practitioner being able to wear a headset and dictate into the EHR directly. I hadn't realized that some practices already have scribes who may be thousands of miles away from the patient-physician encounter, sometimes in India.

I went back to the New York Times article I mentioned initially and saw a quote from a family medicine physician who said, "Having the scribe has been life-changing." An article in the journal Health Affairs said two-thirds of a primary care doctors time at work was spent on clerical duties that could be done by others. Another doctor  said, "Making physicians into secretaries is not a winning proposition." She had surveyed over 50 primary care practice in the past five years, finding those who used scribes were more satisfied with their work and their choice of careers.

Doctors have been dictating patient records for fifty years, but those transcriptions often made their way to the chart many hours later. Having a scribe could cut that lag time immensely.

With our growing need for primary care physicians and the tendency for medical students to avoid those specialities, aiming toward more financially rewarding and less laborious fields in medicine, the advent of medical scribes may be not only a significant improvement for the lives of those already in front-line medical areas, but an inducement for new prospective physicians to join their ranks.

I'm heartily in favor of the idea.