User login
Iron supplements don’t increase malaria risk
Photo by Nina Matthews
Taking iron supplements during pregnancy does not increase a woman’s risk of contracting malaria, according to research published in JAMA.
Investigators studied nearly 500 pregnant women in a malaria-endemic region, comparing those who received daily iron supplements to those who received placebo.
Roughly half of the women in each group developed malaria, and iron supplementation was associated with benefits for mothers and children.
Martin N. Mwangi, PhD, of Wageningen University in The Netherlands, and his colleagues conducted this research.
The team said current estimates suggest that anemia affects 57% of pregnant women in Africa. And although iron deficiency is the most common cause, iron supplementation during pregnancy has uncertain health benefits.
There is some evidence to suggest that iron supplementation may increase the risk of infectious diseases, including malaria.
To investigate this association, Dr Mwangi and his colleagues studied 470 pregnant women living in a malaria-endemic area in Kenya. The subjects were randomized to daily supplementation with 60 mg of iron (n=237) or placebo (n=233) until 1 month postpartum.
All women received 5.7 mg iron per day through flour fortification during the intervention, as well as the usual intermittent preventive treatment against malaria.
Among the 470 participating women, 40 women (22 in the iron group and 18 in the placebo group) were lost to follow-up or excluded at birth. Twelve mothers were lost to follow-up postpartum (5 iron, 7 placebo). At study entry, 190 of 318 women (60%) were iron-deficient.
After childbirth, there was no significant difference in Plasmodium infection between the treatment groups. Infection occurred in 50.9% of women in the iron group and 52.1% in the placebo group (P=0.83).
There was a significant increase in hemoglobin concentration and a significant decrease in anemia among mothers who received iron (P<0.001 for both). Mothers in the iron group also had a significantly lower mean zinc protoporphyrin (ZPP)-heme ratio in whole blood (P<0.001) and erythrocytes (P<0.001).
Children born to mothers in the iron group had a significantly higher mean birth weight (P=0.002), lower risk of low birth weight (<2500 g, P=0.02), older gestational age at delivery (P=0.009), and lower risk of premature birth (P=0.02).
However, there was no significant difference between the treatment groups with regard to birth-weight-for-gestational-age z score (P=0.20), neonatal length (P=0.07), head circumference (P=0.28), hemoglobin concentration in cord blood (P=0.14), cord blood ZPP-heme ratio (P=0.82), or cord erythrocyte ZPP-heme ratio (P=0.88).
Based on these results, the investigators said the benefits of universal iron supplementation during pregnancy (in countries where it is impractical to screen for iron status) outweigh the possible risks.
Photo by Nina Matthews
Taking iron supplements during pregnancy does not increase a woman’s risk of contracting malaria, according to research published in JAMA.
Investigators studied nearly 500 pregnant women in a malaria-endemic region, comparing those who received daily iron supplements to those who received placebo.
Roughly half of the women in each group developed malaria, and iron supplementation was associated with benefits for mothers and children.
Martin N. Mwangi, PhD, of Wageningen University in The Netherlands, and his colleagues conducted this research.
The team said current estimates suggest that anemia affects 57% of pregnant women in Africa. And although iron deficiency is the most common cause, iron supplementation during pregnancy has uncertain health benefits.
There is some evidence to suggest that iron supplementation may increase the risk of infectious diseases, including malaria.
To investigate this association, Dr Mwangi and his colleagues studied 470 pregnant women living in a malaria-endemic area in Kenya. The subjects were randomized to daily supplementation with 60 mg of iron (n=237) or placebo (n=233) until 1 month postpartum.
All women received 5.7 mg iron per day through flour fortification during the intervention, as well as the usual intermittent preventive treatment against malaria.
Among the 470 participating women, 40 women (22 in the iron group and 18 in the placebo group) were lost to follow-up or excluded at birth. Twelve mothers were lost to follow-up postpartum (5 iron, 7 placebo). At study entry, 190 of 318 women (60%) were iron-deficient.
After childbirth, there was no significant difference in Plasmodium infection between the treatment groups. Infection occurred in 50.9% of women in the iron group and 52.1% in the placebo group (P=0.83).
There was a significant increase in hemoglobin concentration and a significant decrease in anemia among mothers who received iron (P<0.001 for both). Mothers in the iron group also had a significantly lower mean zinc protoporphyrin (ZPP)-heme ratio in whole blood (P<0.001) and erythrocytes (P<0.001).
Children born to mothers in the iron group had a significantly higher mean birth weight (P=0.002), lower risk of low birth weight (<2500 g, P=0.02), older gestational age at delivery (P=0.009), and lower risk of premature birth (P=0.02).
However, there was no significant difference between the treatment groups with regard to birth-weight-for-gestational-age z score (P=0.20), neonatal length (P=0.07), head circumference (P=0.28), hemoglobin concentration in cord blood (P=0.14), cord blood ZPP-heme ratio (P=0.82), or cord erythrocyte ZPP-heme ratio (P=0.88).
Based on these results, the investigators said the benefits of universal iron supplementation during pregnancy (in countries where it is impractical to screen for iron status) outweigh the possible risks.
Photo by Nina Matthews
Taking iron supplements during pregnancy does not increase a woman’s risk of contracting malaria, according to research published in JAMA.
Investigators studied nearly 500 pregnant women in a malaria-endemic region, comparing those who received daily iron supplements to those who received placebo.
Roughly half of the women in each group developed malaria, and iron supplementation was associated with benefits for mothers and children.
Martin N. Mwangi, PhD, of Wageningen University in The Netherlands, and his colleagues conducted this research.
The team said current estimates suggest that anemia affects 57% of pregnant women in Africa. And although iron deficiency is the most common cause, iron supplementation during pregnancy has uncertain health benefits.
There is some evidence to suggest that iron supplementation may increase the risk of infectious diseases, including malaria.
To investigate this association, Dr Mwangi and his colleagues studied 470 pregnant women living in a malaria-endemic area in Kenya. The subjects were randomized to daily supplementation with 60 mg of iron (n=237) or placebo (n=233) until 1 month postpartum.
All women received 5.7 mg iron per day through flour fortification during the intervention, as well as the usual intermittent preventive treatment against malaria.
Among the 470 participating women, 40 women (22 in the iron group and 18 in the placebo group) were lost to follow-up or excluded at birth. Twelve mothers were lost to follow-up postpartum (5 iron, 7 placebo). At study entry, 190 of 318 women (60%) were iron-deficient.
After childbirth, there was no significant difference in Plasmodium infection between the treatment groups. Infection occurred in 50.9% of women in the iron group and 52.1% in the placebo group (P=0.83).
There was a significant increase in hemoglobin concentration and a significant decrease in anemia among mothers who received iron (P<0.001 for both). Mothers in the iron group also had a significantly lower mean zinc protoporphyrin (ZPP)-heme ratio in whole blood (P<0.001) and erythrocytes (P<0.001).
Children born to mothers in the iron group had a significantly higher mean birth weight (P=0.002), lower risk of low birth weight (<2500 g, P=0.02), older gestational age at delivery (P=0.009), and lower risk of premature birth (P=0.02).
However, there was no significant difference between the treatment groups with regard to birth-weight-for-gestational-age z score (P=0.20), neonatal length (P=0.07), head circumference (P=0.28), hemoglobin concentration in cord blood (P=0.14), cord blood ZPP-heme ratio (P=0.82), or cord erythrocyte ZPP-heme ratio (P=0.88).
Based on these results, the investigators said the benefits of universal iron supplementation during pregnancy (in countries where it is impractical to screen for iron status) outweigh the possible risks.
Blood cancer drugs set to be removed from CDF
Photo courtesy of CDC
England’s National Health Service (NHS) plans to remove several drugs used to treat hematologic malignancies from the Cancer Drugs Fund (CDF).
The plan is that, as of November 4, 2015, pomalidomide, lenalidomide, ibrutinib, dasatinib, brentuximab, bosutinib, and bendamustine will no longer be funded via the CDF for certain indications.
Ofatumumab was removed from the CDF list yesterday but is now available through the NHS.
Drugs used to treat solid tumor malignancies are set to be de-funded through CDF in November as well.
However, the NHS said the proposal to remove a drug from the CDF is not necessarily a final decision.
In cases where a drug offers enough clinical benefit, the pharmaceutical company developing that drug has the opportunity to reduce the price they are asking the NHS to pay to ensure that it achieves a satisfactory level of value for money. The NHS said a number of such negotiations are underway.
In addition, patients who are currently receiving the drugs set to be removed from the CDF will continue to have access to those drugs.
About the CDF and the NHS
The CDF—set up in 2010 and currently due to run until March 2016—is money the government has set aside to pay for cancer drugs that haven’t been approved by the National Institute for Health and Care Excellence (NICE) and aren’t available within the NHS in England. Most cancer drugs are routinely funded outside of the CDF.
NHS England and NICE are planning to consult on a proposed new system for commissioning cancer drugs. The NHS said the new system will be designed to provide the agency with a more systematic approach to getting the best price for cancer drugs.
Reason for drug removals
The NHS previously increased the budget for the CDF from £200 million in 2013/14, to £280 million in 2014/15, and £340 million from April 2015. This represents a total increase of 70% since August 2014.
However, current projections suggest that spending would rise to around £410 million for this year, an over-spend of £70 million, in the absence of further prioritization. The NHS said this money could be used for other aspects of cancer treatment or NHS services for other patient groups.
Therefore, some drugs are set to be removed from the CDF. The NHS said all decisions on drugs to be maintained in the CDF were based on the advice of clinicians, the best available evidence, and the cost of the treatment.
“There is no escaping the fact that we face a difficult set of choices, but it is our duty to ensure we get maximum value from every penny available on behalf of patients,” said Peter Clark, chair of the CDF.
“We must ensure we invest in those treatments that offer the most benefit, based on rigorous evidence-based clinical analysis and an assessment of the cost of those treatments.”
While de-funding certain drugs will reduce costs, the CDF is not expected to be back on budget this financial year. The NHS does expect the CDF will be operating within its budget during 2016/17.
Blood cancer drugs to be removed
The following drugs are currently on the CDF list for the following indications, but they are set to be de-listed on November 4, 2015.
Bendamustine
For the treatment of chronic lymphocytic leukemia (CLL) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- CLL (not licensed in this indication)
- Second-line indication, third-line indication, or fourth-line indication
- To be used within the treating Trust’s governance framework, as bendamustine is not licensed in this indication
For the treatment of relapsed mantle cell lymphoma (MCL) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- MCL
- Option for second- or subsequent-line chemotherapy
- No previous treatment with bendamustine
- To be used within the treating Trust’s governance framework, as bendamustine is not licensed in this indication
*Bendamustine will remain on the CDF for other indications.
Bosutinib
For the treatment of refractory, chronic phase chronic myeloid leukemia (CML) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Chronic phase CML
- Refractory to nilotinib or dasatinib (if dasatinib accessed via a clinical trial or via its current approved CDF indication)
For the treatment of refractory, accelerated phase CML where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Accelerated phase CML
- Refractory to nilotinib or dasatinib (if dasatinib accessed via a clinical trial or via its current approved CDF indication)
- Significant intolerance to nilotinib (grade 3 or 4 events)
For the treatment of accelerated phase CML where there is intolerance of treatments and where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Accelerated phase CML
- Significant intolerance to dasatinib (grade 3 or 4 adverse events; if dasatinib accessed via its current approved CDF indication)
- Significant intolerance to nilotinib (grade 3 or 4 events)
*Bosutinib will still be available through the CDF for patients with chronic phase CML that is intolerant of other treatments.
Brentuximab
For the treatment of refractory, systemic anaplastic lymphoma where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Relapsed or refractory systemic anaplastic large-cell lymphoma
For the treatment of relapsed or refractory CD30+ Hodgkin lymphoma where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Relapsed or refractory CD30+ Hodgkin lymphoma
- Following autologous stem cell transplant or following at least 2 prior therapies when autologous stem cell transplant or multi-agent chemotherapy is not an option
Dasatinib
For the treatment of Philadelphia-chromosome-positive (Ph+) acute lymphoblastic leukemia where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Refractory or significant intolerance or resistance to prior therapy including imatinib (grade 3 or 4 adverse events)
- Second-line indication or third-line indication
*Dasatinib will still be available for chronic phase and accelerated phase CML.
Ibrutinib
For the treatment of relapsed/refractory CLL where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Confirmed CLL
- Must have received at least 1 prior therapy for CLL
- Considered not appropriate for treatment or retreatment with purine-analogue-based therapy due to:
- Failure to respond to chemo-immunotherapy or
- A progression-free interval of less than 3 years or
- Age of 70 years or more or
- Age of 65 years or more plus the presence of comorbidities or
- A 17p or TP53 deletion
- ECOG performance status of 0-2
- A neutrophil count of ≥0.75 x 10⁹/L
- A platelet count of ≥30 x 10⁹/L
- Patient not on warfarin or CYP3A4/5 inhibitors
- No prior treatment with idelalisib
For the treatment of relapsed/refractory MCL where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Confirmed MCL with cyclin D1 overexpression or translocation breakpoints at t(11;14)
- Failure to achieve at least partial response with, or documented disease progression disease after, the most recent treatment regimen
- ECOG performance status of 0-2
- At least 1 but no more than 5 previous lines of treatment
Lenalidomide
For the second-line treatment of multiple myeloma (MM) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- MM
- Second-line indication
- Contraindication to bortezomib or previously received bortezomib in the first-line setting
*Lenalidomide will still be available for patients with myelodysplastic syndromes with 5q deletion.
Pomalidomide
For the treatment of relapsed and refractory MM where the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically
- MM
- Performance status of 0-2
- Previously received treatment with adequate trials of at least all of the following options of therapy: bortezomib, lenalidomide, and alkylating agents
- Failed treatment with bortezomib or lenalidomide, as defined by: progression on or before 60 days of treatment, progressive disease 6 months or less after achieving a partial response, or intolerance to bortezomib
- Refractory disease to previous treatment
- No resistance to high-dose dexamethasone used in the last line of therapy
- No peripheral neuropathy of grade 2 or more
A complete list of proposed changes to the CDF, as well as the drugs that were de-listed on March 12, 2015, is available on the NHS website.
Photo courtesy of CDC
England’s National Health Service (NHS) plans to remove several drugs used to treat hematologic malignancies from the Cancer Drugs Fund (CDF).
The plan is that, as of November 4, 2015, pomalidomide, lenalidomide, ibrutinib, dasatinib, brentuximab, bosutinib, and bendamustine will no longer be funded via the CDF for certain indications.
Ofatumumab was removed from the CDF list yesterday but is now available through the NHS.
Drugs used to treat solid tumor malignancies are set to be de-funded through CDF in November as well.
However, the NHS said the proposal to remove a drug from the CDF is not necessarily a final decision.
In cases where a drug offers enough clinical benefit, the pharmaceutical company developing that drug has the opportunity to reduce the price they are asking the NHS to pay to ensure that it achieves a satisfactory level of value for money. The NHS said a number of such negotiations are underway.
In addition, patients who are currently receiving the drugs set to be removed from the CDF will continue to have access to those drugs.
About the CDF and the NHS
The CDF—set up in 2010 and currently due to run until March 2016—is money the government has set aside to pay for cancer drugs that haven’t been approved by the National Institute for Health and Care Excellence (NICE) and aren’t available within the NHS in England. Most cancer drugs are routinely funded outside of the CDF.
NHS England and NICE are planning to consult on a proposed new system for commissioning cancer drugs. The NHS said the new system will be designed to provide the agency with a more systematic approach to getting the best price for cancer drugs.
Reason for drug removals
The NHS previously increased the budget for the CDF from £200 million in 2013/14, to £280 million in 2014/15, and £340 million from April 2015. This represents a total increase of 70% since August 2014.
However, current projections suggest that spending would rise to around £410 million for this year, an over-spend of £70 million, in the absence of further prioritization. The NHS said this money could be used for other aspects of cancer treatment or NHS services for other patient groups.
Therefore, some drugs are set to be removed from the CDF. The NHS said all decisions on drugs to be maintained in the CDF were based on the advice of clinicians, the best available evidence, and the cost of the treatment.
“There is no escaping the fact that we face a difficult set of choices, but it is our duty to ensure we get maximum value from every penny available on behalf of patients,” said Peter Clark, chair of the CDF.
“We must ensure we invest in those treatments that offer the most benefit, based on rigorous evidence-based clinical analysis and an assessment of the cost of those treatments.”
While de-funding certain drugs will reduce costs, the CDF is not expected to be back on budget this financial year. The NHS does expect the CDF will be operating within its budget during 2016/17.
Blood cancer drugs to be removed
The following drugs are currently on the CDF list for the following indications, but they are set to be de-listed on November 4, 2015.
Bendamustine
For the treatment of chronic lymphocytic leukemia (CLL) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- CLL (not licensed in this indication)
- Second-line indication, third-line indication, or fourth-line indication
- To be used within the treating Trust’s governance framework, as bendamustine is not licensed in this indication
For the treatment of relapsed mantle cell lymphoma (MCL) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- MCL
- Option for second- or subsequent-line chemotherapy
- No previous treatment with bendamustine
- To be used within the treating Trust’s governance framework, as bendamustine is not licensed in this indication
*Bendamustine will remain on the CDF for other indications.
Bosutinib
For the treatment of refractory, chronic phase chronic myeloid leukemia (CML) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Chronic phase CML
- Refractory to nilotinib or dasatinib (if dasatinib accessed via a clinical trial or via its current approved CDF indication)
For the treatment of refractory, accelerated phase CML where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Accelerated phase CML
- Refractory to nilotinib or dasatinib (if dasatinib accessed via a clinical trial or via its current approved CDF indication)
- Significant intolerance to nilotinib (grade 3 or 4 events)
For the treatment of accelerated phase CML where there is intolerance of treatments and where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Accelerated phase CML
- Significant intolerance to dasatinib (grade 3 or 4 adverse events; if dasatinib accessed via its current approved CDF indication)
- Significant intolerance to nilotinib (grade 3 or 4 events)
*Bosutinib will still be available through the CDF for patients with chronic phase CML that is intolerant of other treatments.
Brentuximab
For the treatment of refractory, systemic anaplastic lymphoma where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Relapsed or refractory systemic anaplastic large-cell lymphoma
For the treatment of relapsed or refractory CD30+ Hodgkin lymphoma where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Relapsed or refractory CD30+ Hodgkin lymphoma
- Following autologous stem cell transplant or following at least 2 prior therapies when autologous stem cell transplant or multi-agent chemotherapy is not an option
Dasatinib
For the treatment of Philadelphia-chromosome-positive (Ph+) acute lymphoblastic leukemia where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Refractory or significant intolerance or resistance to prior therapy including imatinib (grade 3 or 4 adverse events)
- Second-line indication or third-line indication
*Dasatinib will still be available for chronic phase and accelerated phase CML.
Ibrutinib
For the treatment of relapsed/refractory CLL where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Confirmed CLL
- Must have received at least 1 prior therapy for CLL
- Considered not appropriate for treatment or retreatment with purine-analogue-based therapy due to:
- Failure to respond to chemo-immunotherapy or
- A progression-free interval of less than 3 years or
- Age of 70 years or more or
- Age of 65 years or more plus the presence of comorbidities or
- A 17p or TP53 deletion
- ECOG performance status of 0-2
- A neutrophil count of ≥0.75 x 10⁹/L
- A platelet count of ≥30 x 10⁹/L
- Patient not on warfarin or CYP3A4/5 inhibitors
- No prior treatment with idelalisib
For the treatment of relapsed/refractory MCL where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Confirmed MCL with cyclin D1 overexpression or translocation breakpoints at t(11;14)
- Failure to achieve at least partial response with, or documented disease progression disease after, the most recent treatment regimen
- ECOG performance status of 0-2
- At least 1 but no more than 5 previous lines of treatment
Lenalidomide
For the second-line treatment of multiple myeloma (MM) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- MM
- Second-line indication
- Contraindication to bortezomib or previously received bortezomib in the first-line setting
*Lenalidomide will still be available for patients with myelodysplastic syndromes with 5q deletion.
Pomalidomide
For the treatment of relapsed and refractory MM where the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically
- MM
- Performance status of 0-2
- Previously received treatment with adequate trials of at least all of the following options of therapy: bortezomib, lenalidomide, and alkylating agents
- Failed treatment with bortezomib or lenalidomide, as defined by: progression on or before 60 days of treatment, progressive disease 6 months or less after achieving a partial response, or intolerance to bortezomib
- Refractory disease to previous treatment
- No resistance to high-dose dexamethasone used in the last line of therapy
- No peripheral neuropathy of grade 2 or more
A complete list of proposed changes to the CDF, as well as the drugs that were de-listed on March 12, 2015, is available on the NHS website.
Photo courtesy of CDC
England’s National Health Service (NHS) plans to remove several drugs used to treat hematologic malignancies from the Cancer Drugs Fund (CDF).
The plan is that, as of November 4, 2015, pomalidomide, lenalidomide, ibrutinib, dasatinib, brentuximab, bosutinib, and bendamustine will no longer be funded via the CDF for certain indications.
Ofatumumab was removed from the CDF list yesterday but is now available through the NHS.
Drugs used to treat solid tumor malignancies are set to be de-funded through CDF in November as well.
However, the NHS said the proposal to remove a drug from the CDF is not necessarily a final decision.
In cases where a drug offers enough clinical benefit, the pharmaceutical company developing that drug has the opportunity to reduce the price they are asking the NHS to pay to ensure that it achieves a satisfactory level of value for money. The NHS said a number of such negotiations are underway.
In addition, patients who are currently receiving the drugs set to be removed from the CDF will continue to have access to those drugs.
About the CDF and the NHS
The CDF—set up in 2010 and currently due to run until March 2016—is money the government has set aside to pay for cancer drugs that haven’t been approved by the National Institute for Health and Care Excellence (NICE) and aren’t available within the NHS in England. Most cancer drugs are routinely funded outside of the CDF.
NHS England and NICE are planning to consult on a proposed new system for commissioning cancer drugs. The NHS said the new system will be designed to provide the agency with a more systematic approach to getting the best price for cancer drugs.
Reason for drug removals
The NHS previously increased the budget for the CDF from £200 million in 2013/14, to £280 million in 2014/15, and £340 million from April 2015. This represents a total increase of 70% since August 2014.
However, current projections suggest that spending would rise to around £410 million for this year, an over-spend of £70 million, in the absence of further prioritization. The NHS said this money could be used for other aspects of cancer treatment or NHS services for other patient groups.
Therefore, some drugs are set to be removed from the CDF. The NHS said all decisions on drugs to be maintained in the CDF were based on the advice of clinicians, the best available evidence, and the cost of the treatment.
“There is no escaping the fact that we face a difficult set of choices, but it is our duty to ensure we get maximum value from every penny available on behalf of patients,” said Peter Clark, chair of the CDF.
“We must ensure we invest in those treatments that offer the most benefit, based on rigorous evidence-based clinical analysis and an assessment of the cost of those treatments.”
While de-funding certain drugs will reduce costs, the CDF is not expected to be back on budget this financial year. The NHS does expect the CDF will be operating within its budget during 2016/17.
Blood cancer drugs to be removed
The following drugs are currently on the CDF list for the following indications, but they are set to be de-listed on November 4, 2015.
Bendamustine
For the treatment of chronic lymphocytic leukemia (CLL) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- CLL (not licensed in this indication)
- Second-line indication, third-line indication, or fourth-line indication
- To be used within the treating Trust’s governance framework, as bendamustine is not licensed in this indication
For the treatment of relapsed mantle cell lymphoma (MCL) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- MCL
- Option for second- or subsequent-line chemotherapy
- No previous treatment with bendamustine
- To be used within the treating Trust’s governance framework, as bendamustine is not licensed in this indication
*Bendamustine will remain on the CDF for other indications.
Bosutinib
For the treatment of refractory, chronic phase chronic myeloid leukemia (CML) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Chronic phase CML
- Refractory to nilotinib or dasatinib (if dasatinib accessed via a clinical trial or via its current approved CDF indication)
For the treatment of refractory, accelerated phase CML where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Accelerated phase CML
- Refractory to nilotinib or dasatinib (if dasatinib accessed via a clinical trial or via its current approved CDF indication)
- Significant intolerance to nilotinib (grade 3 or 4 events)
For the treatment of accelerated phase CML where there is intolerance of treatments and where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Accelerated phase CML
- Significant intolerance to dasatinib (grade 3 or 4 adverse events; if dasatinib accessed via its current approved CDF indication)
- Significant intolerance to nilotinib (grade 3 or 4 events)
*Bosutinib will still be available through the CDF for patients with chronic phase CML that is intolerant of other treatments.
Brentuximab
For the treatment of refractory, systemic anaplastic lymphoma where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Relapsed or refractory systemic anaplastic large-cell lymphoma
For the treatment of relapsed or refractory CD30+ Hodgkin lymphoma where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Relapsed or refractory CD30+ Hodgkin lymphoma
- Following autologous stem cell transplant or following at least 2 prior therapies when autologous stem cell transplant or multi-agent chemotherapy is not an option
Dasatinib
For the treatment of Philadelphia-chromosome-positive (Ph+) acute lymphoblastic leukemia where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Refractory or significant intolerance or resistance to prior therapy including imatinib (grade 3 or 4 adverse events)
- Second-line indication or third-line indication
*Dasatinib will still be available for chronic phase and accelerated phase CML.
Ibrutinib
For the treatment of relapsed/refractory CLL where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Confirmed CLL
- Must have received at least 1 prior therapy for CLL
- Considered not appropriate for treatment or retreatment with purine-analogue-based therapy due to:
- Failure to respond to chemo-immunotherapy or
- A progression-free interval of less than 3 years or
- Age of 70 years or more or
- Age of 65 years or more plus the presence of comorbidities or
- A 17p or TP53 deletion
- ECOG performance status of 0-2
- A neutrophil count of ≥0.75 x 10⁹/L
- A platelet count of ≥30 x 10⁹/L
- Patient not on warfarin or CYP3A4/5 inhibitors
- No prior treatment with idelalisib
For the treatment of relapsed/refractory MCL where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- Confirmed MCL with cyclin D1 overexpression or translocation breakpoints at t(11;14)
- Failure to achieve at least partial response with, or documented disease progression disease after, the most recent treatment regimen
- ECOG performance status of 0-2
- At least 1 but no more than 5 previous lines of treatment
Lenalidomide
For the second-line treatment of multiple myeloma (MM) where all the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically trained and accredited in the use of systemic anticancer therapy
- MM
- Second-line indication
- Contraindication to bortezomib or previously received bortezomib in the first-line setting
*Lenalidomide will still be available for patients with myelodysplastic syndromes with 5q deletion.
Pomalidomide
For the treatment of relapsed and refractory MM where the following criteria are met:
- Application made by and first cycle of systemic anticancer therapy to be prescribed by a consultant specialist specifically
- MM
- Performance status of 0-2
- Previously received treatment with adequate trials of at least all of the following options of therapy: bortezomib, lenalidomide, and alkylating agents
- Failed treatment with bortezomib or lenalidomide, as defined by: progression on or before 60 days of treatment, progressive disease 6 months or less after achieving a partial response, or intolerance to bortezomib
- Refractory disease to previous treatment
- No resistance to high-dose dexamethasone used in the last line of therapy
- No peripheral neuropathy of grade 2 or more
A complete list of proposed changes to the CDF, as well as the drugs that were de-listed on March 12, 2015, is available on the NHS website.
Critical Literature 2014
Keeping up with the medical literature in a field as broad as hospital medicine is a daunting task. In 2014 alone, there were over 9200 articles published in top‐tier internal medicine journals.[1] The authors have selected articles from among these top journals using a nonsystematic process that involved reviewing articles brought to their attention via colleagues, literature searches, and online services. The focus was to identify articles that would be of importance to the field of hospital medicine for their potential to be practice changing, provocative, or iconoclastic. After culling through hundreds of titles and abstracts, 46 articles were reviewed by both authors in full text, and ultimately 14 were selected for presentation here. Table 1 summarizes the key points.
|
1. Now that neprolysin inhibitors are approved by the FDA, hospitalists will see them prescribed as an alternative to ACE‐inhibitors given their impressive benefits in cardiovascular mortality and heart failure hospitalizations. |
2. Current evidence suggests that intravenous contrast given with CT scans may not significantly alter the incidence of acute kidney injury, its associated mortality, or the need for hemodialysis. |
3. The CAM‐S score is an important tool for prognostication in delirious patients. Those patients with high CAM‐S scores should be considered for goals of care conversations. |
4. The melatonin agonist, ramelteon, shows promise for lowering incident delirium among elderly medical patients, though larger trials are still needed. |
5. Polyethylene glycol may be an excellent alternative to lactulose for patients with acute hepatic encephalopathy once larger studies are done, as it is well tolerated and shows faster resolution of symptoms. |
6. Nonselective ‐blockers should no longer be offered to cirrhotic patients after they develop spontaneous bacterial peritonitis, as they are associated with increased mortality and acute kidney injury. |
7. Current guidelines regarding prophylaxis against VTE in medical inpatients likely result in nonbeneficial use of medications for this purpose. It remains unclear which high‐risk populations do benefit from pharmacologic prophylaxis. |
8. DOACs are as effective and are safer than conventional therapy for treatment of VTE, though they are not recommended in patients with GFR <30 mL/min. |
9. DOACs are more effective and are safer (though they may increase risk of gastrointestinal bleeding) than conventional therapy in patients with AF. |
10. DOACs are as safe and more effective than conventional therapy in elderly patients with VTE or AF, being mindful of dosing recommendations in this population. |
11. Two new once‐weekly antibiotics, dalbavancin and oritavancin, approved for skin and soft tissue infections, appear noninferior to vancomycin and have the potential to shorten hospitalizations and, in doing so, may decrease cost. |
12. Offering family members of a patient undergoing CPR the opportunity to observe has durable impact on meaningful short‐ and long‐term psychological outcomes. Clinicians should strongly consider making this offer. |
AN APPROACHING PARADIGM SHIFT IN THE TREATMENT FOR HEART FAILURE
McMurray J, Packer M, Desai A, et al. Angiotensin‐neprilysin inhibition versus enalapril in heart failure. N Engl J Med. 2014;371:9931004.
Background
The last drug approved by the Food and Drug Administration (FDA) for heart failure (HF) was 10 years ago.[2] The new PARADIGM (Prospective Comparison of ARNI With ACEI to Determine Impact on Global Mortality and Morbidity in Heart Failure) heart failure study comparing a novel combination drug of a neprilysin inhibitor and angiotensin receptor blocker (ARB) to an angiotensin‐converting enzyme (ACE) inhibitor has cardiologists considering a possible change in the HF treatment algorithm. Neprilysin is a naturally occurring enzyme that breaks down the protective vasoactive peptides (brain natriuretic peptide, atrial natriuretic peptide, and bradykinin) made by the heart and the body in HF. These vasoactive peptides function to increase vasodilation and block sodium and water reabsorption. This novel neprilysin inhibitor extends the life of these vasoactive peptides, thus enhancing their effect. By inhibiting both neprilysin and the renin‐angiotensin system, there should be additional improvement in HF management. The neprilysin inhibitor was combined with an ARB instead of an ACE inhibitor because of significant angioedema seen in earlier phase trials when combined with an ACE inhibitor. This is believed related to increases in bradykinin due to both agents.
Findings
In this multicenter, blinded, randomized trial, over 10,000 patients with known HF (ejection fraction<35%, New York Heart Association class II or higher) went through 2 run‐in periods to ensure tolerance of both enalapril and the study drug, a combination of a neprilysin inhibitor and valsartan (neprilysin‐I/ARB). Eventually 8442 patients underwent randomization to either enalapril (10 mg twice a day) or neprilysin‐I/ARB (200 mg twice a day). The primary outcome was a combination of cardiovascular mortality and heart failure hospitalizations. The trial was stopped early at 27 months because of overwhelming benefit with neprilysin‐I/ARB (21.8% vs 26.5%; P<0.001). There was a 20% reduction specifically in cardiovascular mortality (13.3% vs 16.5%; hazard ratio [HR]: 0.80; P<0.001). The number needed to treat (NNT) was 32. There was also a 21% reduction in the risk of hospitalization (P<0.001). More patients with neprilysin‐I/ARB had symptomatic hypotension (14% vs 9.2%; P<0.001) but patients on the ACE inhibitor experienced more cough, hyperkalemia, and increases in their serum creatinine.
Cautions
There are 2 reasons clinicians may not see the same results in practice. First, the trial was stopped early, which can sometimes exaggerate benefits.[3] Second, the 2 run‐in periods eliminated patients who could not tolerate the medications at the trial doses. Additionally, although the study's authors were independent, the trial was funded by a pharmaceutical company.
Implications
This new combination drug of a neprilysin inhibitor and valsartan shows great promise at reducing cardiovascular mortality and hospitalizations for heart failure compared to enalapril alone. Given the high morbidity and mortality of heart failure, having a new agent in the treatment algorithm will be useful to patients and physicians. The drug was just approved by the FDA in July 2015 and will likely be offered as an alternative to ACE inhibitors.
VENOUS CONTRAST‐INDUCED NEPHROTOXICITY: IS THERE REALLY A RISK?
McDonald J, McDonald R, Carter R, et al. Risk of intravenous contrast material‐mediated acute kidney injury: a propensity score‐matched study stratified by baseline‐estimated glomerular filtration rate. Radiology. 2014;271(1):6573.
McDonald R, McDonald J, Carter R, et al. Intravenous contrast material exposure is not an independent risk factor for dialysis or mortality. Radiology. 2014;273(3):714725.
Background
It is a common practice to withhold intravenous contrast material from computed tomography (CT) scans in patients with even moderately poor renal function out of concern for causing contrast‐induced nephropathy (CIN). Our understanding of CIN is based largely on observational studies and outcomes of cardiac catheterizations, where larger amounts of contrast are given intra‐arterially into an atherosclerotic aorta.[4] The exact mechanism of injury is not clear, possibly from direct tubule toxicity or renal vasoconstriction.[5] CIN is defined as a rise in creatinine >0.5 mg/dL or >25% rise in serum creatinine 24 to 48 hours after receiving intravenous contrast. Although it is usually self‐limited, there is concern that patients who develop CIN have an increase risk of dialysis and death.[6] In the last few years, radiologists have started to question whether the risk of CIN is overstated. A recent meta‐analysis of 13 studies demonstrated a similar likelihood of acute kidney injury in patients regardless of receiving intravenous contrast.[7] If the true incidence of CIN after venous contrast is actually lower, this raises the question of whether we are unnecessarily withholding contrast from CTs and thereby reducing their diagnostic accuracy. Two 2014 observational studies provide additional evidence that the concern for CIN may be overstated.
Findings
The 2 Mayo Clinic studies used the same database. They looked at all patients who underwent a contrast‐enhanced or unenhanced thoracic, abdominal, or pelvic CT between January 2000 and December 2010 at the Mayo Clinic. After limiting the data to patients with pre‐ and post‐CT creatinine measurements and excluding anyone on dialysis, with preexisting acute kidney injury, or who had received additional contrast within 14 days, they ended up with 41,229 patients, mostly inpatients. All of the patients were assigned propensity scores based on risk factors for the development of CIN and whether they would likely receive contrast. The patients were then subdivided into 4 renal function subgroups based on estimated glomerular filtration rate (eGFR). The patients who received contrast were matched based on their propensity scores to those who did not received contrast within their eGFR subgroups. Unmatched patients were eliminated, leaving a cohort of 12,508 matched patients. The outcome of the first article was acute kidney injury (AKI) defined as a rise in creatinine >0.5 mg/dL at 24 to 48 hours. Though AKI rose with worsening eGFR subgroups (eGFR > 90 [1.2%] vs eGFR < 30 [14%]), the rates of AKI were the same regardless of contrast exposure. There was no statistical difference in any of the eGFR subgroups. The second study looked at important clinical outcomesdeath and the need for dialysis. There was no statistical difference for emergent dialysis (odds ratio [OR]: 0.96, P=0.89) or 30‐day mortality (HR: 0.97; P=0.45) regardless of whether the patients received contrast or not.
Cautions
In propensity matching, unmeasured confounders can bias the results. However, the issue of whether venous contrast causes CIN will unlikely be settled in a randomized controlled trial. For patients with severe renal failure (eGFR < 30), there were far fewer patients in this subgroup, making it harder to draw conclusions. The amount of venous contrast given was not provided. Finally, this study evaluated intravenous contrast for CTs, not intra‐arterial contrast.
Implications
These 2 studies raise doubt as to whether the incidence of AKI after contrast‐enhanced CT can be attributed to the contrast itself. What exactly causes the rise in creatinine is probably multifactorial including lab variation, hydration, blood pressure changes, nephrotoxic drugs, and comorbid disease. In trying to decide whether to obtain a contrast‐enhanced CT for patients with chronic kidney dysfunction, these studies provide more evidence to consider in the decision‐making process. A conversation with the radiologist about the benefits gained from using contrast in an individual patient may be of value.
PREVENTION AND PROGNOSIS OF INPATIENT DELIRIUM
Hatta K, Yasuhiro K, Wada K, et al. Preventive effects of ramelteon on delirium: a randomized placebo controlled trial. JAMA Psych. 2014;71(4):397403.
A new melatonin agonist dramatically improves delirium incidence.
Background
Numerous medications and therapeutic approaches have been studied to prevent incident delirium in hospitalized medical and surgical patients with varying success. Many of the tested medications also have the potential for significant undesirable side effects. An earlier small trial of melatonin appeared to have impressive efficacy for this purpose and be well tolerated, but the substance is not regulated by the FDA.[8] Ramelteon, a melatonin receptor agonist, is approved by the FDA for insomnia, and authors hypothesized that it, too, may be effective in delirium prevention.
Findings
This study was a multicenter, single‐blinded, randomized controlled trial of the melatonin‐agonist ramelteon versus placebo in elderly patients admitted to the hospital ward or ICU with serious medical conditions. Researchers excluded intubated patients or those with Lewy body dementia, psychiatric disorders, and severe liver disease. Patients received either ramelteon or placebo nightly for up to a week, and the primary end point was incident delirium as determined by a blinded observer using a validated assessment tool. Sixty‐seven patients were enrolled. The baseline characteristics in the arms of the trial were similar. In the placebo arm, 11 of 34 patients (32%) developed delirium during the 7‐day observation period. In the ramelteon arm, 1 of 33 (3%) developed delirium (P=0.003). The rate of drug discontinuation was the same in each arm.
Cautions
This study is small, and the single‐blinded design (the physicians and patients knew which group they were in but the observers did not) limits the validity of these results, mandating a larger double‐blinded trial.
Implications
Ramelteon showed a dramatic impact on preventing incident delirium on elderly hospitalized patients with serious medical conditions admitted to the ward or intensive care unit (ICU) (nonintubated) in this small study. If larger trials concur with the impact of this well‐tolerated and inexpensive medication, the potential for delirium incidence reduction could have a dramatic impact on how care for delirium‐vulnerable patients is conducted as well as the systems‐level costs associated with delirium care. Further studies of this class of medications are needed to more definitively establish its value in delirium prevention.
THE CONFUSION ASSESSMENT METHOD SEVERITY SCORE CAN QUANTIFY PROGNOSIS FOR DELIRIOUS MEDICAL INPATIENTS
Innoye SK, Kosar CM, Tommet D, et al. The CAM‐S: development and validation of a new scoring system for delirium in 2 cohorts. Ann Intern Med. 2014;160:526533.
Background
Delirium is common in hospitalized elderly patients, and numerous studies show that there are both short‐ and long‐term implications of developing delirium. Using well studied and validated tools has made identifying delirium fairly straightforward, yet its treatment remains difficult. Additionally, differentiating which patients will have a simpler clinical course from those at risk for a more morbid one has proved challenging. Using the Confusion Assessment Method (CAM), both in its short (4‐item) and long (10‐item) forms, as the basis for a prognostication tool, would allow for future research on treatment to have a scale against which to measure impact, and would allow clinicians to anticipate which patients were more likely to have difficult clinical courses.
Findings
The CAM Severity (CAM‐S) score was derived in 1219 subjects participating in 2 ongoing studies: 1 included high‐risk medical inpatients 70 years old or older, and the other included similarly aged patients undergoing major orthopedic, general, or vascular surgeries. Outcomes data were not available for the surgical patients. The CAM items were rated as either present/absent or absent/mild/severe, depending on the item, with an associated score attached to each item such that the 4‐item CAM had a score of 0 to 7 and the 10‐item CAM 0 to 19 (Table 2). Clinical outcomes from the medical patients cohort showed a dose response with increasing CAM‐S scores with respect to length of stay, adjusted cost, combined 90‐day end points of skilled nursing facility placement or death, and 90‐day mortality. Specifically, for patients with a CAM‐S (short form) score of 5 to 7, the 90‐day rate of death or nursing home residence was 62%, whereas the 90‐day postdischarge mortality rate was 36%.
The CAM | The CAM‐S | |
---|---|---|
| ||
Acute onset with fluctuating course | Absent | 0 |
Present | 1 | |
Inattention or distractability | Absent | 0 |
Mild | 1 | |
Severe | 2 | |
Disorganized thinking, illogical or unclear ideas | Absent | 0 |
Mild | 1 | |
Severe | 2 | |
Alteration of consciousness | Absent | 0 |
Mild | 0 | |
Severe | 2 | |
Total | 07 |
Cautions
The CAM‐S, like the CAM, may work less well in patients with hypoactive delirium. This scale has been applied in a surgical cohort, but study outcomes were not presented in this article. This absence limits our ability to apply these results to a surgical population presently.
Implications
This study demonstrates that in medical inpatients, the CAM‐S is effective for prognostication. Moreover, the study points out that high‐scoring patients on the CAM‐S have quite poor prognoses, with more than one‐third dying by 3 months. This finding suggests that an important use of the CAM‐S is to identify patients about whom goals of care discussions should be held and end‐of‐life planning initiated if not previously done.
GET EXCITED ABOUT HEPATIC ENCEPHALOPATHY AGAINA NEW POSSIBLE TREATMENT
Rahimi R, Singal A, Cuthbert J, et al. Lactulose vs polyethylene glycol 3350‐electrolyte solution for treatment of overt hepatic encephalopathy. The HELP randomized clinical trial. JAMA Intern Med. 2014;174(11):17271733.
Background
Lactulose has been the principle treatment for acute hepatic encephalopathy (HE) since 1966.[9] It theoretically works by lowering the pH of the colon and trapping ammonia as ammonium, which is then expelled. Alternatively, it may simply decrease transit time through the colon. In fact, earlier treatments for HE were cathartics such as magnesium salts. Unfortunately 20% tp 30% of patients are poor responders to lactulose, and patients do not like it. This new study tests whether a modern‐day cathartic, polyethylene glycol, works as well as lactulose.
Findings
In this unblinded, randomized controlled trial, patients presenting to the emergency department with acute HE were assigned to either lactulose 20 to 30 g for a minimum of 3 doses over 24 hours or 4 L of polyethylene glycol (PEG) over 4 hours. The2 groups were similar in severity and etiology of liver disease. Patients were allowed to have received 1 dose of lactulose given in the emergency department prior to study enrollment. They were excluded if taking rifaximin. The primary outcome was improvement in the hepatic encephalopathy scoring algorithm (HESA) by 1 grade at 24 hours.[10] The algorithm scores HE from 0 (no clinical findings of HE) to 5 (comatose). Initial mean HESA scores in the 2 groups were identical (2.3).
In the lactulose group, 13/25 (52%) improved by at least 1 HESA score at 24 hours. Two patients (8%) completely cleared with a HESA score of 0. In comparison, 21/23 (91%) in the PEG group improved at 24 hours, and 10/23 (43%) had cleared with a HESA score of 0 (P<0.01). The median time to HE resolution was 2 days in the lactulose group compared with 1 day in the PEG group (P=0.01). There were no differences in serious adverse events. The majority (76%) of the PEG group received the full 4 L of PEG.
Cautions
The main limitations of the trial were the small sample size, that it was a single‐center study, and the fact it was unblinded. Additionally, 80% of the PEG group received 1 dose of lactulose prior to enrollment. Statistically, more patients in the PEG group developed hypokalemia, which can worsen HE. Therefore, if PEG is used for acute HE, potassium will need to be monitored.
Implications
The results are intriguing and may represent a new possible treatment for acute HE once larger studies are done. Interestingly, the ammonia level dropped further in the lactulose group than the PEG group, yet there was more cognitive improvement in the PEG group. This raises questions about the role of ammonia and catharsis in HE. Although lactulose and rifaximin continue to be the standard of care, cathartics may be returning as a viable alternative.
SHOULD ‐BLOCKERS BE STOPPED IN PATIENTS WITH CIRRHOSIS WHEN SPONTANEOUS BACTERIAL PERITONITIS OCCURS?
Mandorfer M, Bota S, Schwabi P, et al. Nonselective beta blockers increase risk for hepatorenal syndrome and death in patients with cirrhosis and spontaneous bacterial peritonitis. Gastroenterology. 2014;146:16801690.
Background
Nonselective ‐blockers (NSBBs) are considered the aspirin of hepatologists, as they are used for primary and secondary prevention of variceal bleeds in patients with cirrhosis.[11] Since the 1980s, their benefit in reducing bleeding risk has been known, and more recently there has been evidence that they may reduce the risk of developing ascites in patients with compensated cirrhosis. Yet, there has been some contradictory evidence suggesting reduced survival in patients with decompensated cirrhosis and infections on NSBBs. This has led to the window hypothesis of NSBBs in cirrhosis, where NSBBs are beneficial only during a certain window period during the progression of cirrhosis.[12] Early on in cirrhosis, before the development of varices or ascites, NSBBs have no benefit. As cirrhosis progresses and portal hypertension develops, NSBBs play a major role in reducing bleeding from varices. However, in advanced cirrhosis, NSBBs may become harmful. In theory, they block the body's attempt to increase cardiac output during situations of increased physiologic stress, resulting in decreased mean arterial pressure and perfusion. This, in turn, causes end‐organ damage and increased risk of death. When exactly this NSBB window closes is unclear. A 2014 study suggests the window should close when patients develop spontaneous bacterial peritonitis (SBP).
Findings
This retrospective study followed 607 consecutive patients seen at a liver transplant center in Vienna, Austria, from 2006 to 2011. All of the patients were followed from the time of their first paracentesis. They were excluded if SBP was diagnosed during the first paracentesis. Patients were grouped based on whether they took an NSBB. As expected, more patients on an NSBB had varices (90% vs 62%; P<0.001) and a lower mean heart rate (77.5 vs 83.9 beats/minute; P<0.001). However, the 2 groups were similar in mean arterial pressure, systolic blood pressure, Model for End‐Stage Liver Disease score (17.5), Childs Pugh Score (CPS) (50% were C), and in the etiology of cirrhosis (55% were from alcoholic liver disease). They followed the patients for development of SBP. The primary outcome was transplant‐free survival. For the patients who never developed SBP, there was a 25% reduction in the risk of death for those on an NSBB adjusted for varices and CPS stage (HR=0.75, P=0.027). However, for the 182 patients who developed SBP, those on an NSBB had a 58% increase risk of death, again adjusted for varices and CPS stage (HR=1.58, P=0.014). Among the patients who developed SBP, there was a higher risk of hepatorenal syndrome (HRS) within 90 days for those on an NSBB (24% vs 11%, P=0.027). Although the mean arterial pressures (MAP) had been similar in the 2 groups before SBP, after the development of SBP, those on an NSBB had a significantly lower MAP (77.2 vs 82.6 mm Hg, P=0.005).
Cautions
This is a retrospective study, and although the authors controlled for varices and CPS, it is still possible the 2 groups were not similar. Whether patients were actually taking the NSBB is unknown, and doses of the NSBB were variable.
Implications
This study provides more evidence for the NSBB window hypothesis in the treatment of patients with cirrhosis. It suggests that the window on NSBB closes when patients develop SBP, as NSBBs appear to increase mortality and the risk of HRS. Thus, NSBB therapy should probably be discontinued in cirrhotic patients developing SBP. The question is for how long? The editorial accompanying the article says permanently.[13]
VTE PROPHYLAXIS FOR MEDICAL INPATIENTS: IS IT A THING OF THE PAST?
Flanders SA, Greene T, Grant P, et al. Hospital performance for pharmacologic venous thromboembolism prophylaxis and rate of venous thromboembolism. A cohort study. JAMA Intern Med. 2014;174(10):15771584.
Background
Based on early research studies, many quality and regulatory organizations have stressed the importance of assessing hospitalized patients' venous thromboembolism (VTE) risk and prophylaxing those patients at increased risk either pharmacologically or mechanically. In 2011, a meta‐analysis of 40 studies of medical and stroke patients including approximately 52,000 patients failed to demonstrate a mortality benefit, showing that for every 3 pulmonary embolisms (PEs) prevented, it caused 4 major bleeding episodes per 1000 patients.[14] A second study in 2011, a multicenter, randomized controlled trial with medically complex patients deemed high risk for VTE, also failed to demonstrate a mortality benefit.[15] Despite these and other trials showing questionable benefit, guidelines continue to recommend that high‐risk medical patients should get pharmacologic prophylaxis against VTE.
Findings
This retrospective cohort trial retrospectively evaluated a cohort of 20,794 medical patients (non‐ICU) across 35 hospitals, excluding those with a Caprini score of <2 (ie, low risk for VTE). The authors divided the hospitals into tertiles based on adherence to VTE prophylaxis guidelines. Patients were followed to 90 days after hospitalization with telephone calls (reaching 56%) and chart reviews (100% reviewed) to identify clinically evident VTE events, excluding those that occurred within the first 3 days of index hospitalization. The study identified no statistically significant differences among the tertiles in terms of VTE rates, either in the hospital or at 90 days, though the overall VTE event rate was low. Interestingly, 85% of events took place postdischarge. Subgroup analyses also failed to identify a population of medical patients who benefited from prophylaxis.
Cautions
Debate about whether the Caprini risk score is the best available VTE risk scoring system exists. This study also excluded surgical and ICU patients.
Implications
This trial adds to the mounting literature suggesting that current guidelines‐based pharmacologic VTE prophylaxis for medical patients may offer no clear benefit in terms of incident VTE events or mortality. Although it is not yet time to abandon VTE prophylaxis completely, this study does raise the important question of whether it is time to revisit the quality guidelines and regulatory standards around VTE prophylaxis in medical inpatients. It also highlights the difficulty in assessing medical patients for their VTE risk. Though this study is provocative and important for its real‐world setting, further studies are required.
OUT WITH THE OLD AND IN WITH THE NEW? SHOULD DIRECT ORAL ANTICOAGULANTS BE OUR FIRST CHOICE FOR CARING FOR PATIENTS WITH VTE AND ATRIAL FIBRILLATION?
van Es N, Coppens M, Schulman S. et al. Direct oral anticoagulants compared with vitamin K antagonists for acute venous thromboembolism: evidence from phase 3 trials. Blood. 2014;124(12):19681975.
For patients with acute VTE, direct oral anticoagulants work as well and are safer.
Background
There have been 6 large published randomized controlled trials of direct oral anticoagulants (DOACs) versus vitamin K antagonists (VKAs) in patients with acute VTE. Study sizes range from approximately 2500 to over 8000 subjects. All showed no significant difference between the arms with respect to efficacy (VTE or VTE‐related death) but had variable results with respect to major bleeding risk, a major concern given the nonreversibility of this group of medications. Additionally, subgroup analysis within these studies was challenging given sample size issues.
Findings
These 6 studies were combined in a meta‐analysis to address the DOACs' overall efficacy and safety profile, as well as looking in prespecified subgroups. The meta‐analysis included data from over 27,000 patients, evenly divided between DOACs (edoxaban, apixaban, rivaroxaban, and dabigatran) and VKAs, with the time in the therapeutic range (TTR) in the VKA arm being 64%. Overall, the primary efficacy endpoint (VTE and VTE‐related death) was similar (DOACs relative tisk [RR]=0.90; 95% confidence interval [CI]: 0.77‐1.06) but major bleeding (DOACs RR=0.61; 95% CI: 0.45‐0.83; NNT=150) and combined fatal and intracranial bleeding (DOACs RR=0.37; 95% CI: 0.27‐0.68; NNT=314) favored the DOACs. In subgroup analysis, there was no efficacy difference between the therapeutic groups in the subset specifically with DVT or with PE, or with patients weighing >100 kg, though safety data in these subsets were not evaluable. Patients with creatinine clearances of 30 to 49 mL/min demonstrated similar efficacy in both treatment arms, and the safety analysis in this subset with moderate renal impairment was better in the DOAC arm. Cancer patients achieved better efficacy with similar safety with the DOACs, whereas elderly patients achieved both better safety and efficacy with DOACs.
Cautions
As yet, there are inadequate data on patients with more advanced renal failure (creatinine clearance <30 mL/min) to advise using DOACs in that subset. Also, as there were no data comparing cancer patients with VTE that investigated DOACs versus low molecular weight heparins (the standard of care rather than warfarin since the CLOT [Comparison of Low‐molecular‐weight heparin versus Oral anticoagulant Therapy] trial[16]), the current meta‐analysis does not yet answer whether DOACs should be used in this population despite the efficacy benefit noted in the subgroup analysis.
Implications
This large meta‐analysis strongly suggests we can achieve comparable treatment efficacy from the DOACs as with VKAs, with better safety profiles in patients with acute VTE. In the subset of patients with moderate renal impairment (creatinine clearance 3049 mL/min), it appears safe and effective to choose DOACs.
IN PATIENTS WITH ATRIAL FIBRILLATION, DOACs APPEAR MORE EFFECTIVE THAN VKAs WITH COMPARABLE OR BETTER SAFETY PROFILES
Ruff CT, Guigliano RP, Braunwald E, et al. Comparison of the efficacy and safety of new oral anticoagulants with warfarin in patients with atrial fibrillation: a meta‐analysis of randomized trials. Lancet. 2014;383(9921):955962.
Background
Adding to the previously published meta‐analyses of the original phase 3 randomized trials regarding the DOACs' impact on the atrial fibrillation (AF) treatment safety and efficacy literature relative to VKAs, a 2013 trial, ENGAGE AF‐TIMI 48 (Effective Anticoagulation with Factor Xa Next Generation in Atrial FibrillationThrombolysis in Myocardial Infarction 48), with edoxaban was published and warrants inclusion to have a better opportunity to glean important subgroup information.[17]
Findings
This meta‐analysis included data on 71,683 patients, 42,411 in the DOAC arm and 29,272 in the warfarin arm, as 2 of the trials were3‐arm studies, comparing warfarin to a high dose and a low dose of the DOAC. Meta‐analyses of the 4 trials were broken down into a high‐dose subsetthe 2 high‐dose arms and the standard doses used in the other 2 trialsand a low‐dose subsetthe 2 low‐dose arms and the standard doses used in the other 2 trials. With respect to the efficacy endpoint (incident stroke or systemic embolization), the high‐dose subset analyses of the DOACs yielded a 19% reduction (P<0.0001; NNT=142) relative to the VKAs. The safety endpoint of major bleeding in this analysis identified a 14% reduction in the DOAC group that was nonsignificant (P=0.06). Within the high‐dose subset, analyses favored DOACs with respect to hemorrhagic stroke (51% reduction; P<0.0001; NNT=220), intracranial hemorrhage (52% reduction; P<0.0001; NNT=132), and overall mortality (10% reduction; P=0.0003; NNT=129), whereas they increased the risk of gastrointestinal bleeding (25% increase; P=0.043; NNH=185). There was no significant difference between DOACs and warfarin with respect to ischemic stroke. The low‐dose subset had similar overall results with even fewer hemorrhage strokes balancing a higher incidence of ischemic strokes in the DOAC arm than in warfarin. Other important subgroup analyses suggest the safety and efficacy impact of DOACs is significant for VKA‐naive and experienced patients, though only statistically so for VKA‐naive patients. Additionally, the anticoagulation centers included in the study that had a TTR <66% seemed to gain a safety advantage from the DOACs, whereas both TTR groups (<66% and 66%) appeared to achieve an efficacy benefit from DOACs.
Cautions
There are not sufficient data to suggest routinely switching patients tolerating and well managed on VKAs to DOACs for AF.
Implications
DOACs reduce stroke and systemic emboli in patients with AF without increasing intracranial bleeding or hemorrhagic stroke, though at the cost of increased gastrointestinal bleeding in patients on the high‐dose regimens. Those patients on the low‐dose regimens have even a lower hemorrhagic stroke risk, the benefit of which is negated by a higher than VKA risk of ischemic strokes. Centers with lower TTRs (and perhaps by extrapolation, those patients with more difficulty staying in the therapeutic range) may gain more benefit by switching. New patients on treatment for AF should strongly be considered for DOAC therapy as the first line.
IN ELDERLY PATIENTS, THE DOACs APPEAR TO OFFER IMPROVED EFFICACY WITHOUT SACRIFICING SAFETY
Sardar P, Chatterjee S, Chaudhari S, Lip GYH. New oral anticoagulants in elderly adults: evidence from meta‐analysis of randomized trials. J Am Geriatr Soc. 2014;62(5):857864.
Background
The prevalence of AF rises with age, as does the prevalence of malignancy, limited mobility, and other comorbidities that increase the risk for VTEs. These factors may also increase the risk of bleeding with conventional therapy with heparins and VKAs. As such, understanding the implications of using DOACs in the elderly population is important.
Findings
This meta‐analysis included the elderly (age 75 years) subset of patients from existing AF treatment and VTE treatment and prophylaxis randomized trials comparing DOACs with VKAs, low‐molecular‐weight heparin (LMWH), aspirin, or placebo. The primary safety outcome was major bleeding. For AF trials, the efficacy endpoint was stroke or systemic embolization, whereas in VTE trials it was VTE or VTE‐related death. Authors were able to extract data on 25,031 patients across 10 trials that evaluated rivaroxaban, apixaban, and dabigatran (not edoxaban), with follow‐up data ranging from 35 days to 2 years. For safety outcomes, the 2 arms showed no statistical difference (DOAC: 6.4%; conventional therapy: 6.3%; OR: 1.02; 95% CI: 0.73‐1.43). For efficacy endpoints in VTE studies, DOACs were more effective (3.7% vs 7.0%; OR: 0.45; 95% CI: 0.27‐77; NNT=30). For AF, the efficacy analysis favored DOACs also (3.3% vs 4.7%; OR: 0.65; 95% CI: 0.48‐0.87; NNT=71). When analyzed by the efficacy of the individual DOAC, rivaroxaban and apixaban both appeared to outperform the VKA/LMWH arm for both VTE and AF treatment, whereas data on dabigatran were only available for AF, also showing an efficacy benefit. Individual DOAC analyses for safety endpoints showed all the 3 to be similar to VKA/LMWH.
Cautions
Authors note, however, that coexisting low body weight and renal insufficiency may influence dosing choices in this population. There are specific dosage recommendations in the elderly for some DOACs.
Implications
The use of DOACs in patients aged 75 years and older appears to confer a substantial efficacy advantage when used for treatment of VTE and AF patients. The safety data presented in this meta‐analysis suggest that this class is comparable to VKA/LMWH medications.
CHANGING INPATIENT MANAGEMENT OF SKIN INFECTIONS
Boucher, H, Wilcox M, Talbot G, et al. Once‐weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370:21692179.
Corey G, Kabler, H, Mahra P, et al. Single‐dose oritavancin in the treatment of acute bacterial skin infections. N Engl J Med. 2014;370:21802190.
Background
There are over 870,000 hospital admissions yearly for skin infection, making it one of most common reasons for hospitalization in the United States.[18] Management often requires lengthy treatments with intravenous antibiotics, especially with the emergence of methicillin‐resistant Staphylococcus aureus. Results from 2 large randomized, double‐blinded, multicenter clinical trials were published looking at new once‐weekly intravenous antibiotics. Dalbavancin and oritavancin are both lipoglycopeptides in the same family as vancomycin. What is unique is that their serum drug concentrations exceed the minimum inhibitor concentrations for over a week. Both drugs were compared in noninferiority trials to vancomycin. The studies had similar outcomes. The dalbavancin results are presented below.
Findings
Researchers randomized 1312 patients with significant cellulitis, large abscess, or wound infection. Patients also had fever, leukocytosis, or bandemia, and the infection had to be deemed severe enough to require a minimum of 3 days of intravenous antibiotics. The patients could not have received any prior antibiotics. Over 80% of the patients had fevers, and more than half met the criteria for systemic inflammatory response syndrome. Patients were randomized to either dalbavancin (on day 1 and day 8) or vancomycin every 12 hours (1 gm or 15 mg/kg), with both groups receiving placebo dosing of the other drug. The blinded physicians could decide to switch to oral agent (placebo or linezolid in the vancomycin group) anytime after day 3, and the physicians could stop antibiotics anytime after day 10. Otherwise, all patients received 14 days of antibiotics.
The FDA‐approved outcome was cessation of spread of erythema at 48 to 72 hours and no fever at 3 independent readings. Results were similar in the dalbavancin group compared to the vancomycinlinezolid group (79.7% vs 79.8%). Dalbavancin was deemed noninferior to vancomycin. Blinded investigator's assessment of treatment success at 2 weeks was also similar (96% vs 96.7%, respectively). More treatment‐related adverse events occurred in the vancomycinlinezolid group (183 vs 139; P=0.02) and more deaths occurred in the vancomycin group (7 vs 1; P=0.03).
Cautions
These antibiotics have only been shown effective for complicated, acute bacterial skin infections. Their performance for other gram‐positive infections is unknown. In the future, it is possible that patients with severe skin infections will receive a dose of these antibiotics on hospital day 1 and be sent home with close follow‐up. However, that study has not been done yet to confirm efficacy and safety. Though the drugs appear safe, there needs to be more clinical use before they become standard of care, especially because of the long half‐life. Finally, these drugs are very expensive and provide broad spectrum gram‐positive coverage. They are not meant for a simple cellulitis.
Implications
These 2 new once‐weekly antibioticsdalbavancin and oritavancinare noninferior to vancomycin for acute bacterial skin infections. They provide alternative treatment choices for managing patients with significant infections requiring hospitalization. In the future, they may change the need for hospitalization of these patients or significantly reduce their length of stay. Though expensive, a significant reduction in hospitalization will offset costs.
SHOULD THEY STAY OR SHOULD THEY GO? FAMILY PRESENCE DURING CPR MAY IMPROVE THE GRIEF PROCESS DURABLY
Jabre P, Tazarourte K, Azoulay E, et al. Offering the opportunity for family to be present during cardiopulmonary resuscitation: 1 year assessment. Intensive Care Med. 2014;40:981987.
Background
In 2013, a French study randomized adult family members of a patient undergoing cardiopulmonary resuscitation (CPR) occurring at home to either be invited to stay and watch the resuscitation or to have no specific invitation offered.[19] At 90 days, this study revealed that those who were invited to watch (and 79% did) had fewer symptoms of post‐traumatic stress disorder (PTSD) (27% vs 37%) and anxiety (15% vs 23%), though not depression, than did the group not offered the opportunity to watch (though 43% watched anyway). There were 570 subjects (family members) in the trial, of whom a greater number in the control arm declined to participate in a 90‐day follow‐up due to emotional distress. Notably, only 4% of the patients in this study undergoing CPR survived to day 28. Whether the apparent positive psychological impact of the offer to watch CPR for families was durable remained in question.
Findings
The study group followed the families up to 1 year. At that time, dropout rates were similar (with the assumption, as in the prior study, that those who dropped out of either arm had PTSD symptoms). At follow‐up, subjects were again assessed for PTSD, anxiety, and depression symptoms as well as for meeting criteria for having had a major depressive episode or complicated grief. Four hundred eight of the original 570 subjects were able to undergo reevaluation. The 1‐year results showed the group offered the chance to watch CPR had fewer PTSD symptoms (20% vs 32%) and depression symptoms (10% vs 16%), as well as fewer major depressive episodes (23% vs 31%) and less complicated grief (21% vs 36%) but without a durable impact on anxiety symptoms.
Cautions
The resuscitation efforts in question here occurred out of hospital (in the home). Part of the protocol for those family members observing CPR was that a clinician was assigned to stay with them and explain the resuscitation process as it occurred.
Implications
It is postulated that having the chance to observe CPR, if desired, may help the grieving process. This study clearly raises a question about the wisdom of routinely escorting patient's families out of the room during resuscitative efforts. It seems likely that the durable and important psychological effects observed in this study for family members would similarly persist in emergency department and inpatient settings, where staff can be with patients' families to talk them through the events they are witnessing. It is time to ask families if they prefer to stay and watch CPR and not automatically move them to a waiting room.
Disclosure: Nothing to report.
- http://scientific.thomsonreuters.com/imgblast/JCRFullCovlist-2014.pdf. Accessed August 28, 2015. Journals in the 2014 release of the JCR. Available at:
- Neprilysin inhibition—a novel therapy for heart failure. N Engl J Med. 2014;371(11):1062–1064.
- Stopping randomized trials early for benefit and estimation of treatment effects: systematic review and meta‐regression analysis. JAMA. 2010;303(12):1180–1187. , , , et al.
- Intravenous contrast medium‐induced nephrotoxicity: is the medical risk really as great as we have come to believe? Radiology 2010;256(1):21–28. ,
- Pathophysiology of contrast medium‐induced nephropathy. Kidney Int. 2005;68(1):14–22. , ,
- Contrast‐induced acute kidney injury: short‐ and long‐term implications. Semin Nephrol. 2011;31(3):300–309. ,
- Frequency of acute kidney injury following intravenous contrast medium administration: a systematic review and meta‐analysis. Radiology. 2013;267(1):119–128. , , , et al.
- Melatonin decreases delirium in elderly patients: a randomized, placebo‐controlled trial. Int J Geriatr Psychiatry. 2011;26(7):687–694. , , , , ,
- Lactulose in the treatment of chronic portal‐systemic encephalopathy. A double‐blind clinical trial. N Engl J Med. 1969;281(8):408–412. , ,
- Performance of the hepatic encephalopathy scoring algorithm in a clinical trial of patients with cirrhosis and severe hepatic encephalopathy. Am J Gastroenterol. 2009;104(6):1392–1400. , , , et al.
- The changing role of beta‐blocker therapy in patients with cirrhosis. J Hepatol. 2014;60(3):643–653. ,
- The window hypothesis: haemodynamic and non‐haemodynamic effects of beta‐blockers improve survival of patients with cirrhosis during a window in the disease. Gut. 2012;61(7):967–969. , , ,
- When should the beta‐blocker window in cirrhosis close? Gastroenterology. 2014;146(7):1597–1599. ,
- Venous thromboembolism prophylaxis in hospitalized medical patients and those with stroke: a background review for an American College of Physicians Clinical Practice Guideline. Ann Intern Med. 2011;155(9):602–615. , , ,
- LIFENOX Investigators. Low‐molecular‐weight heparin and mortality in acutely ill medical patients. N Engl J Med. 2011;365(26):2463–2472. , , , , , ;
- Randomized Comparison of Low‐Molecular‐Weight Heparin versus Oral Anticoagulant Therapy for the Prevention of Recurrent Venous Thromboembolism in Patients with Cancer (CLOT) Investigators. Low‐molecular‐weight heparin versus a coumarin for the prevention of recurrent venous thromboembolism in patients with cancer. N Engl J Med. 2003;349(2):146–153. , , , et al.;
- ENGAGE AF‐TIMI 48 Investigators. Edoxaban versus warfarin in patients with atrial fibrillation. N Engl J Med. 2013;369(22):2093–2104. , , , et al.;
- Pharmacology and the treatment of complicated skin and skin‐structure infections. N Engl J Med. 2014;370(23):2238–2239.
- Family presence during cardiopulmonary resuscitation. N Engl J Med. 2013;368(11):1008–1018. , , , et al.
Keeping up with the medical literature in a field as broad as hospital medicine is a daunting task. In 2014 alone, there were over 9200 articles published in top‐tier internal medicine journals.[1] The authors have selected articles from among these top journals using a nonsystematic process that involved reviewing articles brought to their attention via colleagues, literature searches, and online services. The focus was to identify articles that would be of importance to the field of hospital medicine for their potential to be practice changing, provocative, or iconoclastic. After culling through hundreds of titles and abstracts, 46 articles were reviewed by both authors in full text, and ultimately 14 were selected for presentation here. Table 1 summarizes the key points.
|
1. Now that neprolysin inhibitors are approved by the FDA, hospitalists will see them prescribed as an alternative to ACE‐inhibitors given their impressive benefits in cardiovascular mortality and heart failure hospitalizations. |
2. Current evidence suggests that intravenous contrast given with CT scans may not significantly alter the incidence of acute kidney injury, its associated mortality, or the need for hemodialysis. |
3. The CAM‐S score is an important tool for prognostication in delirious patients. Those patients with high CAM‐S scores should be considered for goals of care conversations. |
4. The melatonin agonist, ramelteon, shows promise for lowering incident delirium among elderly medical patients, though larger trials are still needed. |
5. Polyethylene glycol may be an excellent alternative to lactulose for patients with acute hepatic encephalopathy once larger studies are done, as it is well tolerated and shows faster resolution of symptoms. |
6. Nonselective ‐blockers should no longer be offered to cirrhotic patients after they develop spontaneous bacterial peritonitis, as they are associated with increased mortality and acute kidney injury. |
7. Current guidelines regarding prophylaxis against VTE in medical inpatients likely result in nonbeneficial use of medications for this purpose. It remains unclear which high‐risk populations do benefit from pharmacologic prophylaxis. |
8. DOACs are as effective and are safer than conventional therapy for treatment of VTE, though they are not recommended in patients with GFR <30 mL/min. |
9. DOACs are more effective and are safer (though they may increase risk of gastrointestinal bleeding) than conventional therapy in patients with AF. |
10. DOACs are as safe and more effective than conventional therapy in elderly patients with VTE or AF, being mindful of dosing recommendations in this population. |
11. Two new once‐weekly antibiotics, dalbavancin and oritavancin, approved for skin and soft tissue infections, appear noninferior to vancomycin and have the potential to shorten hospitalizations and, in doing so, may decrease cost. |
12. Offering family members of a patient undergoing CPR the opportunity to observe has durable impact on meaningful short‐ and long‐term psychological outcomes. Clinicians should strongly consider making this offer. |
AN APPROACHING PARADIGM SHIFT IN THE TREATMENT FOR HEART FAILURE
McMurray J, Packer M, Desai A, et al. Angiotensin‐neprilysin inhibition versus enalapril in heart failure. N Engl J Med. 2014;371:9931004.
Background
The last drug approved by the Food and Drug Administration (FDA) for heart failure (HF) was 10 years ago.[2] The new PARADIGM (Prospective Comparison of ARNI With ACEI to Determine Impact on Global Mortality and Morbidity in Heart Failure) heart failure study comparing a novel combination drug of a neprilysin inhibitor and angiotensin receptor blocker (ARB) to an angiotensin‐converting enzyme (ACE) inhibitor has cardiologists considering a possible change in the HF treatment algorithm. Neprilysin is a naturally occurring enzyme that breaks down the protective vasoactive peptides (brain natriuretic peptide, atrial natriuretic peptide, and bradykinin) made by the heart and the body in HF. These vasoactive peptides function to increase vasodilation and block sodium and water reabsorption. This novel neprilysin inhibitor extends the life of these vasoactive peptides, thus enhancing their effect. By inhibiting both neprilysin and the renin‐angiotensin system, there should be additional improvement in HF management. The neprilysin inhibitor was combined with an ARB instead of an ACE inhibitor because of significant angioedema seen in earlier phase trials when combined with an ACE inhibitor. This is believed related to increases in bradykinin due to both agents.
Findings
In this multicenter, blinded, randomized trial, over 10,000 patients with known HF (ejection fraction<35%, New York Heart Association class II or higher) went through 2 run‐in periods to ensure tolerance of both enalapril and the study drug, a combination of a neprilysin inhibitor and valsartan (neprilysin‐I/ARB). Eventually 8442 patients underwent randomization to either enalapril (10 mg twice a day) or neprilysin‐I/ARB (200 mg twice a day). The primary outcome was a combination of cardiovascular mortality and heart failure hospitalizations. The trial was stopped early at 27 months because of overwhelming benefit with neprilysin‐I/ARB (21.8% vs 26.5%; P<0.001). There was a 20% reduction specifically in cardiovascular mortality (13.3% vs 16.5%; hazard ratio [HR]: 0.80; P<0.001). The number needed to treat (NNT) was 32. There was also a 21% reduction in the risk of hospitalization (P<0.001). More patients with neprilysin‐I/ARB had symptomatic hypotension (14% vs 9.2%; P<0.001) but patients on the ACE inhibitor experienced more cough, hyperkalemia, and increases in their serum creatinine.
Cautions
There are 2 reasons clinicians may not see the same results in practice. First, the trial was stopped early, which can sometimes exaggerate benefits.[3] Second, the 2 run‐in periods eliminated patients who could not tolerate the medications at the trial doses. Additionally, although the study's authors were independent, the trial was funded by a pharmaceutical company.
Implications
This new combination drug of a neprilysin inhibitor and valsartan shows great promise at reducing cardiovascular mortality and hospitalizations for heart failure compared to enalapril alone. Given the high morbidity and mortality of heart failure, having a new agent in the treatment algorithm will be useful to patients and physicians. The drug was just approved by the FDA in July 2015 and will likely be offered as an alternative to ACE inhibitors.
VENOUS CONTRAST‐INDUCED NEPHROTOXICITY: IS THERE REALLY A RISK?
McDonald J, McDonald R, Carter R, et al. Risk of intravenous contrast material‐mediated acute kidney injury: a propensity score‐matched study stratified by baseline‐estimated glomerular filtration rate. Radiology. 2014;271(1):6573.
McDonald R, McDonald J, Carter R, et al. Intravenous contrast material exposure is not an independent risk factor for dialysis or mortality. Radiology. 2014;273(3):714725.
Background
It is a common practice to withhold intravenous contrast material from computed tomography (CT) scans in patients with even moderately poor renal function out of concern for causing contrast‐induced nephropathy (CIN). Our understanding of CIN is based largely on observational studies and outcomes of cardiac catheterizations, where larger amounts of contrast are given intra‐arterially into an atherosclerotic aorta.[4] The exact mechanism of injury is not clear, possibly from direct tubule toxicity or renal vasoconstriction.[5] CIN is defined as a rise in creatinine >0.5 mg/dL or >25% rise in serum creatinine 24 to 48 hours after receiving intravenous contrast. Although it is usually self‐limited, there is concern that patients who develop CIN have an increase risk of dialysis and death.[6] In the last few years, radiologists have started to question whether the risk of CIN is overstated. A recent meta‐analysis of 13 studies demonstrated a similar likelihood of acute kidney injury in patients regardless of receiving intravenous contrast.[7] If the true incidence of CIN after venous contrast is actually lower, this raises the question of whether we are unnecessarily withholding contrast from CTs and thereby reducing their diagnostic accuracy. Two 2014 observational studies provide additional evidence that the concern for CIN may be overstated.
Findings
The 2 Mayo Clinic studies used the same database. They looked at all patients who underwent a contrast‐enhanced or unenhanced thoracic, abdominal, or pelvic CT between January 2000 and December 2010 at the Mayo Clinic. After limiting the data to patients with pre‐ and post‐CT creatinine measurements and excluding anyone on dialysis, with preexisting acute kidney injury, or who had received additional contrast within 14 days, they ended up with 41,229 patients, mostly inpatients. All of the patients were assigned propensity scores based on risk factors for the development of CIN and whether they would likely receive contrast. The patients were then subdivided into 4 renal function subgroups based on estimated glomerular filtration rate (eGFR). The patients who received contrast were matched based on their propensity scores to those who did not received contrast within their eGFR subgroups. Unmatched patients were eliminated, leaving a cohort of 12,508 matched patients. The outcome of the first article was acute kidney injury (AKI) defined as a rise in creatinine >0.5 mg/dL at 24 to 48 hours. Though AKI rose with worsening eGFR subgroups (eGFR > 90 [1.2%] vs eGFR < 30 [14%]), the rates of AKI were the same regardless of contrast exposure. There was no statistical difference in any of the eGFR subgroups. The second study looked at important clinical outcomesdeath and the need for dialysis. There was no statistical difference for emergent dialysis (odds ratio [OR]: 0.96, P=0.89) or 30‐day mortality (HR: 0.97; P=0.45) regardless of whether the patients received contrast or not.
Cautions
In propensity matching, unmeasured confounders can bias the results. However, the issue of whether venous contrast causes CIN will unlikely be settled in a randomized controlled trial. For patients with severe renal failure (eGFR < 30), there were far fewer patients in this subgroup, making it harder to draw conclusions. The amount of venous contrast given was not provided. Finally, this study evaluated intravenous contrast for CTs, not intra‐arterial contrast.
Implications
These 2 studies raise doubt as to whether the incidence of AKI after contrast‐enhanced CT can be attributed to the contrast itself. What exactly causes the rise in creatinine is probably multifactorial including lab variation, hydration, blood pressure changes, nephrotoxic drugs, and comorbid disease. In trying to decide whether to obtain a contrast‐enhanced CT for patients with chronic kidney dysfunction, these studies provide more evidence to consider in the decision‐making process. A conversation with the radiologist about the benefits gained from using contrast in an individual patient may be of value.
PREVENTION AND PROGNOSIS OF INPATIENT DELIRIUM
Hatta K, Yasuhiro K, Wada K, et al. Preventive effects of ramelteon on delirium: a randomized placebo controlled trial. JAMA Psych. 2014;71(4):397403.
A new melatonin agonist dramatically improves delirium incidence.
Background
Numerous medications and therapeutic approaches have been studied to prevent incident delirium in hospitalized medical and surgical patients with varying success. Many of the tested medications also have the potential for significant undesirable side effects. An earlier small trial of melatonin appeared to have impressive efficacy for this purpose and be well tolerated, but the substance is not regulated by the FDA.[8] Ramelteon, a melatonin receptor agonist, is approved by the FDA for insomnia, and authors hypothesized that it, too, may be effective in delirium prevention.
Findings
This study was a multicenter, single‐blinded, randomized controlled trial of the melatonin‐agonist ramelteon versus placebo in elderly patients admitted to the hospital ward or ICU with serious medical conditions. Researchers excluded intubated patients or those with Lewy body dementia, psychiatric disorders, and severe liver disease. Patients received either ramelteon or placebo nightly for up to a week, and the primary end point was incident delirium as determined by a blinded observer using a validated assessment tool. Sixty‐seven patients were enrolled. The baseline characteristics in the arms of the trial were similar. In the placebo arm, 11 of 34 patients (32%) developed delirium during the 7‐day observation period. In the ramelteon arm, 1 of 33 (3%) developed delirium (P=0.003). The rate of drug discontinuation was the same in each arm.
Cautions
This study is small, and the single‐blinded design (the physicians and patients knew which group they were in but the observers did not) limits the validity of these results, mandating a larger double‐blinded trial.
Implications
Ramelteon showed a dramatic impact on preventing incident delirium on elderly hospitalized patients with serious medical conditions admitted to the ward or intensive care unit (ICU) (nonintubated) in this small study. If larger trials concur with the impact of this well‐tolerated and inexpensive medication, the potential for delirium incidence reduction could have a dramatic impact on how care for delirium‐vulnerable patients is conducted as well as the systems‐level costs associated with delirium care. Further studies of this class of medications are needed to more definitively establish its value in delirium prevention.
THE CONFUSION ASSESSMENT METHOD SEVERITY SCORE CAN QUANTIFY PROGNOSIS FOR DELIRIOUS MEDICAL INPATIENTS
Innoye SK, Kosar CM, Tommet D, et al. The CAM‐S: development and validation of a new scoring system for delirium in 2 cohorts. Ann Intern Med. 2014;160:526533.
Background
Delirium is common in hospitalized elderly patients, and numerous studies show that there are both short‐ and long‐term implications of developing delirium. Using well studied and validated tools has made identifying delirium fairly straightforward, yet its treatment remains difficult. Additionally, differentiating which patients will have a simpler clinical course from those at risk for a more morbid one has proved challenging. Using the Confusion Assessment Method (CAM), both in its short (4‐item) and long (10‐item) forms, as the basis for a prognostication tool, would allow for future research on treatment to have a scale against which to measure impact, and would allow clinicians to anticipate which patients were more likely to have difficult clinical courses.
Findings
The CAM Severity (CAM‐S) score was derived in 1219 subjects participating in 2 ongoing studies: 1 included high‐risk medical inpatients 70 years old or older, and the other included similarly aged patients undergoing major orthopedic, general, or vascular surgeries. Outcomes data were not available for the surgical patients. The CAM items were rated as either present/absent or absent/mild/severe, depending on the item, with an associated score attached to each item such that the 4‐item CAM had a score of 0 to 7 and the 10‐item CAM 0 to 19 (Table 2). Clinical outcomes from the medical patients cohort showed a dose response with increasing CAM‐S scores with respect to length of stay, adjusted cost, combined 90‐day end points of skilled nursing facility placement or death, and 90‐day mortality. Specifically, for patients with a CAM‐S (short form) score of 5 to 7, the 90‐day rate of death or nursing home residence was 62%, whereas the 90‐day postdischarge mortality rate was 36%.
The CAM | The CAM‐S | |
---|---|---|
| ||
Acute onset with fluctuating course | Absent | 0 |
Present | 1 | |
Inattention or distractability | Absent | 0 |
Mild | 1 | |
Severe | 2 | |
Disorganized thinking, illogical or unclear ideas | Absent | 0 |
Mild | 1 | |
Severe | 2 | |
Alteration of consciousness | Absent | 0 |
Mild | 0 | |
Severe | 2 | |
Total | 07 |
Cautions
The CAM‐S, like the CAM, may work less well in patients with hypoactive delirium. This scale has been applied in a surgical cohort, but study outcomes were not presented in this article. This absence limits our ability to apply these results to a surgical population presently.
Implications
This study demonstrates that in medical inpatients, the CAM‐S is effective for prognostication. Moreover, the study points out that high‐scoring patients on the CAM‐S have quite poor prognoses, with more than one‐third dying by 3 months. This finding suggests that an important use of the CAM‐S is to identify patients about whom goals of care discussions should be held and end‐of‐life planning initiated if not previously done.
GET EXCITED ABOUT HEPATIC ENCEPHALOPATHY AGAINA NEW POSSIBLE TREATMENT
Rahimi R, Singal A, Cuthbert J, et al. Lactulose vs polyethylene glycol 3350‐electrolyte solution for treatment of overt hepatic encephalopathy. The HELP randomized clinical trial. JAMA Intern Med. 2014;174(11):17271733.
Background
Lactulose has been the principle treatment for acute hepatic encephalopathy (HE) since 1966.[9] It theoretically works by lowering the pH of the colon and trapping ammonia as ammonium, which is then expelled. Alternatively, it may simply decrease transit time through the colon. In fact, earlier treatments for HE were cathartics such as magnesium salts. Unfortunately 20% tp 30% of patients are poor responders to lactulose, and patients do not like it. This new study tests whether a modern‐day cathartic, polyethylene glycol, works as well as lactulose.
Findings
In this unblinded, randomized controlled trial, patients presenting to the emergency department with acute HE were assigned to either lactulose 20 to 30 g for a minimum of 3 doses over 24 hours or 4 L of polyethylene glycol (PEG) over 4 hours. The2 groups were similar in severity and etiology of liver disease. Patients were allowed to have received 1 dose of lactulose given in the emergency department prior to study enrollment. They were excluded if taking rifaximin. The primary outcome was improvement in the hepatic encephalopathy scoring algorithm (HESA) by 1 grade at 24 hours.[10] The algorithm scores HE from 0 (no clinical findings of HE) to 5 (comatose). Initial mean HESA scores in the 2 groups were identical (2.3).
In the lactulose group, 13/25 (52%) improved by at least 1 HESA score at 24 hours. Two patients (8%) completely cleared with a HESA score of 0. In comparison, 21/23 (91%) in the PEG group improved at 24 hours, and 10/23 (43%) had cleared with a HESA score of 0 (P<0.01). The median time to HE resolution was 2 days in the lactulose group compared with 1 day in the PEG group (P=0.01). There were no differences in serious adverse events. The majority (76%) of the PEG group received the full 4 L of PEG.
Cautions
The main limitations of the trial were the small sample size, that it was a single‐center study, and the fact it was unblinded. Additionally, 80% of the PEG group received 1 dose of lactulose prior to enrollment. Statistically, more patients in the PEG group developed hypokalemia, which can worsen HE. Therefore, if PEG is used for acute HE, potassium will need to be monitored.
Implications
The results are intriguing and may represent a new possible treatment for acute HE once larger studies are done. Interestingly, the ammonia level dropped further in the lactulose group than the PEG group, yet there was more cognitive improvement in the PEG group. This raises questions about the role of ammonia and catharsis in HE. Although lactulose and rifaximin continue to be the standard of care, cathartics may be returning as a viable alternative.
SHOULD ‐BLOCKERS BE STOPPED IN PATIENTS WITH CIRRHOSIS WHEN SPONTANEOUS BACTERIAL PERITONITIS OCCURS?
Mandorfer M, Bota S, Schwabi P, et al. Nonselective beta blockers increase risk for hepatorenal syndrome and death in patients with cirrhosis and spontaneous bacterial peritonitis. Gastroenterology. 2014;146:16801690.
Background
Nonselective ‐blockers (NSBBs) are considered the aspirin of hepatologists, as they are used for primary and secondary prevention of variceal bleeds in patients with cirrhosis.[11] Since the 1980s, their benefit in reducing bleeding risk has been known, and more recently there has been evidence that they may reduce the risk of developing ascites in patients with compensated cirrhosis. Yet, there has been some contradictory evidence suggesting reduced survival in patients with decompensated cirrhosis and infections on NSBBs. This has led to the window hypothesis of NSBBs in cirrhosis, where NSBBs are beneficial only during a certain window period during the progression of cirrhosis.[12] Early on in cirrhosis, before the development of varices or ascites, NSBBs have no benefit. As cirrhosis progresses and portal hypertension develops, NSBBs play a major role in reducing bleeding from varices. However, in advanced cirrhosis, NSBBs may become harmful. In theory, they block the body's attempt to increase cardiac output during situations of increased physiologic stress, resulting in decreased mean arterial pressure and perfusion. This, in turn, causes end‐organ damage and increased risk of death. When exactly this NSBB window closes is unclear. A 2014 study suggests the window should close when patients develop spontaneous bacterial peritonitis (SBP).
Findings
This retrospective study followed 607 consecutive patients seen at a liver transplant center in Vienna, Austria, from 2006 to 2011. All of the patients were followed from the time of their first paracentesis. They were excluded if SBP was diagnosed during the first paracentesis. Patients were grouped based on whether they took an NSBB. As expected, more patients on an NSBB had varices (90% vs 62%; P<0.001) and a lower mean heart rate (77.5 vs 83.9 beats/minute; P<0.001). However, the 2 groups were similar in mean arterial pressure, systolic blood pressure, Model for End‐Stage Liver Disease score (17.5), Childs Pugh Score (CPS) (50% were C), and in the etiology of cirrhosis (55% were from alcoholic liver disease). They followed the patients for development of SBP. The primary outcome was transplant‐free survival. For the patients who never developed SBP, there was a 25% reduction in the risk of death for those on an NSBB adjusted for varices and CPS stage (HR=0.75, P=0.027). However, for the 182 patients who developed SBP, those on an NSBB had a 58% increase risk of death, again adjusted for varices and CPS stage (HR=1.58, P=0.014). Among the patients who developed SBP, there was a higher risk of hepatorenal syndrome (HRS) within 90 days for those on an NSBB (24% vs 11%, P=0.027). Although the mean arterial pressures (MAP) had been similar in the 2 groups before SBP, after the development of SBP, those on an NSBB had a significantly lower MAP (77.2 vs 82.6 mm Hg, P=0.005).
Cautions
This is a retrospective study, and although the authors controlled for varices and CPS, it is still possible the 2 groups were not similar. Whether patients were actually taking the NSBB is unknown, and doses of the NSBB were variable.
Implications
This study provides more evidence for the NSBB window hypothesis in the treatment of patients with cirrhosis. It suggests that the window on NSBB closes when patients develop SBP, as NSBBs appear to increase mortality and the risk of HRS. Thus, NSBB therapy should probably be discontinued in cirrhotic patients developing SBP. The question is for how long? The editorial accompanying the article says permanently.[13]
VTE PROPHYLAXIS FOR MEDICAL INPATIENTS: IS IT A THING OF THE PAST?
Flanders SA, Greene T, Grant P, et al. Hospital performance for pharmacologic venous thromboembolism prophylaxis and rate of venous thromboembolism. A cohort study. JAMA Intern Med. 2014;174(10):15771584.
Background
Based on early research studies, many quality and regulatory organizations have stressed the importance of assessing hospitalized patients' venous thromboembolism (VTE) risk and prophylaxing those patients at increased risk either pharmacologically or mechanically. In 2011, a meta‐analysis of 40 studies of medical and stroke patients including approximately 52,000 patients failed to demonstrate a mortality benefit, showing that for every 3 pulmonary embolisms (PEs) prevented, it caused 4 major bleeding episodes per 1000 patients.[14] A second study in 2011, a multicenter, randomized controlled trial with medically complex patients deemed high risk for VTE, also failed to demonstrate a mortality benefit.[15] Despite these and other trials showing questionable benefit, guidelines continue to recommend that high‐risk medical patients should get pharmacologic prophylaxis against VTE.
Findings
This retrospective cohort trial retrospectively evaluated a cohort of 20,794 medical patients (non‐ICU) across 35 hospitals, excluding those with a Caprini score of <2 (ie, low risk for VTE). The authors divided the hospitals into tertiles based on adherence to VTE prophylaxis guidelines. Patients were followed to 90 days after hospitalization with telephone calls (reaching 56%) and chart reviews (100% reviewed) to identify clinically evident VTE events, excluding those that occurred within the first 3 days of index hospitalization. The study identified no statistically significant differences among the tertiles in terms of VTE rates, either in the hospital or at 90 days, though the overall VTE event rate was low. Interestingly, 85% of events took place postdischarge. Subgroup analyses also failed to identify a population of medical patients who benefited from prophylaxis.
Cautions
Debate about whether the Caprini risk score is the best available VTE risk scoring system exists. This study also excluded surgical and ICU patients.
Implications
This trial adds to the mounting literature suggesting that current guidelines‐based pharmacologic VTE prophylaxis for medical patients may offer no clear benefit in terms of incident VTE events or mortality. Although it is not yet time to abandon VTE prophylaxis completely, this study does raise the important question of whether it is time to revisit the quality guidelines and regulatory standards around VTE prophylaxis in medical inpatients. It also highlights the difficulty in assessing medical patients for their VTE risk. Though this study is provocative and important for its real‐world setting, further studies are required.
OUT WITH THE OLD AND IN WITH THE NEW? SHOULD DIRECT ORAL ANTICOAGULANTS BE OUR FIRST CHOICE FOR CARING FOR PATIENTS WITH VTE AND ATRIAL FIBRILLATION?
van Es N, Coppens M, Schulman S. et al. Direct oral anticoagulants compared with vitamin K antagonists for acute venous thromboembolism: evidence from phase 3 trials. Blood. 2014;124(12):19681975.
For patients with acute VTE, direct oral anticoagulants work as well and are safer.
Background
There have been 6 large published randomized controlled trials of direct oral anticoagulants (DOACs) versus vitamin K antagonists (VKAs) in patients with acute VTE. Study sizes range from approximately 2500 to over 8000 subjects. All showed no significant difference between the arms with respect to efficacy (VTE or VTE‐related death) but had variable results with respect to major bleeding risk, a major concern given the nonreversibility of this group of medications. Additionally, subgroup analysis within these studies was challenging given sample size issues.
Findings
These 6 studies were combined in a meta‐analysis to address the DOACs' overall efficacy and safety profile, as well as looking in prespecified subgroups. The meta‐analysis included data from over 27,000 patients, evenly divided between DOACs (edoxaban, apixaban, rivaroxaban, and dabigatran) and VKAs, with the time in the therapeutic range (TTR) in the VKA arm being 64%. Overall, the primary efficacy endpoint (VTE and VTE‐related death) was similar (DOACs relative tisk [RR]=0.90; 95% confidence interval [CI]: 0.77‐1.06) but major bleeding (DOACs RR=0.61; 95% CI: 0.45‐0.83; NNT=150) and combined fatal and intracranial bleeding (DOACs RR=0.37; 95% CI: 0.27‐0.68; NNT=314) favored the DOACs. In subgroup analysis, there was no efficacy difference between the therapeutic groups in the subset specifically with DVT or with PE, or with patients weighing >100 kg, though safety data in these subsets were not evaluable. Patients with creatinine clearances of 30 to 49 mL/min demonstrated similar efficacy in both treatment arms, and the safety analysis in this subset with moderate renal impairment was better in the DOAC arm. Cancer patients achieved better efficacy with similar safety with the DOACs, whereas elderly patients achieved both better safety and efficacy with DOACs.
Cautions
As yet, there are inadequate data on patients with more advanced renal failure (creatinine clearance <30 mL/min) to advise using DOACs in that subset. Also, as there were no data comparing cancer patients with VTE that investigated DOACs versus low molecular weight heparins (the standard of care rather than warfarin since the CLOT [Comparison of Low‐molecular‐weight heparin versus Oral anticoagulant Therapy] trial[16]), the current meta‐analysis does not yet answer whether DOACs should be used in this population despite the efficacy benefit noted in the subgroup analysis.
Implications
This large meta‐analysis strongly suggests we can achieve comparable treatment efficacy from the DOACs as with VKAs, with better safety profiles in patients with acute VTE. In the subset of patients with moderate renal impairment (creatinine clearance 3049 mL/min), it appears safe and effective to choose DOACs.
IN PATIENTS WITH ATRIAL FIBRILLATION, DOACs APPEAR MORE EFFECTIVE THAN VKAs WITH COMPARABLE OR BETTER SAFETY PROFILES
Ruff CT, Guigliano RP, Braunwald E, et al. Comparison of the efficacy and safety of new oral anticoagulants with warfarin in patients with atrial fibrillation: a meta‐analysis of randomized trials. Lancet. 2014;383(9921):955962.
Background
Adding to the previously published meta‐analyses of the original phase 3 randomized trials regarding the DOACs' impact on the atrial fibrillation (AF) treatment safety and efficacy literature relative to VKAs, a 2013 trial, ENGAGE AF‐TIMI 48 (Effective Anticoagulation with Factor Xa Next Generation in Atrial FibrillationThrombolysis in Myocardial Infarction 48), with edoxaban was published and warrants inclusion to have a better opportunity to glean important subgroup information.[17]
Findings
This meta‐analysis included data on 71,683 patients, 42,411 in the DOAC arm and 29,272 in the warfarin arm, as 2 of the trials were3‐arm studies, comparing warfarin to a high dose and a low dose of the DOAC. Meta‐analyses of the 4 trials were broken down into a high‐dose subsetthe 2 high‐dose arms and the standard doses used in the other 2 trialsand a low‐dose subsetthe 2 low‐dose arms and the standard doses used in the other 2 trials. With respect to the efficacy endpoint (incident stroke or systemic embolization), the high‐dose subset analyses of the DOACs yielded a 19% reduction (P<0.0001; NNT=142) relative to the VKAs. The safety endpoint of major bleeding in this analysis identified a 14% reduction in the DOAC group that was nonsignificant (P=0.06). Within the high‐dose subset, analyses favored DOACs with respect to hemorrhagic stroke (51% reduction; P<0.0001; NNT=220), intracranial hemorrhage (52% reduction; P<0.0001; NNT=132), and overall mortality (10% reduction; P=0.0003; NNT=129), whereas they increased the risk of gastrointestinal bleeding (25% increase; P=0.043; NNH=185). There was no significant difference between DOACs and warfarin with respect to ischemic stroke. The low‐dose subset had similar overall results with even fewer hemorrhage strokes balancing a higher incidence of ischemic strokes in the DOAC arm than in warfarin. Other important subgroup analyses suggest the safety and efficacy impact of DOACs is significant for VKA‐naive and experienced patients, though only statistically so for VKA‐naive patients. Additionally, the anticoagulation centers included in the study that had a TTR <66% seemed to gain a safety advantage from the DOACs, whereas both TTR groups (<66% and 66%) appeared to achieve an efficacy benefit from DOACs.
Cautions
There are not sufficient data to suggest routinely switching patients tolerating and well managed on VKAs to DOACs for AF.
Implications
DOACs reduce stroke and systemic emboli in patients with AF without increasing intracranial bleeding or hemorrhagic stroke, though at the cost of increased gastrointestinal bleeding in patients on the high‐dose regimens. Those patients on the low‐dose regimens have even a lower hemorrhagic stroke risk, the benefit of which is negated by a higher than VKA risk of ischemic strokes. Centers with lower TTRs (and perhaps by extrapolation, those patients with more difficulty staying in the therapeutic range) may gain more benefit by switching. New patients on treatment for AF should strongly be considered for DOAC therapy as the first line.
IN ELDERLY PATIENTS, THE DOACs APPEAR TO OFFER IMPROVED EFFICACY WITHOUT SACRIFICING SAFETY
Sardar P, Chatterjee S, Chaudhari S, Lip GYH. New oral anticoagulants in elderly adults: evidence from meta‐analysis of randomized trials. J Am Geriatr Soc. 2014;62(5):857864.
Background
The prevalence of AF rises with age, as does the prevalence of malignancy, limited mobility, and other comorbidities that increase the risk for VTEs. These factors may also increase the risk of bleeding with conventional therapy with heparins and VKAs. As such, understanding the implications of using DOACs in the elderly population is important.
Findings
This meta‐analysis included the elderly (age 75 years) subset of patients from existing AF treatment and VTE treatment and prophylaxis randomized trials comparing DOACs with VKAs, low‐molecular‐weight heparin (LMWH), aspirin, or placebo. The primary safety outcome was major bleeding. For AF trials, the efficacy endpoint was stroke or systemic embolization, whereas in VTE trials it was VTE or VTE‐related death. Authors were able to extract data on 25,031 patients across 10 trials that evaluated rivaroxaban, apixaban, and dabigatran (not edoxaban), with follow‐up data ranging from 35 days to 2 years. For safety outcomes, the 2 arms showed no statistical difference (DOAC: 6.4%; conventional therapy: 6.3%; OR: 1.02; 95% CI: 0.73‐1.43). For efficacy endpoints in VTE studies, DOACs were more effective (3.7% vs 7.0%; OR: 0.45; 95% CI: 0.27‐77; NNT=30). For AF, the efficacy analysis favored DOACs also (3.3% vs 4.7%; OR: 0.65; 95% CI: 0.48‐0.87; NNT=71). When analyzed by the efficacy of the individual DOAC, rivaroxaban and apixaban both appeared to outperform the VKA/LMWH arm for both VTE and AF treatment, whereas data on dabigatran were only available for AF, also showing an efficacy benefit. Individual DOAC analyses for safety endpoints showed all the 3 to be similar to VKA/LMWH.
Cautions
Authors note, however, that coexisting low body weight and renal insufficiency may influence dosing choices in this population. There are specific dosage recommendations in the elderly for some DOACs.
Implications
The use of DOACs in patients aged 75 years and older appears to confer a substantial efficacy advantage when used for treatment of VTE and AF patients. The safety data presented in this meta‐analysis suggest that this class is comparable to VKA/LMWH medications.
CHANGING INPATIENT MANAGEMENT OF SKIN INFECTIONS
Boucher, H, Wilcox M, Talbot G, et al. Once‐weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370:21692179.
Corey G, Kabler, H, Mahra P, et al. Single‐dose oritavancin in the treatment of acute bacterial skin infections. N Engl J Med. 2014;370:21802190.
Background
There are over 870,000 hospital admissions yearly for skin infection, making it one of most common reasons for hospitalization in the United States.[18] Management often requires lengthy treatments with intravenous antibiotics, especially with the emergence of methicillin‐resistant Staphylococcus aureus. Results from 2 large randomized, double‐blinded, multicenter clinical trials were published looking at new once‐weekly intravenous antibiotics. Dalbavancin and oritavancin are both lipoglycopeptides in the same family as vancomycin. What is unique is that their serum drug concentrations exceed the minimum inhibitor concentrations for over a week. Both drugs were compared in noninferiority trials to vancomycin. The studies had similar outcomes. The dalbavancin results are presented below.
Findings
Researchers randomized 1312 patients with significant cellulitis, large abscess, or wound infection. Patients also had fever, leukocytosis, or bandemia, and the infection had to be deemed severe enough to require a minimum of 3 days of intravenous antibiotics. The patients could not have received any prior antibiotics. Over 80% of the patients had fevers, and more than half met the criteria for systemic inflammatory response syndrome. Patients were randomized to either dalbavancin (on day 1 and day 8) or vancomycin every 12 hours (1 gm or 15 mg/kg), with both groups receiving placebo dosing of the other drug. The blinded physicians could decide to switch to oral agent (placebo or linezolid in the vancomycin group) anytime after day 3, and the physicians could stop antibiotics anytime after day 10. Otherwise, all patients received 14 days of antibiotics.
The FDA‐approved outcome was cessation of spread of erythema at 48 to 72 hours and no fever at 3 independent readings. Results were similar in the dalbavancin group compared to the vancomycinlinezolid group (79.7% vs 79.8%). Dalbavancin was deemed noninferior to vancomycin. Blinded investigator's assessment of treatment success at 2 weeks was also similar (96% vs 96.7%, respectively). More treatment‐related adverse events occurred in the vancomycinlinezolid group (183 vs 139; P=0.02) and more deaths occurred in the vancomycin group (7 vs 1; P=0.03).
Cautions
These antibiotics have only been shown effective for complicated, acute bacterial skin infections. Their performance for other gram‐positive infections is unknown. In the future, it is possible that patients with severe skin infections will receive a dose of these antibiotics on hospital day 1 and be sent home with close follow‐up. However, that study has not been done yet to confirm efficacy and safety. Though the drugs appear safe, there needs to be more clinical use before they become standard of care, especially because of the long half‐life. Finally, these drugs are very expensive and provide broad spectrum gram‐positive coverage. They are not meant for a simple cellulitis.
Implications
These 2 new once‐weekly antibioticsdalbavancin and oritavancinare noninferior to vancomycin for acute bacterial skin infections. They provide alternative treatment choices for managing patients with significant infections requiring hospitalization. In the future, they may change the need for hospitalization of these patients or significantly reduce their length of stay. Though expensive, a significant reduction in hospitalization will offset costs.
SHOULD THEY STAY OR SHOULD THEY GO? FAMILY PRESENCE DURING CPR MAY IMPROVE THE GRIEF PROCESS DURABLY
Jabre P, Tazarourte K, Azoulay E, et al. Offering the opportunity for family to be present during cardiopulmonary resuscitation: 1 year assessment. Intensive Care Med. 2014;40:981987.
Background
In 2013, a French study randomized adult family members of a patient undergoing cardiopulmonary resuscitation (CPR) occurring at home to either be invited to stay and watch the resuscitation or to have no specific invitation offered.[19] At 90 days, this study revealed that those who were invited to watch (and 79% did) had fewer symptoms of post‐traumatic stress disorder (PTSD) (27% vs 37%) and anxiety (15% vs 23%), though not depression, than did the group not offered the opportunity to watch (though 43% watched anyway). There were 570 subjects (family members) in the trial, of whom a greater number in the control arm declined to participate in a 90‐day follow‐up due to emotional distress. Notably, only 4% of the patients in this study undergoing CPR survived to day 28. Whether the apparent positive psychological impact of the offer to watch CPR for families was durable remained in question.
Findings
The study group followed the families up to 1 year. At that time, dropout rates were similar (with the assumption, as in the prior study, that those who dropped out of either arm had PTSD symptoms). At follow‐up, subjects were again assessed for PTSD, anxiety, and depression symptoms as well as for meeting criteria for having had a major depressive episode or complicated grief. Four hundred eight of the original 570 subjects were able to undergo reevaluation. The 1‐year results showed the group offered the chance to watch CPR had fewer PTSD symptoms (20% vs 32%) and depression symptoms (10% vs 16%), as well as fewer major depressive episodes (23% vs 31%) and less complicated grief (21% vs 36%) but without a durable impact on anxiety symptoms.
Cautions
The resuscitation efforts in question here occurred out of hospital (in the home). Part of the protocol for those family members observing CPR was that a clinician was assigned to stay with them and explain the resuscitation process as it occurred.
Implications
It is postulated that having the chance to observe CPR, if desired, may help the grieving process. This study clearly raises a question about the wisdom of routinely escorting patient's families out of the room during resuscitative efforts. It seems likely that the durable and important psychological effects observed in this study for family members would similarly persist in emergency department and inpatient settings, where staff can be with patients' families to talk them through the events they are witnessing. It is time to ask families if they prefer to stay and watch CPR and not automatically move them to a waiting room.
Disclosure: Nothing to report.
Keeping up with the medical literature in a field as broad as hospital medicine is a daunting task. In 2014 alone, there were over 9200 articles published in top‐tier internal medicine journals.[1] The authors have selected articles from among these top journals using a nonsystematic process that involved reviewing articles brought to their attention via colleagues, literature searches, and online services. The focus was to identify articles that would be of importance to the field of hospital medicine for their potential to be practice changing, provocative, or iconoclastic. After culling through hundreds of titles and abstracts, 46 articles were reviewed by both authors in full text, and ultimately 14 were selected for presentation here. Table 1 summarizes the key points.
|
1. Now that neprolysin inhibitors are approved by the FDA, hospitalists will see them prescribed as an alternative to ACE‐inhibitors given their impressive benefits in cardiovascular mortality and heart failure hospitalizations. |
2. Current evidence suggests that intravenous contrast given with CT scans may not significantly alter the incidence of acute kidney injury, its associated mortality, or the need for hemodialysis. |
3. The CAM‐S score is an important tool for prognostication in delirious patients. Those patients with high CAM‐S scores should be considered for goals of care conversations. |
4. The melatonin agonist, ramelteon, shows promise for lowering incident delirium among elderly medical patients, though larger trials are still needed. |
5. Polyethylene glycol may be an excellent alternative to lactulose for patients with acute hepatic encephalopathy once larger studies are done, as it is well tolerated and shows faster resolution of symptoms. |
6. Nonselective ‐blockers should no longer be offered to cirrhotic patients after they develop spontaneous bacterial peritonitis, as they are associated with increased mortality and acute kidney injury. |
7. Current guidelines regarding prophylaxis against VTE in medical inpatients likely result in nonbeneficial use of medications for this purpose. It remains unclear which high‐risk populations do benefit from pharmacologic prophylaxis. |
8. DOACs are as effective and are safer than conventional therapy for treatment of VTE, though they are not recommended in patients with GFR <30 mL/min. |
9. DOACs are more effective and are safer (though they may increase risk of gastrointestinal bleeding) than conventional therapy in patients with AF. |
10. DOACs are as safe and more effective than conventional therapy in elderly patients with VTE or AF, being mindful of dosing recommendations in this population. |
11. Two new once‐weekly antibiotics, dalbavancin and oritavancin, approved for skin and soft tissue infections, appear noninferior to vancomycin and have the potential to shorten hospitalizations and, in doing so, may decrease cost. |
12. Offering family members of a patient undergoing CPR the opportunity to observe has durable impact on meaningful short‐ and long‐term psychological outcomes. Clinicians should strongly consider making this offer. |
AN APPROACHING PARADIGM SHIFT IN THE TREATMENT FOR HEART FAILURE
McMurray J, Packer M, Desai A, et al. Angiotensin‐neprilysin inhibition versus enalapril in heart failure. N Engl J Med. 2014;371:9931004.
Background
The last drug approved by the Food and Drug Administration (FDA) for heart failure (HF) was 10 years ago.[2] The new PARADIGM (Prospective Comparison of ARNI With ACEI to Determine Impact on Global Mortality and Morbidity in Heart Failure) heart failure study comparing a novel combination drug of a neprilysin inhibitor and angiotensin receptor blocker (ARB) to an angiotensin‐converting enzyme (ACE) inhibitor has cardiologists considering a possible change in the HF treatment algorithm. Neprilysin is a naturally occurring enzyme that breaks down the protective vasoactive peptides (brain natriuretic peptide, atrial natriuretic peptide, and bradykinin) made by the heart and the body in HF. These vasoactive peptides function to increase vasodilation and block sodium and water reabsorption. This novel neprilysin inhibitor extends the life of these vasoactive peptides, thus enhancing their effect. By inhibiting both neprilysin and the renin‐angiotensin system, there should be additional improvement in HF management. The neprilysin inhibitor was combined with an ARB instead of an ACE inhibitor because of significant angioedema seen in earlier phase trials when combined with an ACE inhibitor. This is believed related to increases in bradykinin due to both agents.
Findings
In this multicenter, blinded, randomized trial, over 10,000 patients with known HF (ejection fraction<35%, New York Heart Association class II or higher) went through 2 run‐in periods to ensure tolerance of both enalapril and the study drug, a combination of a neprilysin inhibitor and valsartan (neprilysin‐I/ARB). Eventually 8442 patients underwent randomization to either enalapril (10 mg twice a day) or neprilysin‐I/ARB (200 mg twice a day). The primary outcome was a combination of cardiovascular mortality and heart failure hospitalizations. The trial was stopped early at 27 months because of overwhelming benefit with neprilysin‐I/ARB (21.8% vs 26.5%; P<0.001). There was a 20% reduction specifically in cardiovascular mortality (13.3% vs 16.5%; hazard ratio [HR]: 0.80; P<0.001). The number needed to treat (NNT) was 32. There was also a 21% reduction in the risk of hospitalization (P<0.001). More patients with neprilysin‐I/ARB had symptomatic hypotension (14% vs 9.2%; P<0.001) but patients on the ACE inhibitor experienced more cough, hyperkalemia, and increases in their serum creatinine.
Cautions
There are 2 reasons clinicians may not see the same results in practice. First, the trial was stopped early, which can sometimes exaggerate benefits.[3] Second, the 2 run‐in periods eliminated patients who could not tolerate the medications at the trial doses. Additionally, although the study's authors were independent, the trial was funded by a pharmaceutical company.
Implications
This new combination drug of a neprilysin inhibitor and valsartan shows great promise at reducing cardiovascular mortality and hospitalizations for heart failure compared to enalapril alone. Given the high morbidity and mortality of heart failure, having a new agent in the treatment algorithm will be useful to patients and physicians. The drug was just approved by the FDA in July 2015 and will likely be offered as an alternative to ACE inhibitors.
VENOUS CONTRAST‐INDUCED NEPHROTOXICITY: IS THERE REALLY A RISK?
McDonald J, McDonald R, Carter R, et al. Risk of intravenous contrast material‐mediated acute kidney injury: a propensity score‐matched study stratified by baseline‐estimated glomerular filtration rate. Radiology. 2014;271(1):6573.
McDonald R, McDonald J, Carter R, et al. Intravenous contrast material exposure is not an independent risk factor for dialysis or mortality. Radiology. 2014;273(3):714725.
Background
It is a common practice to withhold intravenous contrast material from computed tomography (CT) scans in patients with even moderately poor renal function out of concern for causing contrast‐induced nephropathy (CIN). Our understanding of CIN is based largely on observational studies and outcomes of cardiac catheterizations, where larger amounts of contrast are given intra‐arterially into an atherosclerotic aorta.[4] The exact mechanism of injury is not clear, possibly from direct tubule toxicity or renal vasoconstriction.[5] CIN is defined as a rise in creatinine >0.5 mg/dL or >25% rise in serum creatinine 24 to 48 hours after receiving intravenous contrast. Although it is usually self‐limited, there is concern that patients who develop CIN have an increase risk of dialysis and death.[6] In the last few years, radiologists have started to question whether the risk of CIN is overstated. A recent meta‐analysis of 13 studies demonstrated a similar likelihood of acute kidney injury in patients regardless of receiving intravenous contrast.[7] If the true incidence of CIN after venous contrast is actually lower, this raises the question of whether we are unnecessarily withholding contrast from CTs and thereby reducing their diagnostic accuracy. Two 2014 observational studies provide additional evidence that the concern for CIN may be overstated.
Findings
The 2 Mayo Clinic studies used the same database. They looked at all patients who underwent a contrast‐enhanced or unenhanced thoracic, abdominal, or pelvic CT between January 2000 and December 2010 at the Mayo Clinic. After limiting the data to patients with pre‐ and post‐CT creatinine measurements and excluding anyone on dialysis, with preexisting acute kidney injury, or who had received additional contrast within 14 days, they ended up with 41,229 patients, mostly inpatients. All of the patients were assigned propensity scores based on risk factors for the development of CIN and whether they would likely receive contrast. The patients were then subdivided into 4 renal function subgroups based on estimated glomerular filtration rate (eGFR). The patients who received contrast were matched based on their propensity scores to those who did not received contrast within their eGFR subgroups. Unmatched patients were eliminated, leaving a cohort of 12,508 matched patients. The outcome of the first article was acute kidney injury (AKI) defined as a rise in creatinine >0.5 mg/dL at 24 to 48 hours. Though AKI rose with worsening eGFR subgroups (eGFR > 90 [1.2%] vs eGFR < 30 [14%]), the rates of AKI were the same regardless of contrast exposure. There was no statistical difference in any of the eGFR subgroups. The second study looked at important clinical outcomesdeath and the need for dialysis. There was no statistical difference for emergent dialysis (odds ratio [OR]: 0.96, P=0.89) or 30‐day mortality (HR: 0.97; P=0.45) regardless of whether the patients received contrast or not.
Cautions
In propensity matching, unmeasured confounders can bias the results. However, the issue of whether venous contrast causes CIN will unlikely be settled in a randomized controlled trial. For patients with severe renal failure (eGFR < 30), there were far fewer patients in this subgroup, making it harder to draw conclusions. The amount of venous contrast given was not provided. Finally, this study evaluated intravenous contrast for CTs, not intra‐arterial contrast.
Implications
These 2 studies raise doubt as to whether the incidence of AKI after contrast‐enhanced CT can be attributed to the contrast itself. What exactly causes the rise in creatinine is probably multifactorial including lab variation, hydration, blood pressure changes, nephrotoxic drugs, and comorbid disease. In trying to decide whether to obtain a contrast‐enhanced CT for patients with chronic kidney dysfunction, these studies provide more evidence to consider in the decision‐making process. A conversation with the radiologist about the benefits gained from using contrast in an individual patient may be of value.
PREVENTION AND PROGNOSIS OF INPATIENT DELIRIUM
Hatta K, Yasuhiro K, Wada K, et al. Preventive effects of ramelteon on delirium: a randomized placebo controlled trial. JAMA Psych. 2014;71(4):397403.
A new melatonin agonist dramatically improves delirium incidence.
Background
Numerous medications and therapeutic approaches have been studied to prevent incident delirium in hospitalized medical and surgical patients with varying success. Many of the tested medications also have the potential for significant undesirable side effects. An earlier small trial of melatonin appeared to have impressive efficacy for this purpose and be well tolerated, but the substance is not regulated by the FDA.[8] Ramelteon, a melatonin receptor agonist, is approved by the FDA for insomnia, and authors hypothesized that it, too, may be effective in delirium prevention.
Findings
This study was a multicenter, single‐blinded, randomized controlled trial of the melatonin‐agonist ramelteon versus placebo in elderly patients admitted to the hospital ward or ICU with serious medical conditions. Researchers excluded intubated patients or those with Lewy body dementia, psychiatric disorders, and severe liver disease. Patients received either ramelteon or placebo nightly for up to a week, and the primary end point was incident delirium as determined by a blinded observer using a validated assessment tool. Sixty‐seven patients were enrolled. The baseline characteristics in the arms of the trial were similar. In the placebo arm, 11 of 34 patients (32%) developed delirium during the 7‐day observation period. In the ramelteon arm, 1 of 33 (3%) developed delirium (P=0.003). The rate of drug discontinuation was the same in each arm.
Cautions
This study is small, and the single‐blinded design (the physicians and patients knew which group they were in but the observers did not) limits the validity of these results, mandating a larger double‐blinded trial.
Implications
Ramelteon showed a dramatic impact on preventing incident delirium on elderly hospitalized patients with serious medical conditions admitted to the ward or intensive care unit (ICU) (nonintubated) in this small study. If larger trials concur with the impact of this well‐tolerated and inexpensive medication, the potential for delirium incidence reduction could have a dramatic impact on how care for delirium‐vulnerable patients is conducted as well as the systems‐level costs associated with delirium care. Further studies of this class of medications are needed to more definitively establish its value in delirium prevention.
THE CONFUSION ASSESSMENT METHOD SEVERITY SCORE CAN QUANTIFY PROGNOSIS FOR DELIRIOUS MEDICAL INPATIENTS
Innoye SK, Kosar CM, Tommet D, et al. The CAM‐S: development and validation of a new scoring system for delirium in 2 cohorts. Ann Intern Med. 2014;160:526533.
Background
Delirium is common in hospitalized elderly patients, and numerous studies show that there are both short‐ and long‐term implications of developing delirium. Using well studied and validated tools has made identifying delirium fairly straightforward, yet its treatment remains difficult. Additionally, differentiating which patients will have a simpler clinical course from those at risk for a more morbid one has proved challenging. Using the Confusion Assessment Method (CAM), both in its short (4‐item) and long (10‐item) forms, as the basis for a prognostication tool, would allow for future research on treatment to have a scale against which to measure impact, and would allow clinicians to anticipate which patients were more likely to have difficult clinical courses.
Findings
The CAM Severity (CAM‐S) score was derived in 1219 subjects participating in 2 ongoing studies: 1 included high‐risk medical inpatients 70 years old or older, and the other included similarly aged patients undergoing major orthopedic, general, or vascular surgeries. Outcomes data were not available for the surgical patients. The CAM items were rated as either present/absent or absent/mild/severe, depending on the item, with an associated score attached to each item such that the 4‐item CAM had a score of 0 to 7 and the 10‐item CAM 0 to 19 (Table 2). Clinical outcomes from the medical patients cohort showed a dose response with increasing CAM‐S scores with respect to length of stay, adjusted cost, combined 90‐day end points of skilled nursing facility placement or death, and 90‐day mortality. Specifically, for patients with a CAM‐S (short form) score of 5 to 7, the 90‐day rate of death or nursing home residence was 62%, whereas the 90‐day postdischarge mortality rate was 36%.
The CAM | The CAM‐S | |
---|---|---|
| ||
Acute onset with fluctuating course | Absent | 0 |
Present | 1 | |
Inattention or distractability | Absent | 0 |
Mild | 1 | |
Severe | 2 | |
Disorganized thinking, illogical or unclear ideas | Absent | 0 |
Mild | 1 | |
Severe | 2 | |
Alteration of consciousness | Absent | 0 |
Mild | 0 | |
Severe | 2 | |
Total | 07 |
Cautions
The CAM‐S, like the CAM, may work less well in patients with hypoactive delirium. This scale has been applied in a surgical cohort, but study outcomes were not presented in this article. This absence limits our ability to apply these results to a surgical population presently.
Implications
This study demonstrates that in medical inpatients, the CAM‐S is effective for prognostication. Moreover, the study points out that high‐scoring patients on the CAM‐S have quite poor prognoses, with more than one‐third dying by 3 months. This finding suggests that an important use of the CAM‐S is to identify patients about whom goals of care discussions should be held and end‐of‐life planning initiated if not previously done.
GET EXCITED ABOUT HEPATIC ENCEPHALOPATHY AGAINA NEW POSSIBLE TREATMENT
Rahimi R, Singal A, Cuthbert J, et al. Lactulose vs polyethylene glycol 3350‐electrolyte solution for treatment of overt hepatic encephalopathy. The HELP randomized clinical trial. JAMA Intern Med. 2014;174(11):17271733.
Background
Lactulose has been the principle treatment for acute hepatic encephalopathy (HE) since 1966.[9] It theoretically works by lowering the pH of the colon and trapping ammonia as ammonium, which is then expelled. Alternatively, it may simply decrease transit time through the colon. In fact, earlier treatments for HE were cathartics such as magnesium salts. Unfortunately 20% tp 30% of patients are poor responders to lactulose, and patients do not like it. This new study tests whether a modern‐day cathartic, polyethylene glycol, works as well as lactulose.
Findings
In this unblinded, randomized controlled trial, patients presenting to the emergency department with acute HE were assigned to either lactulose 20 to 30 g for a minimum of 3 doses over 24 hours or 4 L of polyethylene glycol (PEG) over 4 hours. The2 groups were similar in severity and etiology of liver disease. Patients were allowed to have received 1 dose of lactulose given in the emergency department prior to study enrollment. They were excluded if taking rifaximin. The primary outcome was improvement in the hepatic encephalopathy scoring algorithm (HESA) by 1 grade at 24 hours.[10] The algorithm scores HE from 0 (no clinical findings of HE) to 5 (comatose). Initial mean HESA scores in the 2 groups were identical (2.3).
In the lactulose group, 13/25 (52%) improved by at least 1 HESA score at 24 hours. Two patients (8%) completely cleared with a HESA score of 0. In comparison, 21/23 (91%) in the PEG group improved at 24 hours, and 10/23 (43%) had cleared with a HESA score of 0 (P<0.01). The median time to HE resolution was 2 days in the lactulose group compared with 1 day in the PEG group (P=0.01). There were no differences in serious adverse events. The majority (76%) of the PEG group received the full 4 L of PEG.
Cautions
The main limitations of the trial were the small sample size, that it was a single‐center study, and the fact it was unblinded. Additionally, 80% of the PEG group received 1 dose of lactulose prior to enrollment. Statistically, more patients in the PEG group developed hypokalemia, which can worsen HE. Therefore, if PEG is used for acute HE, potassium will need to be monitored.
Implications
The results are intriguing and may represent a new possible treatment for acute HE once larger studies are done. Interestingly, the ammonia level dropped further in the lactulose group than the PEG group, yet there was more cognitive improvement in the PEG group. This raises questions about the role of ammonia and catharsis in HE. Although lactulose and rifaximin continue to be the standard of care, cathartics may be returning as a viable alternative.
SHOULD ‐BLOCKERS BE STOPPED IN PATIENTS WITH CIRRHOSIS WHEN SPONTANEOUS BACTERIAL PERITONITIS OCCURS?
Mandorfer M, Bota S, Schwabi P, et al. Nonselective beta blockers increase risk for hepatorenal syndrome and death in patients with cirrhosis and spontaneous bacterial peritonitis. Gastroenterology. 2014;146:16801690.
Background
Nonselective ‐blockers (NSBBs) are considered the aspirin of hepatologists, as they are used for primary and secondary prevention of variceal bleeds in patients with cirrhosis.[11] Since the 1980s, their benefit in reducing bleeding risk has been known, and more recently there has been evidence that they may reduce the risk of developing ascites in patients with compensated cirrhosis. Yet, there has been some contradictory evidence suggesting reduced survival in patients with decompensated cirrhosis and infections on NSBBs. This has led to the window hypothesis of NSBBs in cirrhosis, where NSBBs are beneficial only during a certain window period during the progression of cirrhosis.[12] Early on in cirrhosis, before the development of varices or ascites, NSBBs have no benefit. As cirrhosis progresses and portal hypertension develops, NSBBs play a major role in reducing bleeding from varices. However, in advanced cirrhosis, NSBBs may become harmful. In theory, they block the body's attempt to increase cardiac output during situations of increased physiologic stress, resulting in decreased mean arterial pressure and perfusion. This, in turn, causes end‐organ damage and increased risk of death. When exactly this NSBB window closes is unclear. A 2014 study suggests the window should close when patients develop spontaneous bacterial peritonitis (SBP).
Findings
This retrospective study followed 607 consecutive patients seen at a liver transplant center in Vienna, Austria, from 2006 to 2011. All of the patients were followed from the time of their first paracentesis. They were excluded if SBP was diagnosed during the first paracentesis. Patients were grouped based on whether they took an NSBB. As expected, more patients on an NSBB had varices (90% vs 62%; P<0.001) and a lower mean heart rate (77.5 vs 83.9 beats/minute; P<0.001). However, the 2 groups were similar in mean arterial pressure, systolic blood pressure, Model for End‐Stage Liver Disease score (17.5), Childs Pugh Score (CPS) (50% were C), and in the etiology of cirrhosis (55% were from alcoholic liver disease). They followed the patients for development of SBP. The primary outcome was transplant‐free survival. For the patients who never developed SBP, there was a 25% reduction in the risk of death for those on an NSBB adjusted for varices and CPS stage (HR=0.75, P=0.027). However, for the 182 patients who developed SBP, those on an NSBB had a 58% increase risk of death, again adjusted for varices and CPS stage (HR=1.58, P=0.014). Among the patients who developed SBP, there was a higher risk of hepatorenal syndrome (HRS) within 90 days for those on an NSBB (24% vs 11%, P=0.027). Although the mean arterial pressures (MAP) had been similar in the 2 groups before SBP, after the development of SBP, those on an NSBB had a significantly lower MAP (77.2 vs 82.6 mm Hg, P=0.005).
Cautions
This is a retrospective study, and although the authors controlled for varices and CPS, it is still possible the 2 groups were not similar. Whether patients were actually taking the NSBB is unknown, and doses of the NSBB were variable.
Implications
This study provides more evidence for the NSBB window hypothesis in the treatment of patients with cirrhosis. It suggests that the window on NSBB closes when patients develop SBP, as NSBBs appear to increase mortality and the risk of HRS. Thus, NSBB therapy should probably be discontinued in cirrhotic patients developing SBP. The question is for how long? The editorial accompanying the article says permanently.[13]
VTE PROPHYLAXIS FOR MEDICAL INPATIENTS: IS IT A THING OF THE PAST?
Flanders SA, Greene T, Grant P, et al. Hospital performance for pharmacologic venous thromboembolism prophylaxis and rate of venous thromboembolism. A cohort study. JAMA Intern Med. 2014;174(10):15771584.
Background
Based on early research studies, many quality and regulatory organizations have stressed the importance of assessing hospitalized patients' venous thromboembolism (VTE) risk and prophylaxing those patients at increased risk either pharmacologically or mechanically. In 2011, a meta‐analysis of 40 studies of medical and stroke patients including approximately 52,000 patients failed to demonstrate a mortality benefit, showing that for every 3 pulmonary embolisms (PEs) prevented, it caused 4 major bleeding episodes per 1000 patients.[14] A second study in 2011, a multicenter, randomized controlled trial with medically complex patients deemed high risk for VTE, also failed to demonstrate a mortality benefit.[15] Despite these and other trials showing questionable benefit, guidelines continue to recommend that high‐risk medical patients should get pharmacologic prophylaxis against VTE.
Findings
This retrospective cohort trial retrospectively evaluated a cohort of 20,794 medical patients (non‐ICU) across 35 hospitals, excluding those with a Caprini score of <2 (ie, low risk for VTE). The authors divided the hospitals into tertiles based on adherence to VTE prophylaxis guidelines. Patients were followed to 90 days after hospitalization with telephone calls (reaching 56%) and chart reviews (100% reviewed) to identify clinically evident VTE events, excluding those that occurred within the first 3 days of index hospitalization. The study identified no statistically significant differences among the tertiles in terms of VTE rates, either in the hospital or at 90 days, though the overall VTE event rate was low. Interestingly, 85% of events took place postdischarge. Subgroup analyses also failed to identify a population of medical patients who benefited from prophylaxis.
Cautions
Debate about whether the Caprini risk score is the best available VTE risk scoring system exists. This study also excluded surgical and ICU patients.
Implications
This trial adds to the mounting literature suggesting that current guidelines‐based pharmacologic VTE prophylaxis for medical patients may offer no clear benefit in terms of incident VTE events or mortality. Although it is not yet time to abandon VTE prophylaxis completely, this study does raise the important question of whether it is time to revisit the quality guidelines and regulatory standards around VTE prophylaxis in medical inpatients. It also highlights the difficulty in assessing medical patients for their VTE risk. Though this study is provocative and important for its real‐world setting, further studies are required.
OUT WITH THE OLD AND IN WITH THE NEW? SHOULD DIRECT ORAL ANTICOAGULANTS BE OUR FIRST CHOICE FOR CARING FOR PATIENTS WITH VTE AND ATRIAL FIBRILLATION?
van Es N, Coppens M, Schulman S. et al. Direct oral anticoagulants compared with vitamin K antagonists for acute venous thromboembolism: evidence from phase 3 trials. Blood. 2014;124(12):19681975.
For patients with acute VTE, direct oral anticoagulants work as well and are safer.
Background
There have been 6 large published randomized controlled trials of direct oral anticoagulants (DOACs) versus vitamin K antagonists (VKAs) in patients with acute VTE. Study sizes range from approximately 2500 to over 8000 subjects. All showed no significant difference between the arms with respect to efficacy (VTE or VTE‐related death) but had variable results with respect to major bleeding risk, a major concern given the nonreversibility of this group of medications. Additionally, subgroup analysis within these studies was challenging given sample size issues.
Findings
These 6 studies were combined in a meta‐analysis to address the DOACs' overall efficacy and safety profile, as well as looking in prespecified subgroups. The meta‐analysis included data from over 27,000 patients, evenly divided between DOACs (edoxaban, apixaban, rivaroxaban, and dabigatran) and VKAs, with the time in the therapeutic range (TTR) in the VKA arm being 64%. Overall, the primary efficacy endpoint (VTE and VTE‐related death) was similar (DOACs relative tisk [RR]=0.90; 95% confidence interval [CI]: 0.77‐1.06) but major bleeding (DOACs RR=0.61; 95% CI: 0.45‐0.83; NNT=150) and combined fatal and intracranial bleeding (DOACs RR=0.37; 95% CI: 0.27‐0.68; NNT=314) favored the DOACs. In subgroup analysis, there was no efficacy difference between the therapeutic groups in the subset specifically with DVT or with PE, or with patients weighing >100 kg, though safety data in these subsets were not evaluable. Patients with creatinine clearances of 30 to 49 mL/min demonstrated similar efficacy in both treatment arms, and the safety analysis in this subset with moderate renal impairment was better in the DOAC arm. Cancer patients achieved better efficacy with similar safety with the DOACs, whereas elderly patients achieved both better safety and efficacy with DOACs.
Cautions
As yet, there are inadequate data on patients with more advanced renal failure (creatinine clearance <30 mL/min) to advise using DOACs in that subset. Also, as there were no data comparing cancer patients with VTE that investigated DOACs versus low molecular weight heparins (the standard of care rather than warfarin since the CLOT [Comparison of Low‐molecular‐weight heparin versus Oral anticoagulant Therapy] trial[16]), the current meta‐analysis does not yet answer whether DOACs should be used in this population despite the efficacy benefit noted in the subgroup analysis.
Implications
This large meta‐analysis strongly suggests we can achieve comparable treatment efficacy from the DOACs as with VKAs, with better safety profiles in patients with acute VTE. In the subset of patients with moderate renal impairment (creatinine clearance 3049 mL/min), it appears safe and effective to choose DOACs.
IN PATIENTS WITH ATRIAL FIBRILLATION, DOACs APPEAR MORE EFFECTIVE THAN VKAs WITH COMPARABLE OR BETTER SAFETY PROFILES
Ruff CT, Guigliano RP, Braunwald E, et al. Comparison of the efficacy and safety of new oral anticoagulants with warfarin in patients with atrial fibrillation: a meta‐analysis of randomized trials. Lancet. 2014;383(9921):955962.
Background
Adding to the previously published meta‐analyses of the original phase 3 randomized trials regarding the DOACs' impact on the atrial fibrillation (AF) treatment safety and efficacy literature relative to VKAs, a 2013 trial, ENGAGE AF‐TIMI 48 (Effective Anticoagulation with Factor Xa Next Generation in Atrial FibrillationThrombolysis in Myocardial Infarction 48), with edoxaban was published and warrants inclusion to have a better opportunity to glean important subgroup information.[17]
Findings
This meta‐analysis included data on 71,683 patients, 42,411 in the DOAC arm and 29,272 in the warfarin arm, as 2 of the trials were3‐arm studies, comparing warfarin to a high dose and a low dose of the DOAC. Meta‐analyses of the 4 trials were broken down into a high‐dose subsetthe 2 high‐dose arms and the standard doses used in the other 2 trialsand a low‐dose subsetthe 2 low‐dose arms and the standard doses used in the other 2 trials. With respect to the efficacy endpoint (incident stroke or systemic embolization), the high‐dose subset analyses of the DOACs yielded a 19% reduction (P<0.0001; NNT=142) relative to the VKAs. The safety endpoint of major bleeding in this analysis identified a 14% reduction in the DOAC group that was nonsignificant (P=0.06). Within the high‐dose subset, analyses favored DOACs with respect to hemorrhagic stroke (51% reduction; P<0.0001; NNT=220), intracranial hemorrhage (52% reduction; P<0.0001; NNT=132), and overall mortality (10% reduction; P=0.0003; NNT=129), whereas they increased the risk of gastrointestinal bleeding (25% increase; P=0.043; NNH=185). There was no significant difference between DOACs and warfarin with respect to ischemic stroke. The low‐dose subset had similar overall results with even fewer hemorrhage strokes balancing a higher incidence of ischemic strokes in the DOAC arm than in warfarin. Other important subgroup analyses suggest the safety and efficacy impact of DOACs is significant for VKA‐naive and experienced patients, though only statistically so for VKA‐naive patients. Additionally, the anticoagulation centers included in the study that had a TTR <66% seemed to gain a safety advantage from the DOACs, whereas both TTR groups (<66% and 66%) appeared to achieve an efficacy benefit from DOACs.
Cautions
There are not sufficient data to suggest routinely switching patients tolerating and well managed on VKAs to DOACs for AF.
Implications
DOACs reduce stroke and systemic emboli in patients with AF without increasing intracranial bleeding or hemorrhagic stroke, though at the cost of increased gastrointestinal bleeding in patients on the high‐dose regimens. Those patients on the low‐dose regimens have even a lower hemorrhagic stroke risk, the benefit of which is negated by a higher than VKA risk of ischemic strokes. Centers with lower TTRs (and perhaps by extrapolation, those patients with more difficulty staying in the therapeutic range) may gain more benefit by switching. New patients on treatment for AF should strongly be considered for DOAC therapy as the first line.
IN ELDERLY PATIENTS, THE DOACs APPEAR TO OFFER IMPROVED EFFICACY WITHOUT SACRIFICING SAFETY
Sardar P, Chatterjee S, Chaudhari S, Lip GYH. New oral anticoagulants in elderly adults: evidence from meta‐analysis of randomized trials. J Am Geriatr Soc. 2014;62(5):857864.
Background
The prevalence of AF rises with age, as does the prevalence of malignancy, limited mobility, and other comorbidities that increase the risk for VTEs. These factors may also increase the risk of bleeding with conventional therapy with heparins and VKAs. As such, understanding the implications of using DOACs in the elderly population is important.
Findings
This meta‐analysis included the elderly (age 75 years) subset of patients from existing AF treatment and VTE treatment and prophylaxis randomized trials comparing DOACs with VKAs, low‐molecular‐weight heparin (LMWH), aspirin, or placebo. The primary safety outcome was major bleeding. For AF trials, the efficacy endpoint was stroke or systemic embolization, whereas in VTE trials it was VTE or VTE‐related death. Authors were able to extract data on 25,031 patients across 10 trials that evaluated rivaroxaban, apixaban, and dabigatran (not edoxaban), with follow‐up data ranging from 35 days to 2 years. For safety outcomes, the 2 arms showed no statistical difference (DOAC: 6.4%; conventional therapy: 6.3%; OR: 1.02; 95% CI: 0.73‐1.43). For efficacy endpoints in VTE studies, DOACs were more effective (3.7% vs 7.0%; OR: 0.45; 95% CI: 0.27‐77; NNT=30). For AF, the efficacy analysis favored DOACs also (3.3% vs 4.7%; OR: 0.65; 95% CI: 0.48‐0.87; NNT=71). When analyzed by the efficacy of the individual DOAC, rivaroxaban and apixaban both appeared to outperform the VKA/LMWH arm for both VTE and AF treatment, whereas data on dabigatran were only available for AF, also showing an efficacy benefit. Individual DOAC analyses for safety endpoints showed all the 3 to be similar to VKA/LMWH.
Cautions
Authors note, however, that coexisting low body weight and renal insufficiency may influence dosing choices in this population. There are specific dosage recommendations in the elderly for some DOACs.
Implications
The use of DOACs in patients aged 75 years and older appears to confer a substantial efficacy advantage when used for treatment of VTE and AF patients. The safety data presented in this meta‐analysis suggest that this class is comparable to VKA/LMWH medications.
CHANGING INPATIENT MANAGEMENT OF SKIN INFECTIONS
Boucher, H, Wilcox M, Talbot G, et al. Once‐weekly dalbavancin versus daily conventional therapy for skin infection. N Engl J Med. 2014;370:21692179.
Corey G, Kabler, H, Mahra P, et al. Single‐dose oritavancin in the treatment of acute bacterial skin infections. N Engl J Med. 2014;370:21802190.
Background
There are over 870,000 hospital admissions yearly for skin infection, making it one of most common reasons for hospitalization in the United States.[18] Management often requires lengthy treatments with intravenous antibiotics, especially with the emergence of methicillin‐resistant Staphylococcus aureus. Results from 2 large randomized, double‐blinded, multicenter clinical trials were published looking at new once‐weekly intravenous antibiotics. Dalbavancin and oritavancin are both lipoglycopeptides in the same family as vancomycin. What is unique is that their serum drug concentrations exceed the minimum inhibitor concentrations for over a week. Both drugs were compared in noninferiority trials to vancomycin. The studies had similar outcomes. The dalbavancin results are presented below.
Findings
Researchers randomized 1312 patients with significant cellulitis, large abscess, or wound infection. Patients also had fever, leukocytosis, or bandemia, and the infection had to be deemed severe enough to require a minimum of 3 days of intravenous antibiotics. The patients could not have received any prior antibiotics. Over 80% of the patients had fevers, and more than half met the criteria for systemic inflammatory response syndrome. Patients were randomized to either dalbavancin (on day 1 and day 8) or vancomycin every 12 hours (1 gm or 15 mg/kg), with both groups receiving placebo dosing of the other drug. The blinded physicians could decide to switch to oral agent (placebo or linezolid in the vancomycin group) anytime after day 3, and the physicians could stop antibiotics anytime after day 10. Otherwise, all patients received 14 days of antibiotics.
The FDA‐approved outcome was cessation of spread of erythema at 48 to 72 hours and no fever at 3 independent readings. Results were similar in the dalbavancin group compared to the vancomycinlinezolid group (79.7% vs 79.8%). Dalbavancin was deemed noninferior to vancomycin. Blinded investigator's assessment of treatment success at 2 weeks was also similar (96% vs 96.7%, respectively). More treatment‐related adverse events occurred in the vancomycinlinezolid group (183 vs 139; P=0.02) and more deaths occurred in the vancomycin group (7 vs 1; P=0.03).
Cautions
These antibiotics have only been shown effective for complicated, acute bacterial skin infections. Their performance for other gram‐positive infections is unknown. In the future, it is possible that patients with severe skin infections will receive a dose of these antibiotics on hospital day 1 and be sent home with close follow‐up. However, that study has not been done yet to confirm efficacy and safety. Though the drugs appear safe, there needs to be more clinical use before they become standard of care, especially because of the long half‐life. Finally, these drugs are very expensive and provide broad spectrum gram‐positive coverage. They are not meant for a simple cellulitis.
Implications
These 2 new once‐weekly antibioticsdalbavancin and oritavancinare noninferior to vancomycin for acute bacterial skin infections. They provide alternative treatment choices for managing patients with significant infections requiring hospitalization. In the future, they may change the need for hospitalization of these patients or significantly reduce their length of stay. Though expensive, a significant reduction in hospitalization will offset costs.
SHOULD THEY STAY OR SHOULD THEY GO? FAMILY PRESENCE DURING CPR MAY IMPROVE THE GRIEF PROCESS DURABLY
Jabre P, Tazarourte K, Azoulay E, et al. Offering the opportunity for family to be present during cardiopulmonary resuscitation: 1 year assessment. Intensive Care Med. 2014;40:981987.
Background
In 2013, a French study randomized adult family members of a patient undergoing cardiopulmonary resuscitation (CPR) occurring at home to either be invited to stay and watch the resuscitation or to have no specific invitation offered.[19] At 90 days, this study revealed that those who were invited to watch (and 79% did) had fewer symptoms of post‐traumatic stress disorder (PTSD) (27% vs 37%) and anxiety (15% vs 23%), though not depression, than did the group not offered the opportunity to watch (though 43% watched anyway). There were 570 subjects (family members) in the trial, of whom a greater number in the control arm declined to participate in a 90‐day follow‐up due to emotional distress. Notably, only 4% of the patients in this study undergoing CPR survived to day 28. Whether the apparent positive psychological impact of the offer to watch CPR for families was durable remained in question.
Findings
The study group followed the families up to 1 year. At that time, dropout rates were similar (with the assumption, as in the prior study, that those who dropped out of either arm had PTSD symptoms). At follow‐up, subjects were again assessed for PTSD, anxiety, and depression symptoms as well as for meeting criteria for having had a major depressive episode or complicated grief. Four hundred eight of the original 570 subjects were able to undergo reevaluation. The 1‐year results showed the group offered the chance to watch CPR had fewer PTSD symptoms (20% vs 32%) and depression symptoms (10% vs 16%), as well as fewer major depressive episodes (23% vs 31%) and less complicated grief (21% vs 36%) but without a durable impact on anxiety symptoms.
Cautions
The resuscitation efforts in question here occurred out of hospital (in the home). Part of the protocol for those family members observing CPR was that a clinician was assigned to stay with them and explain the resuscitation process as it occurred.
Implications
It is postulated that having the chance to observe CPR, if desired, may help the grieving process. This study clearly raises a question about the wisdom of routinely escorting patient's families out of the room during resuscitative efforts. It seems likely that the durable and important psychological effects observed in this study for family members would similarly persist in emergency department and inpatient settings, where staff can be with patients' families to talk them through the events they are witnessing. It is time to ask families if they prefer to stay and watch CPR and not automatically move them to a waiting room.
Disclosure: Nothing to report.
- http://scientific.thomsonreuters.com/imgblast/JCRFullCovlist-2014.pdf. Accessed August 28, 2015. Journals in the 2014 release of the JCR. Available at:
- Neprilysin inhibition—a novel therapy for heart failure. N Engl J Med. 2014;371(11):1062–1064.
- Stopping randomized trials early for benefit and estimation of treatment effects: systematic review and meta‐regression analysis. JAMA. 2010;303(12):1180–1187. , , , et al.
- Intravenous contrast medium‐induced nephrotoxicity: is the medical risk really as great as we have come to believe? Radiology 2010;256(1):21–28. ,
- Pathophysiology of contrast medium‐induced nephropathy. Kidney Int. 2005;68(1):14–22. , ,
- Contrast‐induced acute kidney injury: short‐ and long‐term implications. Semin Nephrol. 2011;31(3):300–309. ,
- Frequency of acute kidney injury following intravenous contrast medium administration: a systematic review and meta‐analysis. Radiology. 2013;267(1):119–128. , , , et al.
- Melatonin decreases delirium in elderly patients: a randomized, placebo‐controlled trial. Int J Geriatr Psychiatry. 2011;26(7):687–694. , , , , ,
- Lactulose in the treatment of chronic portal‐systemic encephalopathy. A double‐blind clinical trial. N Engl J Med. 1969;281(8):408–412. , ,
- Performance of the hepatic encephalopathy scoring algorithm in a clinical trial of patients with cirrhosis and severe hepatic encephalopathy. Am J Gastroenterol. 2009;104(6):1392–1400. , , , et al.
- The changing role of beta‐blocker therapy in patients with cirrhosis. J Hepatol. 2014;60(3):643–653. ,
- The window hypothesis: haemodynamic and non‐haemodynamic effects of beta‐blockers improve survival of patients with cirrhosis during a window in the disease. Gut. 2012;61(7):967–969. , , ,
- When should the beta‐blocker window in cirrhosis close? Gastroenterology. 2014;146(7):1597–1599. ,
- Venous thromboembolism prophylaxis in hospitalized medical patients and those with stroke: a background review for an American College of Physicians Clinical Practice Guideline. Ann Intern Med. 2011;155(9):602–615. , , ,
- LIFENOX Investigators. Low‐molecular‐weight heparin and mortality in acutely ill medical patients. N Engl J Med. 2011;365(26):2463–2472. , , , , , ;
- Randomized Comparison of Low‐Molecular‐Weight Heparin versus Oral Anticoagulant Therapy for the Prevention of Recurrent Venous Thromboembolism in Patients with Cancer (CLOT) Investigators. Low‐molecular‐weight heparin versus a coumarin for the prevention of recurrent venous thromboembolism in patients with cancer. N Engl J Med. 2003;349(2):146–153. , , , et al.;
- ENGAGE AF‐TIMI 48 Investigators. Edoxaban versus warfarin in patients with atrial fibrillation. N Engl J Med. 2013;369(22):2093–2104. , , , et al.;
- Pharmacology and the treatment of complicated skin and skin‐structure infections. N Engl J Med. 2014;370(23):2238–2239.
- Family presence during cardiopulmonary resuscitation. N Engl J Med. 2013;368(11):1008–1018. , , , et al.
- http://scientific.thomsonreuters.com/imgblast/JCRFullCovlist-2014.pdf. Accessed August 28, 2015. Journals in the 2014 release of the JCR. Available at:
- Neprilysin inhibition—a novel therapy for heart failure. N Engl J Med. 2014;371(11):1062–1064.
- Stopping randomized trials early for benefit and estimation of treatment effects: systematic review and meta‐regression analysis. JAMA. 2010;303(12):1180–1187. , , , et al.
- Intravenous contrast medium‐induced nephrotoxicity: is the medical risk really as great as we have come to believe? Radiology 2010;256(1):21–28. ,
- Pathophysiology of contrast medium‐induced nephropathy. Kidney Int. 2005;68(1):14–22. , ,
- Contrast‐induced acute kidney injury: short‐ and long‐term implications. Semin Nephrol. 2011;31(3):300–309. ,
- Frequency of acute kidney injury following intravenous contrast medium administration: a systematic review and meta‐analysis. Radiology. 2013;267(1):119–128. , , , et al.
- Melatonin decreases delirium in elderly patients: a randomized, placebo‐controlled trial. Int J Geriatr Psychiatry. 2011;26(7):687–694. , , , , ,
- Lactulose in the treatment of chronic portal‐systemic encephalopathy. A double‐blind clinical trial. N Engl J Med. 1969;281(8):408–412. , ,
- Performance of the hepatic encephalopathy scoring algorithm in a clinical trial of patients with cirrhosis and severe hepatic encephalopathy. Am J Gastroenterol. 2009;104(6):1392–1400. , , , et al.
- The changing role of beta‐blocker therapy in patients with cirrhosis. J Hepatol. 2014;60(3):643–653. ,
- The window hypothesis: haemodynamic and non‐haemodynamic effects of beta‐blockers improve survival of patients with cirrhosis during a window in the disease. Gut. 2012;61(7):967–969. , , ,
- When should the beta‐blocker window in cirrhosis close? Gastroenterology. 2014;146(7):1597–1599. ,
- Venous thromboembolism prophylaxis in hospitalized medical patients and those with stroke: a background review for an American College of Physicians Clinical Practice Guideline. Ann Intern Med. 2011;155(9):602–615. , , ,
- LIFENOX Investigators. Low‐molecular‐weight heparin and mortality in acutely ill medical patients. N Engl J Med. 2011;365(26):2463–2472. , , , , , ;
- Randomized Comparison of Low‐Molecular‐Weight Heparin versus Oral Anticoagulant Therapy for the Prevention of Recurrent Venous Thromboembolism in Patients with Cancer (CLOT) Investigators. Low‐molecular‐weight heparin versus a coumarin for the prevention of recurrent venous thromboembolism in patients with cancer. N Engl J Med. 2003;349(2):146–153. , , , et al.;
- ENGAGE AF‐TIMI 48 Investigators. Edoxaban versus warfarin in patients with atrial fibrillation. N Engl J Med. 2013;369(22):2093–2104. , , , et al.;
- Pharmacology and the treatment of complicated skin and skin‐structure infections. N Engl J Med. 2014;370(23):2238–2239.
- Family presence during cardiopulmonary resuscitation. N Engl J Med. 2013;368(11):1008–1018. , , , et al.
Secular Trends in AB Resistance
Among hospitalized patients with serious infections, the choice of empiric therapy plays a key role in outcomes.[1, 2, 3, 4, 5, 6, 7, 8, 9] Rising rates and variable patterns of antimicrobial resistance, however, complicate selecting appropriate empiric therapy. Amidst this shifting landscape of resistance to antimicrobials, gram‐negative bacteria and specifically Acinetobacter baumannii (AB), remain a considerable challenge.[10] On the one hand, AB is a less‐frequent cause of serious infections than organisms like Pseudomonas aeruginosa or Enterobacteriaceae in severely ill hospitalized patients.[11, 12] On the other, AB has evolved a variety of resistance mechanisms and exhibits unpredictable susceptibility patterns.[13] These factors combine to increase the likelihood of administering inappropriate empiric therapy when faced with an infection caused by AB and, thereby, raising the risk of death.[14] The fact that clinicians may not routinely consider AB as the potential culprit pathogen in the patient they are treating along with this organism's highly in vitro resistant nature, may result in routine gram‐negative coverage being frequently inadequate for AB infections.
To address the poor outcomes related to inappropriate empiric therapy in the setting of AB, one requires an appreciation of the longitudinal changes and geographic differences in the susceptibility of this pathogen. Thus, we aimed to examine secular trends in the resistance of AB to antimicrobial agents whose effectiveness against this microorganism was well supported in the literature during the study timeframe.[15]
METHODS
To determine the prevalence of predefined resistance patterns among AB in respiratory and blood stream infection (BSI) specimens, we examined The Surveillance Network (TSN) database from Eurofins. We explored data collected between years 2003 and 2012. The database has been used extensively for surveillance purposes since 1994, and has previously been described in detail.[16, 17, 18, 19, 20] Briefly, TSN is a warehouse of routine clinical microbiology data collected from a nationally representative sample of microbiology laboratories in 217 hospitals in the United States. To minimize selection bias, laboratories are included based on their geography and the demographics of the populations they serve.[18] Only clinically significant samples are reported. No personal identifying information for source patients is available in this database. Only source laboratories that perform antimicrobial susceptibility testing according standard Food and Drug Administrationapproved testing methods and that interpret susceptibility in accordance with the Clinical Laboratory Standards Institute breakpoints are included.[21] (See Supporting Table 4 in the online version of this article for minimum inhibitory concentration (MIC) changes over the course of the studycurrent colistin and polymyxin breakpoints applied retrospectively). All enrolled laboratories undergo a pre‐enrollment site visit. Logical filters are used for routine quality control to detect unusual susceptibility profiles and to ensure appropriate testing methods. Repeat testing and reporting are done as necessary.[18]
Laboratory samples are reported as susceptible, intermediate, or resistant. We grouped isolates with intermediate MICs together with the resistant ones for the purposes of the current analysis. Duplicate isolates were excluded. Only samples representing 1 of the 2 infections of interest, respiratory or BSI, were included.
We examined 3 time periods2003 to 2005, 2006 to 2008, and 2009 to 2012for the prevalence of AB's resistance to the following antibiotics: carbapenems (imipenem, meropenem, doripenem), aminoglycosides (tobramycin, amikacin), tetracyclines (minocycline, doxycycline), polymyxins (colistin, polymyxin B), ampicillin‐sulbactam, and trimethoprim‐sulfamethoxazole. Antimicrobial resistance was defined by the designation of intermediate or resistant in the susceptibility category. Resistance to a class of antibiotics was defined as resistance to all drugs within the class for which testing was available. The organism was multidrug resistant (MDR) if it was resistant to at least 1 antimicrobial in at least 3 drug classes examined.[22] Resistance to a combination of 2 drugs was present if the specimen was resistant to both of the drugs in the combination for which testing was available. We examined the data by infection type, time period, the 9 US Census divisions, and location of origin of the sample.
All categorical variables are reported as percentages. Continuous variables are reported as meansstandard deviations and/or medians with the interquartile range (IQR). We did not pursue hypothesis testing due to a high risk of type I error in this large dataset. Therefore, only clinically important trends are highlighted.
RESULTS
Among the 39,320 AB specimens, 81.1% were derived from a respiratory source and 18.9% represented BSI. Demographics of source patients are listed in Table 1. Notably, the median age of those with respiratory infection (58 years; IQR 38, 73) was higher than among patients with BSI (54.5 years; IQR 36, 71), and there were proportionally fewer females among respiratory patients (39.9%) than those with BSI (46.0%). Though only 24.3% of all BSI samples originated from the intensive are unit (ICU), 40.5% of respiratory specimens came from that location. The plurality of all specimens was collected in the 2003 to 2005 time interval (41.3%), followed by 2006 to 2008 (34.7%), with a minority coming from years 2009 to 2012 (24.0%). The proportions of collected specimens from respiratory and BSI sources were similar in all time periods examined (Table 1). Geographically, the South Atlantic division contributed the most samples (24.1%) and East South Central the fewest (2.6%) (Figure 1). The vast majority of all samples came from hospital wards (78.6%), where roughly one‐half originated in the ICU (37.5%). Fewer still came from outpatient sources (18.3%), and a small minority (2.5%) from nursing homes.

Pneumonia | BSI | All | |
---|---|---|---|
| |||
Total, N (%) | 31,868 (81.1) | 7,452 (18.9) | 39,320 |
Age, y | |||
Mean (SD) | 57.7 (37.4) | 57.6 (40.6) | 57.7 (38.0) |
Median (IQR 25, 75) | 58 (38, 73) | 54.5 (36, 71) | 57 (37, 73) |
Gender, female (%) | 12,725 (39.9) | 3,425 (46.0) | 16,150 (41.1) |
ICU (%) | 12,9191 (40.5) | 1,809 (24.3) | 14,7284 (37.5) |
Time period, % total | |||
20032005 | 12,910 (40.5) | 3,340 (44.8) | 16,250 (41.3) |
20062008 | 11,205 (35.2) | 2,435 (32.7) | 13,640 (34.7) |
20092012 | 7,753 (24.3) | 1,677 (22.5) | 9,430 (24.0) |
Figure 2 depicts overall resistance patterns by individual drugs, drug classes, and frequently used combinations of agents. Although doripenem had the highest rate of resistance numerically (90.3%), its susceptibility was tested only in a small minority of specimens (n=31, 0.08%). Resistance to trimethoprim‐sulfamethoxazole was high (55.3%) based on a large number of samples tested (n=33,031). Conversely, colistin as an agent and polymyxins as a class exhibited the highest susceptibility rates of over 90%, though the numbers of samples tested for susceptibility to these drugs were also small (colistin n=2,086, 5.3%; polymyxins n=3,120, 7.9%) (Figure 2). Among commonly used drug combinations, carbapenem+aminoglycoside (18.0%) had the lowest resistance rates, and nearly 30% of all AB specimens tested met the criteria for MDR.

Over time, resistance to carbapenems more‐than doubled, from 21.0% in 2003 to 2005 to 47.9% in 2009 to 2012 (Table 2). Although relatively few samples were tested for colistin susceptibility (n=2,086, 5.3%), resistance to this drug also more than doubled from 2.8% (95% confidence interval: 1.9‐4.2) in 2006 to 2008 to 6.9% (95% confidence interval: 5.7‐8.2) in 2009 to 2012. As a class, however, polymyxins exhibited stable resistance rates over the time frame of the study (Table 2). Prevalence of MDR AB rose from 21.4% in 2003 to 2005 to 33.7% in 2006 to 2008, and remained stable at 35.2% in 2009 to 2012. Resistance to even such broad combinations as carbapenem+ampicillin/sulbactam nearly tripled from 13.2% in 2003 to 2005 to 35.5% in 2009 to 2012. Notably, between 2003 and 2012, although resistance rates either rose or remained stable to all other agents, those to minocycline diminished from 56.5% in 2003 to 2005 to 36.6% in 2006 to 2008 to 30.5% in 2009 to 2012. (See Supporting Table 1 in the online version of this article for time trends based on whether they represented respiratory or BSI specimens, with directionally similar trends in both.)
Drug/Combination | Time Period | ||||||||
---|---|---|---|---|---|---|---|---|---|
20032005 | 20062008 | 20092012 | |||||||
Na | %b | 95% CI | N | % | 95% CI | N | % | 95% CI | |
| |||||||||
Amikacin | 12,949 | 25.2 | 24.5‐26.0 | 10.929 | 35.2 | 34.3‐36.1 | 6,292 | 45.7 | 44.4‐46.9 |
Tobramycin | 14,549 | 37.1 | 36.3‐37.9 | 11,877 | 41.9 | 41.0‐42.8 | 7,901 | 39.2 | 38.1‐40.3 |
Aminoglycoside | 14,505 | 22.5 | 21.8‐23.2 | 11,967 | 30.6 | 29.8‐31.4 | 7,736 | 34.8 | 33.8‐35.8 |
Doxycycline | 173 | 36.4 | 29.6‐43.8 | 38 | 29.0 | 17.0‐44.8 | 32 | 34.4 | 20.4‐51.7 |
Minocycline | 1,388 | 56.5 | 53.9‐50.1 | 902 | 36.6 | 33.5‐39.8 | 522 | 30.5 | 26.7‐34.5 |
Tetracycline | 1,511 | 55.4 | 52.9‐57.9 | 940 | 36.3 | 33.3‐39.4 | 546 | 30.8 | 27.0‐34.8 |
Doripenem | NR | NR | NR | 9 | 77.8 | 45.3‐93.7 | 22 | 95.5 | 78.2‐99.2 |
Imipenem | 14,728 | 21.8 | 21.2‐22.5 | 12,094 | 40.3 | 39.4‐41.2 | 6,681 | 51.7 | 50.5‐52.9 |
Meropenem | 7,226 | 37.0 | 35.9‐38.1 | 5,628 | 48.7 | 47.3‐50.0 | 4,919 | 47.3 | 45.9‐48.7 |
Carbapenem | 15,490 | 21.0 | 20.4‐21.7 | 12,975 | 38.8 | 38.0‐39.7 | 8,778 | 47.9 | 46.9‐49.0 |
Ampicillin/sulbactam | 10,525 | 35.2 | 34.3‐36.2 | 9,413 | 44.9 | 43.9‐45.9 | 6,460 | 41.2 | 40.0‐42.4 |
Colistin | NR | NR | NR | 783 | 2.8 | 1.9‐4.2 | 1,303 | 6.9 | 5.7‐8.2 |
Polymyxin B | 105 | 7.6 | 3.9‐14.3 | 796 | 12.8 | 10.7‐15.3 | 321 | 6.5 | 4.3‐9.6 |
Polymyxin | 105 | 7.6 | 3.9‐14.3 | 1,563 | 7.9 | 6.6‐9.3 | 1,452 | 6.8 | 5.6‐8.2 |
Trimethoprim/sulfamethoxazole | 13,640 | 52.5 | 51.7‐53.3 | 11,535 | 57.1 | 56.2‐58.0 | 7,856 | 57.6 | 56.5‐58.7 |
MDRc | 16,249 | 21.4 | 20.7‐22.0 | 13,640 | 33.7 | 33.0‐34.5 | 9,431 | 35.2 | 34.2‐36.2 |
Carbapenem+aminoglycoside | 14,601 | 8.9 | 8.5‐9.4 | 12,333 | 21.3 | 20.6‐22.0 | 8,256 | 29.3 | 28.3‐30.3 |
Aminoglycoside+ampicillin/sulbactam | 10,107 | 12.9 | 12.3‐13.6 | 9,077 | 24.9 | 24.0‐25.8 | 6,200 | 24.3 | 23.2‐25.3 |
Aminoglycosie+minocycline | 1,359 | 35.6 | 33.1‐38.2 | 856 | 21.4 | 18.8‐24.2 | 503 | 24.5 | 20.9‐28.4 |
Carbapenem+ampicillin/sulbactam | 10,228 | 13.2 | 12.5‐13.9 | 9,145 | 29.4 | 28.4‐30.3 | 6,143 | 35.5 | 34.3‐36.7 |
Regionally, examining resistance by classes and combinations of antibiotics, trimethoprim‐sulfamethoxazole exhibited consistently the highest rates of resistance, ranging from the lowest in the New England (28.8%) to the highest in the East North Central (69.9%) Census divisions (See Supporting Table 2 in the online version of this article). The rates of resistance to tetracyclines ranged from 0.0% in New England to 52.6% in the Mountain division, and to polymyxins from 0.0% in the East South Central division to 23.4% in New England. Generally, New England enjoyed the lowest rates of resistance (0.0% to tetracyclines to 28.8% to trimethoprim‐sulfamethoxazole), and the Mountain division the highest (0.9% to polymyxins to 52.6% to tetracyclines). The rates of MDR AB ranged from 8.0% in New England to 50.4% in the Mountain division (see Supporting Table 2 in the online version of this article).
Examining resistances to drug classes and combinations by the location of the source specimen revealed that trimethoprim‐sulfamethoxazole once again exhibited the highest rate of resistance across all locations (see Supporting Table 3 in the online version of this article). Despite their modest contribution to the overall sample pool (n=967, 2.5%), organisms from nursing home subjects had the highest prevalence of resistance to aminoglycosides (36.3%), tetracyclines (57.1%), and carbapenems (47.1%). This pattern held true for combination regimens examined. Nursing homes also vastly surpassed other locations in the rates of MDR AB (46.5%). Interestingly, the rates of MDR did not differ substantially among regular inpatient wards (29.2%), the ICU (28.7%), and outpatient locations (26.2%) (see Supporting Table 3 in the online version of this article).
DISCUSSION
In this large multicenter survey we have documented the rising rates of AB resistance to clinically important antimicrobials in the United States. On the whole, all antimicrobials, except for minocycline, exhibited either large or small increases in resistance. Alarmingly, even colistin, a true last resort AB treatment, lost a considerable amount of activity against AB, with the resistance rate rising from 2.8% in 2006 to 2008 to 6.9% in 2009 to 2012. The single encouraging trend that we observed was that resistance to minocycline appeared to diminish substantially, going from over one‐half of all AB tested in 2003 to 2005 to just under one‐third in 2009 to 2012.
Although we did note a rise in the MDR AB, our data suggest a lower percentage of all AB that meets the MDR phenotype criteria compared to reports by other groups. For example, the Center for Disease Dynamics and Economic Policy (CDDEP), analyzing the same data as our study, reports a rise in MDR AB from 32.1% in 1999 to 51.0% in 2010.[23] This discrepancy is easily explained by the fact that we included polymyxins, tetracyclines, and trimethoprim‐sulfamethoxazole in our evaluation, whereas the CDDEP did not examine these agents. Furthermore, we omitted fluoroquinolones, a drug class with high rates of resistance, from our study, because we were interested in focusing only on antimicrobials with clinical data in AB infections.[22] In addition, we limited our evaluation to specimens derived from respiratory or BSI sources, whereas the CDDEP data reflect any AB isolate present in TSN.
We additionally confirm that there is substantial geographic variation in resistance patterns. Thus, despite different definitions, our data agree with those from the CDDEP that the MDR prevalence is highest in the Mountain and East North Central divisions, and lowest in New England overall.[23] The wide variations underscore the fact that it is not valid to speak of national rates of resistance, but rather it is important to concentrate on the local patterns. This information, though important from the macroepidemiologic standpoint, is likely still not granular enough to help clinicians make empiric treatment decisions. In fact, what is needed for that is real‐time antibiogram data specific to each center and even each unit within each center.
The latter point is further illustrated by our analysis of locations of origin of the specimens. In this analysis, we discovered that, contrary to the common presumption that the ICU has the highest rate of resistant organisms, specimens derived from nursing homes represent perhaps the most intensely resistant organisms. In other words, the nursing home is the setting most likely to harbor patients with respiratory infections and BSIs caused by resistant AB. These data are in agreement with several other recent investigations. In a period‐prevalence survey conducted in the state of Maryland in 2009 by Thom and colleagues, long‐term care facilities were found to have the highest prevalence of any AB, and also those resistant to imipenem, MDR, and extensively drug‐resistant organisms.[24] Mortensen and coworkers confirmed the high prevalence of AB and AB resistance in long‐term care facilities, and extended this finding to suggest that there is evidence for intra‐ and interhospital spread of these pathogens.[25] Our data confirm this concerning finding at the national level, and point to a potential area of intervention for infection prevention.
An additional finding of some concern is that the highest proportion of colistin resistance among those specimens, whose location of origin was reported in the database, was the outpatient setting (6.6% compared to 5.4% in the ICU specimens, for example). Although these infections would likely meet the definition for healthcare‐associated infection, AB as a community‐acquired respiratory pathogen is not unprecedented either in the United States or abroad.[26, 27, 28, 29, 30] It is, however, reassuring that most other antimicrobials examined in our study exhibit higher rates of susceptibility in the specimens derived from the outpatient settings than either from the hospital or the nursing home.
Our study has a number of strengths. As a large multicenter survey, it is representative of AB susceptibility patterns across the United States, which makes it highly generalizable. We focused on antibiotics for which clinical evidence is available, thus adding a practical dimension to the results. Another pragmatic consideration is examining the data by geographic distributions, allowing an additional layer of granularity for clinical decisions. At the same time it suffers from some limitations. The TSN database consists of microbiology samples from hospital laboratories. Although we attempted to reduce the risk of duplication, because of how samples are numbered in the database, repeat sampling remains a possibility. Despite having stratified the data by geography and the location of origin of the specimen, it is likely not granular enough for local risk stratification decisions clinicians make daily about the choices of empiric therapy. Some of the MIC breakpoints have changed over the period of the study (see Supporting Table 4 in the online version of this article). Because these changes occurred in the last year of data collection (2012), they should have had only a minimal, if any, impact on the observed rates of resistance in the time frame examined. Additionally, because resistance rates evolve rapidly, more current data are required for effective clinical decision making.
In summary, we have demonstrated that the last decade has seen an alarming increase in the rate of resistance of AB to multiple clinically important antimicrobial agents and classes. We have further emphasized the importance of granularity in susceptibility data to help clinicians make sensible decisions about empiric therapy in hospitalized patients with serious infections. Finally, and potentially most disturbingly, the nursing home as a location appears to be a robust reservoir for spread for resistant AB. All of these observations highlight the urgent need to develop novel antibiotics and nontraditional agents, such as antibodies and vaccines, to combat AB infections, in addition to having important infection prevention implications if we are to contain the looming threat of the end of antibiotics.[31]
Disclosure
This study was funded by a grant from Tetraphase Pharmaceuticals, Watertown, MA.
- National Nosocomial Infections Surveillance (NNIS) System Report. Am J Infect Control. 2004;32:470–485.
- National surveillance of antimicrobial resistance in Pseudomonas aeruginosa isolates obtained from intensive care unit patients from 1993 to 2002. Antimicrob Agents Chemother. 2004;48:4606–4610. , , ,
- Health care‐associated pneumonia and community‐acquired pneumonia: a single‐center experience. Antimicrob Agents Chemother. 2007;51:3568–3573. , , , et al.
- Clinical importance of delays in the initiation of appropriate antibiotic treatment for ventilator‐associated pneumonia. Chest. 2002;122:262–268. , , , et al.
- ICU‐Acquired Pneumonia Study Group. Modification of empiric antibiotic treatment in patients with pneumonia acquired in the intensive care unit. Intensive Care Med. 1996;22:387–394. ;
- Antimicrobial therapy escalation and hospital mortality among patients with HCAP: a single center experience. Chest. 2008:134:963–968. , , , ,
- Surviving Sepsis Campaign: international guidelines for management of severe sepsis and septic shock: 2008. Crit Care Med. 2008;36:296–327. , , , et al.
- Inappropriate antibiotic therapy in Gram‐negative sepsis increases hospital length of stay. Crit Care Med. 2011;39:46–51. , , , , ,
- Inadequate antimicrobial treatment of infections: a risk factor for hospital mortality among critically ill patients. Chest. 1999;115:462–474. , , ,
- Centers for Disease Control and Prevention. Antibiotic resistance threats in the United States, 2013. Available at: http://www.cdc.gov/drugresistance/threat-report-2013/pdf/ar-threats-2013-508.pdf#page=59. Accessed December 29, 2014.
- National Healthcare Safety Network (NHSN) Team and Participating NHSN Facilities. Antimicrobial‐resistant pathogens associated with healthcare‐associated infections: summary of data reported to the National Healthcare Safety Network at the Centers for Disease Control and Prevention, 2009–2010. Infect Control Hosp Epidemiol. 2013;34:1–14. , , , et al.;
- Multi‐drug resistance, inappropriate initial antibiotic therapy and mortality in Gram‐negative severe sepsis and septic shock: a retrospective cohort study. Crit Care. 2014;18(6):596. , , , ,
- Global challenge of multidrug‐resistant Acinetobacter baumannii. Antimicrob Agents Chemother. 2007;51:3471–3484. , , , , ,
- Predictors of hospital mortality among septic ICU patients with Acinetobacter spp. bacteremia: a cohort study. BMC Infect Dis. 2014;14:572. , , ,
- Treatment of Acinetobacter infections. Clin Infect Dis. 2010;51:79–84. ,
- Increasing resistance of Acinetobacter species to imipenem in United States hospitals, 1999–2006. Infect Control Hosp Epidemiol. 2010;31:196–197. , ,
- Trends in resistance to carbapenems and third‐generation cephalosporins among clinical isolates of Klebsiella pneumoniae in the United States, 1999–2010. Infect Control Hosp Epidemiol. 2013;34:259–268. , , , ,
- Antimicrobial resistance in key bloodstream bacterial isolates: electronic surveillance with the Surveillance Network Database—USA. Clin Infect Dis. 1999;29:259–263. , ,
- Community‐associated methicillin‐resistant Staphylococcus aureus in outpatients, United States, 1999–2006. Emerg Infect Dis. 2009;15:1925–1930. , ,
- Prevalence of antimicrobial resistance in bacteria isolated from central nervous system specimens as reported by U.S. hospital laboratories from 2000 to 2002. Ann Clin Microbiol Antimicrob. 2004;3:3. , , , ,
- Performance standards for antimicrobial susceptibility testing: twenty‐second informational supplement. CLSI document M100‐S22. Wayne, PA: Clinical and Laboratory Standards Institute; 2012.
- Multidrug‐resistant, extensively drug‐resistant and pandrug‐resistant bacteria: an international expert proposal for interim standard definitions for acquired resistance. Clin Microbiol Infect. 2012;18:268–281. , , , et al.
- CDDEP: The Center for Disease Dynamics, Economics and Policy. Resistance map: Acinetobacter baumannii overview. Available at: http://www.cddep.org/projects/resistance_map/acinetobacter_baumannii_overview. Accessed January 16, 2015.
- Maryland MDRO Prevention Collaborative. Assessing the burden of Acinetobacter baumannii in Maryland: a statewide cross‐sectional period prevalence survey. Infect Control Hosp Epidemiol. 2012;33:883–888. , , , et al.;
- Multidrug‐resistant Acinetobacter baumannii infection, colonization, and transmission related to a long‐term care facility providing subacute care. Infect Control Hosp Epidemiol. 2014;35:406–411. , , , et al.
- Severe community‐acquired pneumonia due to Acinetobacter baumannii. Chest. 2001;120:1072–1077. , , , , ,
- Fulminant community‐acquired Acinetobacter baumannii pneumonia as distinct clinical syndrome. Chest. 2006;129:102–109. , , , , ,
- Community‐acquired Acinetobacter baumannii pneumonia. Rev Clin Esp. 2003;203:284–286. , , ,
- Antimicrobial drug‐resistant microbes associated with hospitalized community‐acquired and healthcare‐associated pneumonia: a multi‐center study in Taiwan. J Formos Med Assoc. 2013;112:31–40. , , , et al.
- Antimicrobial resistance in Hispanic patients hospitalized in San Antonio, TX with community‐acquired pneumonia. Hosp Pract (1995). 2010;38:108–113. , , , ,
- http://blogs.cdc.gov/cdcdirector/2014/05/05/the-end-of-antibiotics-can-we-come-back-from-the-brink/. Published May 5, 2014. Accessed January 16, 2015. Centers for Disease Control and Prevention. CDC director blog. The end of antibiotics. Can we come back from the brink? Available at:
Among hospitalized patients with serious infections, the choice of empiric therapy plays a key role in outcomes.[1, 2, 3, 4, 5, 6, 7, 8, 9] Rising rates and variable patterns of antimicrobial resistance, however, complicate selecting appropriate empiric therapy. Amidst this shifting landscape of resistance to antimicrobials, gram‐negative bacteria and specifically Acinetobacter baumannii (AB), remain a considerable challenge.[10] On the one hand, AB is a less‐frequent cause of serious infections than organisms like Pseudomonas aeruginosa or Enterobacteriaceae in severely ill hospitalized patients.[11, 12] On the other, AB has evolved a variety of resistance mechanisms and exhibits unpredictable susceptibility patterns.[13] These factors combine to increase the likelihood of administering inappropriate empiric therapy when faced with an infection caused by AB and, thereby, raising the risk of death.[14] The fact that clinicians may not routinely consider AB as the potential culprit pathogen in the patient they are treating along with this organism's highly in vitro resistant nature, may result in routine gram‐negative coverage being frequently inadequate for AB infections.
To address the poor outcomes related to inappropriate empiric therapy in the setting of AB, one requires an appreciation of the longitudinal changes and geographic differences in the susceptibility of this pathogen. Thus, we aimed to examine secular trends in the resistance of AB to antimicrobial agents whose effectiveness against this microorganism was well supported in the literature during the study timeframe.[15]
METHODS
To determine the prevalence of predefined resistance patterns among AB in respiratory and blood stream infection (BSI) specimens, we examined The Surveillance Network (TSN) database from Eurofins. We explored data collected between years 2003 and 2012. The database has been used extensively for surveillance purposes since 1994, and has previously been described in detail.[16, 17, 18, 19, 20] Briefly, TSN is a warehouse of routine clinical microbiology data collected from a nationally representative sample of microbiology laboratories in 217 hospitals in the United States. To minimize selection bias, laboratories are included based on their geography and the demographics of the populations they serve.[18] Only clinically significant samples are reported. No personal identifying information for source patients is available in this database. Only source laboratories that perform antimicrobial susceptibility testing according standard Food and Drug Administrationapproved testing methods and that interpret susceptibility in accordance with the Clinical Laboratory Standards Institute breakpoints are included.[21] (See Supporting Table 4 in the online version of this article for minimum inhibitory concentration (MIC) changes over the course of the studycurrent colistin and polymyxin breakpoints applied retrospectively). All enrolled laboratories undergo a pre‐enrollment site visit. Logical filters are used for routine quality control to detect unusual susceptibility profiles and to ensure appropriate testing methods. Repeat testing and reporting are done as necessary.[18]
Laboratory samples are reported as susceptible, intermediate, or resistant. We grouped isolates with intermediate MICs together with the resistant ones for the purposes of the current analysis. Duplicate isolates were excluded. Only samples representing 1 of the 2 infections of interest, respiratory or BSI, were included.
We examined 3 time periods2003 to 2005, 2006 to 2008, and 2009 to 2012for the prevalence of AB's resistance to the following antibiotics: carbapenems (imipenem, meropenem, doripenem), aminoglycosides (tobramycin, amikacin), tetracyclines (minocycline, doxycycline), polymyxins (colistin, polymyxin B), ampicillin‐sulbactam, and trimethoprim‐sulfamethoxazole. Antimicrobial resistance was defined by the designation of intermediate or resistant in the susceptibility category. Resistance to a class of antibiotics was defined as resistance to all drugs within the class for which testing was available. The organism was multidrug resistant (MDR) if it was resistant to at least 1 antimicrobial in at least 3 drug classes examined.[22] Resistance to a combination of 2 drugs was present if the specimen was resistant to both of the drugs in the combination for which testing was available. We examined the data by infection type, time period, the 9 US Census divisions, and location of origin of the sample.
All categorical variables are reported as percentages. Continuous variables are reported as meansstandard deviations and/or medians with the interquartile range (IQR). We did not pursue hypothesis testing due to a high risk of type I error in this large dataset. Therefore, only clinically important trends are highlighted.
RESULTS
Among the 39,320 AB specimens, 81.1% were derived from a respiratory source and 18.9% represented BSI. Demographics of source patients are listed in Table 1. Notably, the median age of those with respiratory infection (58 years; IQR 38, 73) was higher than among patients with BSI (54.5 years; IQR 36, 71), and there were proportionally fewer females among respiratory patients (39.9%) than those with BSI (46.0%). Though only 24.3% of all BSI samples originated from the intensive are unit (ICU), 40.5% of respiratory specimens came from that location. The plurality of all specimens was collected in the 2003 to 2005 time interval (41.3%), followed by 2006 to 2008 (34.7%), with a minority coming from years 2009 to 2012 (24.0%). The proportions of collected specimens from respiratory and BSI sources were similar in all time periods examined (Table 1). Geographically, the South Atlantic division contributed the most samples (24.1%) and East South Central the fewest (2.6%) (Figure 1). The vast majority of all samples came from hospital wards (78.6%), where roughly one‐half originated in the ICU (37.5%). Fewer still came from outpatient sources (18.3%), and a small minority (2.5%) from nursing homes.

Pneumonia | BSI | All | |
---|---|---|---|
| |||
Total, N (%) | 31,868 (81.1) | 7,452 (18.9) | 39,320 |
Age, y | |||
Mean (SD) | 57.7 (37.4) | 57.6 (40.6) | 57.7 (38.0) |
Median (IQR 25, 75) | 58 (38, 73) | 54.5 (36, 71) | 57 (37, 73) |
Gender, female (%) | 12,725 (39.9) | 3,425 (46.0) | 16,150 (41.1) |
ICU (%) | 12,9191 (40.5) | 1,809 (24.3) | 14,7284 (37.5) |
Time period, % total | |||
20032005 | 12,910 (40.5) | 3,340 (44.8) | 16,250 (41.3) |
20062008 | 11,205 (35.2) | 2,435 (32.7) | 13,640 (34.7) |
20092012 | 7,753 (24.3) | 1,677 (22.5) | 9,430 (24.0) |
Figure 2 depicts overall resistance patterns by individual drugs, drug classes, and frequently used combinations of agents. Although doripenem had the highest rate of resistance numerically (90.3%), its susceptibility was tested only in a small minority of specimens (n=31, 0.08%). Resistance to trimethoprim‐sulfamethoxazole was high (55.3%) based on a large number of samples tested (n=33,031). Conversely, colistin as an agent and polymyxins as a class exhibited the highest susceptibility rates of over 90%, though the numbers of samples tested for susceptibility to these drugs were also small (colistin n=2,086, 5.3%; polymyxins n=3,120, 7.9%) (Figure 2). Among commonly used drug combinations, carbapenem+aminoglycoside (18.0%) had the lowest resistance rates, and nearly 30% of all AB specimens tested met the criteria for MDR.

Over time, resistance to carbapenems more‐than doubled, from 21.0% in 2003 to 2005 to 47.9% in 2009 to 2012 (Table 2). Although relatively few samples were tested for colistin susceptibility (n=2,086, 5.3%), resistance to this drug also more than doubled from 2.8% (95% confidence interval: 1.9‐4.2) in 2006 to 2008 to 6.9% (95% confidence interval: 5.7‐8.2) in 2009 to 2012. As a class, however, polymyxins exhibited stable resistance rates over the time frame of the study (Table 2). Prevalence of MDR AB rose from 21.4% in 2003 to 2005 to 33.7% in 2006 to 2008, and remained stable at 35.2% in 2009 to 2012. Resistance to even such broad combinations as carbapenem+ampicillin/sulbactam nearly tripled from 13.2% in 2003 to 2005 to 35.5% in 2009 to 2012. Notably, between 2003 and 2012, although resistance rates either rose or remained stable to all other agents, those to minocycline diminished from 56.5% in 2003 to 2005 to 36.6% in 2006 to 2008 to 30.5% in 2009 to 2012. (See Supporting Table 1 in the online version of this article for time trends based on whether they represented respiratory or BSI specimens, with directionally similar trends in both.)
Drug/Combination | Time Period | ||||||||
---|---|---|---|---|---|---|---|---|---|
20032005 | 20062008 | 20092012 | |||||||
Na | %b | 95% CI | N | % | 95% CI | N | % | 95% CI | |
| |||||||||
Amikacin | 12,949 | 25.2 | 24.5‐26.0 | 10.929 | 35.2 | 34.3‐36.1 | 6,292 | 45.7 | 44.4‐46.9 |
Tobramycin | 14,549 | 37.1 | 36.3‐37.9 | 11,877 | 41.9 | 41.0‐42.8 | 7,901 | 39.2 | 38.1‐40.3 |
Aminoglycoside | 14,505 | 22.5 | 21.8‐23.2 | 11,967 | 30.6 | 29.8‐31.4 | 7,736 | 34.8 | 33.8‐35.8 |
Doxycycline | 173 | 36.4 | 29.6‐43.8 | 38 | 29.0 | 17.0‐44.8 | 32 | 34.4 | 20.4‐51.7 |
Minocycline | 1,388 | 56.5 | 53.9‐50.1 | 902 | 36.6 | 33.5‐39.8 | 522 | 30.5 | 26.7‐34.5 |
Tetracycline | 1,511 | 55.4 | 52.9‐57.9 | 940 | 36.3 | 33.3‐39.4 | 546 | 30.8 | 27.0‐34.8 |
Doripenem | NR | NR | NR | 9 | 77.8 | 45.3‐93.7 | 22 | 95.5 | 78.2‐99.2 |
Imipenem | 14,728 | 21.8 | 21.2‐22.5 | 12,094 | 40.3 | 39.4‐41.2 | 6,681 | 51.7 | 50.5‐52.9 |
Meropenem | 7,226 | 37.0 | 35.9‐38.1 | 5,628 | 48.7 | 47.3‐50.0 | 4,919 | 47.3 | 45.9‐48.7 |
Carbapenem | 15,490 | 21.0 | 20.4‐21.7 | 12,975 | 38.8 | 38.0‐39.7 | 8,778 | 47.9 | 46.9‐49.0 |
Ampicillin/sulbactam | 10,525 | 35.2 | 34.3‐36.2 | 9,413 | 44.9 | 43.9‐45.9 | 6,460 | 41.2 | 40.0‐42.4 |
Colistin | NR | NR | NR | 783 | 2.8 | 1.9‐4.2 | 1,303 | 6.9 | 5.7‐8.2 |
Polymyxin B | 105 | 7.6 | 3.9‐14.3 | 796 | 12.8 | 10.7‐15.3 | 321 | 6.5 | 4.3‐9.6 |
Polymyxin | 105 | 7.6 | 3.9‐14.3 | 1,563 | 7.9 | 6.6‐9.3 | 1,452 | 6.8 | 5.6‐8.2 |
Trimethoprim/sulfamethoxazole | 13,640 | 52.5 | 51.7‐53.3 | 11,535 | 57.1 | 56.2‐58.0 | 7,856 | 57.6 | 56.5‐58.7 |
MDRc | 16,249 | 21.4 | 20.7‐22.0 | 13,640 | 33.7 | 33.0‐34.5 | 9,431 | 35.2 | 34.2‐36.2 |
Carbapenem+aminoglycoside | 14,601 | 8.9 | 8.5‐9.4 | 12,333 | 21.3 | 20.6‐22.0 | 8,256 | 29.3 | 28.3‐30.3 |
Aminoglycoside+ampicillin/sulbactam | 10,107 | 12.9 | 12.3‐13.6 | 9,077 | 24.9 | 24.0‐25.8 | 6,200 | 24.3 | 23.2‐25.3 |
Aminoglycosie+minocycline | 1,359 | 35.6 | 33.1‐38.2 | 856 | 21.4 | 18.8‐24.2 | 503 | 24.5 | 20.9‐28.4 |
Carbapenem+ampicillin/sulbactam | 10,228 | 13.2 | 12.5‐13.9 | 9,145 | 29.4 | 28.4‐30.3 | 6,143 | 35.5 | 34.3‐36.7 |
Regionally, examining resistance by classes and combinations of antibiotics, trimethoprim‐sulfamethoxazole exhibited consistently the highest rates of resistance, ranging from the lowest in the New England (28.8%) to the highest in the East North Central (69.9%) Census divisions (See Supporting Table 2 in the online version of this article). The rates of resistance to tetracyclines ranged from 0.0% in New England to 52.6% in the Mountain division, and to polymyxins from 0.0% in the East South Central division to 23.4% in New England. Generally, New England enjoyed the lowest rates of resistance (0.0% to tetracyclines to 28.8% to trimethoprim‐sulfamethoxazole), and the Mountain division the highest (0.9% to polymyxins to 52.6% to tetracyclines). The rates of MDR AB ranged from 8.0% in New England to 50.4% in the Mountain division (see Supporting Table 2 in the online version of this article).
Examining resistances to drug classes and combinations by the location of the source specimen revealed that trimethoprim‐sulfamethoxazole once again exhibited the highest rate of resistance across all locations (see Supporting Table 3 in the online version of this article). Despite their modest contribution to the overall sample pool (n=967, 2.5%), organisms from nursing home subjects had the highest prevalence of resistance to aminoglycosides (36.3%), tetracyclines (57.1%), and carbapenems (47.1%). This pattern held true for combination regimens examined. Nursing homes also vastly surpassed other locations in the rates of MDR AB (46.5%). Interestingly, the rates of MDR did not differ substantially among regular inpatient wards (29.2%), the ICU (28.7%), and outpatient locations (26.2%) (see Supporting Table 3 in the online version of this article).
DISCUSSION
In this large multicenter survey we have documented the rising rates of AB resistance to clinically important antimicrobials in the United States. On the whole, all antimicrobials, except for minocycline, exhibited either large or small increases in resistance. Alarmingly, even colistin, a true last resort AB treatment, lost a considerable amount of activity against AB, with the resistance rate rising from 2.8% in 2006 to 2008 to 6.9% in 2009 to 2012. The single encouraging trend that we observed was that resistance to minocycline appeared to diminish substantially, going from over one‐half of all AB tested in 2003 to 2005 to just under one‐third in 2009 to 2012.
Although we did note a rise in the MDR AB, our data suggest a lower percentage of all AB that meets the MDR phenotype criteria compared to reports by other groups. For example, the Center for Disease Dynamics and Economic Policy (CDDEP), analyzing the same data as our study, reports a rise in MDR AB from 32.1% in 1999 to 51.0% in 2010.[23] This discrepancy is easily explained by the fact that we included polymyxins, tetracyclines, and trimethoprim‐sulfamethoxazole in our evaluation, whereas the CDDEP did not examine these agents. Furthermore, we omitted fluoroquinolones, a drug class with high rates of resistance, from our study, because we were interested in focusing only on antimicrobials with clinical data in AB infections.[22] In addition, we limited our evaluation to specimens derived from respiratory or BSI sources, whereas the CDDEP data reflect any AB isolate present in TSN.
We additionally confirm that there is substantial geographic variation in resistance patterns. Thus, despite different definitions, our data agree with those from the CDDEP that the MDR prevalence is highest in the Mountain and East North Central divisions, and lowest in New England overall.[23] The wide variations underscore the fact that it is not valid to speak of national rates of resistance, but rather it is important to concentrate on the local patterns. This information, though important from the macroepidemiologic standpoint, is likely still not granular enough to help clinicians make empiric treatment decisions. In fact, what is needed for that is real‐time antibiogram data specific to each center and even each unit within each center.
The latter point is further illustrated by our analysis of locations of origin of the specimens. In this analysis, we discovered that, contrary to the common presumption that the ICU has the highest rate of resistant organisms, specimens derived from nursing homes represent perhaps the most intensely resistant organisms. In other words, the nursing home is the setting most likely to harbor patients with respiratory infections and BSIs caused by resistant AB. These data are in agreement with several other recent investigations. In a period‐prevalence survey conducted in the state of Maryland in 2009 by Thom and colleagues, long‐term care facilities were found to have the highest prevalence of any AB, and also those resistant to imipenem, MDR, and extensively drug‐resistant organisms.[24] Mortensen and coworkers confirmed the high prevalence of AB and AB resistance in long‐term care facilities, and extended this finding to suggest that there is evidence for intra‐ and interhospital spread of these pathogens.[25] Our data confirm this concerning finding at the national level, and point to a potential area of intervention for infection prevention.
An additional finding of some concern is that the highest proportion of colistin resistance among those specimens, whose location of origin was reported in the database, was the outpatient setting (6.6% compared to 5.4% in the ICU specimens, for example). Although these infections would likely meet the definition for healthcare‐associated infection, AB as a community‐acquired respiratory pathogen is not unprecedented either in the United States or abroad.[26, 27, 28, 29, 30] It is, however, reassuring that most other antimicrobials examined in our study exhibit higher rates of susceptibility in the specimens derived from the outpatient settings than either from the hospital or the nursing home.
Our study has a number of strengths. As a large multicenter survey, it is representative of AB susceptibility patterns across the United States, which makes it highly generalizable. We focused on antibiotics for which clinical evidence is available, thus adding a practical dimension to the results. Another pragmatic consideration is examining the data by geographic distributions, allowing an additional layer of granularity for clinical decisions. At the same time it suffers from some limitations. The TSN database consists of microbiology samples from hospital laboratories. Although we attempted to reduce the risk of duplication, because of how samples are numbered in the database, repeat sampling remains a possibility. Despite having stratified the data by geography and the location of origin of the specimen, it is likely not granular enough for local risk stratification decisions clinicians make daily about the choices of empiric therapy. Some of the MIC breakpoints have changed over the period of the study (see Supporting Table 4 in the online version of this article). Because these changes occurred in the last year of data collection (2012), they should have had only a minimal, if any, impact on the observed rates of resistance in the time frame examined. Additionally, because resistance rates evolve rapidly, more current data are required for effective clinical decision making.
In summary, we have demonstrated that the last decade has seen an alarming increase in the rate of resistance of AB to multiple clinically important antimicrobial agents and classes. We have further emphasized the importance of granularity in susceptibility data to help clinicians make sensible decisions about empiric therapy in hospitalized patients with serious infections. Finally, and potentially most disturbingly, the nursing home as a location appears to be a robust reservoir for spread for resistant AB. All of these observations highlight the urgent need to develop novel antibiotics and nontraditional agents, such as antibodies and vaccines, to combat AB infections, in addition to having important infection prevention implications if we are to contain the looming threat of the end of antibiotics.[31]
Disclosure
This study was funded by a grant from Tetraphase Pharmaceuticals, Watertown, MA.
Among hospitalized patients with serious infections, the choice of empiric therapy plays a key role in outcomes.[1, 2, 3, 4, 5, 6, 7, 8, 9] Rising rates and variable patterns of antimicrobial resistance, however, complicate selecting appropriate empiric therapy. Amidst this shifting landscape of resistance to antimicrobials, gram‐negative bacteria and specifically Acinetobacter baumannii (AB), remain a considerable challenge.[10] On the one hand, AB is a less‐frequent cause of serious infections than organisms like Pseudomonas aeruginosa or Enterobacteriaceae in severely ill hospitalized patients.[11, 12] On the other, AB has evolved a variety of resistance mechanisms and exhibits unpredictable susceptibility patterns.[13] These factors combine to increase the likelihood of administering inappropriate empiric therapy when faced with an infection caused by AB and, thereby, raising the risk of death.[14] The fact that clinicians may not routinely consider AB as the potential culprit pathogen in the patient they are treating along with this organism's highly in vitro resistant nature, may result in routine gram‐negative coverage being frequently inadequate for AB infections.
To address the poor outcomes related to inappropriate empiric therapy in the setting of AB, one requires an appreciation of the longitudinal changes and geographic differences in the susceptibility of this pathogen. Thus, we aimed to examine secular trends in the resistance of AB to antimicrobial agents whose effectiveness against this microorganism was well supported in the literature during the study timeframe.[15]
METHODS
To determine the prevalence of predefined resistance patterns among AB in respiratory and blood stream infection (BSI) specimens, we examined The Surveillance Network (TSN) database from Eurofins. We explored data collected between years 2003 and 2012. The database has been used extensively for surveillance purposes since 1994, and has previously been described in detail.[16, 17, 18, 19, 20] Briefly, TSN is a warehouse of routine clinical microbiology data collected from a nationally representative sample of microbiology laboratories in 217 hospitals in the United States. To minimize selection bias, laboratories are included based on their geography and the demographics of the populations they serve.[18] Only clinically significant samples are reported. No personal identifying information for source patients is available in this database. Only source laboratories that perform antimicrobial susceptibility testing according standard Food and Drug Administrationapproved testing methods and that interpret susceptibility in accordance with the Clinical Laboratory Standards Institute breakpoints are included.[21] (See Supporting Table 4 in the online version of this article for minimum inhibitory concentration (MIC) changes over the course of the studycurrent colistin and polymyxin breakpoints applied retrospectively). All enrolled laboratories undergo a pre‐enrollment site visit. Logical filters are used for routine quality control to detect unusual susceptibility profiles and to ensure appropriate testing methods. Repeat testing and reporting are done as necessary.[18]
Laboratory samples are reported as susceptible, intermediate, or resistant. We grouped isolates with intermediate MICs together with the resistant ones for the purposes of the current analysis. Duplicate isolates were excluded. Only samples representing 1 of the 2 infections of interest, respiratory or BSI, were included.
We examined 3 time periods2003 to 2005, 2006 to 2008, and 2009 to 2012for the prevalence of AB's resistance to the following antibiotics: carbapenems (imipenem, meropenem, doripenem), aminoglycosides (tobramycin, amikacin), tetracyclines (minocycline, doxycycline), polymyxins (colistin, polymyxin B), ampicillin‐sulbactam, and trimethoprim‐sulfamethoxazole. Antimicrobial resistance was defined by the designation of intermediate or resistant in the susceptibility category. Resistance to a class of antibiotics was defined as resistance to all drugs within the class for which testing was available. The organism was multidrug resistant (MDR) if it was resistant to at least 1 antimicrobial in at least 3 drug classes examined.[22] Resistance to a combination of 2 drugs was present if the specimen was resistant to both of the drugs in the combination for which testing was available. We examined the data by infection type, time period, the 9 US Census divisions, and location of origin of the sample.
All categorical variables are reported as percentages. Continuous variables are reported as meansstandard deviations and/or medians with the interquartile range (IQR). We did not pursue hypothesis testing due to a high risk of type I error in this large dataset. Therefore, only clinically important trends are highlighted.
RESULTS
Among the 39,320 AB specimens, 81.1% were derived from a respiratory source and 18.9% represented BSI. Demographics of source patients are listed in Table 1. Notably, the median age of those with respiratory infection (58 years; IQR 38, 73) was higher than among patients with BSI (54.5 years; IQR 36, 71), and there were proportionally fewer females among respiratory patients (39.9%) than those with BSI (46.0%). Though only 24.3% of all BSI samples originated from the intensive are unit (ICU), 40.5% of respiratory specimens came from that location. The plurality of all specimens was collected in the 2003 to 2005 time interval (41.3%), followed by 2006 to 2008 (34.7%), with a minority coming from years 2009 to 2012 (24.0%). The proportions of collected specimens from respiratory and BSI sources were similar in all time periods examined (Table 1). Geographically, the South Atlantic division contributed the most samples (24.1%) and East South Central the fewest (2.6%) (Figure 1). The vast majority of all samples came from hospital wards (78.6%), where roughly one‐half originated in the ICU (37.5%). Fewer still came from outpatient sources (18.3%), and a small minority (2.5%) from nursing homes.

Pneumonia | BSI | All | |
---|---|---|---|
| |||
Total, N (%) | 31,868 (81.1) | 7,452 (18.9) | 39,320 |
Age, y | |||
Mean (SD) | 57.7 (37.4) | 57.6 (40.6) | 57.7 (38.0) |
Median (IQR 25, 75) | 58 (38, 73) | 54.5 (36, 71) | 57 (37, 73) |
Gender, female (%) | 12,725 (39.9) | 3,425 (46.0) | 16,150 (41.1) |
ICU (%) | 12,9191 (40.5) | 1,809 (24.3) | 14,7284 (37.5) |
Time period, % total | |||
20032005 | 12,910 (40.5) | 3,340 (44.8) | 16,250 (41.3) |
20062008 | 11,205 (35.2) | 2,435 (32.7) | 13,640 (34.7) |
20092012 | 7,753 (24.3) | 1,677 (22.5) | 9,430 (24.0) |
Figure 2 depicts overall resistance patterns by individual drugs, drug classes, and frequently used combinations of agents. Although doripenem had the highest rate of resistance numerically (90.3%), its susceptibility was tested only in a small minority of specimens (n=31, 0.08%). Resistance to trimethoprim‐sulfamethoxazole was high (55.3%) based on a large number of samples tested (n=33,031). Conversely, colistin as an agent and polymyxins as a class exhibited the highest susceptibility rates of over 90%, though the numbers of samples tested for susceptibility to these drugs were also small (colistin n=2,086, 5.3%; polymyxins n=3,120, 7.9%) (Figure 2). Among commonly used drug combinations, carbapenem+aminoglycoside (18.0%) had the lowest resistance rates, and nearly 30% of all AB specimens tested met the criteria for MDR.

Over time, resistance to carbapenems more‐than doubled, from 21.0% in 2003 to 2005 to 47.9% in 2009 to 2012 (Table 2). Although relatively few samples were tested for colistin susceptibility (n=2,086, 5.3%), resistance to this drug also more than doubled from 2.8% (95% confidence interval: 1.9‐4.2) in 2006 to 2008 to 6.9% (95% confidence interval: 5.7‐8.2) in 2009 to 2012. As a class, however, polymyxins exhibited stable resistance rates over the time frame of the study (Table 2). Prevalence of MDR AB rose from 21.4% in 2003 to 2005 to 33.7% in 2006 to 2008, and remained stable at 35.2% in 2009 to 2012. Resistance to even such broad combinations as carbapenem+ampicillin/sulbactam nearly tripled from 13.2% in 2003 to 2005 to 35.5% in 2009 to 2012. Notably, between 2003 and 2012, although resistance rates either rose or remained stable to all other agents, those to minocycline diminished from 56.5% in 2003 to 2005 to 36.6% in 2006 to 2008 to 30.5% in 2009 to 2012. (See Supporting Table 1 in the online version of this article for time trends based on whether they represented respiratory or BSI specimens, with directionally similar trends in both.)
Drug/Combination | Time Period | ||||||||
---|---|---|---|---|---|---|---|---|---|
20032005 | 20062008 | 20092012 | |||||||
Na | %b | 95% CI | N | % | 95% CI | N | % | 95% CI | |
| |||||||||
Amikacin | 12,949 | 25.2 | 24.5‐26.0 | 10.929 | 35.2 | 34.3‐36.1 | 6,292 | 45.7 | 44.4‐46.9 |
Tobramycin | 14,549 | 37.1 | 36.3‐37.9 | 11,877 | 41.9 | 41.0‐42.8 | 7,901 | 39.2 | 38.1‐40.3 |
Aminoglycoside | 14,505 | 22.5 | 21.8‐23.2 | 11,967 | 30.6 | 29.8‐31.4 | 7,736 | 34.8 | 33.8‐35.8 |
Doxycycline | 173 | 36.4 | 29.6‐43.8 | 38 | 29.0 | 17.0‐44.8 | 32 | 34.4 | 20.4‐51.7 |
Minocycline | 1,388 | 56.5 | 53.9‐50.1 | 902 | 36.6 | 33.5‐39.8 | 522 | 30.5 | 26.7‐34.5 |
Tetracycline | 1,511 | 55.4 | 52.9‐57.9 | 940 | 36.3 | 33.3‐39.4 | 546 | 30.8 | 27.0‐34.8 |
Doripenem | NR | NR | NR | 9 | 77.8 | 45.3‐93.7 | 22 | 95.5 | 78.2‐99.2 |
Imipenem | 14,728 | 21.8 | 21.2‐22.5 | 12,094 | 40.3 | 39.4‐41.2 | 6,681 | 51.7 | 50.5‐52.9 |
Meropenem | 7,226 | 37.0 | 35.9‐38.1 | 5,628 | 48.7 | 47.3‐50.0 | 4,919 | 47.3 | 45.9‐48.7 |
Carbapenem | 15,490 | 21.0 | 20.4‐21.7 | 12,975 | 38.8 | 38.0‐39.7 | 8,778 | 47.9 | 46.9‐49.0 |
Ampicillin/sulbactam | 10,525 | 35.2 | 34.3‐36.2 | 9,413 | 44.9 | 43.9‐45.9 | 6,460 | 41.2 | 40.0‐42.4 |
Colistin | NR | NR | NR | 783 | 2.8 | 1.9‐4.2 | 1,303 | 6.9 | 5.7‐8.2 |
Polymyxin B | 105 | 7.6 | 3.9‐14.3 | 796 | 12.8 | 10.7‐15.3 | 321 | 6.5 | 4.3‐9.6 |
Polymyxin | 105 | 7.6 | 3.9‐14.3 | 1,563 | 7.9 | 6.6‐9.3 | 1,452 | 6.8 | 5.6‐8.2 |
Trimethoprim/sulfamethoxazole | 13,640 | 52.5 | 51.7‐53.3 | 11,535 | 57.1 | 56.2‐58.0 | 7,856 | 57.6 | 56.5‐58.7 |
MDRc | 16,249 | 21.4 | 20.7‐22.0 | 13,640 | 33.7 | 33.0‐34.5 | 9,431 | 35.2 | 34.2‐36.2 |
Carbapenem+aminoglycoside | 14,601 | 8.9 | 8.5‐9.4 | 12,333 | 21.3 | 20.6‐22.0 | 8,256 | 29.3 | 28.3‐30.3 |
Aminoglycoside+ampicillin/sulbactam | 10,107 | 12.9 | 12.3‐13.6 | 9,077 | 24.9 | 24.0‐25.8 | 6,200 | 24.3 | 23.2‐25.3 |
Aminoglycosie+minocycline | 1,359 | 35.6 | 33.1‐38.2 | 856 | 21.4 | 18.8‐24.2 | 503 | 24.5 | 20.9‐28.4 |
Carbapenem+ampicillin/sulbactam | 10,228 | 13.2 | 12.5‐13.9 | 9,145 | 29.4 | 28.4‐30.3 | 6,143 | 35.5 | 34.3‐36.7 |
Regionally, examining resistance by classes and combinations of antibiotics, trimethoprim‐sulfamethoxazole exhibited consistently the highest rates of resistance, ranging from the lowest in the New England (28.8%) to the highest in the East North Central (69.9%) Census divisions (See Supporting Table 2 in the online version of this article). The rates of resistance to tetracyclines ranged from 0.0% in New England to 52.6% in the Mountain division, and to polymyxins from 0.0% in the East South Central division to 23.4% in New England. Generally, New England enjoyed the lowest rates of resistance (0.0% to tetracyclines to 28.8% to trimethoprim‐sulfamethoxazole), and the Mountain division the highest (0.9% to polymyxins to 52.6% to tetracyclines). The rates of MDR AB ranged from 8.0% in New England to 50.4% in the Mountain division (see Supporting Table 2 in the online version of this article).
Examining resistances to drug classes and combinations by the location of the source specimen revealed that trimethoprim‐sulfamethoxazole once again exhibited the highest rate of resistance across all locations (see Supporting Table 3 in the online version of this article). Despite their modest contribution to the overall sample pool (n=967, 2.5%), organisms from nursing home subjects had the highest prevalence of resistance to aminoglycosides (36.3%), tetracyclines (57.1%), and carbapenems (47.1%). This pattern held true for combination regimens examined. Nursing homes also vastly surpassed other locations in the rates of MDR AB (46.5%). Interestingly, the rates of MDR did not differ substantially among regular inpatient wards (29.2%), the ICU (28.7%), and outpatient locations (26.2%) (see Supporting Table 3 in the online version of this article).
DISCUSSION
In this large multicenter survey we have documented the rising rates of AB resistance to clinically important antimicrobials in the United States. On the whole, all antimicrobials, except for minocycline, exhibited either large or small increases in resistance. Alarmingly, even colistin, a true last resort AB treatment, lost a considerable amount of activity against AB, with the resistance rate rising from 2.8% in 2006 to 2008 to 6.9% in 2009 to 2012. The single encouraging trend that we observed was that resistance to minocycline appeared to diminish substantially, going from over one‐half of all AB tested in 2003 to 2005 to just under one‐third in 2009 to 2012.
Although we did note a rise in the MDR AB, our data suggest a lower percentage of all AB that meets the MDR phenotype criteria compared to reports by other groups. For example, the Center for Disease Dynamics and Economic Policy (CDDEP), analyzing the same data as our study, reports a rise in MDR AB from 32.1% in 1999 to 51.0% in 2010.[23] This discrepancy is easily explained by the fact that we included polymyxins, tetracyclines, and trimethoprim‐sulfamethoxazole in our evaluation, whereas the CDDEP did not examine these agents. Furthermore, we omitted fluoroquinolones, a drug class with high rates of resistance, from our study, because we were interested in focusing only on antimicrobials with clinical data in AB infections.[22] In addition, we limited our evaluation to specimens derived from respiratory or BSI sources, whereas the CDDEP data reflect any AB isolate present in TSN.
We additionally confirm that there is substantial geographic variation in resistance patterns. Thus, despite different definitions, our data agree with those from the CDDEP that the MDR prevalence is highest in the Mountain and East North Central divisions, and lowest in New England overall.[23] The wide variations underscore the fact that it is not valid to speak of national rates of resistance, but rather it is important to concentrate on the local patterns. This information, though important from the macroepidemiologic standpoint, is likely still not granular enough to help clinicians make empiric treatment decisions. In fact, what is needed for that is real‐time antibiogram data specific to each center and even each unit within each center.
The latter point is further illustrated by our analysis of locations of origin of the specimens. In this analysis, we discovered that, contrary to the common presumption that the ICU has the highest rate of resistant organisms, specimens derived from nursing homes represent perhaps the most intensely resistant organisms. In other words, the nursing home is the setting most likely to harbor patients with respiratory infections and BSIs caused by resistant AB. These data are in agreement with several other recent investigations. In a period‐prevalence survey conducted in the state of Maryland in 2009 by Thom and colleagues, long‐term care facilities were found to have the highest prevalence of any AB, and also those resistant to imipenem, MDR, and extensively drug‐resistant organisms.[24] Mortensen and coworkers confirmed the high prevalence of AB and AB resistance in long‐term care facilities, and extended this finding to suggest that there is evidence for intra‐ and interhospital spread of these pathogens.[25] Our data confirm this concerning finding at the national level, and point to a potential area of intervention for infection prevention.
An additional finding of some concern is that the highest proportion of colistin resistance among those specimens, whose location of origin was reported in the database, was the outpatient setting (6.6% compared to 5.4% in the ICU specimens, for example). Although these infections would likely meet the definition for healthcare‐associated infection, AB as a community‐acquired respiratory pathogen is not unprecedented either in the United States or abroad.[26, 27, 28, 29, 30] It is, however, reassuring that most other antimicrobials examined in our study exhibit higher rates of susceptibility in the specimens derived from the outpatient settings than either from the hospital or the nursing home.
Our study has a number of strengths. As a large multicenter survey, it is representative of AB susceptibility patterns across the United States, which makes it highly generalizable. We focused on antibiotics for which clinical evidence is available, thus adding a practical dimension to the results. Another pragmatic consideration is examining the data by geographic distributions, allowing an additional layer of granularity for clinical decisions. At the same time it suffers from some limitations. The TSN database consists of microbiology samples from hospital laboratories. Although we attempted to reduce the risk of duplication, because of how samples are numbered in the database, repeat sampling remains a possibility. Despite having stratified the data by geography and the location of origin of the specimen, it is likely not granular enough for local risk stratification decisions clinicians make daily about the choices of empiric therapy. Some of the MIC breakpoints have changed over the period of the study (see Supporting Table 4 in the online version of this article). Because these changes occurred in the last year of data collection (2012), they should have had only a minimal, if any, impact on the observed rates of resistance in the time frame examined. Additionally, because resistance rates evolve rapidly, more current data are required for effective clinical decision making.
In summary, we have demonstrated that the last decade has seen an alarming increase in the rate of resistance of AB to multiple clinically important antimicrobial agents and classes. We have further emphasized the importance of granularity in susceptibility data to help clinicians make sensible decisions about empiric therapy in hospitalized patients with serious infections. Finally, and potentially most disturbingly, the nursing home as a location appears to be a robust reservoir for spread for resistant AB. All of these observations highlight the urgent need to develop novel antibiotics and nontraditional agents, such as antibodies and vaccines, to combat AB infections, in addition to having important infection prevention implications if we are to contain the looming threat of the end of antibiotics.[31]
Disclosure
This study was funded by a grant from Tetraphase Pharmaceuticals, Watertown, MA.
- National Nosocomial Infections Surveillance (NNIS) System Report. Am J Infect Control. 2004;32:470–485.
- National surveillance of antimicrobial resistance in Pseudomonas aeruginosa isolates obtained from intensive care unit patients from 1993 to 2002. Antimicrob Agents Chemother. 2004;48:4606–4610. , , ,
- Health care‐associated pneumonia and community‐acquired pneumonia: a single‐center experience. Antimicrob Agents Chemother. 2007;51:3568–3573. , , , et al.
- Clinical importance of delays in the initiation of appropriate antibiotic treatment for ventilator‐associated pneumonia. Chest. 2002;122:262–268. , , , et al.
- ICU‐Acquired Pneumonia Study Group. Modification of empiric antibiotic treatment in patients with pneumonia acquired in the intensive care unit. Intensive Care Med. 1996;22:387–394. ;
- Antimicrobial therapy escalation and hospital mortality among patients with HCAP: a single center experience. Chest. 2008:134:963–968. , , , ,
- Surviving Sepsis Campaign: international guidelines for management of severe sepsis and septic shock: 2008. Crit Care Med. 2008;36:296–327. , , , et al.
- Inappropriate antibiotic therapy in Gram‐negative sepsis increases hospital length of stay. Crit Care Med. 2011;39:46–51. , , , , ,
- Inadequate antimicrobial treatment of infections: a risk factor for hospital mortality among critically ill patients. Chest. 1999;115:462–474. , , ,
- Centers for Disease Control and Prevention. Antibiotic resistance threats in the United States, 2013. Available at: http://www.cdc.gov/drugresistance/threat-report-2013/pdf/ar-threats-2013-508.pdf#page=59. Accessed December 29, 2014.
- National Healthcare Safety Network (NHSN) Team and Participating NHSN Facilities. Antimicrobial‐resistant pathogens associated with healthcare‐associated infections: summary of data reported to the National Healthcare Safety Network at the Centers for Disease Control and Prevention, 2009–2010. Infect Control Hosp Epidemiol. 2013;34:1–14. , , , et al.;
- Multi‐drug resistance, inappropriate initial antibiotic therapy and mortality in Gram‐negative severe sepsis and septic shock: a retrospective cohort study. Crit Care. 2014;18(6):596. , , , ,
- Global challenge of multidrug‐resistant Acinetobacter baumannii. Antimicrob Agents Chemother. 2007;51:3471–3484. , , , , ,
- Predictors of hospital mortality among septic ICU patients with Acinetobacter spp. bacteremia: a cohort study. BMC Infect Dis. 2014;14:572. , , ,
- Treatment of Acinetobacter infections. Clin Infect Dis. 2010;51:79–84. ,
- Increasing resistance of Acinetobacter species to imipenem in United States hospitals, 1999–2006. Infect Control Hosp Epidemiol. 2010;31:196–197. , ,
- Trends in resistance to carbapenems and third‐generation cephalosporins among clinical isolates of Klebsiella pneumoniae in the United States, 1999–2010. Infect Control Hosp Epidemiol. 2013;34:259–268. , , , ,
- Antimicrobial resistance in key bloodstream bacterial isolates: electronic surveillance with the Surveillance Network Database—USA. Clin Infect Dis. 1999;29:259–263. , ,
- Community‐associated methicillin‐resistant Staphylococcus aureus in outpatients, United States, 1999–2006. Emerg Infect Dis. 2009;15:1925–1930. , ,
- Prevalence of antimicrobial resistance in bacteria isolated from central nervous system specimens as reported by U.S. hospital laboratories from 2000 to 2002. Ann Clin Microbiol Antimicrob. 2004;3:3. , , , ,
- Performance standards for antimicrobial susceptibility testing: twenty‐second informational supplement. CLSI document M100‐S22. Wayne, PA: Clinical and Laboratory Standards Institute; 2012.
- Multidrug‐resistant, extensively drug‐resistant and pandrug‐resistant bacteria: an international expert proposal for interim standard definitions for acquired resistance. Clin Microbiol Infect. 2012;18:268–281. , , , et al.
- CDDEP: The Center for Disease Dynamics, Economics and Policy. Resistance map: Acinetobacter baumannii overview. Available at: http://www.cddep.org/projects/resistance_map/acinetobacter_baumannii_overview. Accessed January 16, 2015.
- Maryland MDRO Prevention Collaborative. Assessing the burden of Acinetobacter baumannii in Maryland: a statewide cross‐sectional period prevalence survey. Infect Control Hosp Epidemiol. 2012;33:883–888. , , , et al.;
- Multidrug‐resistant Acinetobacter baumannii infection, colonization, and transmission related to a long‐term care facility providing subacute care. Infect Control Hosp Epidemiol. 2014;35:406–411. , , , et al.
- Severe community‐acquired pneumonia due to Acinetobacter baumannii. Chest. 2001;120:1072–1077. , , , , ,
- Fulminant community‐acquired Acinetobacter baumannii pneumonia as distinct clinical syndrome. Chest. 2006;129:102–109. , , , , ,
- Community‐acquired Acinetobacter baumannii pneumonia. Rev Clin Esp. 2003;203:284–286. , , ,
- Antimicrobial drug‐resistant microbes associated with hospitalized community‐acquired and healthcare‐associated pneumonia: a multi‐center study in Taiwan. J Formos Med Assoc. 2013;112:31–40. , , , et al.
- Antimicrobial resistance in Hispanic patients hospitalized in San Antonio, TX with community‐acquired pneumonia. Hosp Pract (1995). 2010;38:108–113. , , , ,
- http://blogs.cdc.gov/cdcdirector/2014/05/05/the-end-of-antibiotics-can-we-come-back-from-the-brink/. Published May 5, 2014. Accessed January 16, 2015. Centers for Disease Control and Prevention. CDC director blog. The end of antibiotics. Can we come back from the brink? Available at:
- National Nosocomial Infections Surveillance (NNIS) System Report. Am J Infect Control. 2004;32:470–485.
- National surveillance of antimicrobial resistance in Pseudomonas aeruginosa isolates obtained from intensive care unit patients from 1993 to 2002. Antimicrob Agents Chemother. 2004;48:4606–4610. , , ,
- Health care‐associated pneumonia and community‐acquired pneumonia: a single‐center experience. Antimicrob Agents Chemother. 2007;51:3568–3573. , , , et al.
- Clinical importance of delays in the initiation of appropriate antibiotic treatment for ventilator‐associated pneumonia. Chest. 2002;122:262–268. , , , et al.
- ICU‐Acquired Pneumonia Study Group. Modification of empiric antibiotic treatment in patients with pneumonia acquired in the intensive care unit. Intensive Care Med. 1996;22:387–394. ;
- Antimicrobial therapy escalation and hospital mortality among patients with HCAP: a single center experience. Chest. 2008:134:963–968. , , , ,
- Surviving Sepsis Campaign: international guidelines for management of severe sepsis and septic shock: 2008. Crit Care Med. 2008;36:296–327. , , , et al.
- Inappropriate antibiotic therapy in Gram‐negative sepsis increases hospital length of stay. Crit Care Med. 2011;39:46–51. , , , , ,
- Inadequate antimicrobial treatment of infections: a risk factor for hospital mortality among critically ill patients. Chest. 1999;115:462–474. , , ,
- Centers for Disease Control and Prevention. Antibiotic resistance threats in the United States, 2013. Available at: http://www.cdc.gov/drugresistance/threat-report-2013/pdf/ar-threats-2013-508.pdf#page=59. Accessed December 29, 2014.
- National Healthcare Safety Network (NHSN) Team and Participating NHSN Facilities. Antimicrobial‐resistant pathogens associated with healthcare‐associated infections: summary of data reported to the National Healthcare Safety Network at the Centers for Disease Control and Prevention, 2009–2010. Infect Control Hosp Epidemiol. 2013;34:1–14. , , , et al.;
- Multi‐drug resistance, inappropriate initial antibiotic therapy and mortality in Gram‐negative severe sepsis and septic shock: a retrospective cohort study. Crit Care. 2014;18(6):596. , , , ,
- Global challenge of multidrug‐resistant Acinetobacter baumannii. Antimicrob Agents Chemother. 2007;51:3471–3484. , , , , ,
- Predictors of hospital mortality among septic ICU patients with Acinetobacter spp. bacteremia: a cohort study. BMC Infect Dis. 2014;14:572. , , ,
- Treatment of Acinetobacter infections. Clin Infect Dis. 2010;51:79–84. ,
- Increasing resistance of Acinetobacter species to imipenem in United States hospitals, 1999–2006. Infect Control Hosp Epidemiol. 2010;31:196–197. , ,
- Trends in resistance to carbapenems and third‐generation cephalosporins among clinical isolates of Klebsiella pneumoniae in the United States, 1999–2010. Infect Control Hosp Epidemiol. 2013;34:259–268. , , , ,
- Antimicrobial resistance in key bloodstream bacterial isolates: electronic surveillance with the Surveillance Network Database—USA. Clin Infect Dis. 1999;29:259–263. , ,
- Community‐associated methicillin‐resistant Staphylococcus aureus in outpatients, United States, 1999–2006. Emerg Infect Dis. 2009;15:1925–1930. , ,
- Prevalence of antimicrobial resistance in bacteria isolated from central nervous system specimens as reported by U.S. hospital laboratories from 2000 to 2002. Ann Clin Microbiol Antimicrob. 2004;3:3. , , , ,
- Performance standards for antimicrobial susceptibility testing: twenty‐second informational supplement. CLSI document M100‐S22. Wayne, PA: Clinical and Laboratory Standards Institute; 2012.
- Multidrug‐resistant, extensively drug‐resistant and pandrug‐resistant bacteria: an international expert proposal for interim standard definitions for acquired resistance. Clin Microbiol Infect. 2012;18:268–281. , , , et al.
- CDDEP: The Center for Disease Dynamics, Economics and Policy. Resistance map: Acinetobacter baumannii overview. Available at: http://www.cddep.org/projects/resistance_map/acinetobacter_baumannii_overview. Accessed January 16, 2015.
- Maryland MDRO Prevention Collaborative. Assessing the burden of Acinetobacter baumannii in Maryland: a statewide cross‐sectional period prevalence survey. Infect Control Hosp Epidemiol. 2012;33:883–888. , , , et al.;
- Multidrug‐resistant Acinetobacter baumannii infection, colonization, and transmission related to a long‐term care facility providing subacute care. Infect Control Hosp Epidemiol. 2014;35:406–411. , , , et al.
- Severe community‐acquired pneumonia due to Acinetobacter baumannii. Chest. 2001;120:1072–1077. , , , , ,
- Fulminant community‐acquired Acinetobacter baumannii pneumonia as distinct clinical syndrome. Chest. 2006;129:102–109. , , , , ,
- Community‐acquired Acinetobacter baumannii pneumonia. Rev Clin Esp. 2003;203:284–286. , , ,
- Antimicrobial drug‐resistant microbes associated with hospitalized community‐acquired and healthcare‐associated pneumonia: a multi‐center study in Taiwan. J Formos Med Assoc. 2013;112:31–40. , , , et al.
- Antimicrobial resistance in Hispanic patients hospitalized in San Antonio, TX with community‐acquired pneumonia. Hosp Pract (1995). 2010;38:108–113. , , , ,
- http://blogs.cdc.gov/cdcdirector/2014/05/05/the-end-of-antibiotics-can-we-come-back-from-the-brink/. Published May 5, 2014. Accessed January 16, 2015. Centers for Disease Control and Prevention. CDC director blog. The end of antibiotics. Can we come back from the brink? Available at:
© 2015 Society of Hospital Medicine
Vasoactive Medications Safe in ICU via Peripheral Intravenous Access
Clinical question: Can vasoactive medications be safely given in the ICU via peripheral intravenous (PIV) access instead of central venous access?
Background: Vasoactive medications are given to a variety of patients in shock to maintain hemodynamic function. These medications are given through central venous catheters, partly out of concern for extravasation and tissue injury from PIV access use; however, placement and use of central catheters are also associated with significant morbidity.
Study design: Single-arm, observational, consecutive patient study.
Setting: Single, 18-bed medical ICU.
Synopsis: Investigators identified 734 ICU patients who received vasoactive medications through PIV lines between September 2002 and June 2014. They were 54% male gender, with an average age of 72 years and a SAPS II score average of 75. Norepinephrine, dopamine, and phenylephrine were included in the study. The decision to use these medications was based on clinical judgment. A specific pre-approved protocol, involving PIV and vein size and location, use of ultrasound confirmation, and a maximum duration of 72 hours, was used to administer these medications via PIV. Extravasation was immediately treated with injected phentolamine and topical nitroglycerin.
The average duration of PIV vasoactive medication use was 49 hours. Of the study patients, 13% eventually required central catheters, 2% experienced peripheral extravasation of medication, and none experienced tissue injury as defined by the study group.
Because the study was observational, there was no control group, and outcomes/efficacy compared to central catheters could not be assessed. Patient characteristics and other variables were not controlled for, and its single-center design makes reproducibility uncertain.
Bottom line: Vasoactive medications can be safely and feasibly administered to ICU patients through PIV lines using adequate protocols.
Citation: Cardenas-Garcia J, Schaub KF, Belchikov YG, Narasimhan M, Koenig SJ, Mayo PH. Safety of peripheral intravenous administration of vasoactive medication [published online ahead of print May 26, 2015]. J Hosp Med. doi: 10.1002/jhm.2394.
Clinical question: Can vasoactive medications be safely given in the ICU via peripheral intravenous (PIV) access instead of central venous access?
Background: Vasoactive medications are given to a variety of patients in shock to maintain hemodynamic function. These medications are given through central venous catheters, partly out of concern for extravasation and tissue injury from PIV access use; however, placement and use of central catheters are also associated with significant morbidity.
Study design: Single-arm, observational, consecutive patient study.
Setting: Single, 18-bed medical ICU.
Synopsis: Investigators identified 734 ICU patients who received vasoactive medications through PIV lines between September 2002 and June 2014. They were 54% male gender, with an average age of 72 years and a SAPS II score average of 75. Norepinephrine, dopamine, and phenylephrine were included in the study. The decision to use these medications was based on clinical judgment. A specific pre-approved protocol, involving PIV and vein size and location, use of ultrasound confirmation, and a maximum duration of 72 hours, was used to administer these medications via PIV. Extravasation was immediately treated with injected phentolamine and topical nitroglycerin.
The average duration of PIV vasoactive medication use was 49 hours. Of the study patients, 13% eventually required central catheters, 2% experienced peripheral extravasation of medication, and none experienced tissue injury as defined by the study group.
Because the study was observational, there was no control group, and outcomes/efficacy compared to central catheters could not be assessed. Patient characteristics and other variables were not controlled for, and its single-center design makes reproducibility uncertain.
Bottom line: Vasoactive medications can be safely and feasibly administered to ICU patients through PIV lines using adequate protocols.
Citation: Cardenas-Garcia J, Schaub KF, Belchikov YG, Narasimhan M, Koenig SJ, Mayo PH. Safety of peripheral intravenous administration of vasoactive medication [published online ahead of print May 26, 2015]. J Hosp Med. doi: 10.1002/jhm.2394.
Clinical question: Can vasoactive medications be safely given in the ICU via peripheral intravenous (PIV) access instead of central venous access?
Background: Vasoactive medications are given to a variety of patients in shock to maintain hemodynamic function. These medications are given through central venous catheters, partly out of concern for extravasation and tissue injury from PIV access use; however, placement and use of central catheters are also associated with significant morbidity.
Study design: Single-arm, observational, consecutive patient study.
Setting: Single, 18-bed medical ICU.
Synopsis: Investigators identified 734 ICU patients who received vasoactive medications through PIV lines between September 2002 and June 2014. They were 54% male gender, with an average age of 72 years and a SAPS II score average of 75. Norepinephrine, dopamine, and phenylephrine were included in the study. The decision to use these medications was based on clinical judgment. A specific pre-approved protocol, involving PIV and vein size and location, use of ultrasound confirmation, and a maximum duration of 72 hours, was used to administer these medications via PIV. Extravasation was immediately treated with injected phentolamine and topical nitroglycerin.
The average duration of PIV vasoactive medication use was 49 hours. Of the study patients, 13% eventually required central catheters, 2% experienced peripheral extravasation of medication, and none experienced tissue injury as defined by the study group.
Because the study was observational, there was no control group, and outcomes/efficacy compared to central catheters could not be assessed. Patient characteristics and other variables were not controlled for, and its single-center design makes reproducibility uncertain.
Bottom line: Vasoactive medications can be safely and feasibly administered to ICU patients through PIV lines using adequate protocols.
Citation: Cardenas-Garcia J, Schaub KF, Belchikov YG, Narasimhan M, Koenig SJ, Mayo PH. Safety of peripheral intravenous administration of vasoactive medication [published online ahead of print May 26, 2015]. J Hosp Med. doi: 10.1002/jhm.2394.
Cost, Frequency of Emergency Department Revisits Evaluated
Clinical question: What is the cost and frequency of ED revisits within three days and 30 days?
Background: ED revisits lead to a financial and resource utilization burden on the medical system. The costs and rates of these return visits are unknown and limited in characterization.
Study design: Observational study.
Setting: Six states, using Healthcare Cost and Utilization Project databases.
Synopsis: An observational study examined data from 2006-2010 across six states to determine cost and frequency of ED revisits within a 30-day period from initial ED treatment and discharge. The study examined revisit rates within the first three days of discharge, as well as the 30 days following discharge from the initial presentation.
Three-day revisit rates were 8.2%, with 29% resulting in admission; 32% of the revisits took place at a different institution.
The 30-day revisit rate was 19.9%, with 28% resulting in admission. The most common diagnoses were skin and soft tissue infections (23.9%) and abdominal pain (9.7%). The vast majority of revisits (89%) resulted in the same diagnosis as the first encounter.
Cost of the revisits was more difficult to assess, because only one of six states had full data (Florida); the cost data was extrapolated for the other states involved. In Florida, three-day revisit costs accounted for 30.3% of all primary visit costs. Thirty-day revisit costs were 118% of all primary ED visits costs within that time period.
There was not always an indication of whether the revisit was due to a planned revisit, worsening of symptoms, or inadequate initial treatment, however, leaving the evaluation of cost and revisit burden incomplete.
Bottom line: Initial evaluation of ED revisits shows that rates and cost are significant, though the nature of the revisits remains underevaluated. Preliminary data demonstrate that ED revisits are a significant cost to the healthcare system, though the number of preventable revisits remains unknown.
Citation: Duseja R, Bardach NS, Lin GA, et al. Revisit rates and associated costs after and emergency department encounter. Ann Intern Med. 2015;162(11):750-756.
Clinical question: What is the cost and frequency of ED revisits within three days and 30 days?
Background: ED revisits lead to a financial and resource utilization burden on the medical system. The costs and rates of these return visits are unknown and limited in characterization.
Study design: Observational study.
Setting: Six states, using Healthcare Cost and Utilization Project databases.
Synopsis: An observational study examined data from 2006-2010 across six states to determine cost and frequency of ED revisits within a 30-day period from initial ED treatment and discharge. The study examined revisit rates within the first three days of discharge, as well as the 30 days following discharge from the initial presentation.
Three-day revisit rates were 8.2%, with 29% resulting in admission; 32% of the revisits took place at a different institution.
The 30-day revisit rate was 19.9%, with 28% resulting in admission. The most common diagnoses were skin and soft tissue infections (23.9%) and abdominal pain (9.7%). The vast majority of revisits (89%) resulted in the same diagnosis as the first encounter.
Cost of the revisits was more difficult to assess, because only one of six states had full data (Florida); the cost data was extrapolated for the other states involved. In Florida, three-day revisit costs accounted for 30.3% of all primary visit costs. Thirty-day revisit costs were 118% of all primary ED visits costs within that time period.
There was not always an indication of whether the revisit was due to a planned revisit, worsening of symptoms, or inadequate initial treatment, however, leaving the evaluation of cost and revisit burden incomplete.
Bottom line: Initial evaluation of ED revisits shows that rates and cost are significant, though the nature of the revisits remains underevaluated. Preliminary data demonstrate that ED revisits are a significant cost to the healthcare system, though the number of preventable revisits remains unknown.
Citation: Duseja R, Bardach NS, Lin GA, et al. Revisit rates and associated costs after and emergency department encounter. Ann Intern Med. 2015;162(11):750-756.
Clinical question: What is the cost and frequency of ED revisits within three days and 30 days?
Background: ED revisits lead to a financial and resource utilization burden on the medical system. The costs and rates of these return visits are unknown and limited in characterization.
Study design: Observational study.
Setting: Six states, using Healthcare Cost and Utilization Project databases.
Synopsis: An observational study examined data from 2006-2010 across six states to determine cost and frequency of ED revisits within a 30-day period from initial ED treatment and discharge. The study examined revisit rates within the first three days of discharge, as well as the 30 days following discharge from the initial presentation.
Three-day revisit rates were 8.2%, with 29% resulting in admission; 32% of the revisits took place at a different institution.
The 30-day revisit rate was 19.9%, with 28% resulting in admission. The most common diagnoses were skin and soft tissue infections (23.9%) and abdominal pain (9.7%). The vast majority of revisits (89%) resulted in the same diagnosis as the first encounter.
Cost of the revisits was more difficult to assess, because only one of six states had full data (Florida); the cost data was extrapolated for the other states involved. In Florida, three-day revisit costs accounted for 30.3% of all primary visit costs. Thirty-day revisit costs were 118% of all primary ED visits costs within that time period.
There was not always an indication of whether the revisit was due to a planned revisit, worsening of symptoms, or inadequate initial treatment, however, leaving the evaluation of cost and revisit burden incomplete.
Bottom line: Initial evaluation of ED revisits shows that rates and cost are significant, though the nature of the revisits remains underevaluated. Preliminary data demonstrate that ED revisits are a significant cost to the healthcare system, though the number of preventable revisits remains unknown.
Citation: Duseja R, Bardach NS, Lin GA, et al. Revisit rates and associated costs after and emergency department encounter. Ann Intern Med. 2015;162(11):750-756.
Early, Late Hospital Readmission Factors Differ
Clinical question: What are the differences between factors associated with early (zero to seven days after discharge) and late (eight to 30 days after discharge) readmission?
Background: Thirty-day readmission rates are a quality metric; however, recent evidence challenges the notion that readmissions represent unnecessary and preventable healthcare use. It remains unclear whether the 30-day window post-discharge represents a homogenous period or if there are factors that contribute to readmission during that time.
Study design: Retrospective, single-center, cohort study.
Setting: Large, urban teaching hospital.
Synopsis: Based on 13,355 admissions representing 8,078 patients over a two-year period, the overall readmission rate was 19.7%, with 7.8% early (zero to seven days post-discharge) readmissions, and 11.9% late (eight to 30 days post-discharge) readmissions. Variables were categorized as indicators of acute illness burden, chronic illness burden, patient care process factors, and social determinants of health.
Several markers of acute illness burden were associated with early readmission only. Some markers of chronic illness burden were associated with late readmissions only (e.g. hemodialysis), while others were associated with readmissions throughout the 30-day period. Worse social determinants of health increased odds of readmission in both periods.
The single-center study was able to examine detailed clinical variables; however, this approach limited the generalizability of the the results.
Bottom line: Policies to reduce 30-day readmissions should reflect the different risk factors at play across that time frame.
Citation: Graham KL, Wilker EH, Howell MD, Davis RB, Marcantonio ER. Differences between early and late readmissions among patients: A cohort study. Ann Intern Med. 2015;162(11):741-749.
Clinical question: What are the differences between factors associated with early (zero to seven days after discharge) and late (eight to 30 days after discharge) readmission?
Background: Thirty-day readmission rates are a quality metric; however, recent evidence challenges the notion that readmissions represent unnecessary and preventable healthcare use. It remains unclear whether the 30-day window post-discharge represents a homogenous period or if there are factors that contribute to readmission during that time.
Study design: Retrospective, single-center, cohort study.
Setting: Large, urban teaching hospital.
Synopsis: Based on 13,355 admissions representing 8,078 patients over a two-year period, the overall readmission rate was 19.7%, with 7.8% early (zero to seven days post-discharge) readmissions, and 11.9% late (eight to 30 days post-discharge) readmissions. Variables were categorized as indicators of acute illness burden, chronic illness burden, patient care process factors, and social determinants of health.
Several markers of acute illness burden were associated with early readmission only. Some markers of chronic illness burden were associated with late readmissions only (e.g. hemodialysis), while others were associated with readmissions throughout the 30-day period. Worse social determinants of health increased odds of readmission in both periods.
The single-center study was able to examine detailed clinical variables; however, this approach limited the generalizability of the the results.
Bottom line: Policies to reduce 30-day readmissions should reflect the different risk factors at play across that time frame.
Citation: Graham KL, Wilker EH, Howell MD, Davis RB, Marcantonio ER. Differences between early and late readmissions among patients: A cohort study. Ann Intern Med. 2015;162(11):741-749.
Clinical question: What are the differences between factors associated with early (zero to seven days after discharge) and late (eight to 30 days after discharge) readmission?
Background: Thirty-day readmission rates are a quality metric; however, recent evidence challenges the notion that readmissions represent unnecessary and preventable healthcare use. It remains unclear whether the 30-day window post-discharge represents a homogenous period or if there are factors that contribute to readmission during that time.
Study design: Retrospective, single-center, cohort study.
Setting: Large, urban teaching hospital.
Synopsis: Based on 13,355 admissions representing 8,078 patients over a two-year period, the overall readmission rate was 19.7%, with 7.8% early (zero to seven days post-discharge) readmissions, and 11.9% late (eight to 30 days post-discharge) readmissions. Variables were categorized as indicators of acute illness burden, chronic illness burden, patient care process factors, and social determinants of health.
Several markers of acute illness burden were associated with early readmission only. Some markers of chronic illness burden were associated with late readmissions only (e.g. hemodialysis), while others were associated with readmissions throughout the 30-day period. Worse social determinants of health increased odds of readmission in both periods.
The single-center study was able to examine detailed clinical variables; however, this approach limited the generalizability of the the results.
Bottom line: Policies to reduce 30-day readmissions should reflect the different risk factors at play across that time frame.
Citation: Graham KL, Wilker EH, Howell MD, Davis RB, Marcantonio ER. Differences between early and late readmissions among patients: A cohort study. Ann Intern Med. 2015;162(11):741-749.
Patient Adherence to Pharmacological Thromboprophylaxis Improves with Interventions
Clinical question: How can patient adherence to pharmacological thromboprophylaxis be improved?
Background: Prior studies suggest that the hospital-wide prevalence of nonadministration of VTE thromboprophylaxis orders ranges from 5% to 13%, with patient refusal listed as the most common reason for nonadministration.
Study design: Quasi-experimental, pre-post intervention, with intervention and control units.
Setting: Academic medical center in Philadelphia.
Synopsis: Researchers identified 20,208 admissions for the study; 8,293 (41%) admissions occurred prior to the intervention and 11,915 (59%) after. The three-part intervention, which was composed of (1) standardized nurse response to patient refusal, (2) integration of daily assessment of VTE into rounds, and (3) regular audit with feedback, resulted in a decrease in nonadministration rates during the intervention. Rates continued to decline in the 21-month follow-up period.
After the intervention, the rate of missed doses of pharmacological thromboprophylaxis decreased from 24.7% to 14.7% (P<0.01). This was due to a decrease in patient refusal from 18.3% to 9.4% (P<0.01).
Although there was a decrease in the missed doses of thromboprophylaxis, there was no statistically significant change in the rate of hospital-associated VTE.
Bottom line: A multifaceted intervention resulted in a decrease in the proportion of missed and refused doses of pharmacological VTE thromboprophylaxis, but this was not associated with a statistically significant change in VTE rates.
Citation: Baillie CA, Guevara JP, Boston RC, Hecht TE. A unit-based intervention aimed at improving patient adherence to pharmacological thromboprophylaxis [published online ahead of print June 2, 2015]. BMJ Qual Saf. doi:10.1136/bmjqs-2015-003992.
Clinical question: How can patient adherence to pharmacological thromboprophylaxis be improved?
Background: Prior studies suggest that the hospital-wide prevalence of nonadministration of VTE thromboprophylaxis orders ranges from 5% to 13%, with patient refusal listed as the most common reason for nonadministration.
Study design: Quasi-experimental, pre-post intervention, with intervention and control units.
Setting: Academic medical center in Philadelphia.
Synopsis: Researchers identified 20,208 admissions for the study; 8,293 (41%) admissions occurred prior to the intervention and 11,915 (59%) after. The three-part intervention, which was composed of (1) standardized nurse response to patient refusal, (2) integration of daily assessment of VTE into rounds, and (3) regular audit with feedback, resulted in a decrease in nonadministration rates during the intervention. Rates continued to decline in the 21-month follow-up period.
After the intervention, the rate of missed doses of pharmacological thromboprophylaxis decreased from 24.7% to 14.7% (P<0.01). This was due to a decrease in patient refusal from 18.3% to 9.4% (P<0.01).
Although there was a decrease in the missed doses of thromboprophylaxis, there was no statistically significant change in the rate of hospital-associated VTE.
Bottom line: A multifaceted intervention resulted in a decrease in the proportion of missed and refused doses of pharmacological VTE thromboprophylaxis, but this was not associated with a statistically significant change in VTE rates.
Citation: Baillie CA, Guevara JP, Boston RC, Hecht TE. A unit-based intervention aimed at improving patient adherence to pharmacological thromboprophylaxis [published online ahead of print June 2, 2015]. BMJ Qual Saf. doi:10.1136/bmjqs-2015-003992.
Clinical question: How can patient adherence to pharmacological thromboprophylaxis be improved?
Background: Prior studies suggest that the hospital-wide prevalence of nonadministration of VTE thromboprophylaxis orders ranges from 5% to 13%, with patient refusal listed as the most common reason for nonadministration.
Study design: Quasi-experimental, pre-post intervention, with intervention and control units.
Setting: Academic medical center in Philadelphia.
Synopsis: Researchers identified 20,208 admissions for the study; 8,293 (41%) admissions occurred prior to the intervention and 11,915 (59%) after. The three-part intervention, which was composed of (1) standardized nurse response to patient refusal, (2) integration of daily assessment of VTE into rounds, and (3) regular audit with feedback, resulted in a decrease in nonadministration rates during the intervention. Rates continued to decline in the 21-month follow-up period.
After the intervention, the rate of missed doses of pharmacological thromboprophylaxis decreased from 24.7% to 14.7% (P<0.01). This was due to a decrease in patient refusal from 18.3% to 9.4% (P<0.01).
Although there was a decrease in the missed doses of thromboprophylaxis, there was no statistically significant change in the rate of hospital-associated VTE.
Bottom line: A multifaceted intervention resulted in a decrease in the proportion of missed and refused doses of pharmacological VTE thromboprophylaxis, but this was not associated with a statistically significant change in VTE rates.
Citation: Baillie CA, Guevara JP, Boston RC, Hecht TE. A unit-based intervention aimed at improving patient adherence to pharmacological thromboprophylaxis [published online ahead of print June 2, 2015]. BMJ Qual Saf. doi:10.1136/bmjqs-2015-003992.
Mortality Risk in Patients Older than 75 Presenting with Non-ST-Elevation Acute Coronary Syndrome
Clinical question: Is there a score that will predict the mortality rate in elderly patients presenting with a non-ST-elevation myocardial infarction (NSTEMI)?
Background: Although they represent only 9% of patients in clinical trials, patients over the age of 75 make up one third of patients with NSTEMI, accounting for more than half of NSTEMI-related mortality.
Study design: Retrospective cohort analysis for score calculator design, with prospective cohort validation.
Setting: The retrospective cohort was derived from a meta-analysis of 55 papers. The prospective validation arm used a cohort of patients from a randomized multicenter Italian trial.
Synopsis: The authors developed and validated a mortality predictor for patients 75 and older who present with an NSTEMI. The calculator: hemoglobin less than 10 g/dl (two points), elevated troponin levels, ECG ischemic changes, estimated glomerular filtration rate (eGFR) less than 45, previous vascular event (one point each two). The calculator predicted probabilities of death in one year ranging from 2% (score of zero) to 75% (score of six). The calculator allowed stratification into low (score: zero to one), intermediate (score: two), or high (score: three or greater) risk. High-risk patients appeared to benefit from intervention with significantly reduced risk for mortality (odds ratio 0.44).
Bottom line: A simple risk calculator stratifies elderly patients into low, intermediate, or high risk to predict mortality from NSTEMI. High-risk patients appear to achieve a mortality benefit from intervention.
Citation: Angeli F, Cavallini C, Verdecchia P, et al. A risk score for predicting 1-year mortality in patients ≥75 years of age presenting with non-ST-elevation acute coronary syndrome. Am J Cardiol. 2015;116(2):208-213.
Clinical question: Is there a score that will predict the mortality rate in elderly patients presenting with a non-ST-elevation myocardial infarction (NSTEMI)?
Background: Although they represent only 9% of patients in clinical trials, patients over the age of 75 make up one third of patients with NSTEMI, accounting for more than half of NSTEMI-related mortality.
Study design: Retrospective cohort analysis for score calculator design, with prospective cohort validation.
Setting: The retrospective cohort was derived from a meta-analysis of 55 papers. The prospective validation arm used a cohort of patients from a randomized multicenter Italian trial.
Synopsis: The authors developed and validated a mortality predictor for patients 75 and older who present with an NSTEMI. The calculator: hemoglobin less than 10 g/dl (two points), elevated troponin levels, ECG ischemic changes, estimated glomerular filtration rate (eGFR) less than 45, previous vascular event (one point each two). The calculator predicted probabilities of death in one year ranging from 2% (score of zero) to 75% (score of six). The calculator allowed stratification into low (score: zero to one), intermediate (score: two), or high (score: three or greater) risk. High-risk patients appeared to benefit from intervention with significantly reduced risk for mortality (odds ratio 0.44).
Bottom line: A simple risk calculator stratifies elderly patients into low, intermediate, or high risk to predict mortality from NSTEMI. High-risk patients appear to achieve a mortality benefit from intervention.
Citation: Angeli F, Cavallini C, Verdecchia P, et al. A risk score for predicting 1-year mortality in patients ≥75 years of age presenting with non-ST-elevation acute coronary syndrome. Am J Cardiol. 2015;116(2):208-213.
Clinical question: Is there a score that will predict the mortality rate in elderly patients presenting with a non-ST-elevation myocardial infarction (NSTEMI)?
Background: Although they represent only 9% of patients in clinical trials, patients over the age of 75 make up one third of patients with NSTEMI, accounting for more than half of NSTEMI-related mortality.
Study design: Retrospective cohort analysis for score calculator design, with prospective cohort validation.
Setting: The retrospective cohort was derived from a meta-analysis of 55 papers. The prospective validation arm used a cohort of patients from a randomized multicenter Italian trial.
Synopsis: The authors developed and validated a mortality predictor for patients 75 and older who present with an NSTEMI. The calculator: hemoglobin less than 10 g/dl (two points), elevated troponin levels, ECG ischemic changes, estimated glomerular filtration rate (eGFR) less than 45, previous vascular event (one point each two). The calculator predicted probabilities of death in one year ranging from 2% (score of zero) to 75% (score of six). The calculator allowed stratification into low (score: zero to one), intermediate (score: two), or high (score: three or greater) risk. High-risk patients appeared to benefit from intervention with significantly reduced risk for mortality (odds ratio 0.44).
Bottom line: A simple risk calculator stratifies elderly patients into low, intermediate, or high risk to predict mortality from NSTEMI. High-risk patients appear to achieve a mortality benefit from intervention.
Citation: Angeli F, Cavallini C, Verdecchia P, et al. A risk score for predicting 1-year mortality in patients ≥75 years of age presenting with non-ST-elevation acute coronary syndrome. Am J Cardiol. 2015;116(2):208-213.
An Ancient Recipe to Cure a Modern Pathogen
At the Society for General Microbiology Annual Conference (March 30–April 2, 2015), Harrison et al presented a paper describing their experience with a 1000-year-old antimicrobial remedy with antistaphylococcal activity (S19We1006). The unique team of investigators—consisting of an expert in Viking studies and 3 microbiologists from the University of Nottingham, England, and another microbiologist who performs mouse model studies from Texas Tech University (Lubbock, Texas)—pooled their talents to reconstruct and test an ancient recipe for treating eyelash follicle infection. The potion not only killed methicillin-resistant Staphylococcus aureus (MRSA) grown in established biofilms but also performed as good, if not better, than the conventional antibiotics on MRSA-infected skin wounds in mice.
“Take cropleek and garlic, of both equal quantities, pound them well together . . . take wine and bullocks gall, mix with the leek . . . let it stand 9 days in the brass vessel” is an excerpt of the recipe for treatment of a sty from the 9th century Anglo-Saxon text Bald’s Leechbook (translated from Old English). The modern day chefs cooked their potion using: (1) 2 Allium species (equal amounts of garlic and either leek or onion, finely chopped and crushed in a mortar for 2 minutes), (2) wine (add 25 mL [0.87 fl oz] of an organic vintage from a historic English vineyard near Glastonbury), (3) bullock gall (dissolve bile from a cow’s stomach in distilled water), and (4) brass (glass bottles with squares of brass sheets immersed in the mixture were used because a brass vessel would be not only hard to sterilize but also expensive). After brewing (in the “brass vessel”), the solution was purified by straining it and left to chill at 4oC for 9 days before the mixture was used.
What’s the issue?
Advances in technology provide the opportunity to create new medical treatments. However, efficacious therapies may remain hidden in ancient texts, waiting to be discovered. Historic Chinese literature has been the source for modern day drugs such as artemisinin for Plasmodium falciparum malaria. The ancient recipe in Bald’s Leechbook may be the next major advance in the topical management of MRSA. I anticipate, in the future, that physicians may be prescribing “Bald’s potion” to treat impetigo. What do you think?
We want to know your views! Tell us what you think.
Suggested Readings
- AncientBiotics—a medieval remedy for modern day superbugs [news release]? United Kingdom: The University of Nottingham; March 30, 2015. http://www.nottingham.ac.uk/news/pressreleases/2015/march/ancientbiotics---a-medieval-remedy-for-modern-day-superbugs.aspx. Accessed August 12, 2015.
- Feilden T. 1,000-year-old onion and garlic eye remedy kills MRSA. BBC News. March 30, 2015. http://www.bbc.com/news/uk-england-nottinghamshire-32117815. Accessed August 12, 2015.
- Wilson C. Anglo-Saxon remedy kills hospital superbug MRSA. New Scientist. March 30, 2015. http://www.newscientist.com/article/dn27263-anglosaxon-remedy-kills-hospital-superbug-mrsa.html#.VTPpsqazDzl. Accessed August 12, 2015.
At the Society for General Microbiology Annual Conference (March 30–April 2, 2015), Harrison et al presented a paper describing their experience with a 1000-year-old antimicrobial remedy with antistaphylococcal activity (S19We1006). The unique team of investigators—consisting of an expert in Viking studies and 3 microbiologists from the University of Nottingham, England, and another microbiologist who performs mouse model studies from Texas Tech University (Lubbock, Texas)—pooled their talents to reconstruct and test an ancient recipe for treating eyelash follicle infection. The potion not only killed methicillin-resistant Staphylococcus aureus (MRSA) grown in established biofilms but also performed as good, if not better, than the conventional antibiotics on MRSA-infected skin wounds in mice.
“Take cropleek and garlic, of both equal quantities, pound them well together . . . take wine and bullocks gall, mix with the leek . . . let it stand 9 days in the brass vessel” is an excerpt of the recipe for treatment of a sty from the 9th century Anglo-Saxon text Bald’s Leechbook (translated from Old English). The modern day chefs cooked their potion using: (1) 2 Allium species (equal amounts of garlic and either leek or onion, finely chopped and crushed in a mortar for 2 minutes), (2) wine (add 25 mL [0.87 fl oz] of an organic vintage from a historic English vineyard near Glastonbury), (3) bullock gall (dissolve bile from a cow’s stomach in distilled water), and (4) brass (glass bottles with squares of brass sheets immersed in the mixture were used because a brass vessel would be not only hard to sterilize but also expensive). After brewing (in the “brass vessel”), the solution was purified by straining it and left to chill at 4oC for 9 days before the mixture was used.
What’s the issue?
Advances in technology provide the opportunity to create new medical treatments. However, efficacious therapies may remain hidden in ancient texts, waiting to be discovered. Historic Chinese literature has been the source for modern day drugs such as artemisinin for Plasmodium falciparum malaria. The ancient recipe in Bald’s Leechbook may be the next major advance in the topical management of MRSA. I anticipate, in the future, that physicians may be prescribing “Bald’s potion” to treat impetigo. What do you think?
We want to know your views! Tell us what you think.
Suggested Readings
- AncientBiotics—a medieval remedy for modern day superbugs [news release]? United Kingdom: The University of Nottingham; March 30, 2015. http://www.nottingham.ac.uk/news/pressreleases/2015/march/ancientbiotics---a-medieval-remedy-for-modern-day-superbugs.aspx. Accessed August 12, 2015.
- Feilden T. 1,000-year-old onion and garlic eye remedy kills MRSA. BBC News. March 30, 2015. http://www.bbc.com/news/uk-england-nottinghamshire-32117815. Accessed August 12, 2015.
- Wilson C. Anglo-Saxon remedy kills hospital superbug MRSA. New Scientist. March 30, 2015. http://www.newscientist.com/article/dn27263-anglosaxon-remedy-kills-hospital-superbug-mrsa.html#.VTPpsqazDzl. Accessed August 12, 2015.
At the Society for General Microbiology Annual Conference (March 30–April 2, 2015), Harrison et al presented a paper describing their experience with a 1000-year-old antimicrobial remedy with antistaphylococcal activity (S19We1006). The unique team of investigators—consisting of an expert in Viking studies and 3 microbiologists from the University of Nottingham, England, and another microbiologist who performs mouse model studies from Texas Tech University (Lubbock, Texas)—pooled their talents to reconstruct and test an ancient recipe for treating eyelash follicle infection. The potion not only killed methicillin-resistant Staphylococcus aureus (MRSA) grown in established biofilms but also performed as good, if not better, than the conventional antibiotics on MRSA-infected skin wounds in mice.
“Take cropleek and garlic, of both equal quantities, pound them well together . . . take wine and bullocks gall, mix with the leek . . . let it stand 9 days in the brass vessel” is an excerpt of the recipe for treatment of a sty from the 9th century Anglo-Saxon text Bald’s Leechbook (translated from Old English). The modern day chefs cooked their potion using: (1) 2 Allium species (equal amounts of garlic and either leek or onion, finely chopped and crushed in a mortar for 2 minutes), (2) wine (add 25 mL [0.87 fl oz] of an organic vintage from a historic English vineyard near Glastonbury), (3) bullock gall (dissolve bile from a cow’s stomach in distilled water), and (4) brass (glass bottles with squares of brass sheets immersed in the mixture were used because a brass vessel would be not only hard to sterilize but also expensive). After brewing (in the “brass vessel”), the solution was purified by straining it and left to chill at 4oC for 9 days before the mixture was used.
What’s the issue?
Advances in technology provide the opportunity to create new medical treatments. However, efficacious therapies may remain hidden in ancient texts, waiting to be discovered. Historic Chinese literature has been the source for modern day drugs such as artemisinin for Plasmodium falciparum malaria. The ancient recipe in Bald’s Leechbook may be the next major advance in the topical management of MRSA. I anticipate, in the future, that physicians may be prescribing “Bald’s potion” to treat impetigo. What do you think?
We want to know your views! Tell us what you think.
Suggested Readings
- AncientBiotics—a medieval remedy for modern day superbugs [news release]? United Kingdom: The University of Nottingham; March 30, 2015. http://www.nottingham.ac.uk/news/pressreleases/2015/march/ancientbiotics---a-medieval-remedy-for-modern-day-superbugs.aspx. Accessed August 12, 2015.
- Feilden T. 1,000-year-old onion and garlic eye remedy kills MRSA. BBC News. March 30, 2015. http://www.bbc.com/news/uk-england-nottinghamshire-32117815. Accessed August 12, 2015.
- Wilson C. Anglo-Saxon remedy kills hospital superbug MRSA. New Scientist. March 30, 2015. http://www.newscientist.com/article/dn27263-anglosaxon-remedy-kills-hospital-superbug-mrsa.html#.VTPpsqazDzl. Accessed August 12, 2015.