Malperfusion key in aortic dissection repair outcomes

Indication for surgery unchanged
Article Type
Changed
Wed, 01/02/2019 - 09:56

 

Early repair is the standard of care for patients with type A aortic dissection, but the presence of malperfusion rather than the timing of surgery may be a major determinant in patient survival both in the hospital and in the long term, according to an analysis of patients with acute type A aortic dissection over a 17-year period at the University of Bristol (England).

“Malperfusion at presentation rather than the timing of intervention is the major risk factor for death in both the short term and long term in patients undergoing surgical repair of type A aortic dissection,” Pradeep Narayan, FRCS, and his colleagues said in reporting their findings in the July issue of the Journal of Thoracic and Cardiovascular Surgery (154:81-6). Nonetheless, Dr. Narayan and his colleagues acknowledged that early operation prevents the development of malperfusion and is the best option for restoring normal perfusion for patients who already have malperfusion.

Their study analyzed results from two different groups of patients who had surgery for repair of acute type A aortic dissection over a 17-year period: 72 in the early surgery group that had operative repair within 12 hours of symptom onset; and 80 in the late-surgery group that had the operation 12 hours or more after symptoms first appeared. A total of 205 patients underwent surgical repair for acute type A aortic dissection in that period, but only 152 cases had recorded the timing of surgery from onset of symptoms. The median time between arrival at the center and surgery was 3 hours.

Dr. Narayan and his coauthors reported that 39% (60) of the 152 patients had malperfusion. Organ malperfusion was actually more common in the early surgery group, although the difference was not significant: 48.6% vs. 31.3% in the late-surgery group (P = .29). Early mortality was also similar between the two groups: 19.4% in the early surgery group and 13.8% in the late surgery group (P = .8). In terms of late survival, the study found no difference between the two groups.

Dr. Narayan and his coauthors reported that malperfusion and concomitant coronary artery bypass grafting were independent predictors of survival, with hazard ratios of 2.65 (P = .01) and 3.03 (P = .03), respectively. As a nonlinear variable, time to surgery showed an inverse relationship with late mortality (HR, 0.51; P = .26), but as a linear variable when adjusted for other covariates, including malperfusion, it did not affect survival (HR, 1.01; P = .09).

“The main finding of the present study is that almost 40% of patients undergoing repair of type A aortic dissection had evidence of malperfusion,” Dr. Narayan and his coauthors said. “The second important finding is that the presence of malperfusion was associated with significantly increased risk of death in both the short-term and long-term follow-up.” While a delayed operation was associated with a reduced risk of death, it was not significant when accounting for malperfusion.

Dr. Narayan and his coauthors acknowledged limitations of their study, the most important of which was the including of different types of malperfusion as a single variable. Also, the small sample size may explain the lack of statistically significant differences between the two groups.

Dr. Narayan and his coauthors had no financial relationships to disclose.
 

Body

 

Malperfusion has the potential to serve as a marker for the need for surgery in type A aortic dissection, but the inability to identify the true risk of developing malperfusion in the first 12-24 hours after acute type A dissection means that the indication for early surgery will remain unchanged, James I. Fann, MD, of Stanford (Calif.) University says in his invited commentary (J Thorac Cardiovasc Surg. 2017;154:87-8).

“The findings of Narayan and colleagues impel us to review the history of the development of the classification and treatment (or in fact vice versa) of acute type A dissection and to acknowledge that early timing of surgery in these high-risk patients was originally proposed to prevent malperfusion and to respond to the most catastrophic complications,” Dr. Fann said.

But Dr. Fann cautioned against “being dismissive” of their findings, because such questioning and re-evaluation are essential in developing appropriate treatments. “Now, the question is whether we can identify the cohort of patients who are at lower risk for the development of malperfusion and tailor their treatment,” he said.

Dr. Fann had no financial relationships to disclose.

Publications
Topics
Sections
Body

 

Malperfusion has the potential to serve as a marker for the need for surgery in type A aortic dissection, but the inability to identify the true risk of developing malperfusion in the first 12-24 hours after acute type A dissection means that the indication for early surgery will remain unchanged, James I. Fann, MD, of Stanford (Calif.) University says in his invited commentary (J Thorac Cardiovasc Surg. 2017;154:87-8).

“The findings of Narayan and colleagues impel us to review the history of the development of the classification and treatment (or in fact vice versa) of acute type A dissection and to acknowledge that early timing of surgery in these high-risk patients was originally proposed to prevent malperfusion and to respond to the most catastrophic complications,” Dr. Fann said.

But Dr. Fann cautioned against “being dismissive” of their findings, because such questioning and re-evaluation are essential in developing appropriate treatments. “Now, the question is whether we can identify the cohort of patients who are at lower risk for the development of malperfusion and tailor their treatment,” he said.

Dr. Fann had no financial relationships to disclose.

Body

 

Malperfusion has the potential to serve as a marker for the need for surgery in type A aortic dissection, but the inability to identify the true risk of developing malperfusion in the first 12-24 hours after acute type A dissection means that the indication for early surgery will remain unchanged, James I. Fann, MD, of Stanford (Calif.) University says in his invited commentary (J Thorac Cardiovasc Surg. 2017;154:87-8).

“The findings of Narayan and colleagues impel us to review the history of the development of the classification and treatment (or in fact vice versa) of acute type A dissection and to acknowledge that early timing of surgery in these high-risk patients was originally proposed to prevent malperfusion and to respond to the most catastrophic complications,” Dr. Fann said.

But Dr. Fann cautioned against “being dismissive” of their findings, because such questioning and re-evaluation are essential in developing appropriate treatments. “Now, the question is whether we can identify the cohort of patients who are at lower risk for the development of malperfusion and tailor their treatment,” he said.

Dr. Fann had no financial relationships to disclose.

Title
Indication for surgery unchanged
Indication for surgery unchanged

 

Early repair is the standard of care for patients with type A aortic dissection, but the presence of malperfusion rather than the timing of surgery may be a major determinant in patient survival both in the hospital and in the long term, according to an analysis of patients with acute type A aortic dissection over a 17-year period at the University of Bristol (England).

“Malperfusion at presentation rather than the timing of intervention is the major risk factor for death in both the short term and long term in patients undergoing surgical repair of type A aortic dissection,” Pradeep Narayan, FRCS, and his colleagues said in reporting their findings in the July issue of the Journal of Thoracic and Cardiovascular Surgery (154:81-6). Nonetheless, Dr. Narayan and his colleagues acknowledged that early operation prevents the development of malperfusion and is the best option for restoring normal perfusion for patients who already have malperfusion.

Their study analyzed results from two different groups of patients who had surgery for repair of acute type A aortic dissection over a 17-year period: 72 in the early surgery group that had operative repair within 12 hours of symptom onset; and 80 in the late-surgery group that had the operation 12 hours or more after symptoms first appeared. A total of 205 patients underwent surgical repair for acute type A aortic dissection in that period, but only 152 cases had recorded the timing of surgery from onset of symptoms. The median time between arrival at the center and surgery was 3 hours.

Dr. Narayan and his coauthors reported that 39% (60) of the 152 patients had malperfusion. Organ malperfusion was actually more common in the early surgery group, although the difference was not significant: 48.6% vs. 31.3% in the late-surgery group (P = .29). Early mortality was also similar between the two groups: 19.4% in the early surgery group and 13.8% in the late surgery group (P = .8). In terms of late survival, the study found no difference between the two groups.

Dr. Narayan and his coauthors reported that malperfusion and concomitant coronary artery bypass grafting were independent predictors of survival, with hazard ratios of 2.65 (P = .01) and 3.03 (P = .03), respectively. As a nonlinear variable, time to surgery showed an inverse relationship with late mortality (HR, 0.51; P = .26), but as a linear variable when adjusted for other covariates, including malperfusion, it did not affect survival (HR, 1.01; P = .09).

“The main finding of the present study is that almost 40% of patients undergoing repair of type A aortic dissection had evidence of malperfusion,” Dr. Narayan and his coauthors said. “The second important finding is that the presence of malperfusion was associated with significantly increased risk of death in both the short-term and long-term follow-up.” While a delayed operation was associated with a reduced risk of death, it was not significant when accounting for malperfusion.

Dr. Narayan and his coauthors acknowledged limitations of their study, the most important of which was the including of different types of malperfusion as a single variable. Also, the small sample size may explain the lack of statistically significant differences between the two groups.

Dr. Narayan and his coauthors had no financial relationships to disclose.
 

 

Early repair is the standard of care for patients with type A aortic dissection, but the presence of malperfusion rather than the timing of surgery may be a major determinant in patient survival both in the hospital and in the long term, according to an analysis of patients with acute type A aortic dissection over a 17-year period at the University of Bristol (England).

“Malperfusion at presentation rather than the timing of intervention is the major risk factor for death in both the short term and long term in patients undergoing surgical repair of type A aortic dissection,” Pradeep Narayan, FRCS, and his colleagues said in reporting their findings in the July issue of the Journal of Thoracic and Cardiovascular Surgery (154:81-6). Nonetheless, Dr. Narayan and his colleagues acknowledged that early operation prevents the development of malperfusion and is the best option for restoring normal perfusion for patients who already have malperfusion.

Their study analyzed results from two different groups of patients who had surgery for repair of acute type A aortic dissection over a 17-year period: 72 in the early surgery group that had operative repair within 12 hours of symptom onset; and 80 in the late-surgery group that had the operation 12 hours or more after symptoms first appeared. A total of 205 patients underwent surgical repair for acute type A aortic dissection in that period, but only 152 cases had recorded the timing of surgery from onset of symptoms. The median time between arrival at the center and surgery was 3 hours.

Dr. Narayan and his coauthors reported that 39% (60) of the 152 patients had malperfusion. Organ malperfusion was actually more common in the early surgery group, although the difference was not significant: 48.6% vs. 31.3% in the late-surgery group (P = .29). Early mortality was also similar between the two groups: 19.4% in the early surgery group and 13.8% in the late surgery group (P = .8). In terms of late survival, the study found no difference between the two groups.

Dr. Narayan and his coauthors reported that malperfusion and concomitant coronary artery bypass grafting were independent predictors of survival, with hazard ratios of 2.65 (P = .01) and 3.03 (P = .03), respectively. As a nonlinear variable, time to surgery showed an inverse relationship with late mortality (HR, 0.51; P = .26), but as a linear variable when adjusted for other covariates, including malperfusion, it did not affect survival (HR, 1.01; P = .09).

“The main finding of the present study is that almost 40% of patients undergoing repair of type A aortic dissection had evidence of malperfusion,” Dr. Narayan and his coauthors said. “The second important finding is that the presence of malperfusion was associated with significantly increased risk of death in both the short-term and long-term follow-up.” While a delayed operation was associated with a reduced risk of death, it was not significant when accounting for malperfusion.

Dr. Narayan and his coauthors acknowledged limitations of their study, the most important of which was the including of different types of malperfusion as a single variable. Also, the small sample size may explain the lack of statistically significant differences between the two groups.

Dr. Narayan and his coauthors had no financial relationships to disclose.
 

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE JOURNAL OF THORACIC AND CARDIOVASCULAR SURGERY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Malperfusion is a main determinant of outcomes for patients having surgical repair for acute type A aortic dissection.

Major finding: Patients in the early surgery group (surgery within 12 hours of onset) were more likely to have malperfusion than those who had surgery later, 47% vs. 31%.

Data source: Single-center analysis of 152 operations for repair of acute type A aortic dissections over a 17-year period.

Disclosures: Dr. Narayan and his coauthors had no financial relationships to disclose.

Disqus Comments
Default

Short, simple antibiotic courses effective in latent TB

Article Type
Changed
Fri, 01/18/2019 - 16:56

 

Latent tuberculosis infection can be safely and effectively treated with 3- and 4-month medication regimens, including those using once-weekly dosing, according to results from a new meta-analysis.

The findings, published online July 31 in Annals of Internal Medicine, bolster evidence that shorter antibiotic regimens using rifamycins alone or in combination with other drugs are a viable alternative to the longer courses (Ann Intern Med. 2017;167:248-55).

Zerbor/Thinkstock
While the new study looked at efficacy and toxicity across treatment strategies only and found no significant differences between shorter rifamycin-based regimens and isoniazid-based regimens lasting 6 months or longer, short courses are considered likely to see better patient adherence, previous research in latent TB has indicated (BMC Infect Dis. 2016;16:257).

For their research, Dominik Zenner, MD, an epidemiologist with Public Health England in London, and his colleagues updated a meta-analysis they published in 2014. The team added 8 new randomized studies to the 53 that had been included in the earlier paper (Ann Intern Med. 2014 Sep;161:419-28).

Using pairwise comparisons and a Bayesian network analysis, Dr. Zenner and his colleagues found comparable efficacy among isoniazid regimens of 6 months or more; rifampicin-isoniazid regimens of 3 or 4 months, rifampicin-only regimens, and rifampicin-pyrazinamide regimens, compared with placebo (P less than .05 for all).

Importantly, a rifapentine-based regimen in which patients took a weekly dose for 12 weeks was as effective as the others.

“We think that you can get away with shorter regimens,” Dr. Zenner said in an interview. Although 3- to 4-month courses are already recommended in some countries, including the United Kingdom, for most patients with latent TB, “clinicians in some settings have been quite slow to adopt them,” he said.

The U.S. Centers for Disease Control and Prevention currently recommend multiple treatment strategies for latent TB, depending on patient characteristics. These include 6 or 9 months of isoniazid; 3 months of once-weekly isoniazid and rifapentine; or 4 months of daily rifampin.

In the meta-analysis, rifamycin-only regimens performed as well as did those regimens that also used isoniazid, the study showed, suggesting that, for most patients who can safely be treated with rifamycins, “there is no added gain of using isoniazid,” Dr. Zenner said.

He noted that the longer isoniazid-alone regimens are nonetheless effective and appropriate for some, including people who might have potential drug interactions, such as HIV patients taking antiretroviral medications.

About 2 billion people worldwide are estimated to have latent TB, and most will not go on to develop active TB. However, because latent TB acts as the reservoir for active TB, screening of high-risk groups and close contacts of TB patients and treating latent infections is a public health priority.

But many of these asymptomatic patients will get lost between a positive screen result and successful treatment completion, Dr. Zenner said.

“We have huge drop-offs in the cascade of treatment, and treatment completion is one of the worries,” he said. “Whether it makes a huge difference in compliance to take only 12 doses is not sufficiently studied, but it does make a lot of sense. By reducing the pill burden, as we call it, we think that we will see quite good adherence rates – but that’s a subject of further detailed study.”

The investigators noted as a limitation of their study that hepatotoxicity outcomes were not available for all studies and that some of the included trials had a potential for bias. They did not see statistically significant differences in treatment efficacy between regimens in HIV-positive and HIV-negative patients, but noted in their analysis that “efficacy may have been weaker in HIV-positive populations.”

The U.K. National Institute for Health Research provided some funding for Dr. Zenner and his colleagues’ study. One coauthor, Helen Stagg, PhD, reported nonfinancial support from Sanofi during the study, and financial support from Otsuka for unrelated work.


 

Publications
Topics
Sections

 

Latent tuberculosis infection can be safely and effectively treated with 3- and 4-month medication regimens, including those using once-weekly dosing, according to results from a new meta-analysis.

The findings, published online July 31 in Annals of Internal Medicine, bolster evidence that shorter antibiotic regimens using rifamycins alone or in combination with other drugs are a viable alternative to the longer courses (Ann Intern Med. 2017;167:248-55).

Zerbor/Thinkstock
While the new study looked at efficacy and toxicity across treatment strategies only and found no significant differences between shorter rifamycin-based regimens and isoniazid-based regimens lasting 6 months or longer, short courses are considered likely to see better patient adherence, previous research in latent TB has indicated (BMC Infect Dis. 2016;16:257).

For their research, Dominik Zenner, MD, an epidemiologist with Public Health England in London, and his colleagues updated a meta-analysis they published in 2014. The team added 8 new randomized studies to the 53 that had been included in the earlier paper (Ann Intern Med. 2014 Sep;161:419-28).

Using pairwise comparisons and a Bayesian network analysis, Dr. Zenner and his colleagues found comparable efficacy among isoniazid regimens of 6 months or more; rifampicin-isoniazid regimens of 3 or 4 months, rifampicin-only regimens, and rifampicin-pyrazinamide regimens, compared with placebo (P less than .05 for all).

Importantly, a rifapentine-based regimen in which patients took a weekly dose for 12 weeks was as effective as the others.

“We think that you can get away with shorter regimens,” Dr. Zenner said in an interview. Although 3- to 4-month courses are already recommended in some countries, including the United Kingdom, for most patients with latent TB, “clinicians in some settings have been quite slow to adopt them,” he said.

The U.S. Centers for Disease Control and Prevention currently recommend multiple treatment strategies for latent TB, depending on patient characteristics. These include 6 or 9 months of isoniazid; 3 months of once-weekly isoniazid and rifapentine; or 4 months of daily rifampin.

In the meta-analysis, rifamycin-only regimens performed as well as did those regimens that also used isoniazid, the study showed, suggesting that, for most patients who can safely be treated with rifamycins, “there is no added gain of using isoniazid,” Dr. Zenner said.

He noted that the longer isoniazid-alone regimens are nonetheless effective and appropriate for some, including people who might have potential drug interactions, such as HIV patients taking antiretroviral medications.

About 2 billion people worldwide are estimated to have latent TB, and most will not go on to develop active TB. However, because latent TB acts as the reservoir for active TB, screening of high-risk groups and close contacts of TB patients and treating latent infections is a public health priority.

But many of these asymptomatic patients will get lost between a positive screen result and successful treatment completion, Dr. Zenner said.

“We have huge drop-offs in the cascade of treatment, and treatment completion is one of the worries,” he said. “Whether it makes a huge difference in compliance to take only 12 doses is not sufficiently studied, but it does make a lot of sense. By reducing the pill burden, as we call it, we think that we will see quite good adherence rates – but that’s a subject of further detailed study.”

The investigators noted as a limitation of their study that hepatotoxicity outcomes were not available for all studies and that some of the included trials had a potential for bias. They did not see statistically significant differences in treatment efficacy between regimens in HIV-positive and HIV-negative patients, but noted in their analysis that “efficacy may have been weaker in HIV-positive populations.”

The U.K. National Institute for Health Research provided some funding for Dr. Zenner and his colleagues’ study. One coauthor, Helen Stagg, PhD, reported nonfinancial support from Sanofi during the study, and financial support from Otsuka for unrelated work.


 

 

Latent tuberculosis infection can be safely and effectively treated with 3- and 4-month medication regimens, including those using once-weekly dosing, according to results from a new meta-analysis.

The findings, published online July 31 in Annals of Internal Medicine, bolster evidence that shorter antibiotic regimens using rifamycins alone or in combination with other drugs are a viable alternative to the longer courses (Ann Intern Med. 2017;167:248-55).

Zerbor/Thinkstock
While the new study looked at efficacy and toxicity across treatment strategies only and found no significant differences between shorter rifamycin-based regimens and isoniazid-based regimens lasting 6 months or longer, short courses are considered likely to see better patient adherence, previous research in latent TB has indicated (BMC Infect Dis. 2016;16:257).

For their research, Dominik Zenner, MD, an epidemiologist with Public Health England in London, and his colleagues updated a meta-analysis they published in 2014. The team added 8 new randomized studies to the 53 that had been included in the earlier paper (Ann Intern Med. 2014 Sep;161:419-28).

Using pairwise comparisons and a Bayesian network analysis, Dr. Zenner and his colleagues found comparable efficacy among isoniazid regimens of 6 months or more; rifampicin-isoniazid regimens of 3 or 4 months, rifampicin-only regimens, and rifampicin-pyrazinamide regimens, compared with placebo (P less than .05 for all).

Importantly, a rifapentine-based regimen in which patients took a weekly dose for 12 weeks was as effective as the others.

“We think that you can get away with shorter regimens,” Dr. Zenner said in an interview. Although 3- to 4-month courses are already recommended in some countries, including the United Kingdom, for most patients with latent TB, “clinicians in some settings have been quite slow to adopt them,” he said.

The U.S. Centers for Disease Control and Prevention currently recommend multiple treatment strategies for latent TB, depending on patient characteristics. These include 6 or 9 months of isoniazid; 3 months of once-weekly isoniazid and rifapentine; or 4 months of daily rifampin.

In the meta-analysis, rifamycin-only regimens performed as well as did those regimens that also used isoniazid, the study showed, suggesting that, for most patients who can safely be treated with rifamycins, “there is no added gain of using isoniazid,” Dr. Zenner said.

He noted that the longer isoniazid-alone regimens are nonetheless effective and appropriate for some, including people who might have potential drug interactions, such as HIV patients taking antiretroviral medications.

About 2 billion people worldwide are estimated to have latent TB, and most will not go on to develop active TB. However, because latent TB acts as the reservoir for active TB, screening of high-risk groups and close contacts of TB patients and treating latent infections is a public health priority.

But many of these asymptomatic patients will get lost between a positive screen result and successful treatment completion, Dr. Zenner said.

“We have huge drop-offs in the cascade of treatment, and treatment completion is one of the worries,” he said. “Whether it makes a huge difference in compliance to take only 12 doses is not sufficiently studied, but it does make a lot of sense. By reducing the pill burden, as we call it, we think that we will see quite good adherence rates – but that’s a subject of further detailed study.”

The investigators noted as a limitation of their study that hepatotoxicity outcomes were not available for all studies and that some of the included trials had a potential for bias. They did not see statistically significant differences in treatment efficacy between regimens in HIV-positive and HIV-negative patients, but noted in their analysis that “efficacy may have been weaker in HIV-positive populations.”

The U.K. National Institute for Health Research provided some funding for Dr. Zenner and his colleagues’ study. One coauthor, Helen Stagg, PhD, reported nonfinancial support from Sanofi during the study, and financial support from Otsuka for unrelated work.


 

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Rifamycin-only treatment of latent TB works as well as combination regimens, and shorter dosing schedules show no loss in efficacy vs. longer ones.

Major finding: Rifamycin-only regimens, rifampicin-isoniazid regimens of 3 or 4 months, rifampicin-pyrazinamide regimens were all effective, compared with placebo and with isoniazid regimens of 6, 12 and 72 months.

Data source: A network meta-analysis of 61 randomized trials, 8 of them published in last 3 years

Disclosures: The National Institute for Health Research (UK) funded some co-authors; one co-author disclosed a financial relationship with a pharmaceutical firm.

Disqus Comments
Default

Opioid use higher in adults with health conditions

Let’s share the load
Article Type
Changed
Tue, 05/03/2022 - 15:22

 

Use of prescription opioids is higher among adults with health conditions such as cirrhosis and diabetes, compared with those who do not have the conditions, according to an analysis of national survey data.

In 2015, reported use of opioids was 71.7% in adults with cirrhosis, compared with 37.8% for those who did not have cirrhosis. That is the largest difference among any of the various health conditions included in a report by Beth Han, MD, PhD, of the Substance Abuse and Mental Health Services Administration in Rockville, Md., which conducts the ongoing survey, and her associates (Ann Intern Med. 2017 July 31. doi: 10.7326/M17-0865).

The condition with the next-highest reported use of prescription opioids was chronic obstructive pulmonary disease, at 61.7%, compared with 36.8% for those without it. Any use of opioids was reported by 48.9% of those with diabetes and 36.5% of those without it, with respective figures of 46.1% and 35.8% for hypertension and 45.8% and 36.5% for cancer, Dr. Han and her associates noted in their analysis of 2015 data for 51,200 adults from the National Survey on Drug Use and Health.

Of those with cirrhosis who reported any use of prescription opioids, 86.1% said that they did so without misuse, while the other four conditions had rates ranging from 91.3% to 93.9%. Among those with chronic obstructive pulmonary disease, 6.2% misused opioids without use disorder, and 2.5% had opioid use disorder. These estimates were not available for cirrhosis because of low statistical precision, but the corresponding figures were 6.9% and 1.5% for diabetes, 6% and 2.1% for hypertension, and 5.3% and 0.8% for cancer, the investigators said.

Overall prescription opioid use in 2015 was 37.8% for the civilian, noninstitutionalized adult population, about 91.8 million individuals. Estimates suggest that 4.7% (1.5 million) of all adults misused them in some way, and that 0.8% (1.9 million) had a use disorder, they reported.

“Among adults with misuse of prescription opioids, 59.9% used them without a prescription at least once in 2015, and 40.8% obtained them from friends or relatives for free for their most recent episode of misuse. Such widespread social availability of prescription opioids suggests that they are commonly dispensed in amounts not fully consumed by the patients to whom they are prescribed,” the authors wrote.

Funding for the study came from the Substance Abuse and Mental Health Services Administration, the National Institute on Drug Abuse, and the Office of the Assistant Secretary for Planning and Evaluation of the Department of Health and Human Services. One investigator reported stock holdings in 3M, General Electric, and Pfizer, and another reported stock holdings in Eli Lilly, General Electric, and Sanofi. Dr. Han and the other three investigators disclosed that they had no conflicts of interest.

Body

 

Talk to any busy full-time primary care physician, and it becomes evident that writing an opioid prescription is much easier than exploring other options for addressing chronic pain in the course of a 15-minute visit. The same stressful work conditions likely also make it difficult for primary care providers to appropriately monitor patients who take opioids in the long term with urine drug tests and pill counts to assess for opioid diversion or other substance use.

A potential solution to the problem of the overburdened primary care physician is to distribute some of the work to other members of the health care team. Indeed, we have found that using a nurse care manager with a registry increased receipt of guideline-concordant care (urine drug testing and patient-provider agreements) among patients receiving long-term opioid therapy. The intervention also resulted in reductions in opioid doses at a large urban safety-net hospital and three community health centers.
 

Karen E. Lasser, MD, is with Boston Medical Center and Boston University. Her remarks are excerpted from an editorial response (Ann Intern Med. 2017 Jul 31. doi: 10.7326/M17-1559) to Dr. Han’s study.

Publications
Topics
Sections
Body

 

Talk to any busy full-time primary care physician, and it becomes evident that writing an opioid prescription is much easier than exploring other options for addressing chronic pain in the course of a 15-minute visit. The same stressful work conditions likely also make it difficult for primary care providers to appropriately monitor patients who take opioids in the long term with urine drug tests and pill counts to assess for opioid diversion or other substance use.

A potential solution to the problem of the overburdened primary care physician is to distribute some of the work to other members of the health care team. Indeed, we have found that using a nurse care manager with a registry increased receipt of guideline-concordant care (urine drug testing and patient-provider agreements) among patients receiving long-term opioid therapy. The intervention also resulted in reductions in opioid doses at a large urban safety-net hospital and three community health centers.
 

Karen E. Lasser, MD, is with Boston Medical Center and Boston University. Her remarks are excerpted from an editorial response (Ann Intern Med. 2017 Jul 31. doi: 10.7326/M17-1559) to Dr. Han’s study.

Body

 

Talk to any busy full-time primary care physician, and it becomes evident that writing an opioid prescription is much easier than exploring other options for addressing chronic pain in the course of a 15-minute visit. The same stressful work conditions likely also make it difficult for primary care providers to appropriately monitor patients who take opioids in the long term with urine drug tests and pill counts to assess for opioid diversion or other substance use.

A potential solution to the problem of the overburdened primary care physician is to distribute some of the work to other members of the health care team. Indeed, we have found that using a nurse care manager with a registry increased receipt of guideline-concordant care (urine drug testing and patient-provider agreements) among patients receiving long-term opioid therapy. The intervention also resulted in reductions in opioid doses at a large urban safety-net hospital and three community health centers.
 

Karen E. Lasser, MD, is with Boston Medical Center and Boston University. Her remarks are excerpted from an editorial response (Ann Intern Med. 2017 Jul 31. doi: 10.7326/M17-1559) to Dr. Han’s study.

Title
Let’s share the load
Let’s share the load

 

Use of prescription opioids is higher among adults with health conditions such as cirrhosis and diabetes, compared with those who do not have the conditions, according to an analysis of national survey data.

In 2015, reported use of opioids was 71.7% in adults with cirrhosis, compared with 37.8% for those who did not have cirrhosis. That is the largest difference among any of the various health conditions included in a report by Beth Han, MD, PhD, of the Substance Abuse and Mental Health Services Administration in Rockville, Md., which conducts the ongoing survey, and her associates (Ann Intern Med. 2017 July 31. doi: 10.7326/M17-0865).

The condition with the next-highest reported use of prescription opioids was chronic obstructive pulmonary disease, at 61.7%, compared with 36.8% for those without it. Any use of opioids was reported by 48.9% of those with diabetes and 36.5% of those without it, with respective figures of 46.1% and 35.8% for hypertension and 45.8% and 36.5% for cancer, Dr. Han and her associates noted in their analysis of 2015 data for 51,200 adults from the National Survey on Drug Use and Health.

Of those with cirrhosis who reported any use of prescription opioids, 86.1% said that they did so without misuse, while the other four conditions had rates ranging from 91.3% to 93.9%. Among those with chronic obstructive pulmonary disease, 6.2% misused opioids without use disorder, and 2.5% had opioid use disorder. These estimates were not available for cirrhosis because of low statistical precision, but the corresponding figures were 6.9% and 1.5% for diabetes, 6% and 2.1% for hypertension, and 5.3% and 0.8% for cancer, the investigators said.

Overall prescription opioid use in 2015 was 37.8% for the civilian, noninstitutionalized adult population, about 91.8 million individuals. Estimates suggest that 4.7% (1.5 million) of all adults misused them in some way, and that 0.8% (1.9 million) had a use disorder, they reported.

“Among adults with misuse of prescription opioids, 59.9% used them without a prescription at least once in 2015, and 40.8% obtained them from friends or relatives for free for their most recent episode of misuse. Such widespread social availability of prescription opioids suggests that they are commonly dispensed in amounts not fully consumed by the patients to whom they are prescribed,” the authors wrote.

Funding for the study came from the Substance Abuse and Mental Health Services Administration, the National Institute on Drug Abuse, and the Office of the Assistant Secretary for Planning and Evaluation of the Department of Health and Human Services. One investigator reported stock holdings in 3M, General Electric, and Pfizer, and another reported stock holdings in Eli Lilly, General Electric, and Sanofi. Dr. Han and the other three investigators disclosed that they had no conflicts of interest.

 

Use of prescription opioids is higher among adults with health conditions such as cirrhosis and diabetes, compared with those who do not have the conditions, according to an analysis of national survey data.

In 2015, reported use of opioids was 71.7% in adults with cirrhosis, compared with 37.8% for those who did not have cirrhosis. That is the largest difference among any of the various health conditions included in a report by Beth Han, MD, PhD, of the Substance Abuse and Mental Health Services Administration in Rockville, Md., which conducts the ongoing survey, and her associates (Ann Intern Med. 2017 July 31. doi: 10.7326/M17-0865).

The condition with the next-highest reported use of prescription opioids was chronic obstructive pulmonary disease, at 61.7%, compared with 36.8% for those without it. Any use of opioids was reported by 48.9% of those with diabetes and 36.5% of those without it, with respective figures of 46.1% and 35.8% for hypertension and 45.8% and 36.5% for cancer, Dr. Han and her associates noted in their analysis of 2015 data for 51,200 adults from the National Survey on Drug Use and Health.

Of those with cirrhosis who reported any use of prescription opioids, 86.1% said that they did so without misuse, while the other four conditions had rates ranging from 91.3% to 93.9%. Among those with chronic obstructive pulmonary disease, 6.2% misused opioids without use disorder, and 2.5% had opioid use disorder. These estimates were not available for cirrhosis because of low statistical precision, but the corresponding figures were 6.9% and 1.5% for diabetes, 6% and 2.1% for hypertension, and 5.3% and 0.8% for cancer, the investigators said.

Overall prescription opioid use in 2015 was 37.8% for the civilian, noninstitutionalized adult population, about 91.8 million individuals. Estimates suggest that 4.7% (1.5 million) of all adults misused them in some way, and that 0.8% (1.9 million) had a use disorder, they reported.

“Among adults with misuse of prescription opioids, 59.9% used them without a prescription at least once in 2015, and 40.8% obtained them from friends or relatives for free for their most recent episode of misuse. Such widespread social availability of prescription opioids suggests that they are commonly dispensed in amounts not fully consumed by the patients to whom they are prescribed,” the authors wrote.

Funding for the study came from the Substance Abuse and Mental Health Services Administration, the National Institute on Drug Abuse, and the Office of the Assistant Secretary for Planning and Evaluation of the Department of Health and Human Services. One investigator reported stock holdings in 3M, General Electric, and Pfizer, and another reported stock holdings in Eli Lilly, General Electric, and Sanofi. Dr. Han and the other three investigators disclosed that they had no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Tips for Living With Narcolepsy

Article Type
Changed
Mon, 01/07/2019 - 10:32
Display Headline
Tips for Living With Narcolepsy

Click here to download the PDF.

Article PDF
Issue
Neurology Reviews - 25(8)
Publications
Topics
Page Number
36
Sections
Article PDF
Article PDF

Click here to download the PDF.

Click here to download the PDF.

Issue
Neurology Reviews - 25(8)
Issue
Neurology Reviews - 25(8)
Page Number
36
Page Number
36
Publications
Publications
Topics
Article Type
Display Headline
Tips for Living With Narcolepsy
Display Headline
Tips for Living With Narcolepsy
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Article PDF Media

Thyroid-nodule size boosts serum thyroglobulin’s diagnostic value

Article Type
Changed
Tue, 07/21/2020 - 14:18

 

– Normalizing the serum thyroglobulin level by thyroid nodule size in patients surgically treated for a thyroid nodule produced a strongly significant link between the level of this marker and nodule malignancy in a review of nearly 200 patients treated at any of three Montreal centers.

After normalization, the serum thyroglobulin of patients with a malignant nodule averaged 51 mcg/L*cm, more than double the average 23 mcg/L*cm among patients with benign nodules, Neil Verma, MD, said at the World Congress on Thyroid Cancer.

Mitchel L. Zoler/Frontline Medical News
Dr. Neil Verma
Incorporation of normalized serum thyroglobulin level into the McGill Thyroid Nodule Score Plus (MTNS+) could improve the score’s predictive accuracy, said Dr. Verma, who is now a researcher at the University of Toronto but performed his analysis while at McGill University in Montreal. Normalization by size makes the serum thyroglobulin level more reflective of the nodule’s activity, he explained. The MTNS+ already includes non–normalized serum thyroglobulin as a component: It is 1 of the 23 risk factors for thyroid cancer used to calculate the MTNS+ (Thyroid. 2014 May 19;24[5]:852-7).

But the senior investigator on the study said that, even if the MTNS+ gets a little more accurate by using a nodule size-normalized serum thyroglobulin level, the clinical utility of the MTNS+ will soon be completely eclipsed by widespread reliance on molecular tests, whereas the MTNS+ combines many clinical and conventional laboratory measures. It‘s only a matter of cost, said Richard J. Payne, MD, a head and neck surgeon at McGill.

Dr. Richard J. Payne
The MTNS+ “is a cheap version of molecular testing,” Dr. Payne said in an interview. “I believe the MTNS+ will be outdated within 5 years – once molecular testing becomes cheaper” than it is now. He estimated that currently, at his institution, the cost for molecular testing of a single thyroid nodule runs between $1,000-$5,000 Canadian dollars (about $800-$4,000 U.S.). Furthermore, it is not routinely covered by Canadian provincial medical payers at this time, he said. A small number of his patients opt to pay for molecular testing themselves.

Routine reimbursement for molecular diagnostic tests for the malignancy of thyroid nodules was discussed at a recent meeting of Canadian head and neck surgeons, who decided to lobby provincial governments to try to get it covered, according to Dr. Payne. “I’d be very surprised if we don’t have government coverage within 4-5 years,” in part because the cost for molecular testing will likely fall significantly in that time frame, he predicted.

The analysis reported by Dr. Verma included 196 patients with thyroid nodules who underwent a partial or total thyroidectomy at any of three McGill teaching hospitals during 2010-2015. He determined the benign or malignant status of their nodules based on their histology. The analysis he presented also showed that malignancy had no clear relationship to nodule size. Nodules that were less than 2 cm in diameter were about as likely to be malignant as were those that were 3 cm or larger in diameter, Dr. Verma reported.

Size-normalized serum thyroglobulin will now be incorporated into the MTNS+, which will be the fourth change to the original MTNS scoring system since it was developed more than a decade ago, noted Dr. Payne. But, while the MTNS+ allows better prediction of malignant potential than does the Bethesda system for evaluating nodule cytopathology in a fine-needle aspirate, it still falls short of molecular testing in its predictive accuracy, Dr. Payne said.

Dr. Verma and Dr. Payne had no disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Related Articles

 

– Normalizing the serum thyroglobulin level by thyroid nodule size in patients surgically treated for a thyroid nodule produced a strongly significant link between the level of this marker and nodule malignancy in a review of nearly 200 patients treated at any of three Montreal centers.

After normalization, the serum thyroglobulin of patients with a malignant nodule averaged 51 mcg/L*cm, more than double the average 23 mcg/L*cm among patients with benign nodules, Neil Verma, MD, said at the World Congress on Thyroid Cancer.

Mitchel L. Zoler/Frontline Medical News
Dr. Neil Verma
Incorporation of normalized serum thyroglobulin level into the McGill Thyroid Nodule Score Plus (MTNS+) could improve the score’s predictive accuracy, said Dr. Verma, who is now a researcher at the University of Toronto but performed his analysis while at McGill University in Montreal. Normalization by size makes the serum thyroglobulin level more reflective of the nodule’s activity, he explained. The MTNS+ already includes non–normalized serum thyroglobulin as a component: It is 1 of the 23 risk factors for thyroid cancer used to calculate the MTNS+ (Thyroid. 2014 May 19;24[5]:852-7).

But the senior investigator on the study said that, even if the MTNS+ gets a little more accurate by using a nodule size-normalized serum thyroglobulin level, the clinical utility of the MTNS+ will soon be completely eclipsed by widespread reliance on molecular tests, whereas the MTNS+ combines many clinical and conventional laboratory measures. It‘s only a matter of cost, said Richard J. Payne, MD, a head and neck surgeon at McGill.

Dr. Richard J. Payne
The MTNS+ “is a cheap version of molecular testing,” Dr. Payne said in an interview. “I believe the MTNS+ will be outdated within 5 years – once molecular testing becomes cheaper” than it is now. He estimated that currently, at his institution, the cost for molecular testing of a single thyroid nodule runs between $1,000-$5,000 Canadian dollars (about $800-$4,000 U.S.). Furthermore, it is not routinely covered by Canadian provincial medical payers at this time, he said. A small number of his patients opt to pay for molecular testing themselves.

Routine reimbursement for molecular diagnostic tests for the malignancy of thyroid nodules was discussed at a recent meeting of Canadian head and neck surgeons, who decided to lobby provincial governments to try to get it covered, according to Dr. Payne. “I’d be very surprised if we don’t have government coverage within 4-5 years,” in part because the cost for molecular testing will likely fall significantly in that time frame, he predicted.

The analysis reported by Dr. Verma included 196 patients with thyroid nodules who underwent a partial or total thyroidectomy at any of three McGill teaching hospitals during 2010-2015. He determined the benign or malignant status of their nodules based on their histology. The analysis he presented also showed that malignancy had no clear relationship to nodule size. Nodules that were less than 2 cm in diameter were about as likely to be malignant as were those that were 3 cm or larger in diameter, Dr. Verma reported.

Size-normalized serum thyroglobulin will now be incorporated into the MTNS+, which will be the fourth change to the original MTNS scoring system since it was developed more than a decade ago, noted Dr. Payne. But, while the MTNS+ allows better prediction of malignant potential than does the Bethesda system for evaluating nodule cytopathology in a fine-needle aspirate, it still falls short of molecular testing in its predictive accuracy, Dr. Payne said.

Dr. Verma and Dr. Payne had no disclosures.

 

– Normalizing the serum thyroglobulin level by thyroid nodule size in patients surgically treated for a thyroid nodule produced a strongly significant link between the level of this marker and nodule malignancy in a review of nearly 200 patients treated at any of three Montreal centers.

After normalization, the serum thyroglobulin of patients with a malignant nodule averaged 51 mcg/L*cm, more than double the average 23 mcg/L*cm among patients with benign nodules, Neil Verma, MD, said at the World Congress on Thyroid Cancer.

Mitchel L. Zoler/Frontline Medical News
Dr. Neil Verma
Incorporation of normalized serum thyroglobulin level into the McGill Thyroid Nodule Score Plus (MTNS+) could improve the score’s predictive accuracy, said Dr. Verma, who is now a researcher at the University of Toronto but performed his analysis while at McGill University in Montreal. Normalization by size makes the serum thyroglobulin level more reflective of the nodule’s activity, he explained. The MTNS+ already includes non–normalized serum thyroglobulin as a component: It is 1 of the 23 risk factors for thyroid cancer used to calculate the MTNS+ (Thyroid. 2014 May 19;24[5]:852-7).

But the senior investigator on the study said that, even if the MTNS+ gets a little more accurate by using a nodule size-normalized serum thyroglobulin level, the clinical utility of the MTNS+ will soon be completely eclipsed by widespread reliance on molecular tests, whereas the MTNS+ combines many clinical and conventional laboratory measures. It‘s only a matter of cost, said Richard J. Payne, MD, a head and neck surgeon at McGill.

Dr. Richard J. Payne
The MTNS+ “is a cheap version of molecular testing,” Dr. Payne said in an interview. “I believe the MTNS+ will be outdated within 5 years – once molecular testing becomes cheaper” than it is now. He estimated that currently, at his institution, the cost for molecular testing of a single thyroid nodule runs between $1,000-$5,000 Canadian dollars (about $800-$4,000 U.S.). Furthermore, it is not routinely covered by Canadian provincial medical payers at this time, he said. A small number of his patients opt to pay for molecular testing themselves.

Routine reimbursement for molecular diagnostic tests for the malignancy of thyroid nodules was discussed at a recent meeting of Canadian head and neck surgeons, who decided to lobby provincial governments to try to get it covered, according to Dr. Payne. “I’d be very surprised if we don’t have government coverage within 4-5 years,” in part because the cost for molecular testing will likely fall significantly in that time frame, he predicted.

The analysis reported by Dr. Verma included 196 patients with thyroid nodules who underwent a partial or total thyroidectomy at any of three McGill teaching hospitals during 2010-2015. He determined the benign or malignant status of their nodules based on their histology. The analysis he presented also showed that malignancy had no clear relationship to nodule size. Nodules that were less than 2 cm in diameter were about as likely to be malignant as were those that were 3 cm or larger in diameter, Dr. Verma reported.

Size-normalized serum thyroglobulin will now be incorporated into the MTNS+, which will be the fourth change to the original MTNS scoring system since it was developed more than a decade ago, noted Dr. Payne. But, while the MTNS+ allows better prediction of malignant potential than does the Bethesda system for evaluating nodule cytopathology in a fine-needle aspirate, it still falls short of molecular testing in its predictive accuracy, Dr. Payne said.

Dr. Verma and Dr. Payne had no disclosures.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

AT WCTC 2017

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Normalizing serum thyroglobulin levels based on the size of a patient’s thyroid nodule increased its accuracy for predicting nodule malignancy.

Major finding: The average size-normalized serum thyroglobulin level was 51 mcg/L*cm in patients with malignant nodules and 23 mcg/L*cm with benign nodules.

Data source: Review of 196 patients who underwent partial or complete thyroidectomy at any of three Montreal centers.

Disclosures: Dr. Verma and Dr. Payne had no disclosures.

Disqus Comments
Default

Are women of advanced maternal age at increased risk for severe maternal morbidity?

Article Type
Changed
Tue, 08/28/2018 - 11:09
Display Headline
Are women of advanced maternal age at increased risk for severe maternal morbidity?

EXPERT COMMENTARY

While numerous studies have investigated the risk of perinatal outcomes with advancing maternal age, the primary objective of a recent study by Lisonkova and colleagues was to examine the association between advancing maternal age and severe maternal morbidities and mortality.

Details of the study

The population-based retrospective cohort study compared age-specific rates of severe maternal morbidities and mortality among 828,269 pregnancies in Washington state between 2003 and 2013. Singleton births to women 15 to 60 years of age were included; out-of-hospital births were excluded. Information was obtained by linking the Birth Events Record Database (which includes information on maternal, pregnancy, and labor and delivery characteristics and birth outcomes), and the Comprehensive Hospital Abstract Reporting System database (which includes diagnostic and procedural codes for all hospitalizations in Washington state).

The primary objective was to examine the association between age and severe maternal morbidities. Maternal morbidities were divided into categories: antepartum hemorrhage, respiratory morbidity, thromboembolism, cerebrovascular morbidity, acute cardiac morbidity, severe postpartum hemorrhage, maternal sepsis, renal failure, obstetric shock, complications of anesthesia and obstetric interventions, and need for life-saving procedures. A composite outcome, comprised of severe maternal morbidities, intensive care unit admission, and maternal mortality, was also created.

Rates of severe morbidities were compared for age groups 15 to 19, 20 to 24, 25 to 29, 30 to 34, 35 to 39, 40 to 44, and ≥45 years to the referent category (25 to 29 years). Additional comparisons were also performed for ages 45 to 49 and ≥50 years for the composite and for morbidities with high incidence. Logistic regression and sensitivity analyses were used to control for demographic and prepregnancy characteristics, underlying medical conditions, assisted conception, and delivery characteristics.

Severe maternal morbidities demonstrated a J-shaped association with age: the lowest rates of morbidity were observed in women 20 to 34 years of age, and steeply increasing rates of morbidity were observed for women aged 40 and older. One notable exception was the rate of sepsis, which was increased in teen mothers compared with all other groups.

The unadjusted rate of the composite outcome of severe maternal morbidity and mortality was 2.1% in teenagers, 1.5% among women 25 to 29 years, 2.3% among those aged 40 to 44, and 3.6% among women aged 45 and older.

Although rates were somewhat attenuated after adjustment for demographic and prepregnancy characteristics, chronic medical conditions, assisted conception, and delivery characteristics, most morbidities remained significantly increased among women aged 39 years and older, including the composite outcome. Among the individual morbidities considered, increased risk was highest for renal failure, amniotic fluid embolism, cardiac morbidity, and shock, with adjusted odds ratios of 2.0 or greater for women older than 39 years.


Related article:
Reducing maternal mortality in the United States—Let’s get organized!

Study strengths and weaknesses

This study contributes substantially to the existing literature that demonstrates higher rates of pregnancy-associated morbidities in women of increasing maternal age.1,2 Prior studies in this area focused on perinatal morbidity and mortality and on obstetric outcomes such as cesarean delivery.3–5 This large-scale study examined the association between advancing maternal age and a variety of serious maternal morbidities. In another study, Callaghan and Berg found a similar pattern among mortalities, with high rates of mortality attributable to hemorrhage, embolism, and cardiomyopathy in women aged 40 years and older.1

Exclusion of multiple gestations. As in any study, we must consider the methodology, and it is notable that Lisonkova and colleagues’ study excluded multiple gestations. Given the association with advanced maternal age, assisted reproductive technology, and the incidence of multiple gestations, a high rate of multiple gestations would be expected among women of advanced maternal age. (Generally, maternal age of at least 35 years is considered “advanced,” with greater than 40 years “very advanced.”) Since multiple gestations tend to be associated with increases in morbidity, excluding these pregnancies would likely bias the study results toward the null. If multiple gestations had been included, the rates of serious maternal morbidities in older women might be even higher than those demonstrated, potentially strengthening the associations reported here.

WHAT THIS EVIDENCE MEANS FOR PRACTICE

This large, retrospective study (level II evidence) suggests that women of advancing age are at significantly increased risk of severe maternal morbidities, even after controlling for preexisting medical conditions. We therefore recommend that clinicians inform and counsel women who are considering pregnancy at an advanced age, and those considering oocyte cryopreservation as a means of extending their reproductive life span, about the increased maternal morbidities associated with pregnancy at age 40 and older. 

-- Amy E. Judy, MD, MPH, and Yasser Y. El-Sayed, MD

 

Share your thoughts! Send your Letter to the Editor to [email protected]. Please include your name and the city and state in which you practice.

References
  1. Callaghan WM, Berg CJ. Pregnancy-related mortality among women aged 35 years and older, United States, 1991–1997. Obstet Gynecol. 2003;102(5 pt 1):1015–1021.
  2. McCall SJ, Nair M, Knight M. Factors associated with maternal mortality at advanced maternal age: a population-based case-control study. BJOG. 2017;124(8):1225–1233.
  3. Yogev Y, Melamed N, Bardin R, Tenenbaum-Gavish K, Ben-Shitrit G, Ben-Haroush A. Pregnancy outcome at extremely advanced maternal age. Am J Obstet Gynecol. 2010;203(6):558.e1–e7.
  4. Gilbert WM, Nesbitt TS, Danielsen B. Childbearing beyond age 40: pregnancy outcome in 24,032 cases. Obstet Gynecol. 1999;93(1):9–14.
  5. Luke B, Brown MB. Elevated risks of pregnancy complications and adverse outcomes with increasing maternal age. Hum Reprod. 2007;22(5):1264–1272.
Article PDF
Author and Disclosure Information

Amy E. Judy, MD, MPH, is Clinical Assistant Professor, Division of Maternal-Fetal Medicine and Obstetrics, Department of Obstetrics and Gynecology, Stanford University School of Medicine, Stanford, California.

Yasser Y. El-Sayed, MD, is the Charles B. and Ann L. Johnson Professor and the Division Director of Maternal-Fetal Medicine and Obstetrics, Department of Obstetrics and Gynecology, Stanford University School of Medicine.

The authors report no financial relationships relevant to this article.

Issue
OBG Management - 29(8)
Publications
Topics
Page Number
52-51
Sections
Author and Disclosure Information

Amy E. Judy, MD, MPH, is Clinical Assistant Professor, Division of Maternal-Fetal Medicine and Obstetrics, Department of Obstetrics and Gynecology, Stanford University School of Medicine, Stanford, California.

Yasser Y. El-Sayed, MD, is the Charles B. and Ann L. Johnson Professor and the Division Director of Maternal-Fetal Medicine and Obstetrics, Department of Obstetrics and Gynecology, Stanford University School of Medicine.

The authors report no financial relationships relevant to this article.

Author and Disclosure Information

Amy E. Judy, MD, MPH, is Clinical Assistant Professor, Division of Maternal-Fetal Medicine and Obstetrics, Department of Obstetrics and Gynecology, Stanford University School of Medicine, Stanford, California.

Yasser Y. El-Sayed, MD, is the Charles B. and Ann L. Johnson Professor and the Division Director of Maternal-Fetal Medicine and Obstetrics, Department of Obstetrics and Gynecology, Stanford University School of Medicine.

The authors report no financial relationships relevant to this article.

Article PDF
Article PDF

EXPERT COMMENTARY

While numerous studies have investigated the risk of perinatal outcomes with advancing maternal age, the primary objective of a recent study by Lisonkova and colleagues was to examine the association between advancing maternal age and severe maternal morbidities and mortality.

Details of the study

The population-based retrospective cohort study compared age-specific rates of severe maternal morbidities and mortality among 828,269 pregnancies in Washington state between 2003 and 2013. Singleton births to women 15 to 60 years of age were included; out-of-hospital births were excluded. Information was obtained by linking the Birth Events Record Database (which includes information on maternal, pregnancy, and labor and delivery characteristics and birth outcomes), and the Comprehensive Hospital Abstract Reporting System database (which includes diagnostic and procedural codes for all hospitalizations in Washington state).

The primary objective was to examine the association between age and severe maternal morbidities. Maternal morbidities were divided into categories: antepartum hemorrhage, respiratory morbidity, thromboembolism, cerebrovascular morbidity, acute cardiac morbidity, severe postpartum hemorrhage, maternal sepsis, renal failure, obstetric shock, complications of anesthesia and obstetric interventions, and need for life-saving procedures. A composite outcome, comprised of severe maternal morbidities, intensive care unit admission, and maternal mortality, was also created.

Rates of severe morbidities were compared for age groups 15 to 19, 20 to 24, 25 to 29, 30 to 34, 35 to 39, 40 to 44, and ≥45 years to the referent category (25 to 29 years). Additional comparisons were also performed for ages 45 to 49 and ≥50 years for the composite and for morbidities with high incidence. Logistic regression and sensitivity analyses were used to control for demographic and prepregnancy characteristics, underlying medical conditions, assisted conception, and delivery characteristics.

Severe maternal morbidities demonstrated a J-shaped association with age: the lowest rates of morbidity were observed in women 20 to 34 years of age, and steeply increasing rates of morbidity were observed for women aged 40 and older. One notable exception was the rate of sepsis, which was increased in teen mothers compared with all other groups.

The unadjusted rate of the composite outcome of severe maternal morbidity and mortality was 2.1% in teenagers, 1.5% among women 25 to 29 years, 2.3% among those aged 40 to 44, and 3.6% among women aged 45 and older.

Although rates were somewhat attenuated after adjustment for demographic and prepregnancy characteristics, chronic medical conditions, assisted conception, and delivery characteristics, most morbidities remained significantly increased among women aged 39 years and older, including the composite outcome. Among the individual morbidities considered, increased risk was highest for renal failure, amniotic fluid embolism, cardiac morbidity, and shock, with adjusted odds ratios of 2.0 or greater for women older than 39 years.


Related article:
Reducing maternal mortality in the United States—Let’s get organized!

Study strengths and weaknesses

This study contributes substantially to the existing literature that demonstrates higher rates of pregnancy-associated morbidities in women of increasing maternal age.1,2 Prior studies in this area focused on perinatal morbidity and mortality and on obstetric outcomes such as cesarean delivery.3–5 This large-scale study examined the association between advancing maternal age and a variety of serious maternal morbidities. In another study, Callaghan and Berg found a similar pattern among mortalities, with high rates of mortality attributable to hemorrhage, embolism, and cardiomyopathy in women aged 40 years and older.1

Exclusion of multiple gestations. As in any study, we must consider the methodology, and it is notable that Lisonkova and colleagues’ study excluded multiple gestations. Given the association with advanced maternal age, assisted reproductive technology, and the incidence of multiple gestations, a high rate of multiple gestations would be expected among women of advanced maternal age. (Generally, maternal age of at least 35 years is considered “advanced,” with greater than 40 years “very advanced.”) Since multiple gestations tend to be associated with increases in morbidity, excluding these pregnancies would likely bias the study results toward the null. If multiple gestations had been included, the rates of serious maternal morbidities in older women might be even higher than those demonstrated, potentially strengthening the associations reported here.

WHAT THIS EVIDENCE MEANS FOR PRACTICE

This large, retrospective study (level II evidence) suggests that women of advancing age are at significantly increased risk of severe maternal morbidities, even after controlling for preexisting medical conditions. We therefore recommend that clinicians inform and counsel women who are considering pregnancy at an advanced age, and those considering oocyte cryopreservation as a means of extending their reproductive life span, about the increased maternal morbidities associated with pregnancy at age 40 and older. 

-- Amy E. Judy, MD, MPH, and Yasser Y. El-Sayed, MD

 

Share your thoughts! Send your Letter to the Editor to [email protected]. Please include your name and the city and state in which you practice.

EXPERT COMMENTARY

While numerous studies have investigated the risk of perinatal outcomes with advancing maternal age, the primary objective of a recent study by Lisonkova and colleagues was to examine the association between advancing maternal age and severe maternal morbidities and mortality.

Details of the study

The population-based retrospective cohort study compared age-specific rates of severe maternal morbidities and mortality among 828,269 pregnancies in Washington state between 2003 and 2013. Singleton births to women 15 to 60 years of age were included; out-of-hospital births were excluded. Information was obtained by linking the Birth Events Record Database (which includes information on maternal, pregnancy, and labor and delivery characteristics and birth outcomes), and the Comprehensive Hospital Abstract Reporting System database (which includes diagnostic and procedural codes for all hospitalizations in Washington state).

The primary objective was to examine the association between age and severe maternal morbidities. Maternal morbidities were divided into categories: antepartum hemorrhage, respiratory morbidity, thromboembolism, cerebrovascular morbidity, acute cardiac morbidity, severe postpartum hemorrhage, maternal sepsis, renal failure, obstetric shock, complications of anesthesia and obstetric interventions, and need for life-saving procedures. A composite outcome, comprised of severe maternal morbidities, intensive care unit admission, and maternal mortality, was also created.

Rates of severe morbidities were compared for age groups 15 to 19, 20 to 24, 25 to 29, 30 to 34, 35 to 39, 40 to 44, and ≥45 years to the referent category (25 to 29 years). Additional comparisons were also performed for ages 45 to 49 and ≥50 years for the composite and for morbidities with high incidence. Logistic regression and sensitivity analyses were used to control for demographic and prepregnancy characteristics, underlying medical conditions, assisted conception, and delivery characteristics.

Severe maternal morbidities demonstrated a J-shaped association with age: the lowest rates of morbidity were observed in women 20 to 34 years of age, and steeply increasing rates of morbidity were observed for women aged 40 and older. One notable exception was the rate of sepsis, which was increased in teen mothers compared with all other groups.

The unadjusted rate of the composite outcome of severe maternal morbidity and mortality was 2.1% in teenagers, 1.5% among women 25 to 29 years, 2.3% among those aged 40 to 44, and 3.6% among women aged 45 and older.

Although rates were somewhat attenuated after adjustment for demographic and prepregnancy characteristics, chronic medical conditions, assisted conception, and delivery characteristics, most morbidities remained significantly increased among women aged 39 years and older, including the composite outcome. Among the individual morbidities considered, increased risk was highest for renal failure, amniotic fluid embolism, cardiac morbidity, and shock, with adjusted odds ratios of 2.0 or greater for women older than 39 years.


Related article:
Reducing maternal mortality in the United States—Let’s get organized!

Study strengths and weaknesses

This study contributes substantially to the existing literature that demonstrates higher rates of pregnancy-associated morbidities in women of increasing maternal age.1,2 Prior studies in this area focused on perinatal morbidity and mortality and on obstetric outcomes such as cesarean delivery.3–5 This large-scale study examined the association between advancing maternal age and a variety of serious maternal morbidities. In another study, Callaghan and Berg found a similar pattern among mortalities, with high rates of mortality attributable to hemorrhage, embolism, and cardiomyopathy in women aged 40 years and older.1

Exclusion of multiple gestations. As in any study, we must consider the methodology, and it is notable that Lisonkova and colleagues’ study excluded multiple gestations. Given the association with advanced maternal age, assisted reproductive technology, and the incidence of multiple gestations, a high rate of multiple gestations would be expected among women of advanced maternal age. (Generally, maternal age of at least 35 years is considered “advanced,” with greater than 40 years “very advanced.”) Since multiple gestations tend to be associated with increases in morbidity, excluding these pregnancies would likely bias the study results toward the null. If multiple gestations had been included, the rates of serious maternal morbidities in older women might be even higher than those demonstrated, potentially strengthening the associations reported here.

WHAT THIS EVIDENCE MEANS FOR PRACTICE

This large, retrospective study (level II evidence) suggests that women of advancing age are at significantly increased risk of severe maternal morbidities, even after controlling for preexisting medical conditions. We therefore recommend that clinicians inform and counsel women who are considering pregnancy at an advanced age, and those considering oocyte cryopreservation as a means of extending their reproductive life span, about the increased maternal morbidities associated with pregnancy at age 40 and older. 

-- Amy E. Judy, MD, MPH, and Yasser Y. El-Sayed, MD

 

Share your thoughts! Send your Letter to the Editor to [email protected]. Please include your name and the city and state in which you practice.

References
  1. Callaghan WM, Berg CJ. Pregnancy-related mortality among women aged 35 years and older, United States, 1991–1997. Obstet Gynecol. 2003;102(5 pt 1):1015–1021.
  2. McCall SJ, Nair M, Knight M. Factors associated with maternal mortality at advanced maternal age: a population-based case-control study. BJOG. 2017;124(8):1225–1233.
  3. Yogev Y, Melamed N, Bardin R, Tenenbaum-Gavish K, Ben-Shitrit G, Ben-Haroush A. Pregnancy outcome at extremely advanced maternal age. Am J Obstet Gynecol. 2010;203(6):558.e1–e7.
  4. Gilbert WM, Nesbitt TS, Danielsen B. Childbearing beyond age 40: pregnancy outcome in 24,032 cases. Obstet Gynecol. 1999;93(1):9–14.
  5. Luke B, Brown MB. Elevated risks of pregnancy complications and adverse outcomes with increasing maternal age. Hum Reprod. 2007;22(5):1264–1272.
References
  1. Callaghan WM, Berg CJ. Pregnancy-related mortality among women aged 35 years and older, United States, 1991–1997. Obstet Gynecol. 2003;102(5 pt 1):1015–1021.
  2. McCall SJ, Nair M, Knight M. Factors associated with maternal mortality at advanced maternal age: a population-based case-control study. BJOG. 2017;124(8):1225–1233.
  3. Yogev Y, Melamed N, Bardin R, Tenenbaum-Gavish K, Ben-Shitrit G, Ben-Haroush A. Pregnancy outcome at extremely advanced maternal age. Am J Obstet Gynecol. 2010;203(6):558.e1–e7.
  4. Gilbert WM, Nesbitt TS, Danielsen B. Childbearing beyond age 40: pregnancy outcome in 24,032 cases. Obstet Gynecol. 1999;93(1):9–14.
  5. Luke B, Brown MB. Elevated risks of pregnancy complications and adverse outcomes with increasing maternal age. Hum Reprod. 2007;22(5):1264–1272.
Issue
OBG Management - 29(8)
Issue
OBG Management - 29(8)
Page Number
52-51
Page Number
52-51
Publications
Publications
Topics
Article Type
Display Headline
Are women of advanced maternal age at increased risk for severe maternal morbidity?
Display Headline
Are women of advanced maternal age at increased risk for severe maternal morbidity?
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Article PDF Media

Clinical Outcomes After Conversion from Low-Molecular-Weight Heparin to Unfractionated Heparin for Venous Thromboembolism Prophylaxis

Article Type
Changed
Wed, 05/22/2019 - 09:55
Display Headline
Clinical Outcomes After Conversion from Low-Molecular-Weight Heparin to Unfractionated Heparin for Venous Thromboembolism Prophylaxis

From the Anne Arundel Health System Research Institute, Annapolis, MD.

 

Abstract

  • Objective: To measure clinical outcomes associated with heparin-induced thrombocytopenia (HIT) and acquisition costs of heparin after implementing a new order set promoting unfractionated heparin (UFH) use instead of low-molecular-weight heparin (LMWH) for venous thromboembolism (VTE) prophylaxis.
  • Methods: This was single-center, retrospective, pre-post intervention analysis utilizing pharmacy, laboratory, and clinical data sources. Subjects were patients receiving VTE thromboprophyalxis with heparin at an acute care hospital. Usage rates for UFH and LMWH, acquisition costs for heparins, number of HIT assays, best practice advisories for HIT, and confirmed cases of HIT and HIT with thrombosis were assessed.
  • Results: After order set intervention, UFH use increased from 43% of all prophylaxis orders to 86%. Net annual savings in acquisition costs for VTE prophylaxis was $131,000. After the intervention, HIT best practice advisories and number of monthly HIT assays fell 35% and 15%, respectively. In the 9-month pre-intervention period, HIT and HITT occurred in zero of 6717 patients receiving VTE prophylaxis. In the 25 months of post-intervention follow-up, HIT occurred in 3 of 44,240 patients (P = 0.86) receiving VTE prophylaxis, 2 of whom had HITT, all after receiving UFH. The median duration of UFH and LMWH use was 3.0 and 3.5 days, respectively.
  • Conclusion: UFH use in hospitals can be safely maintained or increased among patient subpopulations that are not at high risk for HIT. A more nuanced approach to prophylaxis, taking into account individual patient risk and expected duration of therapy, may provide desired cost savings without provoking HIT.

Key words: heparin; heparin-induced thrombocytopenia; venous thromboembolism prophylaxis; cost-effectiveness.

 

Heparin-induced thrombocytopenia (HIT) and its more severe clinical complication, HIT with thrombosis (HITT), complicate the use of heparin products for venous thromboembolic (VTE) prophylaxis. The clinical characteristics and time course of thrombocytopenia in relation to heparin are well characterized (typically 30%–50% drop in platelet count 5–10 days after exposure), if not absolute. Risk calculation tools help to judge the clinical probability and guide ordering of appropriate confirmatory tests [1]. The incidence of HIT is higher with unfractionated heparin (UFH) than with low-molecular-weight heparin (LMWH). A meta-analysis of 5 randomized or prospective nonrandomized trials indicated a risk of 2.6% (95% CI, 1.5%–3.8%) for UFH and 0.2% (95% CI, 0.1%–0.4%) for LMWH [2], though the analyzed studies were heavily weighted by studies of orthopedic surgery patients, a high-risk group. However, not all patients are at equal risk for HIT, suggesting that LMWH may not be necessary for all patients [3]. Unfortunately, LMWH is considerably more expensive for hospitals to purchase than UFH, raising costs for a prophylactic treatment that is widely utilized. However, the higher incidence of HIT and HITT associated with UFH can erode any cost savings because of the additional cost of diagnosing HIT and need for temporary or long-term treatment with even more expensive alternative anticoagulants. Indeed, a recent retrospective study suggested that the excess costs of evaluating and treating HIT were approximately $267,000 per year in Canadian dollars [4].But contrary data has also been reported. A retrospective study of the consequences of increased prophylactic UFH use found no increase in ordered HIT assays or in the results of HIT testing or of inferred positive cases despite a growth of 71% in the number of patients receiving UFH prophylaxis [5].

In 2013, the pharmacy and therapeutics committee made a decision to encourage the use of UFH over LMWH for VTE prophylaxis by making changes to order sets to favor UFH over LMWH (enoxaparin). Given the uncertainty about excess risk of HIT, a monitoring work group was created to assess for any increase of either HIT or HITT that might follow, including any patient readmitted with thrombosis within 30 days of a discharge. In this paper, we report the impact of a hospital-wide conversion to UFH for VTE prophylaxis on the incidence of VTE, HIT, and HITT and acquisition costs of UFH and LMWH and use of alternative prophylactic anticoagulant medications.

Methods

Setting

Anne Arundel Medical Center is a 383-bed acute care hospital with about 30,000 adult admissions and 10,000 inpatient surgeries annually. The average length of stay is approximately 3.6 days with a patient median age of 59 years. Caucasians comprise 75.3% of the admitted populations and African Americans 21.4%. Most patients are on Medicare (59%), while 29.5% have private insurance, 6.6% are on Medicaid, and 4.7% self-pay. The 9 most common medical principal diagnoses are sepsis, heart failure, chronic obstructive pulmonary disease, pneumonia, myocardial infarction, ischemic stroke, urinary tract infection, cardiac arrhythmia, and other infection. The 6 most common procedures include newborn delivery (with and without caesarean section), joint replacement surgery, bariatric procedures, cardiac catheterizations, other abdominal surgeries, and thoracotomy. The predominant medical care model is internal medicine and physician assistant acute care hospitalists attending both medicine and surgical patients. Obstetrical hospitalists care for admitted obstetric patients. Patients admitted to the intensive care units had only critical care trained physician specialists as attending physicians. No trainees cared for the patients described in this study.

P&T Committee

The P&T committee is a multidisciplinary group of health care professionals selected for appointment by the chairs of the committee (chair of medicine and director of pharmacy) and approved by the president of the medical staff. The committee has oversight responsibility for all medication policies, order sets involving medications, as well as the monitoring of clinical outcomes as they regard medications.

Electronic Medical Record and Best Practice Advisory

Throughout this study period both pre-and post-intervention, the EMR in use was Epic (Verona WI), used for all ordering and lab results. A best practice advisory was in place in the EMR that alerted providers to all cases of thrombocytopenia < 100,000/mm3 when there was concurrent order for any heparin. The best practice advisory highlighted the thrombocytopenia, advised the providers to consider HIT as a diagnosis and to order confirmation tests if clinically appropriate, providing a direct link to the HIT assay order screen. The best practice advisory did not access information from prior admissions where heparin might have been used nor determine the percentage drop from the baseline platelet count.

HIT Case Definition and Assays

The 2 laboratory tests for HIT on which this study is based are the heparin-induced platelet antibody test (also known as anti-PF4) and the serotonin release assay. The heparin-induced platelet antibody test is an enzyme-linked immunosorbent assay (ELISA) that detects IgG, IgM, and IgA antibodies against the platelet factor 4 (PF4/heparin complex). This test was reported as positive if the optical density was 0.4 or higher and generated an automatic request for a serotonin release assay (SRA), which is a functional assay that measures heparin-dependent platelet activation. The decision to order the SRA was therefore a “reflex” test and not made with any knowledge of clinical characteristics of the case. The HIT assays were performed by a reference lab, Quest Diagnostics, in the Chantilly, VA facility. HIT was said to be present when both a characteristic pattern of thrombocytopenia occurring after heparin use was seen [1]and when the confirmatory SRA was positive at a level of > 20% release.

 

 

Order Set Modifications

After the P&T committee decision to emphasize UFH for VTE prophylaxis in October 2013, the relevant electronic order sets were altered to highlight the fact that UFH was the first choice for VTE prophylaxis. The order sets still allowed LMWH (enoxaparin) or alternative anticoagulants at the prescribers’ discretion but indicated they were a second choice. Doses of UFH and LMWH in the order sets were standard based upon weight and estimates of creatinine clearance and, in the case of dosing frequency for UFH, based upon the risk of VTE. Order sets for the therapeutic treatment of VTE were not changed.

Data Collection and Analysis

The clinical research committee, the local oversight board for research and performance improvement analyses, reviewed this project and determined that it qualified as a performance improvement analysis based upon the standards of the U.S. Office of Human Research Protections. Some data were extracted from patient medical records and stored in a customized and password-protected database. Access to the database was limited to members of the analysis team and stripped of all patient identifiers under the HIPAA privacy rule standard for de-identification from 45 CFR 164.514(b) immediately following the collection of all data elements from the medical record.

An internal pharmacy database was used to determine the volume and actual acquisition cost of prophylactic anticoagulant doses administered during both pre- and post-intervention time periods. To determine if clinical suspicion for HIT increased after the intervention, a definitive listing of all ordered HIT assays was obtained from laboratory billing records for the 9 months (January 2013–September 2013) before the conversion and for 25 months after the intervention (beginning in November 2013 so as not to include the conversion month). To determine if the HIT assays were associated with a higher risk score, we identified all cases in which the HIT assay was ordered and retroactively measured the probability score known as the 4T score [1].Simultaneously, separate clinical work groups reviewed all cases of hospital-acquired thrombosis, whatever their cause, including patients readmitted with thrombosis up to 30 days after discharge and episodes of bleeding due to anti-coagulant use. A chi square analysis of the incidence of HIT pre- and post-intervention was performed.

Results

Heparin Use and Acquisition Costs

OR_HIT_Figure1
Annual hospital admissions and patient-days both increased 3% during the study time period to 29,975 annual adult admissions and 96,426 hospital days. However, the number of patients receiving any prophylactic anticoagulant increased 15% over the time period as a result of a simultaneous anti-VTE education campaign. After the modification in order sets there was a rapid increase in orders for prophylactic UFH and a corresponding decrease in the use of LMWH (Figure 1). A few patients received both at different times during their hospitalization. Prophylactic UFH orders increased from 43% of all prophylactic use pre-intervention to 86% by the second year of the intervention. There was a corresponding reduction in use of LMWH from 53% of all prophylaxis orders to 11% after the intervention. Other anticoagulants, fondaparinux and bivalrudin, together composed 5% and 4% of all prophylactic heparin in the pre- and post-intervention time periods, respectively. Pharmacy acquisition costs averaged $3.50 for daily doses of heparin and $24 for LMWH. With the shift in ordering pattern, acquisition costs of prophylactic LMWH fell by $165,000 annually with a concurrent offsetting increase in acquisition costs of UFH of $34,000 for a net savings of $131,000. The median duration of use was 3.5 days and 3.0 days for UFH and LMWH, respectively. Only 25% of treated patients had heparin exposure of > 5 days. Bleeding complications from any prophylaxis regimen were < 0.5% in both pre- and post-intervention time periods.

HIT Assays and Incidence of HIT and HITT

OR_HIT_Figure2
HIT assays results returned to the chart within a median of 4 days for anti-platelet factor 4 and 5 days for SRA. Data on the number of HIT assays ordered was missing for 1 month in 2013 and 1 month in 2014 due to missing laboratory billing data. On average, the best practice advisory for HIT fired 35% less frequently after the intervention (Figure 2). The number of ordered serum ELISA HIT assays decreased slightly after the intervention from a mean of 18.2 per month to 15.4 per month despite the increase in UFH use. The distribution of the retrospectively measured 4T scores for patients with HIT testing is shown in 
OR_HIT_Table
the Table. There were no differences in the retrospectively calculated scores between the pre- and post-intervention time periods. Most patients had a moderate risk 4T score prior to having HIT assay ordered.

In the 9 months pre-intervention, HIT and HITT occurred in zero of 6717 patients receiving at least 1 dose of VTE prophylaxis. In the 25 months of post-intervention follow-up, 44,240 patients received prophylaxis with either heparin. HIT (clinical suspicion with positive antibody and confirmatory SRA) occurred in 3 patients, 2 of whom had HITT, all after UFH. This incidence was not statistically significant using chi square analysis (P = 0.86).

 

 

Discussion

Because the efficacy of UFH and LMWH for VTE prophylaxis are equivalent [6],choosing between them involves many factors including patient-level risk factors such as renal function, risk of bleeding, as well as other considerations such as nursing time, patient preference, risk of HIT, and acquisition cost. Indeed, the most recent version of the American College of Chest Physicians guidelines for prophylaxis against VTE note that both drugs are recommended with an evidence grade of IB [7].Cost is among the considerations considered appropriate in choosing among agents. The difference in acquisition costs of > $20 per patient per day can have a major financial impact on hospital’s pharmacy budget and may be decisive. But a focus only on acquisition cost is short sighted as the 2 medications have different complication rates with regard to HIT. Thus the need to track HIT incidence after protocol changes are made is paramount.

In our study, we did not measure thrombocytopenia as an endpoint because acquired thrombocytopenia is too common and multifactorial to be a meaningful. Rather, we used the clinical suspicion for HIT as measured by both the number of times the BPA fired warnings of low platelets in the setting of recent heparin use and the number of times clinicians suspected HIT enough to order a HIT assay. We also used actual outcomes (clinically adjudicated cases of HIT and HITT). Our data shows substantial compliance among clinicians with the voluntary conversion to UFH with an immediate and sustained shift to UFH so that UFH was used in 86% of patients. Corresponding cost savings were achieved in heparin acquisition. Unlike some prior reports, there was a minimal burden of HIT as measured by the unchanged number of BPAs, monthly HIT assays and the unchanged clinical risk 4T scores among those patients in whom the test was ordered pre and post intervention. HIT rates were not statistically different after the order set conversion took effect.

Our results and study design are similar but not identical to that of Zhou et al, who found that a campaign to increase VTE prophylaxis resulted in 71% increase of UFH use over 5 years but no increase in number of HIT assays ordered or in the distribution of HIT assay results-both surrogate endpoints [5].But not all analyses of heparin order interventions show similar results. A recent study of a heparin avoidance program in a Canadian tertiary care hospital showed a reduction of 79% and 91% in adjudicated cases of HIT and HITT respectively [4].Moreover, hospital-related expenditures for HIT decreased by nearly $267,000 (Canadian dollars) per year though the additional acquisition costs of LMWH were not stated.A small retrospective heparin avoidance protocol among orthopedic surgery patients showed a reduction of HIT incidence from 5.2% with UFH to 0% with LMWH after universal substitution of LMWH for UFH [8].A recent systematic review identified only 3 prospective studies involving over 1398 postoperative surgical patients that measured HIT and HITT as outcomes [9].The review authors, in pooled analysis, found a lower incidence of HIT and HITT with LMWH postoperatively but downgraded the evidence to “low quality” due to methodologic issues and concerns over bias.A nested case-control study of adult medical patients found that HIT was 6 times more common with UFH than with LMWH and the cost of admissions associated with HIT was 3.5 times higher than for those without HIT, though this increase in costs are not necessarily due to the HIT diagnosis itself but may be markers of patients with more severe illness [10].The duration of heparin therapy was not stated.

There are several potential reasons that our data differs from some of the previous reports described above. We used a strict definition of HIT, requiring the serotonin release assay to be positive in the appropriate clinical setting and did not rely solely upon antibody tests to make the diagnosis, a less rigorous standard found in some studies. Furthermore, our results may differ from previously reports because of differences in patient risk and duration of therapy. Our institution does not perform cardiac surgery and the very large orthopedic surgery programs do not generally use heparin. Another potentially important difference in our study from prior studies is that many of the patients treated at this institution did not receive heparin long enough to be considered at risk; only a quarter were treated for longer than 5 days, generally considered a minumum [11].This is less than half of the duration of the patients in the studies included in the meta-analysis of HIT incidence [2].

 

 

We do not contend that UFH is as safe as LMWH with regard to HIT for all populatons, but rather that the increased risk is not manifest in all patient populations and settings and so the increased cost may not be justified in low-risk patients. Indeed while variability in HIT risk among patients is well documented [3,12], the guidelines for prophylaxis do not generally take this into account when recommending particular VTE prophylaxis strategies.Clinical practice guidelines do recommend different degrees of monitoring the platelet count based on risk of HIT however.

Our study had limitations, chief of which is the retrospective nature of the analysis; however, the methodology we used was similar to those of previous publications [4,5,8].We may have missed some cases of HIT if a clinician did not order the assay in all appropriate patients but there is no reason to think that likelihood was any different pre- and post-intervention. In addition, though we reviewed every case of hospital-acquired thrombosis, it is possible that the clinical reviewers may have missed cases of HITT, especially if the thrombosis occurred before a substantial drop in the platelet count, which is rare but possible. Here too the chance of missing actual cases did not change between the pre-and post-intervention. Our study examined prophylaxis with heparin use and not therapeutic uses. Finally, while noting the acquisition cost reduction achieved with conversion to UFH, we were not able to calculate any excess expense attributed to the rare case of HIT and HITT that occurred. We believe our results are generalizable to hospitals with similar patient profiles.

The idea that patients with different risk factors might do well with different prophylaxis strategies needs to be better appreciated. Such information could be used as a guide to more individualized prophylaxis strategy aided by clinical decision support embedded within the EMR. In this way the benefit of LMWH in avoiding HIT could be reserved for those patients at greatest risk of HIT while simultaneously allowing hospitals not to overspend for prophylaxis in patients who will not benefit from LMWH. Such a strategy would need to be tested prospectively before widespread adoption.

As a result of our internal analysis we have altered our EMR-based best practice alert to conform to the 2013 American Society of Hematology guidelines [15],which is more informative than our original BPA. Specifically, the old guideline only warned if the platelet count was < 100,000/mm3 in association with heparin. The revision notified if there is a > 30% fall regardless of the absolute count and informed prescribers of the 4T score to encourage more optimum use of the HIT assay, avoiding its use for low risk scores and encouraging its use for moderate to high risk scores. We are also strengthening the emphasis that moderate to high risk 4T patients receive alternative anticoagulation until results of the HIT assay are available as we found this not to be a be a universal practice. We recommend similar self-inspection to other institutions.

Corresponding author: Barry R. Meisenberg, MD, Anne Arundel Medical Center, 2001 Medical Parkway, Annapolis, MD 21401, [email protected].

Financial disclosures: None.

Author contributions: conception and design, JR, BRM; analysis and interpretation of data, KW, JR, BRM; drafting of article, JR, BRM; critical revision of the article, KW, JR, BRM; statistical expertise, KW, JR; administrative or technical support, JR; collection and assembly of data, KW, JR.

References

1. Lo GK, Juhl D, Warkentin TE, et al. Evaluation of pretest clinical score (4T’s) for the diagnosis of heparin-induced thrombocytopenia in two clinical settings. J Thromb Haemost 2006;4:759–65.

2. Martel N, Lee J, Wells PS. Risk for heparin-induced thrombocytopenia with unfractionated and low-molecular-weight heparin thromboprophylaxis: a meta-analysis. Blood 2005; 106:2710–5.

3. Warkentin TE, Sheppard JI, Horsewood P, et al. Impact of the patient population on the risk for heparin-induced thrombocytpenia Blood 2000; 96:1703–8.

4. McGowan KE, Makari J, Diamantouros A, et al. Reducing the hospital burden of heparin-induced thrombocytopenia: impact of an avoid heparin program. Blood 2016; 127:1954–9.

5. Zhou A, Winkler A, Emamifar A, et al. Is the incidence of heparin-induced thrombocytopenia affected by the increased use of heparin for VTE prophylaxis? Chest 2012; 142:1175–8.

6.   Mismetti P, Laporte-Simitsidis S, Tardy B, et al. Prevention of venous thromboembolism in internal medicine with unfractionated or low-molecular-weight heparins: a meta-analysis of randomised clinical trials. Thromb Haemost 2000;83:14–19.

7.   Guyatt GH, Akl EA, Crowther M, et al; for the American College of Chest Physicians Antithrombotic Therapy and Prevention of Thrombosis Panel.  Antithrombotic therapy and prevention of thrombosis. 9th ed. American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest 2012;141(2 Suppl):7S–47S.

8. Greinacher A, Eichler P, Lietz T, Warkentin TE. Replacement of unfractionated heparin by low-molecular-weight heparin for postorthopedic surgery antithrombotic prophylaxis lowers the overall risk of symptomatic thrombosis because of a lower frequency of heparin-induced thrombocytopenia. Blood 2005;106:2921–2.

9. Junqueira DRG, Zorzela LM, Perini E. Unfractionated heparin versus low molecular weight heparin for avoiding heparin-induced thrombocytopenia in postoperative patients. Cochrane Database Syst Rev 2017;4:CD007557.

10. Creekmore FM, Oderda GM, Pendleton RC, Brixner DI. Incidence and economic implications of heparin-induced thrombocytopenia in medical patients receiving prophylaxis for venous thromboembolism. Pharmacotherapy 2006;26:1348–445.

11. Warkentin TE, Kelton JG. Temporal aspects of heparin-induced thrombocytopenia N Engl J Med 2001;344:1286–92.

12. Warkentin TE, Sheppard JA, Sigouin CS, et al. Gender imbalance and risk factor interactions in heparin-induced thrombocytopenia. Blood 2006;108:2937–41.

13. Camden R, Ludwig S. Prophylaxis against venous thromboembolism in hospitalized medically ill patients: Update and practical approach. Am J Health Syst Pharm 2012;71:909–17.

14. Linkins LA, Dans AL, Moores LK, et al. Treatment and prevention of heparin-induced thrombocytopenia. antithrombotic therapy and prevention of thrombosis. 9th ed. American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest 2012;141(2 Suppl):e495s–e530s.

15. Cuker A, Crowther MA. 2013 Clinical practice guideline on the evaluation and management of adults with suspected heparin-induced thrombocytopenia. Acessed 19 May 2017 at www.hematology.org/search.aspx?q=heparin+induced+thrombocytopenia.

Issue
Journal of Clinical Outcomes Management - August 2017, Vol. 24, No 8
Publications
Topics
Sections

From the Anne Arundel Health System Research Institute, Annapolis, MD.

 

Abstract

  • Objective: To measure clinical outcomes associated with heparin-induced thrombocytopenia (HIT) and acquisition costs of heparin after implementing a new order set promoting unfractionated heparin (UFH) use instead of low-molecular-weight heparin (LMWH) for venous thromboembolism (VTE) prophylaxis.
  • Methods: This was single-center, retrospective, pre-post intervention analysis utilizing pharmacy, laboratory, and clinical data sources. Subjects were patients receiving VTE thromboprophyalxis with heparin at an acute care hospital. Usage rates for UFH and LMWH, acquisition costs for heparins, number of HIT assays, best practice advisories for HIT, and confirmed cases of HIT and HIT with thrombosis were assessed.
  • Results: After order set intervention, UFH use increased from 43% of all prophylaxis orders to 86%. Net annual savings in acquisition costs for VTE prophylaxis was $131,000. After the intervention, HIT best practice advisories and number of monthly HIT assays fell 35% and 15%, respectively. In the 9-month pre-intervention period, HIT and HITT occurred in zero of 6717 patients receiving VTE prophylaxis. In the 25 months of post-intervention follow-up, HIT occurred in 3 of 44,240 patients (P = 0.86) receiving VTE prophylaxis, 2 of whom had HITT, all after receiving UFH. The median duration of UFH and LMWH use was 3.0 and 3.5 days, respectively.
  • Conclusion: UFH use in hospitals can be safely maintained or increased among patient subpopulations that are not at high risk for HIT. A more nuanced approach to prophylaxis, taking into account individual patient risk and expected duration of therapy, may provide desired cost savings without provoking HIT.

Key words: heparin; heparin-induced thrombocytopenia; venous thromboembolism prophylaxis; cost-effectiveness.

 

Heparin-induced thrombocytopenia (HIT) and its more severe clinical complication, HIT with thrombosis (HITT), complicate the use of heparin products for venous thromboembolic (VTE) prophylaxis. The clinical characteristics and time course of thrombocytopenia in relation to heparin are well characterized (typically 30%–50% drop in platelet count 5–10 days after exposure), if not absolute. Risk calculation tools help to judge the clinical probability and guide ordering of appropriate confirmatory tests [1]. The incidence of HIT is higher with unfractionated heparin (UFH) than with low-molecular-weight heparin (LMWH). A meta-analysis of 5 randomized or prospective nonrandomized trials indicated a risk of 2.6% (95% CI, 1.5%–3.8%) for UFH and 0.2% (95% CI, 0.1%–0.4%) for LMWH [2], though the analyzed studies were heavily weighted by studies of orthopedic surgery patients, a high-risk group. However, not all patients are at equal risk for HIT, suggesting that LMWH may not be necessary for all patients [3]. Unfortunately, LMWH is considerably more expensive for hospitals to purchase than UFH, raising costs for a prophylactic treatment that is widely utilized. However, the higher incidence of HIT and HITT associated with UFH can erode any cost savings because of the additional cost of diagnosing HIT and need for temporary or long-term treatment with even more expensive alternative anticoagulants. Indeed, a recent retrospective study suggested that the excess costs of evaluating and treating HIT were approximately $267,000 per year in Canadian dollars [4].But contrary data has also been reported. A retrospective study of the consequences of increased prophylactic UFH use found no increase in ordered HIT assays or in the results of HIT testing or of inferred positive cases despite a growth of 71% in the number of patients receiving UFH prophylaxis [5].

In 2013, the pharmacy and therapeutics committee made a decision to encourage the use of UFH over LMWH for VTE prophylaxis by making changes to order sets to favor UFH over LMWH (enoxaparin). Given the uncertainty about excess risk of HIT, a monitoring work group was created to assess for any increase of either HIT or HITT that might follow, including any patient readmitted with thrombosis within 30 days of a discharge. In this paper, we report the impact of a hospital-wide conversion to UFH for VTE prophylaxis on the incidence of VTE, HIT, and HITT and acquisition costs of UFH and LMWH and use of alternative prophylactic anticoagulant medications.

Methods

Setting

Anne Arundel Medical Center is a 383-bed acute care hospital with about 30,000 adult admissions and 10,000 inpatient surgeries annually. The average length of stay is approximately 3.6 days with a patient median age of 59 years. Caucasians comprise 75.3% of the admitted populations and African Americans 21.4%. Most patients are on Medicare (59%), while 29.5% have private insurance, 6.6% are on Medicaid, and 4.7% self-pay. The 9 most common medical principal diagnoses are sepsis, heart failure, chronic obstructive pulmonary disease, pneumonia, myocardial infarction, ischemic stroke, urinary tract infection, cardiac arrhythmia, and other infection. The 6 most common procedures include newborn delivery (with and without caesarean section), joint replacement surgery, bariatric procedures, cardiac catheterizations, other abdominal surgeries, and thoracotomy. The predominant medical care model is internal medicine and physician assistant acute care hospitalists attending both medicine and surgical patients. Obstetrical hospitalists care for admitted obstetric patients. Patients admitted to the intensive care units had only critical care trained physician specialists as attending physicians. No trainees cared for the patients described in this study.

P&T Committee

The P&T committee is a multidisciplinary group of health care professionals selected for appointment by the chairs of the committee (chair of medicine and director of pharmacy) and approved by the president of the medical staff. The committee has oversight responsibility for all medication policies, order sets involving medications, as well as the monitoring of clinical outcomes as they regard medications.

Electronic Medical Record and Best Practice Advisory

Throughout this study period both pre-and post-intervention, the EMR in use was Epic (Verona WI), used for all ordering and lab results. A best practice advisory was in place in the EMR that alerted providers to all cases of thrombocytopenia < 100,000/mm3 when there was concurrent order for any heparin. The best practice advisory highlighted the thrombocytopenia, advised the providers to consider HIT as a diagnosis and to order confirmation tests if clinically appropriate, providing a direct link to the HIT assay order screen. The best practice advisory did not access information from prior admissions where heparin might have been used nor determine the percentage drop from the baseline platelet count.

HIT Case Definition and Assays

The 2 laboratory tests for HIT on which this study is based are the heparin-induced platelet antibody test (also known as anti-PF4) and the serotonin release assay. The heparin-induced platelet antibody test is an enzyme-linked immunosorbent assay (ELISA) that detects IgG, IgM, and IgA antibodies against the platelet factor 4 (PF4/heparin complex). This test was reported as positive if the optical density was 0.4 or higher and generated an automatic request for a serotonin release assay (SRA), which is a functional assay that measures heparin-dependent platelet activation. The decision to order the SRA was therefore a “reflex” test and not made with any knowledge of clinical characteristics of the case. The HIT assays were performed by a reference lab, Quest Diagnostics, in the Chantilly, VA facility. HIT was said to be present when both a characteristic pattern of thrombocytopenia occurring after heparin use was seen [1]and when the confirmatory SRA was positive at a level of > 20% release.

 

 

Order Set Modifications

After the P&T committee decision to emphasize UFH for VTE prophylaxis in October 2013, the relevant electronic order sets were altered to highlight the fact that UFH was the first choice for VTE prophylaxis. The order sets still allowed LMWH (enoxaparin) or alternative anticoagulants at the prescribers’ discretion but indicated they were a second choice. Doses of UFH and LMWH in the order sets were standard based upon weight and estimates of creatinine clearance and, in the case of dosing frequency for UFH, based upon the risk of VTE. Order sets for the therapeutic treatment of VTE were not changed.

Data Collection and Analysis

The clinical research committee, the local oversight board for research and performance improvement analyses, reviewed this project and determined that it qualified as a performance improvement analysis based upon the standards of the U.S. Office of Human Research Protections. Some data were extracted from patient medical records and stored in a customized and password-protected database. Access to the database was limited to members of the analysis team and stripped of all patient identifiers under the HIPAA privacy rule standard for de-identification from 45 CFR 164.514(b) immediately following the collection of all data elements from the medical record.

An internal pharmacy database was used to determine the volume and actual acquisition cost of prophylactic anticoagulant doses administered during both pre- and post-intervention time periods. To determine if clinical suspicion for HIT increased after the intervention, a definitive listing of all ordered HIT assays was obtained from laboratory billing records for the 9 months (January 2013–September 2013) before the conversion and for 25 months after the intervention (beginning in November 2013 so as not to include the conversion month). To determine if the HIT assays were associated with a higher risk score, we identified all cases in which the HIT assay was ordered and retroactively measured the probability score known as the 4T score [1].Simultaneously, separate clinical work groups reviewed all cases of hospital-acquired thrombosis, whatever their cause, including patients readmitted with thrombosis up to 30 days after discharge and episodes of bleeding due to anti-coagulant use. A chi square analysis of the incidence of HIT pre- and post-intervention was performed.

Results

Heparin Use and Acquisition Costs

OR_HIT_Figure1
Annual hospital admissions and patient-days both increased 3% during the study time period to 29,975 annual adult admissions and 96,426 hospital days. However, the number of patients receiving any prophylactic anticoagulant increased 15% over the time period as a result of a simultaneous anti-VTE education campaign. After the modification in order sets there was a rapid increase in orders for prophylactic UFH and a corresponding decrease in the use of LMWH (Figure 1). A few patients received both at different times during their hospitalization. Prophylactic UFH orders increased from 43% of all prophylactic use pre-intervention to 86% by the second year of the intervention. There was a corresponding reduction in use of LMWH from 53% of all prophylaxis orders to 11% after the intervention. Other anticoagulants, fondaparinux and bivalrudin, together composed 5% and 4% of all prophylactic heparin in the pre- and post-intervention time periods, respectively. Pharmacy acquisition costs averaged $3.50 for daily doses of heparin and $24 for LMWH. With the shift in ordering pattern, acquisition costs of prophylactic LMWH fell by $165,000 annually with a concurrent offsetting increase in acquisition costs of UFH of $34,000 for a net savings of $131,000. The median duration of use was 3.5 days and 3.0 days for UFH and LMWH, respectively. Only 25% of treated patients had heparin exposure of > 5 days. Bleeding complications from any prophylaxis regimen were < 0.5% in both pre- and post-intervention time periods.

HIT Assays and Incidence of HIT and HITT

OR_HIT_Figure2
HIT assays results returned to the chart within a median of 4 days for anti-platelet factor 4 and 5 days for SRA. Data on the number of HIT assays ordered was missing for 1 month in 2013 and 1 month in 2014 due to missing laboratory billing data. On average, the best practice advisory for HIT fired 35% less frequently after the intervention (Figure 2). The number of ordered serum ELISA HIT assays decreased slightly after the intervention from a mean of 18.2 per month to 15.4 per month despite the increase in UFH use. The distribution of the retrospectively measured 4T scores for patients with HIT testing is shown in 
OR_HIT_Table
the Table. There were no differences in the retrospectively calculated scores between the pre- and post-intervention time periods. Most patients had a moderate risk 4T score prior to having HIT assay ordered.

In the 9 months pre-intervention, HIT and HITT occurred in zero of 6717 patients receiving at least 1 dose of VTE prophylaxis. In the 25 months of post-intervention follow-up, 44,240 patients received prophylaxis with either heparin. HIT (clinical suspicion with positive antibody and confirmatory SRA) occurred in 3 patients, 2 of whom had HITT, all after UFH. This incidence was not statistically significant using chi square analysis (P = 0.86).

 

 

Discussion

Because the efficacy of UFH and LMWH for VTE prophylaxis are equivalent [6],choosing between them involves many factors including patient-level risk factors such as renal function, risk of bleeding, as well as other considerations such as nursing time, patient preference, risk of HIT, and acquisition cost. Indeed, the most recent version of the American College of Chest Physicians guidelines for prophylaxis against VTE note that both drugs are recommended with an evidence grade of IB [7].Cost is among the considerations considered appropriate in choosing among agents. The difference in acquisition costs of > $20 per patient per day can have a major financial impact on hospital’s pharmacy budget and may be decisive. But a focus only on acquisition cost is short sighted as the 2 medications have different complication rates with regard to HIT. Thus the need to track HIT incidence after protocol changes are made is paramount.

In our study, we did not measure thrombocytopenia as an endpoint because acquired thrombocytopenia is too common and multifactorial to be a meaningful. Rather, we used the clinical suspicion for HIT as measured by both the number of times the BPA fired warnings of low platelets in the setting of recent heparin use and the number of times clinicians suspected HIT enough to order a HIT assay. We also used actual outcomes (clinically adjudicated cases of HIT and HITT). Our data shows substantial compliance among clinicians with the voluntary conversion to UFH with an immediate and sustained shift to UFH so that UFH was used in 86% of patients. Corresponding cost savings were achieved in heparin acquisition. Unlike some prior reports, there was a minimal burden of HIT as measured by the unchanged number of BPAs, monthly HIT assays and the unchanged clinical risk 4T scores among those patients in whom the test was ordered pre and post intervention. HIT rates were not statistically different after the order set conversion took effect.

Our results and study design are similar but not identical to that of Zhou et al, who found that a campaign to increase VTE prophylaxis resulted in 71% increase of UFH use over 5 years but no increase in number of HIT assays ordered or in the distribution of HIT assay results-both surrogate endpoints [5].But not all analyses of heparin order interventions show similar results. A recent study of a heparin avoidance program in a Canadian tertiary care hospital showed a reduction of 79% and 91% in adjudicated cases of HIT and HITT respectively [4].Moreover, hospital-related expenditures for HIT decreased by nearly $267,000 (Canadian dollars) per year though the additional acquisition costs of LMWH were not stated.A small retrospective heparin avoidance protocol among orthopedic surgery patients showed a reduction of HIT incidence from 5.2% with UFH to 0% with LMWH after universal substitution of LMWH for UFH [8].A recent systematic review identified only 3 prospective studies involving over 1398 postoperative surgical patients that measured HIT and HITT as outcomes [9].The review authors, in pooled analysis, found a lower incidence of HIT and HITT with LMWH postoperatively but downgraded the evidence to “low quality” due to methodologic issues and concerns over bias.A nested case-control study of adult medical patients found that HIT was 6 times more common with UFH than with LMWH and the cost of admissions associated with HIT was 3.5 times higher than for those without HIT, though this increase in costs are not necessarily due to the HIT diagnosis itself but may be markers of patients with more severe illness [10].The duration of heparin therapy was not stated.

There are several potential reasons that our data differs from some of the previous reports described above. We used a strict definition of HIT, requiring the serotonin release assay to be positive in the appropriate clinical setting and did not rely solely upon antibody tests to make the diagnosis, a less rigorous standard found in some studies. Furthermore, our results may differ from previously reports because of differences in patient risk and duration of therapy. Our institution does not perform cardiac surgery and the very large orthopedic surgery programs do not generally use heparin. Another potentially important difference in our study from prior studies is that many of the patients treated at this institution did not receive heparin long enough to be considered at risk; only a quarter were treated for longer than 5 days, generally considered a minumum [11].This is less than half of the duration of the patients in the studies included in the meta-analysis of HIT incidence [2].

 

 

We do not contend that UFH is as safe as LMWH with regard to HIT for all populatons, but rather that the increased risk is not manifest in all patient populations and settings and so the increased cost may not be justified in low-risk patients. Indeed while variability in HIT risk among patients is well documented [3,12], the guidelines for prophylaxis do not generally take this into account when recommending particular VTE prophylaxis strategies.Clinical practice guidelines do recommend different degrees of monitoring the platelet count based on risk of HIT however.

Our study had limitations, chief of which is the retrospective nature of the analysis; however, the methodology we used was similar to those of previous publications [4,5,8].We may have missed some cases of HIT if a clinician did not order the assay in all appropriate patients but there is no reason to think that likelihood was any different pre- and post-intervention. In addition, though we reviewed every case of hospital-acquired thrombosis, it is possible that the clinical reviewers may have missed cases of HITT, especially if the thrombosis occurred before a substantial drop in the platelet count, which is rare but possible. Here too the chance of missing actual cases did not change between the pre-and post-intervention. Our study examined prophylaxis with heparin use and not therapeutic uses. Finally, while noting the acquisition cost reduction achieved with conversion to UFH, we were not able to calculate any excess expense attributed to the rare case of HIT and HITT that occurred. We believe our results are generalizable to hospitals with similar patient profiles.

The idea that patients with different risk factors might do well with different prophylaxis strategies needs to be better appreciated. Such information could be used as a guide to more individualized prophylaxis strategy aided by clinical decision support embedded within the EMR. In this way the benefit of LMWH in avoiding HIT could be reserved for those patients at greatest risk of HIT while simultaneously allowing hospitals not to overspend for prophylaxis in patients who will not benefit from LMWH. Such a strategy would need to be tested prospectively before widespread adoption.

As a result of our internal analysis we have altered our EMR-based best practice alert to conform to the 2013 American Society of Hematology guidelines [15],which is more informative than our original BPA. Specifically, the old guideline only warned if the platelet count was < 100,000/mm3 in association with heparin. The revision notified if there is a > 30% fall regardless of the absolute count and informed prescribers of the 4T score to encourage more optimum use of the HIT assay, avoiding its use for low risk scores and encouraging its use for moderate to high risk scores. We are also strengthening the emphasis that moderate to high risk 4T patients receive alternative anticoagulation until results of the HIT assay are available as we found this not to be a be a universal practice. We recommend similar self-inspection to other institutions.

Corresponding author: Barry R. Meisenberg, MD, Anne Arundel Medical Center, 2001 Medical Parkway, Annapolis, MD 21401, [email protected].

Financial disclosures: None.

Author contributions: conception and design, JR, BRM; analysis and interpretation of data, KW, JR, BRM; drafting of article, JR, BRM; critical revision of the article, KW, JR, BRM; statistical expertise, KW, JR; administrative or technical support, JR; collection and assembly of data, KW, JR.

From the Anne Arundel Health System Research Institute, Annapolis, MD.

 

Abstract

  • Objective: To measure clinical outcomes associated with heparin-induced thrombocytopenia (HIT) and acquisition costs of heparin after implementing a new order set promoting unfractionated heparin (UFH) use instead of low-molecular-weight heparin (LMWH) for venous thromboembolism (VTE) prophylaxis.
  • Methods: This was single-center, retrospective, pre-post intervention analysis utilizing pharmacy, laboratory, and clinical data sources. Subjects were patients receiving VTE thromboprophyalxis with heparin at an acute care hospital. Usage rates for UFH and LMWH, acquisition costs for heparins, number of HIT assays, best practice advisories for HIT, and confirmed cases of HIT and HIT with thrombosis were assessed.
  • Results: After order set intervention, UFH use increased from 43% of all prophylaxis orders to 86%. Net annual savings in acquisition costs for VTE prophylaxis was $131,000. After the intervention, HIT best practice advisories and number of monthly HIT assays fell 35% and 15%, respectively. In the 9-month pre-intervention period, HIT and HITT occurred in zero of 6717 patients receiving VTE prophylaxis. In the 25 months of post-intervention follow-up, HIT occurred in 3 of 44,240 patients (P = 0.86) receiving VTE prophylaxis, 2 of whom had HITT, all after receiving UFH. The median duration of UFH and LMWH use was 3.0 and 3.5 days, respectively.
  • Conclusion: UFH use in hospitals can be safely maintained or increased among patient subpopulations that are not at high risk for HIT. A more nuanced approach to prophylaxis, taking into account individual patient risk and expected duration of therapy, may provide desired cost savings without provoking HIT.

Key words: heparin; heparin-induced thrombocytopenia; venous thromboembolism prophylaxis; cost-effectiveness.

 

Heparin-induced thrombocytopenia (HIT) and its more severe clinical complication, HIT with thrombosis (HITT), complicate the use of heparin products for venous thromboembolic (VTE) prophylaxis. The clinical characteristics and time course of thrombocytopenia in relation to heparin are well characterized (typically 30%–50% drop in platelet count 5–10 days after exposure), if not absolute. Risk calculation tools help to judge the clinical probability and guide ordering of appropriate confirmatory tests [1]. The incidence of HIT is higher with unfractionated heparin (UFH) than with low-molecular-weight heparin (LMWH). A meta-analysis of 5 randomized or prospective nonrandomized trials indicated a risk of 2.6% (95% CI, 1.5%–3.8%) for UFH and 0.2% (95% CI, 0.1%–0.4%) for LMWH [2], though the analyzed studies were heavily weighted by studies of orthopedic surgery patients, a high-risk group. However, not all patients are at equal risk for HIT, suggesting that LMWH may not be necessary for all patients [3]. Unfortunately, LMWH is considerably more expensive for hospitals to purchase than UFH, raising costs for a prophylactic treatment that is widely utilized. However, the higher incidence of HIT and HITT associated with UFH can erode any cost savings because of the additional cost of diagnosing HIT and need for temporary or long-term treatment with even more expensive alternative anticoagulants. Indeed, a recent retrospective study suggested that the excess costs of evaluating and treating HIT were approximately $267,000 per year in Canadian dollars [4].But contrary data has also been reported. A retrospective study of the consequences of increased prophylactic UFH use found no increase in ordered HIT assays or in the results of HIT testing or of inferred positive cases despite a growth of 71% in the number of patients receiving UFH prophylaxis [5].

In 2013, the pharmacy and therapeutics committee made a decision to encourage the use of UFH over LMWH for VTE prophylaxis by making changes to order sets to favor UFH over LMWH (enoxaparin). Given the uncertainty about excess risk of HIT, a monitoring work group was created to assess for any increase of either HIT or HITT that might follow, including any patient readmitted with thrombosis within 30 days of a discharge. In this paper, we report the impact of a hospital-wide conversion to UFH for VTE prophylaxis on the incidence of VTE, HIT, and HITT and acquisition costs of UFH and LMWH and use of alternative prophylactic anticoagulant medications.

Methods

Setting

Anne Arundel Medical Center is a 383-bed acute care hospital with about 30,000 adult admissions and 10,000 inpatient surgeries annually. The average length of stay is approximately 3.6 days with a patient median age of 59 years. Caucasians comprise 75.3% of the admitted populations and African Americans 21.4%. Most patients are on Medicare (59%), while 29.5% have private insurance, 6.6% are on Medicaid, and 4.7% self-pay. The 9 most common medical principal diagnoses are sepsis, heart failure, chronic obstructive pulmonary disease, pneumonia, myocardial infarction, ischemic stroke, urinary tract infection, cardiac arrhythmia, and other infection. The 6 most common procedures include newborn delivery (with and without caesarean section), joint replacement surgery, bariatric procedures, cardiac catheterizations, other abdominal surgeries, and thoracotomy. The predominant medical care model is internal medicine and physician assistant acute care hospitalists attending both medicine and surgical patients. Obstetrical hospitalists care for admitted obstetric patients. Patients admitted to the intensive care units had only critical care trained physician specialists as attending physicians. No trainees cared for the patients described in this study.

P&T Committee

The P&T committee is a multidisciplinary group of health care professionals selected for appointment by the chairs of the committee (chair of medicine and director of pharmacy) and approved by the president of the medical staff. The committee has oversight responsibility for all medication policies, order sets involving medications, as well as the monitoring of clinical outcomes as they regard medications.

Electronic Medical Record and Best Practice Advisory

Throughout this study period both pre-and post-intervention, the EMR in use was Epic (Verona WI), used for all ordering and lab results. A best practice advisory was in place in the EMR that alerted providers to all cases of thrombocytopenia < 100,000/mm3 when there was concurrent order for any heparin. The best practice advisory highlighted the thrombocytopenia, advised the providers to consider HIT as a diagnosis and to order confirmation tests if clinically appropriate, providing a direct link to the HIT assay order screen. The best practice advisory did not access information from prior admissions where heparin might have been used nor determine the percentage drop from the baseline platelet count.

HIT Case Definition and Assays

The 2 laboratory tests for HIT on which this study is based are the heparin-induced platelet antibody test (also known as anti-PF4) and the serotonin release assay. The heparin-induced platelet antibody test is an enzyme-linked immunosorbent assay (ELISA) that detects IgG, IgM, and IgA antibodies against the platelet factor 4 (PF4/heparin complex). This test was reported as positive if the optical density was 0.4 or higher and generated an automatic request for a serotonin release assay (SRA), which is a functional assay that measures heparin-dependent platelet activation. The decision to order the SRA was therefore a “reflex” test and not made with any knowledge of clinical characteristics of the case. The HIT assays were performed by a reference lab, Quest Diagnostics, in the Chantilly, VA facility. HIT was said to be present when both a characteristic pattern of thrombocytopenia occurring after heparin use was seen [1]and when the confirmatory SRA was positive at a level of > 20% release.

 

 

Order Set Modifications

After the P&T committee decision to emphasize UFH for VTE prophylaxis in October 2013, the relevant electronic order sets were altered to highlight the fact that UFH was the first choice for VTE prophylaxis. The order sets still allowed LMWH (enoxaparin) or alternative anticoagulants at the prescribers’ discretion but indicated they were a second choice. Doses of UFH and LMWH in the order sets were standard based upon weight and estimates of creatinine clearance and, in the case of dosing frequency for UFH, based upon the risk of VTE. Order sets for the therapeutic treatment of VTE were not changed.

Data Collection and Analysis

The clinical research committee, the local oversight board for research and performance improvement analyses, reviewed this project and determined that it qualified as a performance improvement analysis based upon the standards of the U.S. Office of Human Research Protections. Some data were extracted from patient medical records and stored in a customized and password-protected database. Access to the database was limited to members of the analysis team and stripped of all patient identifiers under the HIPAA privacy rule standard for de-identification from 45 CFR 164.514(b) immediately following the collection of all data elements from the medical record.

An internal pharmacy database was used to determine the volume and actual acquisition cost of prophylactic anticoagulant doses administered during both pre- and post-intervention time periods. To determine if clinical suspicion for HIT increased after the intervention, a definitive listing of all ordered HIT assays was obtained from laboratory billing records for the 9 months (January 2013–September 2013) before the conversion and for 25 months after the intervention (beginning in November 2013 so as not to include the conversion month). To determine if the HIT assays were associated with a higher risk score, we identified all cases in which the HIT assay was ordered and retroactively measured the probability score known as the 4T score [1].Simultaneously, separate clinical work groups reviewed all cases of hospital-acquired thrombosis, whatever their cause, including patients readmitted with thrombosis up to 30 days after discharge and episodes of bleeding due to anti-coagulant use. A chi square analysis of the incidence of HIT pre- and post-intervention was performed.

Results

Heparin Use and Acquisition Costs

OR_HIT_Figure1
Annual hospital admissions and patient-days both increased 3% during the study time period to 29,975 annual adult admissions and 96,426 hospital days. However, the number of patients receiving any prophylactic anticoagulant increased 15% over the time period as a result of a simultaneous anti-VTE education campaign. After the modification in order sets there was a rapid increase in orders for prophylactic UFH and a corresponding decrease in the use of LMWH (Figure 1). A few patients received both at different times during their hospitalization. Prophylactic UFH orders increased from 43% of all prophylactic use pre-intervention to 86% by the second year of the intervention. There was a corresponding reduction in use of LMWH from 53% of all prophylaxis orders to 11% after the intervention. Other anticoagulants, fondaparinux and bivalrudin, together composed 5% and 4% of all prophylactic heparin in the pre- and post-intervention time periods, respectively. Pharmacy acquisition costs averaged $3.50 for daily doses of heparin and $24 for LMWH. With the shift in ordering pattern, acquisition costs of prophylactic LMWH fell by $165,000 annually with a concurrent offsetting increase in acquisition costs of UFH of $34,000 for a net savings of $131,000. The median duration of use was 3.5 days and 3.0 days for UFH and LMWH, respectively. Only 25% of treated patients had heparin exposure of > 5 days. Bleeding complications from any prophylaxis regimen were < 0.5% in both pre- and post-intervention time periods.

HIT Assays and Incidence of HIT and HITT

OR_HIT_Figure2
HIT assays results returned to the chart within a median of 4 days for anti-platelet factor 4 and 5 days for SRA. Data on the number of HIT assays ordered was missing for 1 month in 2013 and 1 month in 2014 due to missing laboratory billing data. On average, the best practice advisory for HIT fired 35% less frequently after the intervention (Figure 2). The number of ordered serum ELISA HIT assays decreased slightly after the intervention from a mean of 18.2 per month to 15.4 per month despite the increase in UFH use. The distribution of the retrospectively measured 4T scores for patients with HIT testing is shown in 
OR_HIT_Table
the Table. There were no differences in the retrospectively calculated scores between the pre- and post-intervention time periods. Most patients had a moderate risk 4T score prior to having HIT assay ordered.

In the 9 months pre-intervention, HIT and HITT occurred in zero of 6717 patients receiving at least 1 dose of VTE prophylaxis. In the 25 months of post-intervention follow-up, 44,240 patients received prophylaxis with either heparin. HIT (clinical suspicion with positive antibody and confirmatory SRA) occurred in 3 patients, 2 of whom had HITT, all after UFH. This incidence was not statistically significant using chi square analysis (P = 0.86).

 

 

Discussion

Because the efficacy of UFH and LMWH for VTE prophylaxis are equivalent [6],choosing between them involves many factors including patient-level risk factors such as renal function, risk of bleeding, as well as other considerations such as nursing time, patient preference, risk of HIT, and acquisition cost. Indeed, the most recent version of the American College of Chest Physicians guidelines for prophylaxis against VTE note that both drugs are recommended with an evidence grade of IB [7].Cost is among the considerations considered appropriate in choosing among agents. The difference in acquisition costs of > $20 per patient per day can have a major financial impact on hospital’s pharmacy budget and may be decisive. But a focus only on acquisition cost is short sighted as the 2 medications have different complication rates with regard to HIT. Thus the need to track HIT incidence after protocol changes are made is paramount.

In our study, we did not measure thrombocytopenia as an endpoint because acquired thrombocytopenia is too common and multifactorial to be a meaningful. Rather, we used the clinical suspicion for HIT as measured by both the number of times the BPA fired warnings of low platelets in the setting of recent heparin use and the number of times clinicians suspected HIT enough to order a HIT assay. We also used actual outcomes (clinically adjudicated cases of HIT and HITT). Our data shows substantial compliance among clinicians with the voluntary conversion to UFH with an immediate and sustained shift to UFH so that UFH was used in 86% of patients. Corresponding cost savings were achieved in heparin acquisition. Unlike some prior reports, there was a minimal burden of HIT as measured by the unchanged number of BPAs, monthly HIT assays and the unchanged clinical risk 4T scores among those patients in whom the test was ordered pre and post intervention. HIT rates were not statistically different after the order set conversion took effect.

Our results and study design are similar but not identical to that of Zhou et al, who found that a campaign to increase VTE prophylaxis resulted in 71% increase of UFH use over 5 years but no increase in number of HIT assays ordered or in the distribution of HIT assay results-both surrogate endpoints [5].But not all analyses of heparin order interventions show similar results. A recent study of a heparin avoidance program in a Canadian tertiary care hospital showed a reduction of 79% and 91% in adjudicated cases of HIT and HITT respectively [4].Moreover, hospital-related expenditures for HIT decreased by nearly $267,000 (Canadian dollars) per year though the additional acquisition costs of LMWH were not stated.A small retrospective heparin avoidance protocol among orthopedic surgery patients showed a reduction of HIT incidence from 5.2% with UFH to 0% with LMWH after universal substitution of LMWH for UFH [8].A recent systematic review identified only 3 prospective studies involving over 1398 postoperative surgical patients that measured HIT and HITT as outcomes [9].The review authors, in pooled analysis, found a lower incidence of HIT and HITT with LMWH postoperatively but downgraded the evidence to “low quality” due to methodologic issues and concerns over bias.A nested case-control study of adult medical patients found that HIT was 6 times more common with UFH than with LMWH and the cost of admissions associated with HIT was 3.5 times higher than for those without HIT, though this increase in costs are not necessarily due to the HIT diagnosis itself but may be markers of patients with more severe illness [10].The duration of heparin therapy was not stated.

There are several potential reasons that our data differs from some of the previous reports described above. We used a strict definition of HIT, requiring the serotonin release assay to be positive in the appropriate clinical setting and did not rely solely upon antibody tests to make the diagnosis, a less rigorous standard found in some studies. Furthermore, our results may differ from previously reports because of differences in patient risk and duration of therapy. Our institution does not perform cardiac surgery and the very large orthopedic surgery programs do not generally use heparin. Another potentially important difference in our study from prior studies is that many of the patients treated at this institution did not receive heparin long enough to be considered at risk; only a quarter were treated for longer than 5 days, generally considered a minumum [11].This is less than half of the duration of the patients in the studies included in the meta-analysis of HIT incidence [2].

 

 

We do not contend that UFH is as safe as LMWH with regard to HIT for all populatons, but rather that the increased risk is not manifest in all patient populations and settings and so the increased cost may not be justified in low-risk patients. Indeed while variability in HIT risk among patients is well documented [3,12], the guidelines for prophylaxis do not generally take this into account when recommending particular VTE prophylaxis strategies.Clinical practice guidelines do recommend different degrees of monitoring the platelet count based on risk of HIT however.

Our study had limitations, chief of which is the retrospective nature of the analysis; however, the methodology we used was similar to those of previous publications [4,5,8].We may have missed some cases of HIT if a clinician did not order the assay in all appropriate patients but there is no reason to think that likelihood was any different pre- and post-intervention. In addition, though we reviewed every case of hospital-acquired thrombosis, it is possible that the clinical reviewers may have missed cases of HITT, especially if the thrombosis occurred before a substantial drop in the platelet count, which is rare but possible. Here too the chance of missing actual cases did not change between the pre-and post-intervention. Our study examined prophylaxis with heparin use and not therapeutic uses. Finally, while noting the acquisition cost reduction achieved with conversion to UFH, we were not able to calculate any excess expense attributed to the rare case of HIT and HITT that occurred. We believe our results are generalizable to hospitals with similar patient profiles.

The idea that patients with different risk factors might do well with different prophylaxis strategies needs to be better appreciated. Such information could be used as a guide to more individualized prophylaxis strategy aided by clinical decision support embedded within the EMR. In this way the benefit of LMWH in avoiding HIT could be reserved for those patients at greatest risk of HIT while simultaneously allowing hospitals not to overspend for prophylaxis in patients who will not benefit from LMWH. Such a strategy would need to be tested prospectively before widespread adoption.

As a result of our internal analysis we have altered our EMR-based best practice alert to conform to the 2013 American Society of Hematology guidelines [15],which is more informative than our original BPA. Specifically, the old guideline only warned if the platelet count was < 100,000/mm3 in association with heparin. The revision notified if there is a > 30% fall regardless of the absolute count and informed prescribers of the 4T score to encourage more optimum use of the HIT assay, avoiding its use for low risk scores and encouraging its use for moderate to high risk scores. We are also strengthening the emphasis that moderate to high risk 4T patients receive alternative anticoagulation until results of the HIT assay are available as we found this not to be a be a universal practice. We recommend similar self-inspection to other institutions.

Corresponding author: Barry R. Meisenberg, MD, Anne Arundel Medical Center, 2001 Medical Parkway, Annapolis, MD 21401, [email protected].

Financial disclosures: None.

Author contributions: conception and design, JR, BRM; analysis and interpretation of data, KW, JR, BRM; drafting of article, JR, BRM; critical revision of the article, KW, JR, BRM; statistical expertise, KW, JR; administrative or technical support, JR; collection and assembly of data, KW, JR.

References

1. Lo GK, Juhl D, Warkentin TE, et al. Evaluation of pretest clinical score (4T’s) for the diagnosis of heparin-induced thrombocytopenia in two clinical settings. J Thromb Haemost 2006;4:759–65.

2. Martel N, Lee J, Wells PS. Risk for heparin-induced thrombocytopenia with unfractionated and low-molecular-weight heparin thromboprophylaxis: a meta-analysis. Blood 2005; 106:2710–5.

3. Warkentin TE, Sheppard JI, Horsewood P, et al. Impact of the patient population on the risk for heparin-induced thrombocytpenia Blood 2000; 96:1703–8.

4. McGowan KE, Makari J, Diamantouros A, et al. Reducing the hospital burden of heparin-induced thrombocytopenia: impact of an avoid heparin program. Blood 2016; 127:1954–9.

5. Zhou A, Winkler A, Emamifar A, et al. Is the incidence of heparin-induced thrombocytopenia affected by the increased use of heparin for VTE prophylaxis? Chest 2012; 142:1175–8.

6.   Mismetti P, Laporte-Simitsidis S, Tardy B, et al. Prevention of venous thromboembolism in internal medicine with unfractionated or low-molecular-weight heparins: a meta-analysis of randomised clinical trials. Thromb Haemost 2000;83:14–19.

7.   Guyatt GH, Akl EA, Crowther M, et al; for the American College of Chest Physicians Antithrombotic Therapy and Prevention of Thrombosis Panel.  Antithrombotic therapy and prevention of thrombosis. 9th ed. American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest 2012;141(2 Suppl):7S–47S.

8. Greinacher A, Eichler P, Lietz T, Warkentin TE. Replacement of unfractionated heparin by low-molecular-weight heparin for postorthopedic surgery antithrombotic prophylaxis lowers the overall risk of symptomatic thrombosis because of a lower frequency of heparin-induced thrombocytopenia. Blood 2005;106:2921–2.

9. Junqueira DRG, Zorzela LM, Perini E. Unfractionated heparin versus low molecular weight heparin for avoiding heparin-induced thrombocytopenia in postoperative patients. Cochrane Database Syst Rev 2017;4:CD007557.

10. Creekmore FM, Oderda GM, Pendleton RC, Brixner DI. Incidence and economic implications of heparin-induced thrombocytopenia in medical patients receiving prophylaxis for venous thromboembolism. Pharmacotherapy 2006;26:1348–445.

11. Warkentin TE, Kelton JG. Temporal aspects of heparin-induced thrombocytopenia N Engl J Med 2001;344:1286–92.

12. Warkentin TE, Sheppard JA, Sigouin CS, et al. Gender imbalance and risk factor interactions in heparin-induced thrombocytopenia. Blood 2006;108:2937–41.

13. Camden R, Ludwig S. Prophylaxis against venous thromboembolism in hospitalized medically ill patients: Update and practical approach. Am J Health Syst Pharm 2012;71:909–17.

14. Linkins LA, Dans AL, Moores LK, et al. Treatment and prevention of heparin-induced thrombocytopenia. antithrombotic therapy and prevention of thrombosis. 9th ed. American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest 2012;141(2 Suppl):e495s–e530s.

15. Cuker A, Crowther MA. 2013 Clinical practice guideline on the evaluation and management of adults with suspected heparin-induced thrombocytopenia. Acessed 19 May 2017 at www.hematology.org/search.aspx?q=heparin+induced+thrombocytopenia.

References

1. Lo GK, Juhl D, Warkentin TE, et al. Evaluation of pretest clinical score (4T’s) for the diagnosis of heparin-induced thrombocytopenia in two clinical settings. J Thromb Haemost 2006;4:759–65.

2. Martel N, Lee J, Wells PS. Risk for heparin-induced thrombocytopenia with unfractionated and low-molecular-weight heparin thromboprophylaxis: a meta-analysis. Blood 2005; 106:2710–5.

3. Warkentin TE, Sheppard JI, Horsewood P, et al. Impact of the patient population on the risk for heparin-induced thrombocytpenia Blood 2000; 96:1703–8.

4. McGowan KE, Makari J, Diamantouros A, et al. Reducing the hospital burden of heparin-induced thrombocytopenia: impact of an avoid heparin program. Blood 2016; 127:1954–9.

5. Zhou A, Winkler A, Emamifar A, et al. Is the incidence of heparin-induced thrombocytopenia affected by the increased use of heparin for VTE prophylaxis? Chest 2012; 142:1175–8.

6.   Mismetti P, Laporte-Simitsidis S, Tardy B, et al. Prevention of venous thromboembolism in internal medicine with unfractionated or low-molecular-weight heparins: a meta-analysis of randomised clinical trials. Thromb Haemost 2000;83:14–19.

7.   Guyatt GH, Akl EA, Crowther M, et al; for the American College of Chest Physicians Antithrombotic Therapy and Prevention of Thrombosis Panel.  Antithrombotic therapy and prevention of thrombosis. 9th ed. American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest 2012;141(2 Suppl):7S–47S.

8. Greinacher A, Eichler P, Lietz T, Warkentin TE. Replacement of unfractionated heparin by low-molecular-weight heparin for postorthopedic surgery antithrombotic prophylaxis lowers the overall risk of symptomatic thrombosis because of a lower frequency of heparin-induced thrombocytopenia. Blood 2005;106:2921–2.

9. Junqueira DRG, Zorzela LM, Perini E. Unfractionated heparin versus low molecular weight heparin for avoiding heparin-induced thrombocytopenia in postoperative patients. Cochrane Database Syst Rev 2017;4:CD007557.

10. Creekmore FM, Oderda GM, Pendleton RC, Brixner DI. Incidence and economic implications of heparin-induced thrombocytopenia in medical patients receiving prophylaxis for venous thromboembolism. Pharmacotherapy 2006;26:1348–445.

11. Warkentin TE, Kelton JG. Temporal aspects of heparin-induced thrombocytopenia N Engl J Med 2001;344:1286–92.

12. Warkentin TE, Sheppard JA, Sigouin CS, et al. Gender imbalance and risk factor interactions in heparin-induced thrombocytopenia. Blood 2006;108:2937–41.

13. Camden R, Ludwig S. Prophylaxis against venous thromboembolism in hospitalized medically ill patients: Update and practical approach. Am J Health Syst Pharm 2012;71:909–17.

14. Linkins LA, Dans AL, Moores LK, et al. Treatment and prevention of heparin-induced thrombocytopenia. antithrombotic therapy and prevention of thrombosis. 9th ed. American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest 2012;141(2 Suppl):e495s–e530s.

15. Cuker A, Crowther MA. 2013 Clinical practice guideline on the evaluation and management of adults with suspected heparin-induced thrombocytopenia. Acessed 19 May 2017 at www.hematology.org/search.aspx?q=heparin+induced+thrombocytopenia.

Issue
Journal of Clinical Outcomes Management - August 2017, Vol. 24, No 8
Issue
Journal of Clinical Outcomes Management - August 2017, Vol. 24, No 8
Publications
Publications
Topics
Article Type
Display Headline
Clinical Outcomes After Conversion from Low-Molecular-Weight Heparin to Unfractionated Heparin for Venous Thromboembolism Prophylaxis
Display Headline
Clinical Outcomes After Conversion from Low-Molecular-Weight Heparin to Unfractionated Heparin for Venous Thromboembolism Prophylaxis
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Lessons emerge from Europe’s first enterovirus-related brain stem encephalitis outbreak

Article Type
Changed
Fri, 01/18/2019 - 16:56

 

– Ninety-two percent of Spanish children sickened during the first-ever outbreak of enterovirus-associated brain stem encephalitis in western Europe survived with no long-term sequelae, Nuria Worner, MD, reported at the annual meeting of the European Society for Paediatric Infectious Diseases.

“We think that aggressive treatments should be restricted to those patients with important neurologic involvement,” declared Dr. Worner of Vall d’Hebron University Hospital in Barcelona. “We can say that no patients with milder involvement and without warning signs during the first 24 hours after onset of neurologic involvement went on to develop fulminant symptoms.”

Notable outbreaks of enterovirus A71 (EV-A71)-associated brain stem encephalitis occurred in Southeast Asia, Australia, and China in the late 1990s.

Dr. Worner reported on 196 children treated for laboratory-confirmed EV-A71–associated brain stem encephalitis at 16 Spanish hospitals in April-December 2016. Their median age was 25 months, 57% were male, and a median of 2 days of symptoms of mild viral illness transpired before neurologic symptoms arose. Prior to presenting to a hospital, 21% of the children had been diagnosed with hand-foot-and-mouth disease, and 13% with herpangina.

Initial preadmission symptoms included fever in 94% of cases, sleepiness in 86%, ataxia in 75%, tremor in 47%, myoclonus in 40%, and a rash in 26%.

Fifty-five percent of the children had EV RNA isolated from both throat and feces, 26% from the throat only, and 19% only from their feces. Eighty-seven percent of serotyped EV were EV-A71.

Ninety percent of children underwent lumbar puncture. Particularly noteworthy was the finding that EV was detected in the cerebrospinal fluid of a mere 3% of patients, although pleocytosis was present in 84%.

Brain MRI showed brain stem encephalitis along with myelitis in 50% of patients, brain stem myelitis without encephalitis in 29%, myelitis elsewhere in 2%, and normal findings in 19%.

Ground zero for the outbreak was Barcelona and the surrounding region of Catalonia; indeed, 130 of the 196 (66%) affected children came from there. The Catalan health department and pediatric infectious disease specialists quickly created standardized case severity definitions and treatment recommendations; they distributed them nationally.

Mild EV-A71–associated brain stem disease was defined as two or more of the following: tremor, myoclonus, mild ataxia, and/or significant drowsiness. The recommendation in these mild cases was for no treatment other than supportive care and careful in-hospital monitoring.

Patients with moderate involvement had to meet the definition for mild disease plus more pronounced ataxia or bulbar motor neuron involvement marked by slurred speech, drooling, dysphagia, apnea, abolition of the gag reflex, and/or an abnormal respiratory pattern. Moderately affected patients received two doses of intravenous immunoglobulin (IVIG), each dosed at 1 g/kg per 24 hours. Admission to the pediatric ICU was individualized for patients with moderate EV-A71–associated brain stem encephalitis.

Severe disease was categorized as bulbar motor neuron involvement plus neurogenic cardiorespiratory failure. Those patients were uniformly admitted to a pediatric ICU and given the two doses of IVIG. The need for systemic steroids was determined on an individual basis.

Forty percent of patients received IVIG and systemic steroids, 24% received IVIG only, 2% systemic steroids only, and 34% received no treatment other than supportive care.

Twenty-six percent of children were admitted to a pediatric ICU for a median stay of 3.5 days. Nine percent of children were placed on mechanical ventilation.

As the disease evolved, the most frequent neurologic complications included slurred speech in 15% of children, abnormal breathing pattern in 11%, seizures in 10%, acute flaccid paralysis in 9%, and cardiorespiratory failure with pulmonary edema in 9%, all occurring within the first hours after hospital admission.

The median hospital length of stay for the full study population was 6 days. The survival rate was 99.5%, with the sole death being due to cardiorespiratory failure.

With 1-6 months of follow-up since the acute episode of EV-A71–associated brain stem encephalitis, the long-term sequelae included two cases of limb paresis and two cases of paresis of a cranial nerve, one child with residual seizures, and one with hypoxic-ischemic encephalopathy.

Asked why the fatality rate in the Spanish outbreak was so much lower than in the earlier Australasian outbreaks, Dr. Worner cited Catalan physicians’ quick recognition of what was underway – and, more importantly, a difference in the EV-A71 viral subgenotype. Most of the most severe cases in Asia and Australia involved the C-4 subgenotype, while in Spain, the predominant subgenotype involved in the outbreak was C-1.

As for the curious finding that EV was detectable in the cerebrospinal fluid of a mere 3% of the Spanish children, she said the explanation is unknown. The two main possibilities are that the CNS symptoms were due to a parenchymal brain infection rather than to EV-A71 infection of meningeal tissue. Alternatively, the CNS involvement may have been a manifestation of an immunologic response to the infection, rather than being due to the virus itself.

Dr. Worner reported having no financial conflicts of interest.

 

 

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Related Articles

 

– Ninety-two percent of Spanish children sickened during the first-ever outbreak of enterovirus-associated brain stem encephalitis in western Europe survived with no long-term sequelae, Nuria Worner, MD, reported at the annual meeting of the European Society for Paediatric Infectious Diseases.

“We think that aggressive treatments should be restricted to those patients with important neurologic involvement,” declared Dr. Worner of Vall d’Hebron University Hospital in Barcelona. “We can say that no patients with milder involvement and without warning signs during the first 24 hours after onset of neurologic involvement went on to develop fulminant symptoms.”

Notable outbreaks of enterovirus A71 (EV-A71)-associated brain stem encephalitis occurred in Southeast Asia, Australia, and China in the late 1990s.

Dr. Worner reported on 196 children treated for laboratory-confirmed EV-A71–associated brain stem encephalitis at 16 Spanish hospitals in April-December 2016. Their median age was 25 months, 57% were male, and a median of 2 days of symptoms of mild viral illness transpired before neurologic symptoms arose. Prior to presenting to a hospital, 21% of the children had been diagnosed with hand-foot-and-mouth disease, and 13% with herpangina.

Initial preadmission symptoms included fever in 94% of cases, sleepiness in 86%, ataxia in 75%, tremor in 47%, myoclonus in 40%, and a rash in 26%.

Fifty-five percent of the children had EV RNA isolated from both throat and feces, 26% from the throat only, and 19% only from their feces. Eighty-seven percent of serotyped EV were EV-A71.

Ninety percent of children underwent lumbar puncture. Particularly noteworthy was the finding that EV was detected in the cerebrospinal fluid of a mere 3% of patients, although pleocytosis was present in 84%.

Brain MRI showed brain stem encephalitis along with myelitis in 50% of patients, brain stem myelitis without encephalitis in 29%, myelitis elsewhere in 2%, and normal findings in 19%.

Ground zero for the outbreak was Barcelona and the surrounding region of Catalonia; indeed, 130 of the 196 (66%) affected children came from there. The Catalan health department and pediatric infectious disease specialists quickly created standardized case severity definitions and treatment recommendations; they distributed them nationally.

Mild EV-A71–associated brain stem disease was defined as two or more of the following: tremor, myoclonus, mild ataxia, and/or significant drowsiness. The recommendation in these mild cases was for no treatment other than supportive care and careful in-hospital monitoring.

Patients with moderate involvement had to meet the definition for mild disease plus more pronounced ataxia or bulbar motor neuron involvement marked by slurred speech, drooling, dysphagia, apnea, abolition of the gag reflex, and/or an abnormal respiratory pattern. Moderately affected patients received two doses of intravenous immunoglobulin (IVIG), each dosed at 1 g/kg per 24 hours. Admission to the pediatric ICU was individualized for patients with moderate EV-A71–associated brain stem encephalitis.

Severe disease was categorized as bulbar motor neuron involvement plus neurogenic cardiorespiratory failure. Those patients were uniformly admitted to a pediatric ICU and given the two doses of IVIG. The need for systemic steroids was determined on an individual basis.

Forty percent of patients received IVIG and systemic steroids, 24% received IVIG only, 2% systemic steroids only, and 34% received no treatment other than supportive care.

Twenty-six percent of children were admitted to a pediatric ICU for a median stay of 3.5 days. Nine percent of children were placed on mechanical ventilation.

As the disease evolved, the most frequent neurologic complications included slurred speech in 15% of children, abnormal breathing pattern in 11%, seizures in 10%, acute flaccid paralysis in 9%, and cardiorespiratory failure with pulmonary edema in 9%, all occurring within the first hours after hospital admission.

The median hospital length of stay for the full study population was 6 days. The survival rate was 99.5%, with the sole death being due to cardiorespiratory failure.

With 1-6 months of follow-up since the acute episode of EV-A71–associated brain stem encephalitis, the long-term sequelae included two cases of limb paresis and two cases of paresis of a cranial nerve, one child with residual seizures, and one with hypoxic-ischemic encephalopathy.

Asked why the fatality rate in the Spanish outbreak was so much lower than in the earlier Australasian outbreaks, Dr. Worner cited Catalan physicians’ quick recognition of what was underway – and, more importantly, a difference in the EV-A71 viral subgenotype. Most of the most severe cases in Asia and Australia involved the C-4 subgenotype, while in Spain, the predominant subgenotype involved in the outbreak was C-1.

As for the curious finding that EV was detectable in the cerebrospinal fluid of a mere 3% of the Spanish children, she said the explanation is unknown. The two main possibilities are that the CNS symptoms were due to a parenchymal brain infection rather than to EV-A71 infection of meningeal tissue. Alternatively, the CNS involvement may have been a manifestation of an immunologic response to the infection, rather than being due to the virus itself.

Dr. Worner reported having no financial conflicts of interest.

 

 

 

– Ninety-two percent of Spanish children sickened during the first-ever outbreak of enterovirus-associated brain stem encephalitis in western Europe survived with no long-term sequelae, Nuria Worner, MD, reported at the annual meeting of the European Society for Paediatric Infectious Diseases.

“We think that aggressive treatments should be restricted to those patients with important neurologic involvement,” declared Dr. Worner of Vall d’Hebron University Hospital in Barcelona. “We can say that no patients with milder involvement and without warning signs during the first 24 hours after onset of neurologic involvement went on to develop fulminant symptoms.”

Notable outbreaks of enterovirus A71 (EV-A71)-associated brain stem encephalitis occurred in Southeast Asia, Australia, and China in the late 1990s.

Dr. Worner reported on 196 children treated for laboratory-confirmed EV-A71–associated brain stem encephalitis at 16 Spanish hospitals in April-December 2016. Their median age was 25 months, 57% were male, and a median of 2 days of symptoms of mild viral illness transpired before neurologic symptoms arose. Prior to presenting to a hospital, 21% of the children had been diagnosed with hand-foot-and-mouth disease, and 13% with herpangina.

Initial preadmission symptoms included fever in 94% of cases, sleepiness in 86%, ataxia in 75%, tremor in 47%, myoclonus in 40%, and a rash in 26%.

Fifty-five percent of the children had EV RNA isolated from both throat and feces, 26% from the throat only, and 19% only from their feces. Eighty-seven percent of serotyped EV were EV-A71.

Ninety percent of children underwent lumbar puncture. Particularly noteworthy was the finding that EV was detected in the cerebrospinal fluid of a mere 3% of patients, although pleocytosis was present in 84%.

Brain MRI showed brain stem encephalitis along with myelitis in 50% of patients, brain stem myelitis without encephalitis in 29%, myelitis elsewhere in 2%, and normal findings in 19%.

Ground zero for the outbreak was Barcelona and the surrounding region of Catalonia; indeed, 130 of the 196 (66%) affected children came from there. The Catalan health department and pediatric infectious disease specialists quickly created standardized case severity definitions and treatment recommendations; they distributed them nationally.

Mild EV-A71–associated brain stem disease was defined as two or more of the following: tremor, myoclonus, mild ataxia, and/or significant drowsiness. The recommendation in these mild cases was for no treatment other than supportive care and careful in-hospital monitoring.

Patients with moderate involvement had to meet the definition for mild disease plus more pronounced ataxia or bulbar motor neuron involvement marked by slurred speech, drooling, dysphagia, apnea, abolition of the gag reflex, and/or an abnormal respiratory pattern. Moderately affected patients received two doses of intravenous immunoglobulin (IVIG), each dosed at 1 g/kg per 24 hours. Admission to the pediatric ICU was individualized for patients with moderate EV-A71–associated brain stem encephalitis.

Severe disease was categorized as bulbar motor neuron involvement plus neurogenic cardiorespiratory failure. Those patients were uniformly admitted to a pediatric ICU and given the two doses of IVIG. The need for systemic steroids was determined on an individual basis.

Forty percent of patients received IVIG and systemic steroids, 24% received IVIG only, 2% systemic steroids only, and 34% received no treatment other than supportive care.

Twenty-six percent of children were admitted to a pediatric ICU for a median stay of 3.5 days. Nine percent of children were placed on mechanical ventilation.

As the disease evolved, the most frequent neurologic complications included slurred speech in 15% of children, abnormal breathing pattern in 11%, seizures in 10%, acute flaccid paralysis in 9%, and cardiorespiratory failure with pulmonary edema in 9%, all occurring within the first hours after hospital admission.

The median hospital length of stay for the full study population was 6 days. The survival rate was 99.5%, with the sole death being due to cardiorespiratory failure.

With 1-6 months of follow-up since the acute episode of EV-A71–associated brain stem encephalitis, the long-term sequelae included two cases of limb paresis and two cases of paresis of a cranial nerve, one child with residual seizures, and one with hypoxic-ischemic encephalopathy.

Asked why the fatality rate in the Spanish outbreak was so much lower than in the earlier Australasian outbreaks, Dr. Worner cited Catalan physicians’ quick recognition of what was underway – and, more importantly, a difference in the EV-A71 viral subgenotype. Most of the most severe cases in Asia and Australia involved the C-4 subgenotype, while in Spain, the predominant subgenotype involved in the outbreak was C-1.

As for the curious finding that EV was detectable in the cerebrospinal fluid of a mere 3% of the Spanish children, she said the explanation is unknown. The two main possibilities are that the CNS symptoms were due to a parenchymal brain infection rather than to EV-A71 infection of meningeal tissue. Alternatively, the CNS involvement may have been a manifestation of an immunologic response to the infection, rather than being due to the virus itself.

Dr. Worner reported having no financial conflicts of interest.

 

 

Publications
Publications
Topics
Article Type
Sections
Article Source

AT ESPID 2017

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Reserve aggressive therapies for children with enterovirus-A71–associated brain stem encephalitis for the subgroup with severe neurologic complications early on.

Major finding: Ninety-two percent of Spanish children involved in an outbreak of enterovirus-associated brain stem encephalitis survived with no long-term sequelae.

Data source: A retrospective review of 196 children treated for laboratory-confirmed EV-A71–associated brain stem encephalitis at 16 Spanish hospitals in April-December 2016 during the first-ever outbreak of this condition in western Europe.

Disclosures: Dr. Worner reported having no financial conflicts of interest.

Disqus Comments
Default

Exploring Noninvasive Presurgical Brain Mapping

Article Type
Changed
Tue, 08/15/2017 - 07:57
Options include magnetoencephalography, functional MRI, and transcranial magnetic stimulation.

Although invasive procedures like cortical stimulation mapping and the Wada test have traditionally been employed to determine the location of motor and language-related areas of the brain prior to ablative epilepsy surgery, a recent review of noninvasive procedures suggests several may have merit. Papanicolaou et al outline the value of magnetoencephalography, functional magnetic resonance imaging, and transcranial magnetic stimulation to accomplish the same purpose and explain the rationale and conditions in which these approaches may be worth considering.

Papanicolaou AC, Rezaie R, Narayana S et al. On the relative merits of invasive and non-invasive pre-surgical brain mapping: New tools in ablative epilepsy surgery [published online ahead of print July 3, 2017]. Epilepsy Res. 2017; doi: 10.1016/j.eplepsyres.2017.07.002.

Publications
Sections
Options include magnetoencephalography, functional MRI, and transcranial magnetic stimulation.
Options include magnetoencephalography, functional MRI, and transcranial magnetic stimulation.

Although invasive procedures like cortical stimulation mapping and the Wada test have traditionally been employed to determine the location of motor and language-related areas of the brain prior to ablative epilepsy surgery, a recent review of noninvasive procedures suggests several may have merit. Papanicolaou et al outline the value of magnetoencephalography, functional magnetic resonance imaging, and transcranial magnetic stimulation to accomplish the same purpose and explain the rationale and conditions in which these approaches may be worth considering.

Papanicolaou AC, Rezaie R, Narayana S et al. On the relative merits of invasive and non-invasive pre-surgical brain mapping: New tools in ablative epilepsy surgery [published online ahead of print July 3, 2017]. Epilepsy Res. 2017; doi: 10.1016/j.eplepsyres.2017.07.002.

Although invasive procedures like cortical stimulation mapping and the Wada test have traditionally been employed to determine the location of motor and language-related areas of the brain prior to ablative epilepsy surgery, a recent review of noninvasive procedures suggests several may have merit. Papanicolaou et al outline the value of magnetoencephalography, functional magnetic resonance imaging, and transcranial magnetic stimulation to accomplish the same purpose and explain the rationale and conditions in which these approaches may be worth considering.

Papanicolaou AC, Rezaie R, Narayana S et al. On the relative merits of invasive and non-invasive pre-surgical brain mapping: New tools in ablative epilepsy surgery [published online ahead of print July 3, 2017]. Epilepsy Res. 2017; doi: 10.1016/j.eplepsyres.2017.07.002.

Publications
Publications
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Implementation of a Communication Training Program Is Associated with Reduction of Antipsychotic Medication Use in Nursing Homes

Article Type
Changed
Wed, 02/28/2018 - 14:42
Display Headline
Implementation of a Communication Training Program Is Associated with Reduction of Antipsychotic Medication Use in Nursing Homes

Study Overview

Objective. To evaluate the effectiveness of OASIS, a large-scale, statewide communication training program, on the reduction of antipsychotic use in nursing homes (NHs).

Design. Quasi-experimental longitudinal study with external controls.

Setting and participants. The participants were residents living in NHs between 1 March 2011 and 31 August 2013. The intervention group consisted of NHs in Massachusetts that were enrolled in the OASIS intervention and the control group consisted of NHs in Massachusetts and New York. The Centers for Medicare & Medicaid Services Minimum Data Set (MDS) 3.0 data was analyzed to determine medication use and behavior of residents of NHs. Residents of these NHs were excluded if they had a US Food and Drug Administration (FDA)-approved indication for antipsychotic use (eg, schizophrenia); were short-term residents (length of stay < 90 days); or had missing data on psychopharmacological medication use or behavior.

Intervention. The OASIS is an educational program that targeted both direct care and non-direct care staff in NHs to assist them in meeting the needs and challenges of caring for long-term care residents. Utilizing a train-the-trainer model, OASIS program coordinators and champions from each intervention NH participated in an 8-hour in-person training session that focused on enhancing communication skills between NH staff and residents with cognitive impairment. These trainers subsequently instructed the OASIS program to staff at their respective NHs using a team-based care approach. Addi-tional support of the OASIS educational program, such as telephone support, 12 webinars, 2 regional seminars, and 2 booster sessions, were provided to participating NHs.

Main outcome measures. The main outcome measure was facility-level prevalence of antipsychotic use in long-term NH residents captured by MDS in the 7 days preceding the MDS assessment. The secondary outcome measures were facility-level quarterly prevalence of psychotropic medications that may have been substituted for antipsychotic medications (ie, anxiolytics, antidepressants, and hypnotics) and behavioral disturbances (ie, physically abusive behavior, verbally abusive behavior, and rejecting care). All secondary outcomes were dichotomized in the 7 days preceding the MDS assessment and aggregated at the facility level for each quarter.

The analysis utilized an interrupted time series model of facility-level prevalence of antipsychotic medication use, other psychotropic medication use, and behavioral disturbances to evaluate the OASIS intervention’s effectiveness in participating facilities compared with control NHs. This methodology allowed the assessment of changes in the trend of antipsychotic use after the OASIS intervention controlling for historical trends. Data from the 18-month pre-intervention (baseline) period was compared with that of a 3-month training phase, a 6-month implementation phase, and a 3-month maintenance phase.

Main results. 93 NHs received OASIS intervention (27 with high prevalence of antipsychotic use) while 831 NHs did not (non-intervention control). The intervention NHs had a higher prevalence of antipsychotic use before OASIS training (baseline period) than the control NHs (34.1% vs. 22.7%, P < 0.001). The intervention NHs compared to controls were smaller in size (122 beds [interquartile range {IQR}, 88–152 beds] vs. 140 beds; [IQR, 104–200 beds]; P < 0.001), more likely to be for profit (77.4% vs. 62.0%, P = 0.009), had corporate ownership (93.5% vs. 74.6%, P < 0.001), and provided resident-only councils (78.5% vs. 52.9%, P < 0.001). The intervention NHs had higher registered nurse (RN) staffing hours per resident (0.8 vs. 0.7; P = 0.01) but lower certified nursing assistant (CNA) hours per resident (2.3 vs. 2.4; P = 0.04) than control NHs. There was no difference in licensed practical nurse hours per resident between groups.

All 93 intervention NHs completed the 8-hour in-person training session and attended an average of 6.5 (range, 0–12) subsequent support webinars. Thirteen NHs (14.0%) attended no regional seminars, 32 (34.4%) attended one, and 48 (51.6%) attended both. Four NHs (4.3%) attended one booster session, and 13 (14.0%) attended both. The NH staff most often trained in the OASIS training program were the directors of nursing, RNs, CNAs, and activities personnel. Support staff including housekeeping and dietary were trained in about half of the reporting intervention NHs, while physicians and nurse practitioners participated infrequently. Concurrent training programs in dementia care (Hand-in-Hand, Alzheimer Association training, MassPRO dementia care training) were implemented in 67.2% of intervention NHs.

In the intervention NHs, the prevalence of antipsych-otic prescribing decreased from 34.1% at baseline to 26.5% at the study end (7.6% absolute reduction, 22.3% relative reduction). In comparison, the prevalence of antipsychotic prescribing in control NHs decreased from 22.7% to 18.8% over the same period (3.9% absolute reduction, 17.2% relative reduction). During the OASIS implementation phase, the intervention NHs had a reduc-tion in prevalence of antipsychotic use (–1.20% [95% confidence interval {CI}, –1.85% to –0.09% per quarter]) greater than that of the control NHs (–0.23% [95% CI, –0.47% to 0.01% per quarter]), resulting in a net OASIS influence of –0.97% (95% CI, –1.85% to –0.09% per quarter; P = 0.03). The antipsychotic use reduction observed in the implementation phase was not sustained in the maintenance phase (difference of 0.93%; 95% CI, –0.66% to 2.54%; P = 0.48). No increases in other psychotropic medication use (anxiolytics, antidepressants, hypnotics) or behavioral disturbances (physically abusive behavior, verbally abusive behavior, and rejecting care) were observed during the OASIS training and implementation phases.

Conclusion. The OASIS communication training program reduced the prevalence of antipsychotic use in NHs during its implementation phase, but its effect was not sustained in the subsequent maintenance phase. The use of other psychotropic medications and behavior disturbances did not increase during the implementation of OASIS program. The findings from this study provided further support for utilizing nonpharmacologic programs to treat behavioral and psychological symptoms of dementia in older adults who reside in NHs.

Commentary

The use of both conventional and atypical antipsychotic medications is associated with a dose-related, approximately 2-fold increased risk of sudden cardiac death in older adults [1,2]. In 2006, the FDA issued a public health advisory stating that both conventional and atypical anti-psychotic medications are associated with an increased risk of mortality in elderly patients treated for dementia-related psychosis. Despite this black box warning and growing recognition that antipsychotic medications are not indicated for the treatment of dementia-related psychosis, the off-label use of antipsychotic medications to treat behavioral and psychological symptoms of dementia in older adults remains a common practice in nursing homes [3]. Thus, there is an urgent need to assess and develop effective interventions that reduce the practice of antipsychotic medication prescribing in long-term care. To that effect, the study reported by Tjia et al appropriately investigated the impact of the OASIS communication training program, a nonpharmacologic intervention, on the reduction of antipsychotic use in NHs.

This study was well designed and had a number of strengths. It utilized an interrupted time series model, one of the strongest quasi-experimental approaches due to its robustness to threats of internal validity, for evaluating longitudinal effects of an intervention intended to improve the quality of medication use. Moreover, this study included a large sample size and comparison facilities from the same geographical areas (NHs in Massachusetts and New York State) that served as external controls. Several potential weaknesses of the study were identified. Because facility-level aggregate data from NHs were used for analysis, individual level (long-term care resident) characteristics were not accounted for in the analysis. In addition, while the post-OASIS intervention questionnaire response rate was 65.6% (61 of 93 intervention NHs), a higher response rate would provide better characterization of NH staff that participated in OASIS program training, program completion rate, and a more complete representation of competing dementia care training programs concurrently implemented in these NHs.

Several studies, most utilizing various provider education methods, had explored whether these interventions could curb antipsychotic use in NHs with limited success. The largest successful intervention was reported by Meador et al [4], where a focused provider education program facilitated a relative reduction in antipsychotic medication use of 23% compared to control NHs. However, the implementation of this specific program was time- and resource-intensive, requiring geropsychiatry evaluation to all physicians (45 to 60 min), nurse-educator in-service programs for NH staff (5 to 6 one-hr sessions), management specialist consultation to NH administrators (4 hr), and evening meeting for the families of NH residents. The current study by Tjia et al, the largest study to date conducted in the context of competing dementia care training programs and increased awareness of the danger of antipsychotic use in the elderly, similarly showed a meaningful reduction in antipsychotic medication use in NHs that received the OASIS communication training program. The OASIS program appears to be less resource-intensive than the provider education program modeled by Meador et al, and its train-the-trainer model is likely more adaptable to meet the limitations (eg, low staffing and staff turnover) inherent in NHs. The beneficial effect of the OASIS program on reduction of antipsychotic medication prescribing was observed despite low participation by prescribers (11.5% of physicians and 11.5% of nurse practitioners). Although it is unclear why this was observed, this finding is intriguing in that a communication training program that reframes challenging behavior of NH residents with cognitive impairment as (1) communication of unmet needs, (2) train staff to anticipate resident needs, and (3) integrate resident strengths into daily care plans can alter provider prescription behavior. The implication of this is that provider practice in managing behavioral and psychological symptoms of dementia can be improved by optimizing communication training in NH staff. Taken together, this study adds to evidence in favor of utilizing nonpharmacologic interventions to reduce antipsychotic use in long-term care.

Applications for Clinical Practice

OASIS, a communication training program for NH staff, reduces antipsychotic medication use in NHs during its implementation phase. Future studies need to investigate pragmatic methods to sustain the beneficial effect of OASIS after its implementation phase.

 

—Fred Ko, MD, MS, Icahn School of Medicine at Mount Sinai, New York, NY

References

1. Ray WA, Chung CP, Murray KT, et al. Atypical antipsychotic drugs and the risk of sudden cardiac death. N Engl J Med 2009;360:225–35.

2. Wang PS, Schneeweiss S, Avorn J, et al. Risk of death in elderly users of conventional vs. atypical antipsychotic medications. N Engl J Med 2005;353:2335–41.

3. Chen Y, Briesacher BA, Field TS, et al. Unexplained variation across US nursing homes in antipsychotic prescribing rates. Arch Intern Med 2010;170:89–95.

4. Meador KG, Taylor JA, Thapa PB, et al. Predictors of anti-
psychotic withdrawal or dose reduction in a randomized controlled trial of provider education. J Am Geriatr Soc 1997;45:207–10.

Issue
Journal of Clinical Outcomes Management - August 2017, Vol. 24, No 8
Publications
Topics
Sections

Study Overview

Objective. To evaluate the effectiveness of OASIS, a large-scale, statewide communication training program, on the reduction of antipsychotic use in nursing homes (NHs).

Design. Quasi-experimental longitudinal study with external controls.

Setting and participants. The participants were residents living in NHs between 1 March 2011 and 31 August 2013. The intervention group consisted of NHs in Massachusetts that were enrolled in the OASIS intervention and the control group consisted of NHs in Massachusetts and New York. The Centers for Medicare & Medicaid Services Minimum Data Set (MDS) 3.0 data was analyzed to determine medication use and behavior of residents of NHs. Residents of these NHs were excluded if they had a US Food and Drug Administration (FDA)-approved indication for antipsychotic use (eg, schizophrenia); were short-term residents (length of stay < 90 days); or had missing data on psychopharmacological medication use or behavior.

Intervention. The OASIS is an educational program that targeted both direct care and non-direct care staff in NHs to assist them in meeting the needs and challenges of caring for long-term care residents. Utilizing a train-the-trainer model, OASIS program coordinators and champions from each intervention NH participated in an 8-hour in-person training session that focused on enhancing communication skills between NH staff and residents with cognitive impairment. These trainers subsequently instructed the OASIS program to staff at their respective NHs using a team-based care approach. Addi-tional support of the OASIS educational program, such as telephone support, 12 webinars, 2 regional seminars, and 2 booster sessions, were provided to participating NHs.

Main outcome measures. The main outcome measure was facility-level prevalence of antipsychotic use in long-term NH residents captured by MDS in the 7 days preceding the MDS assessment. The secondary outcome measures were facility-level quarterly prevalence of psychotropic medications that may have been substituted for antipsychotic medications (ie, anxiolytics, antidepressants, and hypnotics) and behavioral disturbances (ie, physically abusive behavior, verbally abusive behavior, and rejecting care). All secondary outcomes were dichotomized in the 7 days preceding the MDS assessment and aggregated at the facility level for each quarter.

The analysis utilized an interrupted time series model of facility-level prevalence of antipsychotic medication use, other psychotropic medication use, and behavioral disturbances to evaluate the OASIS intervention’s effectiveness in participating facilities compared with control NHs. This methodology allowed the assessment of changes in the trend of antipsychotic use after the OASIS intervention controlling for historical trends. Data from the 18-month pre-intervention (baseline) period was compared with that of a 3-month training phase, a 6-month implementation phase, and a 3-month maintenance phase.

Main results. 93 NHs received OASIS intervention (27 with high prevalence of antipsychotic use) while 831 NHs did not (non-intervention control). The intervention NHs had a higher prevalence of antipsychotic use before OASIS training (baseline period) than the control NHs (34.1% vs. 22.7%, P < 0.001). The intervention NHs compared to controls were smaller in size (122 beds [interquartile range {IQR}, 88–152 beds] vs. 140 beds; [IQR, 104–200 beds]; P < 0.001), more likely to be for profit (77.4% vs. 62.0%, P = 0.009), had corporate ownership (93.5% vs. 74.6%, P < 0.001), and provided resident-only councils (78.5% vs. 52.9%, P < 0.001). The intervention NHs had higher registered nurse (RN) staffing hours per resident (0.8 vs. 0.7; P = 0.01) but lower certified nursing assistant (CNA) hours per resident (2.3 vs. 2.4; P = 0.04) than control NHs. There was no difference in licensed practical nurse hours per resident between groups.

All 93 intervention NHs completed the 8-hour in-person training session and attended an average of 6.5 (range, 0–12) subsequent support webinars. Thirteen NHs (14.0%) attended no regional seminars, 32 (34.4%) attended one, and 48 (51.6%) attended both. Four NHs (4.3%) attended one booster session, and 13 (14.0%) attended both. The NH staff most often trained in the OASIS training program were the directors of nursing, RNs, CNAs, and activities personnel. Support staff including housekeeping and dietary were trained in about half of the reporting intervention NHs, while physicians and nurse practitioners participated infrequently. Concurrent training programs in dementia care (Hand-in-Hand, Alzheimer Association training, MassPRO dementia care training) were implemented in 67.2% of intervention NHs.

In the intervention NHs, the prevalence of antipsych-otic prescribing decreased from 34.1% at baseline to 26.5% at the study end (7.6% absolute reduction, 22.3% relative reduction). In comparison, the prevalence of antipsychotic prescribing in control NHs decreased from 22.7% to 18.8% over the same period (3.9% absolute reduction, 17.2% relative reduction). During the OASIS implementation phase, the intervention NHs had a reduc-tion in prevalence of antipsychotic use (–1.20% [95% confidence interval {CI}, –1.85% to –0.09% per quarter]) greater than that of the control NHs (–0.23% [95% CI, –0.47% to 0.01% per quarter]), resulting in a net OASIS influence of –0.97% (95% CI, –1.85% to –0.09% per quarter; P = 0.03). The antipsychotic use reduction observed in the implementation phase was not sustained in the maintenance phase (difference of 0.93%; 95% CI, –0.66% to 2.54%; P = 0.48). No increases in other psychotropic medication use (anxiolytics, antidepressants, hypnotics) or behavioral disturbances (physically abusive behavior, verbally abusive behavior, and rejecting care) were observed during the OASIS training and implementation phases.

Conclusion. The OASIS communication training program reduced the prevalence of antipsychotic use in NHs during its implementation phase, but its effect was not sustained in the subsequent maintenance phase. The use of other psychotropic medications and behavior disturbances did not increase during the implementation of OASIS program. The findings from this study provided further support for utilizing nonpharmacologic programs to treat behavioral and psychological symptoms of dementia in older adults who reside in NHs.

Commentary

The use of both conventional and atypical antipsychotic medications is associated with a dose-related, approximately 2-fold increased risk of sudden cardiac death in older adults [1,2]. In 2006, the FDA issued a public health advisory stating that both conventional and atypical anti-psychotic medications are associated with an increased risk of mortality in elderly patients treated for dementia-related psychosis. Despite this black box warning and growing recognition that antipsychotic medications are not indicated for the treatment of dementia-related psychosis, the off-label use of antipsychotic medications to treat behavioral and psychological symptoms of dementia in older adults remains a common practice in nursing homes [3]. Thus, there is an urgent need to assess and develop effective interventions that reduce the practice of antipsychotic medication prescribing in long-term care. To that effect, the study reported by Tjia et al appropriately investigated the impact of the OASIS communication training program, a nonpharmacologic intervention, on the reduction of antipsychotic use in NHs.

This study was well designed and had a number of strengths. It utilized an interrupted time series model, one of the strongest quasi-experimental approaches due to its robustness to threats of internal validity, for evaluating longitudinal effects of an intervention intended to improve the quality of medication use. Moreover, this study included a large sample size and comparison facilities from the same geographical areas (NHs in Massachusetts and New York State) that served as external controls. Several potential weaknesses of the study were identified. Because facility-level aggregate data from NHs were used for analysis, individual level (long-term care resident) characteristics were not accounted for in the analysis. In addition, while the post-OASIS intervention questionnaire response rate was 65.6% (61 of 93 intervention NHs), a higher response rate would provide better characterization of NH staff that participated in OASIS program training, program completion rate, and a more complete representation of competing dementia care training programs concurrently implemented in these NHs.

Several studies, most utilizing various provider education methods, had explored whether these interventions could curb antipsychotic use in NHs with limited success. The largest successful intervention was reported by Meador et al [4], where a focused provider education program facilitated a relative reduction in antipsychotic medication use of 23% compared to control NHs. However, the implementation of this specific program was time- and resource-intensive, requiring geropsychiatry evaluation to all physicians (45 to 60 min), nurse-educator in-service programs for NH staff (5 to 6 one-hr sessions), management specialist consultation to NH administrators (4 hr), and evening meeting for the families of NH residents. The current study by Tjia et al, the largest study to date conducted in the context of competing dementia care training programs and increased awareness of the danger of antipsychotic use in the elderly, similarly showed a meaningful reduction in antipsychotic medication use in NHs that received the OASIS communication training program. The OASIS program appears to be less resource-intensive than the provider education program modeled by Meador et al, and its train-the-trainer model is likely more adaptable to meet the limitations (eg, low staffing and staff turnover) inherent in NHs. The beneficial effect of the OASIS program on reduction of antipsychotic medication prescribing was observed despite low participation by prescribers (11.5% of physicians and 11.5% of nurse practitioners). Although it is unclear why this was observed, this finding is intriguing in that a communication training program that reframes challenging behavior of NH residents with cognitive impairment as (1) communication of unmet needs, (2) train staff to anticipate resident needs, and (3) integrate resident strengths into daily care plans can alter provider prescription behavior. The implication of this is that provider practice in managing behavioral and psychological symptoms of dementia can be improved by optimizing communication training in NH staff. Taken together, this study adds to evidence in favor of utilizing nonpharmacologic interventions to reduce antipsychotic use in long-term care.

Applications for Clinical Practice

OASIS, a communication training program for NH staff, reduces antipsychotic medication use in NHs during its implementation phase. Future studies need to investigate pragmatic methods to sustain the beneficial effect of OASIS after its implementation phase.

 

—Fred Ko, MD, MS, Icahn School of Medicine at Mount Sinai, New York, NY

Study Overview

Objective. To evaluate the effectiveness of OASIS, a large-scale, statewide communication training program, on the reduction of antipsychotic use in nursing homes (NHs).

Design. Quasi-experimental longitudinal study with external controls.

Setting and participants. The participants were residents living in NHs between 1 March 2011 and 31 August 2013. The intervention group consisted of NHs in Massachusetts that were enrolled in the OASIS intervention and the control group consisted of NHs in Massachusetts and New York. The Centers for Medicare & Medicaid Services Minimum Data Set (MDS) 3.0 data was analyzed to determine medication use and behavior of residents of NHs. Residents of these NHs were excluded if they had a US Food and Drug Administration (FDA)-approved indication for antipsychotic use (eg, schizophrenia); were short-term residents (length of stay < 90 days); or had missing data on psychopharmacological medication use or behavior.

Intervention. The OASIS is an educational program that targeted both direct care and non-direct care staff in NHs to assist them in meeting the needs and challenges of caring for long-term care residents. Utilizing a train-the-trainer model, OASIS program coordinators and champions from each intervention NH participated in an 8-hour in-person training session that focused on enhancing communication skills between NH staff and residents with cognitive impairment. These trainers subsequently instructed the OASIS program to staff at their respective NHs using a team-based care approach. Addi-tional support of the OASIS educational program, such as telephone support, 12 webinars, 2 regional seminars, and 2 booster sessions, were provided to participating NHs.

Main outcome measures. The main outcome measure was facility-level prevalence of antipsychotic use in long-term NH residents captured by MDS in the 7 days preceding the MDS assessment. The secondary outcome measures were facility-level quarterly prevalence of psychotropic medications that may have been substituted for antipsychotic medications (ie, anxiolytics, antidepressants, and hypnotics) and behavioral disturbances (ie, physically abusive behavior, verbally abusive behavior, and rejecting care). All secondary outcomes were dichotomized in the 7 days preceding the MDS assessment and aggregated at the facility level for each quarter.

The analysis utilized an interrupted time series model of facility-level prevalence of antipsychotic medication use, other psychotropic medication use, and behavioral disturbances to evaluate the OASIS intervention’s effectiveness in participating facilities compared with control NHs. This methodology allowed the assessment of changes in the trend of antipsychotic use after the OASIS intervention controlling for historical trends. Data from the 18-month pre-intervention (baseline) period was compared with that of a 3-month training phase, a 6-month implementation phase, and a 3-month maintenance phase.

Main results. 93 NHs received OASIS intervention (27 with high prevalence of antipsychotic use) while 831 NHs did not (non-intervention control). The intervention NHs had a higher prevalence of antipsychotic use before OASIS training (baseline period) than the control NHs (34.1% vs. 22.7%, P < 0.001). The intervention NHs compared to controls were smaller in size (122 beds [interquartile range {IQR}, 88–152 beds] vs. 140 beds; [IQR, 104–200 beds]; P < 0.001), more likely to be for profit (77.4% vs. 62.0%, P = 0.009), had corporate ownership (93.5% vs. 74.6%, P < 0.001), and provided resident-only councils (78.5% vs. 52.9%, P < 0.001). The intervention NHs had higher registered nurse (RN) staffing hours per resident (0.8 vs. 0.7; P = 0.01) but lower certified nursing assistant (CNA) hours per resident (2.3 vs. 2.4; P = 0.04) than control NHs. There was no difference in licensed practical nurse hours per resident between groups.

All 93 intervention NHs completed the 8-hour in-person training session and attended an average of 6.5 (range, 0–12) subsequent support webinars. Thirteen NHs (14.0%) attended no regional seminars, 32 (34.4%) attended one, and 48 (51.6%) attended both. Four NHs (4.3%) attended one booster session, and 13 (14.0%) attended both. The NH staff most often trained in the OASIS training program were the directors of nursing, RNs, CNAs, and activities personnel. Support staff including housekeeping and dietary were trained in about half of the reporting intervention NHs, while physicians and nurse practitioners participated infrequently. Concurrent training programs in dementia care (Hand-in-Hand, Alzheimer Association training, MassPRO dementia care training) were implemented in 67.2% of intervention NHs.

In the intervention NHs, the prevalence of antipsych-otic prescribing decreased from 34.1% at baseline to 26.5% at the study end (7.6% absolute reduction, 22.3% relative reduction). In comparison, the prevalence of antipsychotic prescribing in control NHs decreased from 22.7% to 18.8% over the same period (3.9% absolute reduction, 17.2% relative reduction). During the OASIS implementation phase, the intervention NHs had a reduc-tion in prevalence of antipsychotic use (–1.20% [95% confidence interval {CI}, –1.85% to –0.09% per quarter]) greater than that of the control NHs (–0.23% [95% CI, –0.47% to 0.01% per quarter]), resulting in a net OASIS influence of –0.97% (95% CI, –1.85% to –0.09% per quarter; P = 0.03). The antipsychotic use reduction observed in the implementation phase was not sustained in the maintenance phase (difference of 0.93%; 95% CI, –0.66% to 2.54%; P = 0.48). No increases in other psychotropic medication use (anxiolytics, antidepressants, hypnotics) or behavioral disturbances (physically abusive behavior, verbally abusive behavior, and rejecting care) were observed during the OASIS training and implementation phases.

Conclusion. The OASIS communication training program reduced the prevalence of antipsychotic use in NHs during its implementation phase, but its effect was not sustained in the subsequent maintenance phase. The use of other psychotropic medications and behavior disturbances did not increase during the implementation of OASIS program. The findings from this study provided further support for utilizing nonpharmacologic programs to treat behavioral and psychological symptoms of dementia in older adults who reside in NHs.

Commentary

The use of both conventional and atypical antipsychotic medications is associated with a dose-related, approximately 2-fold increased risk of sudden cardiac death in older adults [1,2]. In 2006, the FDA issued a public health advisory stating that both conventional and atypical anti-psychotic medications are associated with an increased risk of mortality in elderly patients treated for dementia-related psychosis. Despite this black box warning and growing recognition that antipsychotic medications are not indicated for the treatment of dementia-related psychosis, the off-label use of antipsychotic medications to treat behavioral and psychological symptoms of dementia in older adults remains a common practice in nursing homes [3]. Thus, there is an urgent need to assess and develop effective interventions that reduce the practice of antipsychotic medication prescribing in long-term care. To that effect, the study reported by Tjia et al appropriately investigated the impact of the OASIS communication training program, a nonpharmacologic intervention, on the reduction of antipsychotic use in NHs.

This study was well designed and had a number of strengths. It utilized an interrupted time series model, one of the strongest quasi-experimental approaches due to its robustness to threats of internal validity, for evaluating longitudinal effects of an intervention intended to improve the quality of medication use. Moreover, this study included a large sample size and comparison facilities from the same geographical areas (NHs in Massachusetts and New York State) that served as external controls. Several potential weaknesses of the study were identified. Because facility-level aggregate data from NHs were used for analysis, individual level (long-term care resident) characteristics were not accounted for in the analysis. In addition, while the post-OASIS intervention questionnaire response rate was 65.6% (61 of 93 intervention NHs), a higher response rate would provide better characterization of NH staff that participated in OASIS program training, program completion rate, and a more complete representation of competing dementia care training programs concurrently implemented in these NHs.

Several studies, most utilizing various provider education methods, had explored whether these interventions could curb antipsychotic use in NHs with limited success. The largest successful intervention was reported by Meador et al [4], where a focused provider education program facilitated a relative reduction in antipsychotic medication use of 23% compared to control NHs. However, the implementation of this specific program was time- and resource-intensive, requiring geropsychiatry evaluation to all physicians (45 to 60 min), nurse-educator in-service programs for NH staff (5 to 6 one-hr sessions), management specialist consultation to NH administrators (4 hr), and evening meeting for the families of NH residents. The current study by Tjia et al, the largest study to date conducted in the context of competing dementia care training programs and increased awareness of the danger of antipsychotic use in the elderly, similarly showed a meaningful reduction in antipsychotic medication use in NHs that received the OASIS communication training program. The OASIS program appears to be less resource-intensive than the provider education program modeled by Meador et al, and its train-the-trainer model is likely more adaptable to meet the limitations (eg, low staffing and staff turnover) inherent in NHs. The beneficial effect of the OASIS program on reduction of antipsychotic medication prescribing was observed despite low participation by prescribers (11.5% of physicians and 11.5% of nurse practitioners). Although it is unclear why this was observed, this finding is intriguing in that a communication training program that reframes challenging behavior of NH residents with cognitive impairment as (1) communication of unmet needs, (2) train staff to anticipate resident needs, and (3) integrate resident strengths into daily care plans can alter provider prescription behavior. The implication of this is that provider practice in managing behavioral and psychological symptoms of dementia can be improved by optimizing communication training in NH staff. Taken together, this study adds to evidence in favor of utilizing nonpharmacologic interventions to reduce antipsychotic use in long-term care.

Applications for Clinical Practice

OASIS, a communication training program for NH staff, reduces antipsychotic medication use in NHs during its implementation phase. Future studies need to investigate pragmatic methods to sustain the beneficial effect of OASIS after its implementation phase.

 

—Fred Ko, MD, MS, Icahn School of Medicine at Mount Sinai, New York, NY

References

1. Ray WA, Chung CP, Murray KT, et al. Atypical antipsychotic drugs and the risk of sudden cardiac death. N Engl J Med 2009;360:225–35.

2. Wang PS, Schneeweiss S, Avorn J, et al. Risk of death in elderly users of conventional vs. atypical antipsychotic medications. N Engl J Med 2005;353:2335–41.

3. Chen Y, Briesacher BA, Field TS, et al. Unexplained variation across US nursing homes in antipsychotic prescribing rates. Arch Intern Med 2010;170:89–95.

4. Meador KG, Taylor JA, Thapa PB, et al. Predictors of anti-
psychotic withdrawal or dose reduction in a randomized controlled trial of provider education. J Am Geriatr Soc 1997;45:207–10.

References

1. Ray WA, Chung CP, Murray KT, et al. Atypical antipsychotic drugs and the risk of sudden cardiac death. N Engl J Med 2009;360:225–35.

2. Wang PS, Schneeweiss S, Avorn J, et al. Risk of death in elderly users of conventional vs. atypical antipsychotic medications. N Engl J Med 2005;353:2335–41.

3. Chen Y, Briesacher BA, Field TS, et al. Unexplained variation across US nursing homes in antipsychotic prescribing rates. Arch Intern Med 2010;170:89–95.

4. Meador KG, Taylor JA, Thapa PB, et al. Predictors of anti-
psychotic withdrawal or dose reduction in a randomized controlled trial of provider education. J Am Geriatr Soc 1997;45:207–10.

Issue
Journal of Clinical Outcomes Management - August 2017, Vol. 24, No 8
Issue
Journal of Clinical Outcomes Management - August 2017, Vol. 24, No 8
Publications
Publications
Topics
Article Type
Display Headline
Implementation of a Communication Training Program Is Associated with Reduction of Antipsychotic Medication Use in Nursing Homes
Display Headline
Implementation of a Communication Training Program Is Associated with Reduction of Antipsychotic Medication Use in Nursing Homes
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default