User login
Official Newspaper of the American College of Surgeons
Should clopidogrel be discontinued prior to open vascular procedures?
The continued use of perioperative clopidogrel is appropriate
Surgeons have always worried about bleeding risks for procedures we do. Complex vascular procedures are further complicated by the myriad of available antiplatelet agents designed to reduce ischemic events from cardiovascular disease burden at the expense of potential bleeding complications if antiplatelet medications are continued. Rather than relying on anecdotal reports by historical vignettes, let’s look at the evidence.
There probably is no other drug available in our vascular toolbox which has been studied more in the last 20 years than clopidogrel. Multiple randomized and double blinded studies such as CASPAR1 and CHARISMA2 have amplified what was known since the early CAPRIE trial in the 1990’s and that is that clopidogrel is safe when used as a single medication or as a dual agent with aspirin (duel antiplatelet therapy [DAPT]).
But not all our patients need DAPT. There is no level 1 evidence demonstrating the need for any antiplatelet therapy in the primary prevention of cardiovascular events for patients deemed at low or moderate risk of cardiovascular disease from a large meta-analysis review of six primary prevention trials encompassing over 95,000 patients.3
If our patients do present with vascular disease, current ACCP guidelines recommend single-agent antiplatelet medication (either ASA or clopidogrel) for symptomatic peripheral arterial disease (PAD) whether planning LE revascularization with bypass or via endovascular means with grade 1A evidence.4 This works fine for single-focus vascular disease and each antiplatelet agent have proponents but either works well.
That’s great, but what about all those sick cardiac patients we see the most of? First, CHARISMA subgroup analysis of patients with preexisting coronary and/or cerebrovascular disease demonstrate a 7.1% risk reduction in MI, cerebrovascular events, and cardiac ischemic deaths when continuing DAPT over aspirin alone, and similar risk reduction is found in PAD patients for endpoints of MI and ischemic cardiovascular events. Second, there was no significant difference in severe, fatal, or moderate bleeding in those receiving DAPT vs. aspirin alone with only minor bleeding increased using DAPT. Third, real-life practice echoes multiple trial experiences such as the Vascular Study Group of New England study group confirmed in reviewing 16 centers and 66 surgeons with more than 10,000 patients. Approximately 39% underwent major aortic or lower extremity bypass operations.
No statistical difference could be found for reoperation (P = .74), transfusion (P = .1) or operative type between DAPT or aspirin use alone.5 This is rediscovered once again by Saadeh and Sfeir in their prospective study of 647 major arterial procedures over 7 years finding no significant difference in reoperation for bleeding or bleeding mortality between DAPT vs. aspirin alone.6
So can we stop bashing clopidogrel as an evil agent of bleeding as Dr. Dalsing wishes to do? After all, he has been on record as stating, “I don’t know if our bleeding risk is worse or better … something we have to do to keep our grafts going.” Evidence tells us the benefits for continuing DAPT as seen in risk reduction in primary cardiovascular outcomes far outweigh the risk of minor bleeding associated with continued use.
Let the science dictate practice. Patients with low or moderate risk for cardiovascular disease need no antiplatelet medication unless undergoing PAD treatment where a single agent, either aspirin or clopidogrel alone, is sufficient. In those patients having a large cardiovascular burden of disease, combination of aspirin and clopidogrel improves survival benefit and reduces ischemic events without a significant risk of reoperation, transfusion, or bleeding-related mortality. As many of our patients require DAPT for drug eluting coronary stents, withholding clopidogrel preoperatively increases overall risk beyond acceptable limits. Improving surgical skills and paying attention to hemostasis during the operation will allow naysayers to achieve improved patient survival without fear of bleeding when continuing best medical therapy such as DAPT.
Gary Lemmon, MD, is professor of vascular surgery at Indiana University, Indianapolis, and chief, vascular surgery, Indianapolis VA Medical Center. He reported no relevant conflicts.
References
1. J Vasc Surg. 2010;52:825-33
2. Eur Heart J. 2009;30:192-201
3. Lancet. 2009;373:1849-604. Chest. 2012;141:e669s-90s
5. J Vasc Surg. 2011;54: 779-84
6. J Vasc Surg. 2013;58: 1586-92
The continued use of perioperative clopidogrel is debatable!
There are cases in which clopidogrel should not be discontinued for a needed vascular intervention. Delaying operation or maintaining clopidogrel during operation if your patient required a recent coronary stent is warranted unless you are willing to accept an acute coronary thrombosis.
However, in other cases, for example infrainguinal grafts, the risk of potential increased bleeding when adding clopidogrel to aspirin may outweigh potential improvements in graft patency. This is especially true of below-knee vein bypass grafts where data do not support improved patency. However, in the CASPAR trial, prosthetic graft patency did appear to be beneficial, but only in subgroup analysis.1
It is true that severe bleeding was not increased (intracranial hemorrhage, or hemodynamic compromise: 1 vs 2.7%, P = NS) but moderate bleeding (transfusion required: 0.7 vs 3.7%, P = .012) and mild bleeding (5.4 vs 12.1%, P = .004) was increased when this agent was used especially in vein graft surgery. This risk of bleeding was present even when clopidogrel was begun 2 or more days after surgery.1
To complicate this decision, a Cochrane review did not consider subgroup analysis as statistically valid and so the authors considered infrainguinal graft patency as not improved with clopidogrel but bleeding risk was increased. One might even question the use of acetylsalicylic acid (ASA) for vein graft bypasses based on the results of this metanalysis.2 Carotid endarterectomy is a common vascular surgery procedure in which antiplatelet use has been evaluated in the real-world situation and with large cohorts. As is always the case when dealing with patient issues, the addition of one agent does not tell the entire story and patient demographics can have a significant influence on the outcome. A report from the Vascular Quality Initiative (VQI) database controlled for patient differences by propensity matching with more than 4,500 patients in each of the two groups; ASA vs. ASA + clopidogrel; demonstrated that major bleeding, defined as return to the OR for bleeding, was statistically more common with dual therapy (1.3% vs. 0.7%, P = .004).3
The addition of clopidogrel did statistically decrease the risk of ipsilateral TIA or stroke (0.8% vs. 1.2%, P = .02) but not the risk of death (0.2% vs. 0.3%, P = .3) or postoperative MI (1% vs. 0.8%, P = .4). Reoperation for bleeding is not inconsequential since in patients requiring this intervention, there is a significantly worse outcome in regard to stroke (3.7% vs. 0.8%, P = .001), MI (6.2% vs. 0.8%, P = .001), and death (2.5% vs. 0.2%,P = .001). Further drill down involving propensity score–matched analysis stratified by symptom status (asymptomatic vs. symptomatic) was quite interesting in that in only asymptomatic patients did the addition of clopidogrel actually demonstrate a statistically significant reduction in TIA or stroke, any stroke, or composite stroke/death. Symptomatic patients taking dual therapy demonstrated a slight reduction in TIA or stroke (1.4% vs. 1.7%, P = .6), any stroke (1.1% vs. 1.2%, P = .9) and composite stroke/death (1.2% vs. 1.5%, P = .5) but in no instance was statistical significance reached. The use of protamine did help to decrease the risk of bleeding.
Regarding the use of dual therapy during open aortic operations, an earlier report of the VQI database demonstrated no significant difference in bleeding risk statistically, but if one delves deeper the data indicate something different. In the majority of cases, vascular surgeons do not feel comfortable preforming this extensive dissection on dual therapy. Of the cases reported, 1,074 were preformed either free of either drug or only on ASA while 42 were on dual therapy and only 12 on clopidogrel only. In fact, in the conclusions, the authors note that they do not believe that conclusions regarding clopidogrel use in patient undergoing open abdominal aortic aneurysm repair can be drawn based on their results since the potential for a type II error was too great.4
It may be that our current level of sophistication is not sufficiently mature to determine the actual effect that clopidogrel is having on our patients. Clopidogrel, a thienopyridine, inhibits platelet activation by blocking the ADP-binding site for the P2Y12 receptor. Over 85% of ingested drug is metabolized into inactive metabolites while 15% is metabolized by the liver via a two-step oxidative process into the active thiol metabolite. Inter-individual variability in the antiplatelet response to thienopyridines is noted and partially caused by genetic mutations in the CP isoenzymes. Platelet reactivity testing is possible but most of the work has been conducted for those patients requiring coronary artery revascularization. Results of tailoring intervention to maximize therapeutic benefit and decrease the risk of bleeding have been inconsistent but, in some studies, appear to be promising.5 This approach may ultimately be found superior to determining how effective clopidogrel actually is in a particular case with some insight into the bleeding risk as well. With this determination, whether or not to hold clopidogrel perioperatively can be made with some science behind the decision.
Clearly, a blanket statement that the risk of bleeding should be accepted or ignored because of the demonstrated benefits of clopidogrel in patients requiring vascular surgery is not accurate. In some cases, there is no clear benefit, so eliminating the bleeding risk may well be the appropriate decision. The astute vascular surgeon understands the details of the written word in order to make an educated decision and understands that new information such as determining platelet reactivity may provide more clarity to such decisions in the future.
Michael C. Dalsing, MD, is chief of vascular surgery at Indiana University, Indianapolis. He reported no relevant conflicts.
References
1. J Vasc Surg. 2010;52:825-33
2. Cochrane Database Syst Rev. 2015, Issue 2. Art. No.: CD000535
The continued use of perioperative clopidogrel is appropriate
Surgeons have always worried about bleeding risks for procedures we do. Complex vascular procedures are further complicated by the myriad of available antiplatelet agents designed to reduce ischemic events from cardiovascular disease burden at the expense of potential bleeding complications if antiplatelet medications are continued. Rather than relying on anecdotal reports by historical vignettes, let’s look at the evidence.
There probably is no other drug available in our vascular toolbox which has been studied more in the last 20 years than clopidogrel. Multiple randomized and double blinded studies such as CASPAR1 and CHARISMA2 have amplified what was known since the early CAPRIE trial in the 1990’s and that is that clopidogrel is safe when used as a single medication or as a dual agent with aspirin (duel antiplatelet therapy [DAPT]).
But not all our patients need DAPT. There is no level 1 evidence demonstrating the need for any antiplatelet therapy in the primary prevention of cardiovascular events for patients deemed at low or moderate risk of cardiovascular disease from a large meta-analysis review of six primary prevention trials encompassing over 95,000 patients.3
If our patients do present with vascular disease, current ACCP guidelines recommend single-agent antiplatelet medication (either ASA or clopidogrel) for symptomatic peripheral arterial disease (PAD) whether planning LE revascularization with bypass or via endovascular means with grade 1A evidence.4 This works fine for single-focus vascular disease and each antiplatelet agent have proponents but either works well.
That’s great, but what about all those sick cardiac patients we see the most of? First, CHARISMA subgroup analysis of patients with preexisting coronary and/or cerebrovascular disease demonstrate a 7.1% risk reduction in MI, cerebrovascular events, and cardiac ischemic deaths when continuing DAPT over aspirin alone, and similar risk reduction is found in PAD patients for endpoints of MI and ischemic cardiovascular events. Second, there was no significant difference in severe, fatal, or moderate bleeding in those receiving DAPT vs. aspirin alone with only minor bleeding increased using DAPT. Third, real-life practice echoes multiple trial experiences such as the Vascular Study Group of New England study group confirmed in reviewing 16 centers and 66 surgeons with more than 10,000 patients. Approximately 39% underwent major aortic or lower extremity bypass operations.
No statistical difference could be found for reoperation (P = .74), transfusion (P = .1) or operative type between DAPT or aspirin use alone.5 This is rediscovered once again by Saadeh and Sfeir in their prospective study of 647 major arterial procedures over 7 years finding no significant difference in reoperation for bleeding or bleeding mortality between DAPT vs. aspirin alone.6
So can we stop bashing clopidogrel as an evil agent of bleeding as Dr. Dalsing wishes to do? After all, he has been on record as stating, “I don’t know if our bleeding risk is worse or better … something we have to do to keep our grafts going.” Evidence tells us the benefits for continuing DAPT as seen in risk reduction in primary cardiovascular outcomes far outweigh the risk of minor bleeding associated with continued use.
Let the science dictate practice. Patients with low or moderate risk for cardiovascular disease need no antiplatelet medication unless undergoing PAD treatment where a single agent, either aspirin or clopidogrel alone, is sufficient. In those patients having a large cardiovascular burden of disease, combination of aspirin and clopidogrel improves survival benefit and reduces ischemic events without a significant risk of reoperation, transfusion, or bleeding-related mortality. As many of our patients require DAPT for drug eluting coronary stents, withholding clopidogrel preoperatively increases overall risk beyond acceptable limits. Improving surgical skills and paying attention to hemostasis during the operation will allow naysayers to achieve improved patient survival without fear of bleeding when continuing best medical therapy such as DAPT.
Gary Lemmon, MD, is professor of vascular surgery at Indiana University, Indianapolis, and chief, vascular surgery, Indianapolis VA Medical Center. He reported no relevant conflicts.
References
1. J Vasc Surg. 2010;52:825-33
2. Eur Heart J. 2009;30:192-201
3. Lancet. 2009;373:1849-604. Chest. 2012;141:e669s-90s
5. J Vasc Surg. 2011;54: 779-84
6. J Vasc Surg. 2013;58: 1586-92
The continued use of perioperative clopidogrel is debatable!
There are cases in which clopidogrel should not be discontinued for a needed vascular intervention. Delaying operation or maintaining clopidogrel during operation if your patient required a recent coronary stent is warranted unless you are willing to accept an acute coronary thrombosis.
However, in other cases, for example infrainguinal grafts, the risk of potential increased bleeding when adding clopidogrel to aspirin may outweigh potential improvements in graft patency. This is especially true of below-knee vein bypass grafts where data do not support improved patency. However, in the CASPAR trial, prosthetic graft patency did appear to be beneficial, but only in subgroup analysis.1
It is true that severe bleeding was not increased (intracranial hemorrhage, or hemodynamic compromise: 1 vs 2.7%, P = NS) but moderate bleeding (transfusion required: 0.7 vs 3.7%, P = .012) and mild bleeding (5.4 vs 12.1%, P = .004) was increased when this agent was used especially in vein graft surgery. This risk of bleeding was present even when clopidogrel was begun 2 or more days after surgery.1
To complicate this decision, a Cochrane review did not consider subgroup analysis as statistically valid and so the authors considered infrainguinal graft patency as not improved with clopidogrel but bleeding risk was increased. One might even question the use of acetylsalicylic acid (ASA) for vein graft bypasses based on the results of this metanalysis.2 Carotid endarterectomy is a common vascular surgery procedure in which antiplatelet use has been evaluated in the real-world situation and with large cohorts. As is always the case when dealing with patient issues, the addition of one agent does not tell the entire story and patient demographics can have a significant influence on the outcome. A report from the Vascular Quality Initiative (VQI) database controlled for patient differences by propensity matching with more than 4,500 patients in each of the two groups; ASA vs. ASA + clopidogrel; demonstrated that major bleeding, defined as return to the OR for bleeding, was statistically more common with dual therapy (1.3% vs. 0.7%, P = .004).3
The addition of clopidogrel did statistically decrease the risk of ipsilateral TIA or stroke (0.8% vs. 1.2%, P = .02) but not the risk of death (0.2% vs. 0.3%, P = .3) or postoperative MI (1% vs. 0.8%, P = .4). Reoperation for bleeding is not inconsequential since in patients requiring this intervention, there is a significantly worse outcome in regard to stroke (3.7% vs. 0.8%, P = .001), MI (6.2% vs. 0.8%, P = .001), and death (2.5% vs. 0.2%,P = .001). Further drill down involving propensity score–matched analysis stratified by symptom status (asymptomatic vs. symptomatic) was quite interesting in that in only asymptomatic patients did the addition of clopidogrel actually demonstrate a statistically significant reduction in TIA or stroke, any stroke, or composite stroke/death. Symptomatic patients taking dual therapy demonstrated a slight reduction in TIA or stroke (1.4% vs. 1.7%, P = .6), any stroke (1.1% vs. 1.2%, P = .9) and composite stroke/death (1.2% vs. 1.5%, P = .5) but in no instance was statistical significance reached. The use of protamine did help to decrease the risk of bleeding.
Regarding the use of dual therapy during open aortic operations, an earlier report of the VQI database demonstrated no significant difference in bleeding risk statistically, but if one delves deeper the data indicate something different. In the majority of cases, vascular surgeons do not feel comfortable preforming this extensive dissection on dual therapy. Of the cases reported, 1,074 were preformed either free of either drug or only on ASA while 42 were on dual therapy and only 12 on clopidogrel only. In fact, in the conclusions, the authors note that they do not believe that conclusions regarding clopidogrel use in patient undergoing open abdominal aortic aneurysm repair can be drawn based on their results since the potential for a type II error was too great.4
It may be that our current level of sophistication is not sufficiently mature to determine the actual effect that clopidogrel is having on our patients. Clopidogrel, a thienopyridine, inhibits platelet activation by blocking the ADP-binding site for the P2Y12 receptor. Over 85% of ingested drug is metabolized into inactive metabolites while 15% is metabolized by the liver via a two-step oxidative process into the active thiol metabolite. Inter-individual variability in the antiplatelet response to thienopyridines is noted and partially caused by genetic mutations in the CP isoenzymes. Platelet reactivity testing is possible but most of the work has been conducted for those patients requiring coronary artery revascularization. Results of tailoring intervention to maximize therapeutic benefit and decrease the risk of bleeding have been inconsistent but, in some studies, appear to be promising.5 This approach may ultimately be found superior to determining how effective clopidogrel actually is in a particular case with some insight into the bleeding risk as well. With this determination, whether or not to hold clopidogrel perioperatively can be made with some science behind the decision.
Clearly, a blanket statement that the risk of bleeding should be accepted or ignored because of the demonstrated benefits of clopidogrel in patients requiring vascular surgery is not accurate. In some cases, there is no clear benefit, so eliminating the bleeding risk may well be the appropriate decision. The astute vascular surgeon understands the details of the written word in order to make an educated decision and understands that new information such as determining platelet reactivity may provide more clarity to such decisions in the future.
Michael C. Dalsing, MD, is chief of vascular surgery at Indiana University, Indianapolis. He reported no relevant conflicts.
References
1. J Vasc Surg. 2010;52:825-33
2. Cochrane Database Syst Rev. 2015, Issue 2. Art. No.: CD000535
The continued use of perioperative clopidogrel is appropriate
Surgeons have always worried about bleeding risks for procedures we do. Complex vascular procedures are further complicated by the myriad of available antiplatelet agents designed to reduce ischemic events from cardiovascular disease burden at the expense of potential bleeding complications if antiplatelet medications are continued. Rather than relying on anecdotal reports by historical vignettes, let’s look at the evidence.
There probably is no other drug available in our vascular toolbox which has been studied more in the last 20 years than clopidogrel. Multiple randomized and double blinded studies such as CASPAR1 and CHARISMA2 have amplified what was known since the early CAPRIE trial in the 1990’s and that is that clopidogrel is safe when used as a single medication or as a dual agent with aspirin (duel antiplatelet therapy [DAPT]).
But not all our patients need DAPT. There is no level 1 evidence demonstrating the need for any antiplatelet therapy in the primary prevention of cardiovascular events for patients deemed at low or moderate risk of cardiovascular disease from a large meta-analysis review of six primary prevention trials encompassing over 95,000 patients.3
If our patients do present with vascular disease, current ACCP guidelines recommend single-agent antiplatelet medication (either ASA or clopidogrel) for symptomatic peripheral arterial disease (PAD) whether planning LE revascularization with bypass or via endovascular means with grade 1A evidence.4 This works fine for single-focus vascular disease and each antiplatelet agent have proponents but either works well.
That’s great, but what about all those sick cardiac patients we see the most of? First, CHARISMA subgroup analysis of patients with preexisting coronary and/or cerebrovascular disease demonstrate a 7.1% risk reduction in MI, cerebrovascular events, and cardiac ischemic deaths when continuing DAPT over aspirin alone, and similar risk reduction is found in PAD patients for endpoints of MI and ischemic cardiovascular events. Second, there was no significant difference in severe, fatal, or moderate bleeding in those receiving DAPT vs. aspirin alone with only minor bleeding increased using DAPT. Third, real-life practice echoes multiple trial experiences such as the Vascular Study Group of New England study group confirmed in reviewing 16 centers and 66 surgeons with more than 10,000 patients. Approximately 39% underwent major aortic or lower extremity bypass operations.
No statistical difference could be found for reoperation (P = .74), transfusion (P = .1) or operative type between DAPT or aspirin use alone.5 This is rediscovered once again by Saadeh and Sfeir in their prospective study of 647 major arterial procedures over 7 years finding no significant difference in reoperation for bleeding or bleeding mortality between DAPT vs. aspirin alone.6
So can we stop bashing clopidogrel as an evil agent of bleeding as Dr. Dalsing wishes to do? After all, he has been on record as stating, “I don’t know if our bleeding risk is worse or better … something we have to do to keep our grafts going.” Evidence tells us the benefits for continuing DAPT as seen in risk reduction in primary cardiovascular outcomes far outweigh the risk of minor bleeding associated with continued use.
Let the science dictate practice. Patients with low or moderate risk for cardiovascular disease need no antiplatelet medication unless undergoing PAD treatment where a single agent, either aspirin or clopidogrel alone, is sufficient. In those patients having a large cardiovascular burden of disease, combination of aspirin and clopidogrel improves survival benefit and reduces ischemic events without a significant risk of reoperation, transfusion, or bleeding-related mortality. As many of our patients require DAPT for drug eluting coronary stents, withholding clopidogrel preoperatively increases overall risk beyond acceptable limits. Improving surgical skills and paying attention to hemostasis during the operation will allow naysayers to achieve improved patient survival without fear of bleeding when continuing best medical therapy such as DAPT.
Gary Lemmon, MD, is professor of vascular surgery at Indiana University, Indianapolis, and chief, vascular surgery, Indianapolis VA Medical Center. He reported no relevant conflicts.
References
1. J Vasc Surg. 2010;52:825-33
2. Eur Heart J. 2009;30:192-201
3. Lancet. 2009;373:1849-604. Chest. 2012;141:e669s-90s
5. J Vasc Surg. 2011;54: 779-84
6. J Vasc Surg. 2013;58: 1586-92
The continued use of perioperative clopidogrel is debatable!
There are cases in which clopidogrel should not be discontinued for a needed vascular intervention. Delaying operation or maintaining clopidogrel during operation if your patient required a recent coronary stent is warranted unless you are willing to accept an acute coronary thrombosis.
However, in other cases, for example infrainguinal grafts, the risk of potential increased bleeding when adding clopidogrel to aspirin may outweigh potential improvements in graft patency. This is especially true of below-knee vein bypass grafts where data do not support improved patency. However, in the CASPAR trial, prosthetic graft patency did appear to be beneficial, but only in subgroup analysis.1
It is true that severe bleeding was not increased (intracranial hemorrhage, or hemodynamic compromise: 1 vs 2.7%, P = NS) but moderate bleeding (transfusion required: 0.7 vs 3.7%, P = .012) and mild bleeding (5.4 vs 12.1%, P = .004) was increased when this agent was used especially in vein graft surgery. This risk of bleeding was present even when clopidogrel was begun 2 or more days after surgery.1
To complicate this decision, a Cochrane review did not consider subgroup analysis as statistically valid and so the authors considered infrainguinal graft patency as not improved with clopidogrel but bleeding risk was increased. One might even question the use of acetylsalicylic acid (ASA) for vein graft bypasses based on the results of this metanalysis.2 Carotid endarterectomy is a common vascular surgery procedure in which antiplatelet use has been evaluated in the real-world situation and with large cohorts. As is always the case when dealing with patient issues, the addition of one agent does not tell the entire story and patient demographics can have a significant influence on the outcome. A report from the Vascular Quality Initiative (VQI) database controlled for patient differences by propensity matching with more than 4,500 patients in each of the two groups; ASA vs. ASA + clopidogrel; demonstrated that major bleeding, defined as return to the OR for bleeding, was statistically more common with dual therapy (1.3% vs. 0.7%, P = .004).3
The addition of clopidogrel did statistically decrease the risk of ipsilateral TIA or stroke (0.8% vs. 1.2%, P = .02) but not the risk of death (0.2% vs. 0.3%, P = .3) or postoperative MI (1% vs. 0.8%, P = .4). Reoperation for bleeding is not inconsequential since in patients requiring this intervention, there is a significantly worse outcome in regard to stroke (3.7% vs. 0.8%, P = .001), MI (6.2% vs. 0.8%, P = .001), and death (2.5% vs. 0.2%,P = .001). Further drill down involving propensity score–matched analysis stratified by symptom status (asymptomatic vs. symptomatic) was quite interesting in that in only asymptomatic patients did the addition of clopidogrel actually demonstrate a statistically significant reduction in TIA or stroke, any stroke, or composite stroke/death. Symptomatic patients taking dual therapy demonstrated a slight reduction in TIA or stroke (1.4% vs. 1.7%, P = .6), any stroke (1.1% vs. 1.2%, P = .9) and composite stroke/death (1.2% vs. 1.5%, P = .5) but in no instance was statistical significance reached. The use of protamine did help to decrease the risk of bleeding.
Regarding the use of dual therapy during open aortic operations, an earlier report of the VQI database demonstrated no significant difference in bleeding risk statistically, but if one delves deeper the data indicate something different. In the majority of cases, vascular surgeons do not feel comfortable preforming this extensive dissection on dual therapy. Of the cases reported, 1,074 were preformed either free of either drug or only on ASA while 42 were on dual therapy and only 12 on clopidogrel only. In fact, in the conclusions, the authors note that they do not believe that conclusions regarding clopidogrel use in patient undergoing open abdominal aortic aneurysm repair can be drawn based on their results since the potential for a type II error was too great.4
It may be that our current level of sophistication is not sufficiently mature to determine the actual effect that clopidogrel is having on our patients. Clopidogrel, a thienopyridine, inhibits platelet activation by blocking the ADP-binding site for the P2Y12 receptor. Over 85% of ingested drug is metabolized into inactive metabolites while 15% is metabolized by the liver via a two-step oxidative process into the active thiol metabolite. Inter-individual variability in the antiplatelet response to thienopyridines is noted and partially caused by genetic mutations in the CP isoenzymes. Platelet reactivity testing is possible but most of the work has been conducted for those patients requiring coronary artery revascularization. Results of tailoring intervention to maximize therapeutic benefit and decrease the risk of bleeding have been inconsistent but, in some studies, appear to be promising.5 This approach may ultimately be found superior to determining how effective clopidogrel actually is in a particular case with some insight into the bleeding risk as well. With this determination, whether or not to hold clopidogrel perioperatively can be made with some science behind the decision.
Clearly, a blanket statement that the risk of bleeding should be accepted or ignored because of the demonstrated benefits of clopidogrel in patients requiring vascular surgery is not accurate. In some cases, there is no clear benefit, so eliminating the bleeding risk may well be the appropriate decision. The astute vascular surgeon understands the details of the written word in order to make an educated decision and understands that new information such as determining platelet reactivity may provide more clarity to such decisions in the future.
Michael C. Dalsing, MD, is chief of vascular surgery at Indiana University, Indianapolis. He reported no relevant conflicts.
References
1. J Vasc Surg. 2010;52:825-33
2. Cochrane Database Syst Rev. 2015, Issue 2. Art. No.: CD000535
Study: One hour with patients means two hours on EHR
Physicians are spending twice as much time on electronic health records as they are face to face with patients, according to a new study by the American Medical Association.
Researchers observed 57 physicians in four specialties (family medicine, internal medicine, cardiology, and orthopedics) and found that for every hour of direct clinical face time with patients, nearly 2 additional hours is spent on EHR and desk work within the clinic day. Additionally, based on diaries kept by 21 of the participating physicians, another 1-2 hours of personal time were spent each night doing additional computer and clerical work, according to the study published Sept. 5 in Annals of Internal Medicine (2016. doi: 10.7326/M16-0961).
“Over the years, doctors have recognized that more and more of their time was spent on nonpatient care, activities but probably haven’t recognized the magnitude of that change,” Christine Sinsky, MD, vice president of professional satisfaction at the AMA and lead author on the study, said in an interview. “Our study was able to help to quantify that and paint that picture.”
Overall, physicians spent 27% of their day dealing directly with patients, while 49% of the time was spent on EHR and desk work. In the examination room with patients, physicians spent 53% of time on direct clinical face time and 37% on EHR and desk work.
The situation “is the cumulative effect of many, many well-intended efforts that individually might have made sense, but taken collectively have paradoxically made it harder for physicians to deliver quality of care and harder for patients to get the quality of care they deserve,” she said.
EHR development should be focused on reducing the time-cost of providing care on their platforms, Dr. Sinsky recommended.
She noted that for her practice, it takes 32 clicks to order and record a flu shot. “I think vendors have a responsibility to minimize time, to minimize clicks involved in a task.”
She added that “regulators have a responsibility to not just add more and more regulations without first identifying the time-cost of complying with that regulation and without adding up the total cost of complying with regulation.”
Future regulations on EHRs must add flexibility when it comes to who is entering information into the system, she said. “Many regulations are either written with the explicit statement – or it is implied or an institution might overinterpret the regulation – that the physician is the one who must do the keyboarding into the record,” she said, noting that although not primarily studied in the research, preliminary data suggests that doctors who had documentation support were able to spend more time with their patients.
Finally, physicians themselves need to be stronger advocates for the changes they need to enable them to better serve their patients.
In addition to Dr. Sinsky, three other study authors are employed by AMA, which funded the study. No other financial conflicts were reported.
Physicians are spending twice as much time on electronic health records as they are face to face with patients, according to a new study by the American Medical Association.
Researchers observed 57 physicians in four specialties (family medicine, internal medicine, cardiology, and orthopedics) and found that for every hour of direct clinical face time with patients, nearly 2 additional hours is spent on EHR and desk work within the clinic day. Additionally, based on diaries kept by 21 of the participating physicians, another 1-2 hours of personal time were spent each night doing additional computer and clerical work, according to the study published Sept. 5 in Annals of Internal Medicine (2016. doi: 10.7326/M16-0961).
“Over the years, doctors have recognized that more and more of their time was spent on nonpatient care, activities but probably haven’t recognized the magnitude of that change,” Christine Sinsky, MD, vice president of professional satisfaction at the AMA and lead author on the study, said in an interview. “Our study was able to help to quantify that and paint that picture.”
Overall, physicians spent 27% of their day dealing directly with patients, while 49% of the time was spent on EHR and desk work. In the examination room with patients, physicians spent 53% of time on direct clinical face time and 37% on EHR and desk work.
The situation “is the cumulative effect of many, many well-intended efforts that individually might have made sense, but taken collectively have paradoxically made it harder for physicians to deliver quality of care and harder for patients to get the quality of care they deserve,” she said.
EHR development should be focused on reducing the time-cost of providing care on their platforms, Dr. Sinsky recommended.
She noted that for her practice, it takes 32 clicks to order and record a flu shot. “I think vendors have a responsibility to minimize time, to minimize clicks involved in a task.”
She added that “regulators have a responsibility to not just add more and more regulations without first identifying the time-cost of complying with that regulation and without adding up the total cost of complying with regulation.”
Future regulations on EHRs must add flexibility when it comes to who is entering information into the system, she said. “Many regulations are either written with the explicit statement – or it is implied or an institution might overinterpret the regulation – that the physician is the one who must do the keyboarding into the record,” she said, noting that although not primarily studied in the research, preliminary data suggests that doctors who had documentation support were able to spend more time with their patients.
Finally, physicians themselves need to be stronger advocates for the changes they need to enable them to better serve their patients.
In addition to Dr. Sinsky, three other study authors are employed by AMA, which funded the study. No other financial conflicts were reported.
Physicians are spending twice as much time on electronic health records as they are face to face with patients, according to a new study by the American Medical Association.
Researchers observed 57 physicians in four specialties (family medicine, internal medicine, cardiology, and orthopedics) and found that for every hour of direct clinical face time with patients, nearly 2 additional hours is spent on EHR and desk work within the clinic day. Additionally, based on diaries kept by 21 of the participating physicians, another 1-2 hours of personal time were spent each night doing additional computer and clerical work, according to the study published Sept. 5 in Annals of Internal Medicine (2016. doi: 10.7326/M16-0961).
“Over the years, doctors have recognized that more and more of their time was spent on nonpatient care, activities but probably haven’t recognized the magnitude of that change,” Christine Sinsky, MD, vice president of professional satisfaction at the AMA and lead author on the study, said in an interview. “Our study was able to help to quantify that and paint that picture.”
Overall, physicians spent 27% of their day dealing directly with patients, while 49% of the time was spent on EHR and desk work. In the examination room with patients, physicians spent 53% of time on direct clinical face time and 37% on EHR and desk work.
The situation “is the cumulative effect of many, many well-intended efforts that individually might have made sense, but taken collectively have paradoxically made it harder for physicians to deliver quality of care and harder for patients to get the quality of care they deserve,” she said.
EHR development should be focused on reducing the time-cost of providing care on their platforms, Dr. Sinsky recommended.
She noted that for her practice, it takes 32 clicks to order and record a flu shot. “I think vendors have a responsibility to minimize time, to minimize clicks involved in a task.”
She added that “regulators have a responsibility to not just add more and more regulations without first identifying the time-cost of complying with that regulation and without adding up the total cost of complying with regulation.”
Future regulations on EHRs must add flexibility when it comes to who is entering information into the system, she said. “Many regulations are either written with the explicit statement – or it is implied or an institution might overinterpret the regulation – that the physician is the one who must do the keyboarding into the record,” she said, noting that although not primarily studied in the research, preliminary data suggests that doctors who had documentation support were able to spend more time with their patients.
Finally, physicians themselves need to be stronger advocates for the changes they need to enable them to better serve their patients.
In addition to Dr. Sinsky, three other study authors are employed by AMA, which funded the study. No other financial conflicts were reported.
FROM ANNALS OF INTERNAL MEDICINE
10 tips to mitigate legal risks of opioid prescribing
Opioid-related lawsuits against physicians are on the rise. Common allegations include unnecessary prescribing, failing to heed contraindications, and missing warning signs of a likely overdose, said Ericka L. Adler, a Chicago-based health law attorney. To mitigate your risk of getting sued, legal and clinical experts offer the following advice.
1. Identify at-risk patients
Consider the full range of patient risk factors before prescribing or continuing opioids, said Ilene R. Robeck, MD, director of virtual pain education at Richmond VA Medical Center in St. Petersburg, Fla., and cochair of the National VA PACT Pain Champions Initiative.
“When we look at the overdose data, there used to be a perception that people who overdosed were not taking their medication as prescribed, and that’s not true all the time,” she said. “In fact, in some [studies], half the people who overdose take their medication exactly as prescribed. The problems are related to dose, mixing the opioids with other medications, [patient] age, and underlying medical problems.”
Ensure that therapy considerations related to opioids address the full patient picture, Dr. Robeck advised. For example, patients with liver disease, sleep apnea, chronic obstructive pulmonary disease (COPD), asthma, and kidney disease are more prone to overdose. In addition, while nonmedical use of prescription drugs is highest in patients aged 18-25 years, opioid overdose rates are highest among patients aged 25-54 years, according to the Centers for Disease Control and Prevention.
2. Monitor midlevel providers
Closely monitor and limit opioid prescribing by the midlevel providers you supervise, said pharmacologist and consultant James T. O’Donnell, PharmD. A fair share of medical malpractice lawsuits result from failing to supervise a physician assistant (PA) or nurse practitioner (NP) prescribing or treating pain patients.
“Excessive or inappropriate opiate prescribing will result in legal actions against the supervising physician,” Dr. O’Donnell said. “Close monitoring of midlevel providers requires establishing guidelines for opiate prescribing.”
Develop practice protocols that track and regulate midlevel prescribing and regularly discuss prescribing dangers with staff. Know your state law; the extent to which an NP or PA can prescribe varies widely.
3. Document
Keep detailed records of patient encounters that include specifics of what the patient tells you, said Ms. Adler.
Clear documentation about prior conditions, interactions with other health care providers, and past and current treatments help protect the doctor should liability later arise. In the case of a dishonest patient, clear record keeping could help show that a patient lied or omitted facts if the notes later become evidence in a lawsuit, she said.
“I also think [doctors] should document their policies, so there is clarity and understanding in the relationship,” Ms. Adler said. “Showing a policy where national standards/recommendations are followed will help protect the practice.”
4. Restrict refills
Require prescriber review before patients can obtain refills or new prescriptions for patients who run out of medicine before their next appointment, Dr. O’Donnell said.
“Excessive or early refills for opiate prescriptions are signs of abuse,” he said. “This creates risk to the patient and malpractice risk to the physician.”
It’s also helpful to limit the number of pharmacies used for opioid prescriptions, Ms. Adler said. This makes it easier track medications and narrows the pathway between prescription and drug obtainment.
5. Partner with pharmacists
Work closely with other health providers, such as psychiatrists, therapists, and pharmacists to ensure safe prescribing decisions. Pharmacists have a corresponding responsibility in dispensing opiates, Dr. O’Donnell said.
“Take the pharmacists’ calls regarding your opiates prescriptions,” he advised. “The pharmacist will know what other medicines the patient is taking and may advise of dangerous dosage or interactions.”
6. Require patient agreements
Opioid treatment agreements aid in patient accountability and promote education of drug risks, Dr. Robeck noted. In such contracts, patients agree to fully comply with all aspects of the treatment program and acknowledge that they will not use medication with harmful substances. Other terms can include that patients:
• Only obtain opioid prescriptions from one provider.
• Agree to keep all scheduled medical appointments.
• Promise to undergo urine drug screens as indicated.
• Agree not to share or sell medication.
• Agree not to drive or operate heavy machinery until medication-related drowsiness clears.
Contracts can help patients remain informed about the dangers and benefits of medications, while protecting the physician’s right to terminate treatment if the patient violates the agreement, Ms. Adler said. One sample agreement can be found here.
7. Involve family members
Family members and caregivers are critical to a patient’s opioid therapy plan, Dr. Robeck said. Discuss with patients ahead of time the potential for family member involvement. Family or the patient’s support system should be educated about the patient’s medications, the risks, and how to respond in an emergency.
“It may be life saving,” Dr. Robeck said. “It’s very important for the physician to communicate with the family. There may be times you want the family to come [to appointments].”
Such communication can ensure that family members’ concerns about a patient are conveyed to physicians. Family and caregivers can also have a role in improving home conditions to assist with pain management for the patient, she said. Family education in using a naloxone rescue kit in the event of a possible overdose is essential.
“Giving the patient naloxone to prevent an overdose is more than just handing them the nasal spray or the self-injector, it’s also about educating them and the family about the true risks of these medications,” Dr. Robeck said.
8. Watch for red flags
Be cognizant of warning signs that patients may be addicted, Ms. Adler advised. Patients who demand medications, act impatiently about waiting for refills, or refuse to answer questions about their history should raise alarm bells, she said. Patients who travel long distances for pain medication also should raise question marks, Dr. O’Donnell adds.
“If the patient is in too much of a hurry to wait for records or to get a urine test, that’s a red flag,” Ms. Adler said. “When doctors are not sure, the safest bet is to refer to a pain specialist.”
Consider the criteria for opioid use disorder, Dr. Robeck noted. The condition is defined as a problematic pattern of opioid use leading to clinically significant impairment or distress. Signs of opioid use disorder include recurrent use by patients resulting in a failure to fulfill major role obligations at work or home, continued opioid use despite having persistent or recurrent social or interpersonal problems caused or exacerbated by opioids, and spending a great deal of time in activities necessary to obtain the opioid, use the opioid, or recover from use, according to the American Psychiatric Association.
9. Develop an exit plan
Before starting a patient on opioid therapy, have a plan in place in case something goes awry, Dr. Robeck said. Create an exit strategy that includes both pharmacological and nonpharmacological resources from which to draw from should problems arise. Make sure you have a plan for tapering patients off opioids when necessary. This may include getting help from other clinicians in the community, she said.
“Those patients need very careful follow-up,” Dr. Robeck. “The rate of the taper needs to occur based on level of risk. Whenever possible, we try to taper patients slowly.”
10. Do your research
Always check your state’s prescription drug–monitoring program (PDMP) when prescribing an opiate to a new patient, Mr. O’Donnell advised.
Currently, 49 states and Guam have operational PDMP databases. The PDMP Training and Technical Assistance Center offers information about each PDMP, state pharmacy and practitioner data, drug schedules monitored, patient information data, and legislation dates and citations.
Perhaps most importantly, know best prescribing practices, Dr. Robeck said. Earlier this year, the CDC released guidelines for prescribing opioids for chronic pain.
“Thoroughly understand the CDC guidelines,” she said. “These are the keys to understanding where opioids fit into your plan of medications and nonpharmacological therapies for pain.”
On Twitter @legal_med
Opioid-related lawsuits against physicians are on the rise. Common allegations include unnecessary prescribing, failing to heed contraindications, and missing warning signs of a likely overdose, said Ericka L. Adler, a Chicago-based health law attorney. To mitigate your risk of getting sued, legal and clinical experts offer the following advice.
1. Identify at-risk patients
Consider the full range of patient risk factors before prescribing or continuing opioids, said Ilene R. Robeck, MD, director of virtual pain education at Richmond VA Medical Center in St. Petersburg, Fla., and cochair of the National VA PACT Pain Champions Initiative.
“When we look at the overdose data, there used to be a perception that people who overdosed were not taking their medication as prescribed, and that’s not true all the time,” she said. “In fact, in some [studies], half the people who overdose take their medication exactly as prescribed. The problems are related to dose, mixing the opioids with other medications, [patient] age, and underlying medical problems.”
Ensure that therapy considerations related to opioids address the full patient picture, Dr. Robeck advised. For example, patients with liver disease, sleep apnea, chronic obstructive pulmonary disease (COPD), asthma, and kidney disease are more prone to overdose. In addition, while nonmedical use of prescription drugs is highest in patients aged 18-25 years, opioid overdose rates are highest among patients aged 25-54 years, according to the Centers for Disease Control and Prevention.
2. Monitor midlevel providers
Closely monitor and limit opioid prescribing by the midlevel providers you supervise, said pharmacologist and consultant James T. O’Donnell, PharmD. A fair share of medical malpractice lawsuits result from failing to supervise a physician assistant (PA) or nurse practitioner (NP) prescribing or treating pain patients.
“Excessive or inappropriate opiate prescribing will result in legal actions against the supervising physician,” Dr. O’Donnell said. “Close monitoring of midlevel providers requires establishing guidelines for opiate prescribing.”
Develop practice protocols that track and regulate midlevel prescribing and regularly discuss prescribing dangers with staff. Know your state law; the extent to which an NP or PA can prescribe varies widely.
3. Document
Keep detailed records of patient encounters that include specifics of what the patient tells you, said Ms. Adler.
Clear documentation about prior conditions, interactions with other health care providers, and past and current treatments help protect the doctor should liability later arise. In the case of a dishonest patient, clear record keeping could help show that a patient lied or omitted facts if the notes later become evidence in a lawsuit, she said.
“I also think [doctors] should document their policies, so there is clarity and understanding in the relationship,” Ms. Adler said. “Showing a policy where national standards/recommendations are followed will help protect the practice.”
4. Restrict refills
Require prescriber review before patients can obtain refills or new prescriptions for patients who run out of medicine before their next appointment, Dr. O’Donnell said.
“Excessive or early refills for opiate prescriptions are signs of abuse,” he said. “This creates risk to the patient and malpractice risk to the physician.”
It’s also helpful to limit the number of pharmacies used for opioid prescriptions, Ms. Adler said. This makes it easier track medications and narrows the pathway between prescription and drug obtainment.
5. Partner with pharmacists
Work closely with other health providers, such as psychiatrists, therapists, and pharmacists to ensure safe prescribing decisions. Pharmacists have a corresponding responsibility in dispensing opiates, Dr. O’Donnell said.
“Take the pharmacists’ calls regarding your opiates prescriptions,” he advised. “The pharmacist will know what other medicines the patient is taking and may advise of dangerous dosage or interactions.”
6. Require patient agreements
Opioid treatment agreements aid in patient accountability and promote education of drug risks, Dr. Robeck noted. In such contracts, patients agree to fully comply with all aspects of the treatment program and acknowledge that they will not use medication with harmful substances. Other terms can include that patients:
• Only obtain opioid prescriptions from one provider.
• Agree to keep all scheduled medical appointments.
• Promise to undergo urine drug screens as indicated.
• Agree not to share or sell medication.
• Agree not to drive or operate heavy machinery until medication-related drowsiness clears.
Contracts can help patients remain informed about the dangers and benefits of medications, while protecting the physician’s right to terminate treatment if the patient violates the agreement, Ms. Adler said. One sample agreement can be found here.
7. Involve family members
Family members and caregivers are critical to a patient’s opioid therapy plan, Dr. Robeck said. Discuss with patients ahead of time the potential for family member involvement. Family or the patient’s support system should be educated about the patient’s medications, the risks, and how to respond in an emergency.
“It may be life saving,” Dr. Robeck said. “It’s very important for the physician to communicate with the family. There may be times you want the family to come [to appointments].”
Such communication can ensure that family members’ concerns about a patient are conveyed to physicians. Family and caregivers can also have a role in improving home conditions to assist with pain management for the patient, she said. Family education in using a naloxone rescue kit in the event of a possible overdose is essential.
“Giving the patient naloxone to prevent an overdose is more than just handing them the nasal spray or the self-injector, it’s also about educating them and the family about the true risks of these medications,” Dr. Robeck said.
8. Watch for red flags
Be cognizant of warning signs that patients may be addicted, Ms. Adler advised. Patients who demand medications, act impatiently about waiting for refills, or refuse to answer questions about their history should raise alarm bells, she said. Patients who travel long distances for pain medication also should raise question marks, Dr. O’Donnell adds.
“If the patient is in too much of a hurry to wait for records or to get a urine test, that’s a red flag,” Ms. Adler said. “When doctors are not sure, the safest bet is to refer to a pain specialist.”
Consider the criteria for opioid use disorder, Dr. Robeck noted. The condition is defined as a problematic pattern of opioid use leading to clinically significant impairment or distress. Signs of opioid use disorder include recurrent use by patients resulting in a failure to fulfill major role obligations at work or home, continued opioid use despite having persistent or recurrent social or interpersonal problems caused or exacerbated by opioids, and spending a great deal of time in activities necessary to obtain the opioid, use the opioid, or recover from use, according to the American Psychiatric Association.
9. Develop an exit plan
Before starting a patient on opioid therapy, have a plan in place in case something goes awry, Dr. Robeck said. Create an exit strategy that includes both pharmacological and nonpharmacological resources from which to draw from should problems arise. Make sure you have a plan for tapering patients off opioids when necessary. This may include getting help from other clinicians in the community, she said.
“Those patients need very careful follow-up,” Dr. Robeck. “The rate of the taper needs to occur based on level of risk. Whenever possible, we try to taper patients slowly.”
10. Do your research
Always check your state’s prescription drug–monitoring program (PDMP) when prescribing an opiate to a new patient, Mr. O’Donnell advised.
Currently, 49 states and Guam have operational PDMP databases. The PDMP Training and Technical Assistance Center offers information about each PDMP, state pharmacy and practitioner data, drug schedules monitored, patient information data, and legislation dates and citations.
Perhaps most importantly, know best prescribing practices, Dr. Robeck said. Earlier this year, the CDC released guidelines for prescribing opioids for chronic pain.
“Thoroughly understand the CDC guidelines,” she said. “These are the keys to understanding where opioids fit into your plan of medications and nonpharmacological therapies for pain.”
On Twitter @legal_med
Opioid-related lawsuits against physicians are on the rise. Common allegations include unnecessary prescribing, failing to heed contraindications, and missing warning signs of a likely overdose, said Ericka L. Adler, a Chicago-based health law attorney. To mitigate your risk of getting sued, legal and clinical experts offer the following advice.
1. Identify at-risk patients
Consider the full range of patient risk factors before prescribing or continuing opioids, said Ilene R. Robeck, MD, director of virtual pain education at Richmond VA Medical Center in St. Petersburg, Fla., and cochair of the National VA PACT Pain Champions Initiative.
“When we look at the overdose data, there used to be a perception that people who overdosed were not taking their medication as prescribed, and that’s not true all the time,” she said. “In fact, in some [studies], half the people who overdose take their medication exactly as prescribed. The problems are related to dose, mixing the opioids with other medications, [patient] age, and underlying medical problems.”
Ensure that therapy considerations related to opioids address the full patient picture, Dr. Robeck advised. For example, patients with liver disease, sleep apnea, chronic obstructive pulmonary disease (COPD), asthma, and kidney disease are more prone to overdose. In addition, while nonmedical use of prescription drugs is highest in patients aged 18-25 years, opioid overdose rates are highest among patients aged 25-54 years, according to the Centers for Disease Control and Prevention.
2. Monitor midlevel providers
Closely monitor and limit opioid prescribing by the midlevel providers you supervise, said pharmacologist and consultant James T. O’Donnell, PharmD. A fair share of medical malpractice lawsuits result from failing to supervise a physician assistant (PA) or nurse practitioner (NP) prescribing or treating pain patients.
“Excessive or inappropriate opiate prescribing will result in legal actions against the supervising physician,” Dr. O’Donnell said. “Close monitoring of midlevel providers requires establishing guidelines for opiate prescribing.”
Develop practice protocols that track and regulate midlevel prescribing and regularly discuss prescribing dangers with staff. Know your state law; the extent to which an NP or PA can prescribe varies widely.
3. Document
Keep detailed records of patient encounters that include specifics of what the patient tells you, said Ms. Adler.
Clear documentation about prior conditions, interactions with other health care providers, and past and current treatments help protect the doctor should liability later arise. In the case of a dishonest patient, clear record keeping could help show that a patient lied or omitted facts if the notes later become evidence in a lawsuit, she said.
“I also think [doctors] should document their policies, so there is clarity and understanding in the relationship,” Ms. Adler said. “Showing a policy where national standards/recommendations are followed will help protect the practice.”
4. Restrict refills
Require prescriber review before patients can obtain refills or new prescriptions for patients who run out of medicine before their next appointment, Dr. O’Donnell said.
“Excessive or early refills for opiate prescriptions are signs of abuse,” he said. “This creates risk to the patient and malpractice risk to the physician.”
It’s also helpful to limit the number of pharmacies used for opioid prescriptions, Ms. Adler said. This makes it easier track medications and narrows the pathway between prescription and drug obtainment.
5. Partner with pharmacists
Work closely with other health providers, such as psychiatrists, therapists, and pharmacists to ensure safe prescribing decisions. Pharmacists have a corresponding responsibility in dispensing opiates, Dr. O’Donnell said.
“Take the pharmacists’ calls regarding your opiates prescriptions,” he advised. “The pharmacist will know what other medicines the patient is taking and may advise of dangerous dosage or interactions.”
6. Require patient agreements
Opioid treatment agreements aid in patient accountability and promote education of drug risks, Dr. Robeck noted. In such contracts, patients agree to fully comply with all aspects of the treatment program and acknowledge that they will not use medication with harmful substances. Other terms can include that patients:
• Only obtain opioid prescriptions from one provider.
• Agree to keep all scheduled medical appointments.
• Promise to undergo urine drug screens as indicated.
• Agree not to share or sell medication.
• Agree not to drive or operate heavy machinery until medication-related drowsiness clears.
Contracts can help patients remain informed about the dangers and benefits of medications, while protecting the physician’s right to terminate treatment if the patient violates the agreement, Ms. Adler said. One sample agreement can be found here.
7. Involve family members
Family members and caregivers are critical to a patient’s opioid therapy plan, Dr. Robeck said. Discuss with patients ahead of time the potential for family member involvement. Family or the patient’s support system should be educated about the patient’s medications, the risks, and how to respond in an emergency.
“It may be life saving,” Dr. Robeck said. “It’s very important for the physician to communicate with the family. There may be times you want the family to come [to appointments].”
Such communication can ensure that family members’ concerns about a patient are conveyed to physicians. Family and caregivers can also have a role in improving home conditions to assist with pain management for the patient, she said. Family education in using a naloxone rescue kit in the event of a possible overdose is essential.
“Giving the patient naloxone to prevent an overdose is more than just handing them the nasal spray or the self-injector, it’s also about educating them and the family about the true risks of these medications,” Dr. Robeck said.
8. Watch for red flags
Be cognizant of warning signs that patients may be addicted, Ms. Adler advised. Patients who demand medications, act impatiently about waiting for refills, or refuse to answer questions about their history should raise alarm bells, she said. Patients who travel long distances for pain medication also should raise question marks, Dr. O’Donnell adds.
“If the patient is in too much of a hurry to wait for records or to get a urine test, that’s a red flag,” Ms. Adler said. “When doctors are not sure, the safest bet is to refer to a pain specialist.”
Consider the criteria for opioid use disorder, Dr. Robeck noted. The condition is defined as a problematic pattern of opioid use leading to clinically significant impairment or distress. Signs of opioid use disorder include recurrent use by patients resulting in a failure to fulfill major role obligations at work or home, continued opioid use despite having persistent or recurrent social or interpersonal problems caused or exacerbated by opioids, and spending a great deal of time in activities necessary to obtain the opioid, use the opioid, or recover from use, according to the American Psychiatric Association.
9. Develop an exit plan
Before starting a patient on opioid therapy, have a plan in place in case something goes awry, Dr. Robeck said. Create an exit strategy that includes both pharmacological and nonpharmacological resources from which to draw from should problems arise. Make sure you have a plan for tapering patients off opioids when necessary. This may include getting help from other clinicians in the community, she said.
“Those patients need very careful follow-up,” Dr. Robeck. “The rate of the taper needs to occur based on level of risk. Whenever possible, we try to taper patients slowly.”
10. Do your research
Always check your state’s prescription drug–monitoring program (PDMP) when prescribing an opiate to a new patient, Mr. O’Donnell advised.
Currently, 49 states and Guam have operational PDMP databases. The PDMP Training and Technical Assistance Center offers information about each PDMP, state pharmacy and practitioner data, drug schedules monitored, patient information data, and legislation dates and citations.
Perhaps most importantly, know best prescribing practices, Dr. Robeck said. Earlier this year, the CDC released guidelines for prescribing opioids for chronic pain.
“Thoroughly understand the CDC guidelines,” she said. “These are the keys to understanding where opioids fit into your plan of medications and nonpharmacological therapies for pain.”
On Twitter @legal_med
CMS offers lower-stress reporting options for MACRA in 2017
Physicians will have options for when they can start meeting the requirements for the Merit-based Incentive Payment System (MIPS) track under regulations that implement the Medicare Access and CHIP Reauthorization Act.
The options are designed to allow physicians a variety of ways to get started with the new Quality Payment Program – the term CMS has given the MACRA-legislated reforms – and provide more limited ways to participate in 2017.
Option 1: Test the quality payment program in 2017 by submitting data without facing any negative payment adjustments. This will give physicians the year to make sure their processes are in place and ready for broader participation in 2018 and beyond.
Option 2: Delay the start of the performance period and participate for just part of 2017. Depending on how long a physician delays reporting quality information back to CMS, they could still qualify for a smaller bonus payment.
Option 3: Participate for the entire calendar year as called for by the law and be eligible for the full participation bonuses.
Option 4: For those who qualify, participate in an Advanced Alternative Payment Model (APM) beginning next year.
The final regulations for implementing MACRA will be released on Nov. 1, CMS Acting Administrator Andy Slavitt wrote in a blog post published Sept. 8.
“However you choose to participate in 2017, we will have resources available to assist you and walk you through what needs to be done,” Mr. Slavitt wrote.
Physicians will have options for when they can start meeting the requirements for the Merit-based Incentive Payment System (MIPS) track under regulations that implement the Medicare Access and CHIP Reauthorization Act.
The options are designed to allow physicians a variety of ways to get started with the new Quality Payment Program – the term CMS has given the MACRA-legislated reforms – and provide more limited ways to participate in 2017.
Option 1: Test the quality payment program in 2017 by submitting data without facing any negative payment adjustments. This will give physicians the year to make sure their processes are in place and ready for broader participation in 2018 and beyond.
Option 2: Delay the start of the performance period and participate for just part of 2017. Depending on how long a physician delays reporting quality information back to CMS, they could still qualify for a smaller bonus payment.
Option 3: Participate for the entire calendar year as called for by the law and be eligible for the full participation bonuses.
Option 4: For those who qualify, participate in an Advanced Alternative Payment Model (APM) beginning next year.
The final regulations for implementing MACRA will be released on Nov. 1, CMS Acting Administrator Andy Slavitt wrote in a blog post published Sept. 8.
“However you choose to participate in 2017, we will have resources available to assist you and walk you through what needs to be done,” Mr. Slavitt wrote.
Physicians will have options for when they can start meeting the requirements for the Merit-based Incentive Payment System (MIPS) track under regulations that implement the Medicare Access and CHIP Reauthorization Act.
The options are designed to allow physicians a variety of ways to get started with the new Quality Payment Program – the term CMS has given the MACRA-legislated reforms – and provide more limited ways to participate in 2017.
Option 1: Test the quality payment program in 2017 by submitting data without facing any negative payment adjustments. This will give physicians the year to make sure their processes are in place and ready for broader participation in 2018 and beyond.
Option 2: Delay the start of the performance period and participate for just part of 2017. Depending on how long a physician delays reporting quality information back to CMS, they could still qualify for a smaller bonus payment.
Option 3: Participate for the entire calendar year as called for by the law and be eligible for the full participation bonuses.
Option 4: For those who qualify, participate in an Advanced Alternative Payment Model (APM) beginning next year.
The final regulations for implementing MACRA will be released on Nov. 1, CMS Acting Administrator Andy Slavitt wrote in a blog post published Sept. 8.
“However you choose to participate in 2017, we will have resources available to assist you and walk you through what needs to be done,” Mr. Slavitt wrote.
ACOs score slight bump in bonus payments
Accountable care organizations participating in the Medicare Shared Savings Program generated $466 million in savings in 2015, up from $411 million in 2014, the Centers for Medicare & Medicaid Services announced.
Despite the growth in savings, there was little growth in the number of ACOs that qualified for bonus payments based on the savings they were able to generate.
Of 392 participants in Medicare Shared Savings Programand 12 Pioneer ACO Model participants, 31% (125) received bonus payments in 2015, as compared with 27% (97 organizations from a pool of 20 Pioneer ACOs and 333 ACO shared savings program participants) in 2014, according to a CMS report.
The agency noted that another 83 ACOs in the Shared Savings Program and two Pioneer ACOs generated savings in 2015 but did not qualify for bonus payments. Of the four Pioneer ACOs that recorded losses, only one incurred losses great enough to require payment to CMS.
On the quality side, the mean quality score among Pioneer ACOs increased to 92% in 2015, the fourth year of the program, up from 87% in 2014. Quality scores have risen each year, with a growth of 21% from the first year.
Participants in the Shared Savings Program that reported quality measures in both 2014 and 2015 improved on 84% of the quality measures that were reported in both years. In four measures – screening risk for future falls, depression screening and follow-up, blood pressure screening and follow-up, and administering pneumonia vaccine – the average quality performance improvement was more than 15% year-over-year.
The National Association of ACOs said it was “disappointed” in the small bump in financial bonuses.
“The results are not as strong as we, and many of our ACO members, had hoped for,” NAACOS President and CEO Clif Gaus, ScD, said in a statement. “But overall, we are pleased to see the results show a positive trend for the program,” noting that despite being only a few years old, the the participating ACOs “have accomplished a lot to reduce cost and improve quality.”
Accountable care organizations participating in the Medicare Shared Savings Program generated $466 million in savings in 2015, up from $411 million in 2014, the Centers for Medicare & Medicaid Services announced.
Despite the growth in savings, there was little growth in the number of ACOs that qualified for bonus payments based on the savings they were able to generate.
Of 392 participants in Medicare Shared Savings Programand 12 Pioneer ACO Model participants, 31% (125) received bonus payments in 2015, as compared with 27% (97 organizations from a pool of 20 Pioneer ACOs and 333 ACO shared savings program participants) in 2014, according to a CMS report.
The agency noted that another 83 ACOs in the Shared Savings Program and two Pioneer ACOs generated savings in 2015 but did not qualify for bonus payments. Of the four Pioneer ACOs that recorded losses, only one incurred losses great enough to require payment to CMS.
On the quality side, the mean quality score among Pioneer ACOs increased to 92% in 2015, the fourth year of the program, up from 87% in 2014. Quality scores have risen each year, with a growth of 21% from the first year.
Participants in the Shared Savings Program that reported quality measures in both 2014 and 2015 improved on 84% of the quality measures that were reported in both years. In four measures – screening risk for future falls, depression screening and follow-up, blood pressure screening and follow-up, and administering pneumonia vaccine – the average quality performance improvement was more than 15% year-over-year.
The National Association of ACOs said it was “disappointed” in the small bump in financial bonuses.
“The results are not as strong as we, and many of our ACO members, had hoped for,” NAACOS President and CEO Clif Gaus, ScD, said in a statement. “But overall, we are pleased to see the results show a positive trend for the program,” noting that despite being only a few years old, the the participating ACOs “have accomplished a lot to reduce cost and improve quality.”
Accountable care organizations participating in the Medicare Shared Savings Program generated $466 million in savings in 2015, up from $411 million in 2014, the Centers for Medicare & Medicaid Services announced.
Despite the growth in savings, there was little growth in the number of ACOs that qualified for bonus payments based on the savings they were able to generate.
Of 392 participants in Medicare Shared Savings Programand 12 Pioneer ACO Model participants, 31% (125) received bonus payments in 2015, as compared with 27% (97 organizations from a pool of 20 Pioneer ACOs and 333 ACO shared savings program participants) in 2014, according to a CMS report.
The agency noted that another 83 ACOs in the Shared Savings Program and two Pioneer ACOs generated savings in 2015 but did not qualify for bonus payments. Of the four Pioneer ACOs that recorded losses, only one incurred losses great enough to require payment to CMS.
On the quality side, the mean quality score among Pioneer ACOs increased to 92% in 2015, the fourth year of the program, up from 87% in 2014. Quality scores have risen each year, with a growth of 21% from the first year.
Participants in the Shared Savings Program that reported quality measures in both 2014 and 2015 improved on 84% of the quality measures that were reported in both years. In four measures – screening risk for future falls, depression screening and follow-up, blood pressure screening and follow-up, and administering pneumonia vaccine – the average quality performance improvement was more than 15% year-over-year.
The National Association of ACOs said it was “disappointed” in the small bump in financial bonuses.
“The results are not as strong as we, and many of our ACO members, had hoped for,” NAACOS President and CEO Clif Gaus, ScD, said in a statement. “But overall, we are pleased to see the results show a positive trend for the program,” noting that despite being only a few years old, the the participating ACOs “have accomplished a lot to reduce cost and improve quality.”
Better survival with primary surgery in stage IIIC ovarian ca
Although the use of neoadjuvant chemotherapy for treatment of women with advanced ovarian cancer has grown significantly in recent years, a new study shows that it is associated with worse overall survival for women with stage IIIC disease, compared with primary cytoreductive surgery.
Among 594 women with advanced ovarian cancer treated at one of six major comprehensive cancer centers, median overall survival (OS) for women with stage IIIC cancers treated with neoadjuvant chemotherapy (NACT) was 33 months, compared with 43 months for women treated with primary cytoreductive surgery (PCS), reported Larissa A. Meyer, MD, of the University of Texas M.D. Anderson Cancer Center in Houston, and her colleagues.
There were no significant survival differences between chemotherapy and surgery for women with stage IV disease, however, and for these patients neoadjuvant chemotherapy was associated with fewer morbidities, and may be a better therapeutic option, the investigators reported.
“Although additional biases may persist despite propensity-score matching, our results suggest that in carefully selected patients with stage IIIC disease, PCS is associated with a survival advantage, with overall low rates of surgical morbidity. In contrast, for patients with stage IV disease, our results confirm that NACT is noninferior to PCS for survival, with fewer ICU admissions and rehospitalizations, which suggests that NACT may be preferable for patients with stage IV ovarian cancer,” they wrote in the Journal of Clinical Oncology (2016. doi: 10.1200/JCO.2016.68.1239).
The increase in the use of NACT in women with advanced ovarian cancer in the United States was spurred by two randomized clinical trials, the investigators noted. The first, published in 2010 showed that survival was similar for women with stage IIIC or IV ovarian cancer treated with either neoadjuvant chemotherapy followed by interval debulking surgery or with primary surgery followed by chemotherapy. The second study, published in 2015, found that “in women with stage III or IV ovarian cancer, survival with primary chemotherapy is noninferior to primary surgery. In this study population, the researchers stated that “giving primary chemotherapy before surgery is an acceptable standard of care for women with advanced ovarian cancer.”
To see what effect these trials had on clinical practice and outcomes in the United States, the authors conducted an observational study of patients treated at six National Cancer Institute–designated cancer centers, looking at NACT use in 1,538 women diagnosed with ovarian cancer from 2003 through 2012, and at OS, morbidity, and postoperative residual disease in a propensity score–matched sample of 594 patients.
They found that for women with stage IIIC disease, NACT use increased from 16% during the period 2003-2010, to 34% during 2011-2012. For women with stage IV disease, NACT use grew from 41% to 62% during the respective time periods (P for trend for both comparisons = .001).
As noted before, median overall survival among women with stage IIIC disease in the propensity score–matched sample was significantly shorter for those treated with primary NACT vs. PCS.
For women with stage IV disease, however, there was no significant difference in OS between those treated with NACT (median 31 months) vs. those treated with PCS (median 36 months, hazard ratio 1.16, not significant).
Women with stages IIIC and IV disease who received NACT were less likely to have one or more centimeters of residual disease postoperatively and were less likely to have an ICU admission or rehospitalization (P for all comparisons = .04). However, overall survival was lower among women with stage IIIC disease who had only microscopic residual disease or residual disease measuring 1 cm or less (HR, 1.49; P = .04).
“Future studies should prospectively consider the efficacy of NACT by extent of residual disease in unselected patients,” the authors recommended.
The study was supported by grants from the National Cancer Institute and Cancer Prevention and Research Institute of Texas. Dr. Meyer and multiple coauthors disclosed honoraria, research funding, and/or advising/consulting with various pharmaceutical companies.
Although the use of neoadjuvant chemotherapy for treatment of women with advanced ovarian cancer has grown significantly in recent years, a new study shows that it is associated with worse overall survival for women with stage IIIC disease, compared with primary cytoreductive surgery.
Among 594 women with advanced ovarian cancer treated at one of six major comprehensive cancer centers, median overall survival (OS) for women with stage IIIC cancers treated with neoadjuvant chemotherapy (NACT) was 33 months, compared with 43 months for women treated with primary cytoreductive surgery (PCS), reported Larissa A. Meyer, MD, of the University of Texas M.D. Anderson Cancer Center in Houston, and her colleagues.
There were no significant survival differences between chemotherapy and surgery for women with stage IV disease, however, and for these patients neoadjuvant chemotherapy was associated with fewer morbidities, and may be a better therapeutic option, the investigators reported.
“Although additional biases may persist despite propensity-score matching, our results suggest that in carefully selected patients with stage IIIC disease, PCS is associated with a survival advantage, with overall low rates of surgical morbidity. In contrast, for patients with stage IV disease, our results confirm that NACT is noninferior to PCS for survival, with fewer ICU admissions and rehospitalizations, which suggests that NACT may be preferable for patients with stage IV ovarian cancer,” they wrote in the Journal of Clinical Oncology (2016. doi: 10.1200/JCO.2016.68.1239).
The increase in the use of NACT in women with advanced ovarian cancer in the United States was spurred by two randomized clinical trials, the investigators noted. The first, published in 2010 showed that survival was similar for women with stage IIIC or IV ovarian cancer treated with either neoadjuvant chemotherapy followed by interval debulking surgery or with primary surgery followed by chemotherapy. The second study, published in 2015, found that “in women with stage III or IV ovarian cancer, survival with primary chemotherapy is noninferior to primary surgery. In this study population, the researchers stated that “giving primary chemotherapy before surgery is an acceptable standard of care for women with advanced ovarian cancer.”
To see what effect these trials had on clinical practice and outcomes in the United States, the authors conducted an observational study of patients treated at six National Cancer Institute–designated cancer centers, looking at NACT use in 1,538 women diagnosed with ovarian cancer from 2003 through 2012, and at OS, morbidity, and postoperative residual disease in a propensity score–matched sample of 594 patients.
They found that for women with stage IIIC disease, NACT use increased from 16% during the period 2003-2010, to 34% during 2011-2012. For women with stage IV disease, NACT use grew from 41% to 62% during the respective time periods (P for trend for both comparisons = .001).
As noted before, median overall survival among women with stage IIIC disease in the propensity score–matched sample was significantly shorter for those treated with primary NACT vs. PCS.
For women with stage IV disease, however, there was no significant difference in OS between those treated with NACT (median 31 months) vs. those treated with PCS (median 36 months, hazard ratio 1.16, not significant).
Women with stages IIIC and IV disease who received NACT were less likely to have one or more centimeters of residual disease postoperatively and were less likely to have an ICU admission or rehospitalization (P for all comparisons = .04). However, overall survival was lower among women with stage IIIC disease who had only microscopic residual disease or residual disease measuring 1 cm or less (HR, 1.49; P = .04).
“Future studies should prospectively consider the efficacy of NACT by extent of residual disease in unselected patients,” the authors recommended.
The study was supported by grants from the National Cancer Institute and Cancer Prevention and Research Institute of Texas. Dr. Meyer and multiple coauthors disclosed honoraria, research funding, and/or advising/consulting with various pharmaceutical companies.
Although the use of neoadjuvant chemotherapy for treatment of women with advanced ovarian cancer has grown significantly in recent years, a new study shows that it is associated with worse overall survival for women with stage IIIC disease, compared with primary cytoreductive surgery.
Among 594 women with advanced ovarian cancer treated at one of six major comprehensive cancer centers, median overall survival (OS) for women with stage IIIC cancers treated with neoadjuvant chemotherapy (NACT) was 33 months, compared with 43 months for women treated with primary cytoreductive surgery (PCS), reported Larissa A. Meyer, MD, of the University of Texas M.D. Anderson Cancer Center in Houston, and her colleagues.
There were no significant survival differences between chemotherapy and surgery for women with stage IV disease, however, and for these patients neoadjuvant chemotherapy was associated with fewer morbidities, and may be a better therapeutic option, the investigators reported.
“Although additional biases may persist despite propensity-score matching, our results suggest that in carefully selected patients with stage IIIC disease, PCS is associated with a survival advantage, with overall low rates of surgical morbidity. In contrast, for patients with stage IV disease, our results confirm that NACT is noninferior to PCS for survival, with fewer ICU admissions and rehospitalizations, which suggests that NACT may be preferable for patients with stage IV ovarian cancer,” they wrote in the Journal of Clinical Oncology (2016. doi: 10.1200/JCO.2016.68.1239).
The increase in the use of NACT in women with advanced ovarian cancer in the United States was spurred by two randomized clinical trials, the investigators noted. The first, published in 2010 showed that survival was similar for women with stage IIIC or IV ovarian cancer treated with either neoadjuvant chemotherapy followed by interval debulking surgery or with primary surgery followed by chemotherapy. The second study, published in 2015, found that “in women with stage III or IV ovarian cancer, survival with primary chemotherapy is noninferior to primary surgery. In this study population, the researchers stated that “giving primary chemotherapy before surgery is an acceptable standard of care for women with advanced ovarian cancer.”
To see what effect these trials had on clinical practice and outcomes in the United States, the authors conducted an observational study of patients treated at six National Cancer Institute–designated cancer centers, looking at NACT use in 1,538 women diagnosed with ovarian cancer from 2003 through 2012, and at OS, morbidity, and postoperative residual disease in a propensity score–matched sample of 594 patients.
They found that for women with stage IIIC disease, NACT use increased from 16% during the period 2003-2010, to 34% during 2011-2012. For women with stage IV disease, NACT use grew from 41% to 62% during the respective time periods (P for trend for both comparisons = .001).
As noted before, median overall survival among women with stage IIIC disease in the propensity score–matched sample was significantly shorter for those treated with primary NACT vs. PCS.
For women with stage IV disease, however, there was no significant difference in OS between those treated with NACT (median 31 months) vs. those treated with PCS (median 36 months, hazard ratio 1.16, not significant).
Women with stages IIIC and IV disease who received NACT were less likely to have one or more centimeters of residual disease postoperatively and were less likely to have an ICU admission or rehospitalization (P for all comparisons = .04). However, overall survival was lower among women with stage IIIC disease who had only microscopic residual disease or residual disease measuring 1 cm or less (HR, 1.49; P = .04).
“Future studies should prospectively consider the efficacy of NACT by extent of residual disease in unselected patients,” the authors recommended.
The study was supported by grants from the National Cancer Institute and Cancer Prevention and Research Institute of Texas. Dr. Meyer and multiple coauthors disclosed honoraria, research funding, and/or advising/consulting with various pharmaceutical companies.
FROM JOURNAL OF CLINICAL ONCOLOGY
Key clinical point: Neoadjuvant chemotherapy was associated with lower overall survival of stage IIIC but not stage IV ovarian cancer.
Major finding: Median OS was 33 months with neoadjuvant chemotherapy vs. 43 months with primary cytoreductive surgery.
Data source: Observational study of 1,538 patients with ovarian cancer, and propensity score–matched sample of 594 patients for clinical outcomes.
Disclosures: The study was supported by grants from the National Cancer Institute and Cancer Prevention and Research Institute of Texas. Larissa A. Meyer and multiple coauthors disclosed honoraria, research funding, and/or advising/consulting with various pharmaceutical companies.
The new NOACs are generally the best bet
New NOACs have largely replaced the need for vitamin K antagonists
The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.
Pharmacologic design
The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.
Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.
Patient selection
The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7
In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.
Overcoming challenges
Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.
Conclusions
Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.
Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.
References
1. J Am Vet Med Assoc 1924;64:553-575
3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470
4. Eur Heart J 2013;34:2094-2106
6. Nat Rev Cardiol 2014;11:693-703
8. N Engl J Med 2015;373:511-520
9. N Engl J Med 2014;371:2141-2142
What the doctor didn’t order: unintended consequences and pitfalls of NOACs
Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.
Dabigatran and edoxaban
Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3
Rivaroxaban and apixaban
Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6
Consequences and pitfalls with NOACs
Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:
• When a patient is bleeding.
• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.
• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.
• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.
• Patients with deteriorating renal function.
• During perioperative management.
• During reversal of anticoagulation.
• When there is suspicion of overdose.
• Assessment of compliance in patients suffering thrombotic events while on treatment.7
Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8
Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.
The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.
The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.
So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.
Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.
References
1. N Engl J Med. 2009;361:2342-2352
2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426
3. N Engl J Med 2013;369:1406-1415
4. N Engl J Med 2010;363:2499-2510
5. N Engl J Med 2013;368:699-708
6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065
7. J Thrombosis and Haemostasis 2013;11:756-760
New NOACs have largely replaced the need for vitamin K antagonists
The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.
Pharmacologic design
The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.
Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.
Patient selection
The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7
In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.
Overcoming challenges
Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.
Conclusions
Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.
Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.
References
1. J Am Vet Med Assoc 1924;64:553-575
3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470
4. Eur Heart J 2013;34:2094-2106
6. Nat Rev Cardiol 2014;11:693-703
8. N Engl J Med 2015;373:511-520
9. N Engl J Med 2014;371:2141-2142
What the doctor didn’t order: unintended consequences and pitfalls of NOACs
Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.
Dabigatran and edoxaban
Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3
Rivaroxaban and apixaban
Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6
Consequences and pitfalls with NOACs
Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:
• When a patient is bleeding.
• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.
• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.
• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.
• Patients with deteriorating renal function.
• During perioperative management.
• During reversal of anticoagulation.
• When there is suspicion of overdose.
• Assessment of compliance in patients suffering thrombotic events while on treatment.7
Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8
Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.
The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.
The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.
So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.
Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.
References
1. N Engl J Med. 2009;361:2342-2352
2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426
3. N Engl J Med 2013;369:1406-1415
4. N Engl J Med 2010;363:2499-2510
5. N Engl J Med 2013;368:699-708
6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065
7. J Thrombosis and Haemostasis 2013;11:756-760
New NOACs have largely replaced the need for vitamin K antagonists
The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.
Pharmacologic design
The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.
Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.
Patient selection
The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7
In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.
Overcoming challenges
Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.
Conclusions
Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.
Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.
References
1. J Am Vet Med Assoc 1924;64:553-575
3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470
4. Eur Heart J 2013;34:2094-2106
6. Nat Rev Cardiol 2014;11:693-703
8. N Engl J Med 2015;373:511-520
9. N Engl J Med 2014;371:2141-2142
What the doctor didn’t order: unintended consequences and pitfalls of NOACs
Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.
Dabigatran and edoxaban
Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3
Rivaroxaban and apixaban
Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6
Consequences and pitfalls with NOACs
Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:
• When a patient is bleeding.
• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.
• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.
• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.
• Patients with deteriorating renal function.
• During perioperative management.
• During reversal of anticoagulation.
• When there is suspicion of overdose.
• Assessment of compliance in patients suffering thrombotic events while on treatment.7
Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8
Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.
The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.
The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.
So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.
Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.
References
1. N Engl J Med. 2009;361:2342-2352
2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426
3. N Engl J Med 2013;369:1406-1415
4. N Engl J Med 2010;363:2499-2510
5. N Engl J Med 2013;368:699-708
6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065
7. J Thrombosis and Haemostasis 2013;11:756-760
Commentary: INR instability in the NOAC era
Progress in the development of new oral anticoagulants (NOACs), as well as agents for their reversal, has lowered the threshold to use these therapeutics as first line agents for the management of nonvalvular atrial fibrillation and venous thromboembolism.1,2 Despite this increase in adoption, however, debate persists as to whether patients chronically maintained on vitamin K antagonists (VKAs), such as warfarin, should be switched to NOACs. The recently published research letter by Pokorney et al. assessed the stability of international normalized ratios (INRs) in patients on long-term warfarin therapy in order to address this question.3
Specifically, prospective registry data from 3,749 patients with at least three INR values in the first 6 months of therapy as well as six or more in the following year were included. Patients were deemed stable if 80% or more of their INRs were in a therapeutic range defined as an INR between 2 and 3.3 During the initiation period, only one in four patients taking warfarin had a stable INR.3 Furthermore, stability in the first 6 months was found to have limited ability to predict stability in the subsequent year (concordance index of 0.61). With regard to time in therapeutic range (TTR), only 32% of patients had a TTR of greater than 80% during the first 6 months with less than half (42%) of these patients able to maintain this in the following year.
Findings from Pokorney et al. add to the growing body of literature demonstrating the difficulty of achieving and maintaining a therapeutic INR while on warfarin therapy.4-7 Clinically, these findings are important, as deviations from TTR have been shown to be associated with increased risk of bleeding and thrombosis as well as increased health care costs.8-10 Mechanistically, patient factors such as differences in vitamin K consumption, comorbid conditions, drug-drug interactions, and medication compliance, as well as genetic differences that impact drug metabolism undoubtedly contribute to the variation of INR noted in patients on warfarin therapy.
Attempts to improve stability have included the administration of low-dose oral vitamin K. However, recent data from a multicenter randomized control trial suggests that while such therapy may help to decrease extreme variations in INR, it does not lead to an increased TTR.11 Furthermore, while significant work has been conducted in identifying specific gene variants, such as CYP2C9 and VKORC, which encode cytochrome P450 and vitamin K epoxide reductase enzymes, respectively, economic analyses suggest that testing for these gene variants would not be cost-effective.12 Additionally, clinical prediction tools, which incorporate important patient factors to help guide anticoagulation explain less than 10% of TTR variability.4
Nonetheless, some caution is warranted in the interpretation of the results reported by Pokorney and his colleagues. The proportion of registry patients treated with warfarin who had a low TTR was much lower than that previously reported by the pivotal U.S. trials of NOACs (55%-68%) and significantly lower than the results of a recent nationwide Swedish registry involving 40,449 patients.13
In the Swedish registry, the mean individual TTR was 70% with more than half the patients having a TTR of 70% or more, emphasizing the importance of health care system effects. Moreover, regardless of whether a patient is on warfarin or a NOAC, patients with a lower TTR have higher rates of diabetes, chronic obstructive pulmonary disease, heart failure, and renal failure, which may contribute to the need for additional therapies that may influence TTR.
For example, INR may be increased by ciprofloxacin or omeprazole when taken with warfarin, and CYP3A4 and P-glycoprotein (P-gp) inducers and inhibitors can result in an increased or decreased anticoagulation effect when used with NOACs. Recent reports have also highlighted variability in the safety of NOACs, particularly among patients with renal or liver insufficiency, African Americans, or patients with a prior history of GI bleeding.14-16 For these subgroups, determining NOAC activity to improve clinical safety of these agents is difficult.
PT or INR testing is largely insensitive or otherwise highly variable and the blood draw time relative to the most recent dose significantly influences the measured level of anti-Xa activity. Importantly, socioeconomic factors and family support systems also influence TTR, as important determinants of access to needed drugs or the ability to sustain related costs over time.
Taken together, prior INR stability on warfarin therapy does not ensure continued stability and, as a consequence, long-term warfarin therapy requires close monitoring in order to remain effective. To this end, further development of point-of-care coagulometers for self-testing and self-management, which have been found to be acceptable and preferred by patients, should be pursued.17 Similarly, attempts to decrease INR variability through research on optimizing computer assisted dosing programs remains warranted.18 NOACs offer an advantage over warfarin therapy in that they have a more predictable pharmacokinetic profile, which precludes the need for routine monitoring of anticoagulation parameters. However, many of the same factors, which influence TTR for warfarin do so for NOACs; NOACs have increased bleeding risk in comparison to warfarin for a number of demographic groups; and the high cost of NOACs may influence patient compliance.
Accordingly, until further data is available, consideration of the conversion of a patient on warfarin with a low TTR to a NOAC should be individualized.
Madhukar S. Patel, MD, is a general surgeon at the Department of Surgery, Massachusetts General Hospital, Boston, and Elliot L. Chaikof, MD, is Surgeon-in-Chief, Beth Israel Deaconess Medical Center, and Chairman, Roberta and Stephen R. Weiner Department of Surgery, Johnson and Johnson Professor of Surgery, Harvard Medical School. Dr. Chaikof is also an associate editor for Vascular Specialist. They have no relevant conflicts.
References
2. Nat Rev Cardiol. 2014;11:693-703.
5. J Thromb Haemost. 2010;8:2182-91.
6. Thromb Haemost. 2009;101:552-6.
7. Am J Cardiovasc Drugs. 2015;15:205-11.
8. Circ Cardiovasc Qual Outcomes. 2008;1:84-91.
10. J Med Econ. 2015;18:333-40.
11. Thromb Haemost. 2016;116:480-5.
12. Ann Intern Med. 2009;150:73-83.
13. JAMA Cardiol. 2016;1:172-80.
14. N Engl J Med. 2013;369:2093-104.
15. JAMA Intern Med. 2015;175:18-24.
16. J Am Coll Cardiol. 2014;63:891-900.
Progress in the development of new oral anticoagulants (NOACs), as well as agents for their reversal, has lowered the threshold to use these therapeutics as first line agents for the management of nonvalvular atrial fibrillation and venous thromboembolism.1,2 Despite this increase in adoption, however, debate persists as to whether patients chronically maintained on vitamin K antagonists (VKAs), such as warfarin, should be switched to NOACs. The recently published research letter by Pokorney et al. assessed the stability of international normalized ratios (INRs) in patients on long-term warfarin therapy in order to address this question.3
Specifically, prospective registry data from 3,749 patients with at least three INR values in the first 6 months of therapy as well as six or more in the following year were included. Patients were deemed stable if 80% or more of their INRs were in a therapeutic range defined as an INR between 2 and 3.3 During the initiation period, only one in four patients taking warfarin had a stable INR.3 Furthermore, stability in the first 6 months was found to have limited ability to predict stability in the subsequent year (concordance index of 0.61). With regard to time in therapeutic range (TTR), only 32% of patients had a TTR of greater than 80% during the first 6 months with less than half (42%) of these patients able to maintain this in the following year.
Findings from Pokorney et al. add to the growing body of literature demonstrating the difficulty of achieving and maintaining a therapeutic INR while on warfarin therapy.4-7 Clinically, these findings are important, as deviations from TTR have been shown to be associated with increased risk of bleeding and thrombosis as well as increased health care costs.8-10 Mechanistically, patient factors such as differences in vitamin K consumption, comorbid conditions, drug-drug interactions, and medication compliance, as well as genetic differences that impact drug metabolism undoubtedly contribute to the variation of INR noted in patients on warfarin therapy.
Attempts to improve stability have included the administration of low-dose oral vitamin K. However, recent data from a multicenter randomized control trial suggests that while such therapy may help to decrease extreme variations in INR, it does not lead to an increased TTR.11 Furthermore, while significant work has been conducted in identifying specific gene variants, such as CYP2C9 and VKORC, which encode cytochrome P450 and vitamin K epoxide reductase enzymes, respectively, economic analyses suggest that testing for these gene variants would not be cost-effective.12 Additionally, clinical prediction tools, which incorporate important patient factors to help guide anticoagulation explain less than 10% of TTR variability.4
Nonetheless, some caution is warranted in the interpretation of the results reported by Pokorney and his colleagues. The proportion of registry patients treated with warfarin who had a low TTR was much lower than that previously reported by the pivotal U.S. trials of NOACs (55%-68%) and significantly lower than the results of a recent nationwide Swedish registry involving 40,449 patients.13
In the Swedish registry, the mean individual TTR was 70% with more than half the patients having a TTR of 70% or more, emphasizing the importance of health care system effects. Moreover, regardless of whether a patient is on warfarin or a NOAC, patients with a lower TTR have higher rates of diabetes, chronic obstructive pulmonary disease, heart failure, and renal failure, which may contribute to the need for additional therapies that may influence TTR.
For example, INR may be increased by ciprofloxacin or omeprazole when taken with warfarin, and CYP3A4 and P-glycoprotein (P-gp) inducers and inhibitors can result in an increased or decreased anticoagulation effect when used with NOACs. Recent reports have also highlighted variability in the safety of NOACs, particularly among patients with renal or liver insufficiency, African Americans, or patients with a prior history of GI bleeding.14-16 For these subgroups, determining NOAC activity to improve clinical safety of these agents is difficult.
PT or INR testing is largely insensitive or otherwise highly variable and the blood draw time relative to the most recent dose significantly influences the measured level of anti-Xa activity. Importantly, socioeconomic factors and family support systems also influence TTR, as important determinants of access to needed drugs or the ability to sustain related costs over time.
Taken together, prior INR stability on warfarin therapy does not ensure continued stability and, as a consequence, long-term warfarin therapy requires close monitoring in order to remain effective. To this end, further development of point-of-care coagulometers for self-testing and self-management, which have been found to be acceptable and preferred by patients, should be pursued.17 Similarly, attempts to decrease INR variability through research on optimizing computer assisted dosing programs remains warranted.18 NOACs offer an advantage over warfarin therapy in that they have a more predictable pharmacokinetic profile, which precludes the need for routine monitoring of anticoagulation parameters. However, many of the same factors, which influence TTR for warfarin do so for NOACs; NOACs have increased bleeding risk in comparison to warfarin for a number of demographic groups; and the high cost of NOACs may influence patient compliance.
Accordingly, until further data is available, consideration of the conversion of a patient on warfarin with a low TTR to a NOAC should be individualized.
Madhukar S. Patel, MD, is a general surgeon at the Department of Surgery, Massachusetts General Hospital, Boston, and Elliot L. Chaikof, MD, is Surgeon-in-Chief, Beth Israel Deaconess Medical Center, and Chairman, Roberta and Stephen R. Weiner Department of Surgery, Johnson and Johnson Professor of Surgery, Harvard Medical School. Dr. Chaikof is also an associate editor for Vascular Specialist. They have no relevant conflicts.
References
2. Nat Rev Cardiol. 2014;11:693-703.
5. J Thromb Haemost. 2010;8:2182-91.
6. Thromb Haemost. 2009;101:552-6.
7. Am J Cardiovasc Drugs. 2015;15:205-11.
8. Circ Cardiovasc Qual Outcomes. 2008;1:84-91.
10. J Med Econ. 2015;18:333-40.
11. Thromb Haemost. 2016;116:480-5.
12. Ann Intern Med. 2009;150:73-83.
13. JAMA Cardiol. 2016;1:172-80.
14. N Engl J Med. 2013;369:2093-104.
15. JAMA Intern Med. 2015;175:18-24.
16. J Am Coll Cardiol. 2014;63:891-900.
Progress in the development of new oral anticoagulants (NOACs), as well as agents for their reversal, has lowered the threshold to use these therapeutics as first line agents for the management of nonvalvular atrial fibrillation and venous thromboembolism.1,2 Despite this increase in adoption, however, debate persists as to whether patients chronically maintained on vitamin K antagonists (VKAs), such as warfarin, should be switched to NOACs. The recently published research letter by Pokorney et al. assessed the stability of international normalized ratios (INRs) in patients on long-term warfarin therapy in order to address this question.3
Specifically, prospective registry data from 3,749 patients with at least three INR values in the first 6 months of therapy as well as six or more in the following year were included. Patients were deemed stable if 80% or more of their INRs were in a therapeutic range defined as an INR between 2 and 3.3 During the initiation period, only one in four patients taking warfarin had a stable INR.3 Furthermore, stability in the first 6 months was found to have limited ability to predict stability in the subsequent year (concordance index of 0.61). With regard to time in therapeutic range (TTR), only 32% of patients had a TTR of greater than 80% during the first 6 months with less than half (42%) of these patients able to maintain this in the following year.
Findings from Pokorney et al. add to the growing body of literature demonstrating the difficulty of achieving and maintaining a therapeutic INR while on warfarin therapy.4-7 Clinically, these findings are important, as deviations from TTR have been shown to be associated with increased risk of bleeding and thrombosis as well as increased health care costs.8-10 Mechanistically, patient factors such as differences in vitamin K consumption, comorbid conditions, drug-drug interactions, and medication compliance, as well as genetic differences that impact drug metabolism undoubtedly contribute to the variation of INR noted in patients on warfarin therapy.
Attempts to improve stability have included the administration of low-dose oral vitamin K. However, recent data from a multicenter randomized control trial suggests that while such therapy may help to decrease extreme variations in INR, it does not lead to an increased TTR.11 Furthermore, while significant work has been conducted in identifying specific gene variants, such as CYP2C9 and VKORC, which encode cytochrome P450 and vitamin K epoxide reductase enzymes, respectively, economic analyses suggest that testing for these gene variants would not be cost-effective.12 Additionally, clinical prediction tools, which incorporate important patient factors to help guide anticoagulation explain less than 10% of TTR variability.4
Nonetheless, some caution is warranted in the interpretation of the results reported by Pokorney and his colleagues. The proportion of registry patients treated with warfarin who had a low TTR was much lower than that previously reported by the pivotal U.S. trials of NOACs (55%-68%) and significantly lower than the results of a recent nationwide Swedish registry involving 40,449 patients.13
In the Swedish registry, the mean individual TTR was 70% with more than half the patients having a TTR of 70% or more, emphasizing the importance of health care system effects. Moreover, regardless of whether a patient is on warfarin or a NOAC, patients with a lower TTR have higher rates of diabetes, chronic obstructive pulmonary disease, heart failure, and renal failure, which may contribute to the need for additional therapies that may influence TTR.
For example, INR may be increased by ciprofloxacin or omeprazole when taken with warfarin, and CYP3A4 and P-glycoprotein (P-gp) inducers and inhibitors can result in an increased or decreased anticoagulation effect when used with NOACs. Recent reports have also highlighted variability in the safety of NOACs, particularly among patients with renal or liver insufficiency, African Americans, or patients with a prior history of GI bleeding.14-16 For these subgroups, determining NOAC activity to improve clinical safety of these agents is difficult.
PT or INR testing is largely insensitive or otherwise highly variable and the blood draw time relative to the most recent dose significantly influences the measured level of anti-Xa activity. Importantly, socioeconomic factors and family support systems also influence TTR, as important determinants of access to needed drugs or the ability to sustain related costs over time.
Taken together, prior INR stability on warfarin therapy does not ensure continued stability and, as a consequence, long-term warfarin therapy requires close monitoring in order to remain effective. To this end, further development of point-of-care coagulometers for self-testing and self-management, which have been found to be acceptable and preferred by patients, should be pursued.17 Similarly, attempts to decrease INR variability through research on optimizing computer assisted dosing programs remains warranted.18 NOACs offer an advantage over warfarin therapy in that they have a more predictable pharmacokinetic profile, which precludes the need for routine monitoring of anticoagulation parameters. However, many of the same factors, which influence TTR for warfarin do so for NOACs; NOACs have increased bleeding risk in comparison to warfarin for a number of demographic groups; and the high cost of NOACs may influence patient compliance.
Accordingly, until further data is available, consideration of the conversion of a patient on warfarin with a low TTR to a NOAC should be individualized.
Madhukar S. Patel, MD, is a general surgeon at the Department of Surgery, Massachusetts General Hospital, Boston, and Elliot L. Chaikof, MD, is Surgeon-in-Chief, Beth Israel Deaconess Medical Center, and Chairman, Roberta and Stephen R. Weiner Department of Surgery, Johnson and Johnson Professor of Surgery, Harvard Medical School. Dr. Chaikof is also an associate editor for Vascular Specialist. They have no relevant conflicts.
References
2. Nat Rev Cardiol. 2014;11:693-703.
5. J Thromb Haemost. 2010;8:2182-91.
6. Thromb Haemost. 2009;101:552-6.
7. Am J Cardiovasc Drugs. 2015;15:205-11.
8. Circ Cardiovasc Qual Outcomes. 2008;1:84-91.
10. J Med Econ. 2015;18:333-40.
11. Thromb Haemost. 2016;116:480-5.
12. Ann Intern Med. 2009;150:73-83.
13. JAMA Cardiol. 2016;1:172-80.
14. N Engl J Med. 2013;369:2093-104.
15. JAMA Intern Med. 2015;175:18-24.
16. J Am Coll Cardiol. 2014;63:891-900.
Antibiotic susceptibility differs in transplant recipients
Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.
Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.
The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.
The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).
“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.
Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.
Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.
The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.
The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).
“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.
Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.
Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.
The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.
The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).
“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.
FROM DIAGNOSTIC MICROBIOLOGY AND INFECTIOUS DISEASE
Key clinical point: Antibiotic susceptibility in bacteria cultured from transplant recipients differs markedly from that in hospital-wide antibiograms.
Major finding: In the transplant recipients, E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin.
Data source: A single-center study comparing the antibiotic susceptibility of 1,889 bacterial isolates from transplant recipients with 10,439 isolates from other patients.
Disclosures: This study was not supported by funding from any public, commercial, or not-for-profit entities. Dr. Rosa and her associates reported having no relevant financial disclosures.
Robot-assisted laparoscopic surgery performed mostly by and for white males
BOSTON – Patients who receive robot-assisted laparoscopic surgery (RALS), an increasingly widespread facet of surgical medicine, tend to be higher income white males, according to an extensive new study presented at Minimally Invasive Surgery Week.
“We wanted to look at how the technology is rolling out ... and what some of those characteristics are that are occurring, not only with the types of patients that are picking up these surgeries but also who the surgeons are that are performing these surgeries,” the study’s lead investigator, Michael A. Palese, MD, of Mount Sinai Health System, New York, explained during a video interview.
A total of 63,725 RALS cases were included, all of which occurred during 2009-2015. In addition to affluent white males being the predominant recipients of this type of surgery, younger white male surgeons tended to be the ones more likely to perform RALS. Across specialties, RALS use has increased substantially over the study period, with the largest increases seen among cardiothoracic surgeons (from 197 cases, 3.1% of all cases per year, to 1,159, 8.7% of all cases). Among general surgeons, RALS use increased from 98 cases (3.2%) to 2,559 cases (19.1%), and for orthopedic surgeons, 55 (0.8%) to 985 (7.4%).
Dr. Palese discussed the genesis of the study, the importance of the study’s findings, and where he foresees RALS heading in the near future. He did not report any relevant financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
BOSTON – Patients who receive robot-assisted laparoscopic surgery (RALS), an increasingly widespread facet of surgical medicine, tend to be higher income white males, according to an extensive new study presented at Minimally Invasive Surgery Week.
“We wanted to look at how the technology is rolling out ... and what some of those characteristics are that are occurring, not only with the types of patients that are picking up these surgeries but also who the surgeons are that are performing these surgeries,” the study’s lead investigator, Michael A. Palese, MD, of Mount Sinai Health System, New York, explained during a video interview.
A total of 63,725 RALS cases were included, all of which occurred during 2009-2015. In addition to affluent white males being the predominant recipients of this type of surgery, younger white male surgeons tended to be the ones more likely to perform RALS. Across specialties, RALS use has increased substantially over the study period, with the largest increases seen among cardiothoracic surgeons (from 197 cases, 3.1% of all cases per year, to 1,159, 8.7% of all cases). Among general surgeons, RALS use increased from 98 cases (3.2%) to 2,559 cases (19.1%), and for orthopedic surgeons, 55 (0.8%) to 985 (7.4%).
Dr. Palese discussed the genesis of the study, the importance of the study’s findings, and where he foresees RALS heading in the near future. He did not report any relevant financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
BOSTON – Patients who receive robot-assisted laparoscopic surgery (RALS), an increasingly widespread facet of surgical medicine, tend to be higher income white males, according to an extensive new study presented at Minimally Invasive Surgery Week.
“We wanted to look at how the technology is rolling out ... and what some of those characteristics are that are occurring, not only with the types of patients that are picking up these surgeries but also who the surgeons are that are performing these surgeries,” the study’s lead investigator, Michael A. Palese, MD, of Mount Sinai Health System, New York, explained during a video interview.
A total of 63,725 RALS cases were included, all of which occurred during 2009-2015. In addition to affluent white males being the predominant recipients of this type of surgery, younger white male surgeons tended to be the ones more likely to perform RALS. Across specialties, RALS use has increased substantially over the study period, with the largest increases seen among cardiothoracic surgeons (from 197 cases, 3.1% of all cases per year, to 1,159, 8.7% of all cases). Among general surgeons, RALS use increased from 98 cases (3.2%) to 2,559 cases (19.1%), and for orthopedic surgeons, 55 (0.8%) to 985 (7.4%).
Dr. Palese discussed the genesis of the study, the importance of the study’s findings, and where he foresees RALS heading in the near future. He did not report any relevant financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
AT MINIMALLY INVASIVE SURGERY WEEK