The Official Newspaper of the American Association for Thoracic Surgery

Top Sections
Conference Coverage
Point/Counterpoint
Podcasts
tsn
Main menu
AATS Main Menu
Explore menu
AATS Explore Menu
Proclivity ID
18826001
Unpublish
Specialty Focus
General Thoracic
Acquired Cardiovascular Disease
Congenital Heart Disease
Altmetric
Article Authors "autobrand" affiliation
Frontline Medical News
DSM Affiliated
Display in offset block
Enable Disqus
Display Author and Disclosure Link
Publication Type
Society
Slot System
Top 25
Disable Sticky Ads
Disable Ad Block Mitigation
Featured Buckets Admin
Show Ads on this Publication's Homepage
Consolidated Pub
Show Article Page Numbers on TOC
Use larger logo size
Off
publication_blueconic_enabled
Off
Show More Destinations Menu
Disable Adhesion on Publication
Off
Restore Menu Label on Mobile Navigation
Disable Facebook Pixel from Publication
Exclude this publication from publication selection on articles and quiz

Should clopidogrel be discontinued prior to open vascular procedures?

Article Type
Changed
Wed, 01/02/2019 - 09:40
Display Headline
Should clopidogrel be discontinued prior to open vascular procedures?

The continued use of perioperative clopidogrel is appropriate

Surgeons have always worried about bleeding risks for procedures we do. Complex vascular procedures are further complicated by the myriad of available antiplatelet agents designed to reduce ischemic events from cardiovascular disease burden at the expense of potential bleeding complications if antiplatelet medications are continued. Rather than relying on anecdotal reports by historical vignettes, let’s look at the evidence.

There probably is no other drug available in our vascular toolbox which has been studied more in the last 20 years than clopidogrel. Multiple randomized and double blinded studies such as CASPAR1 and CHARISMA2 have amplified what was known since the early CAPRIE trial in the 1990’s and that is that clopidogrel is safe when used as a single medication or as a dual agent with aspirin (duel antiplatelet therapy [DAPT]).

Dr. Gary Lemmon

But not all our patients need DAPT. There is no level 1 evidence demonstrating the need for any antiplatelet therapy in the primary prevention of cardiovascular events for patients deemed at low or moderate risk of cardiovascular disease from a large meta-analysis review of six primary prevention trials encompassing over 95,000 patients.3

If our patients do present with vascular disease, current ACCP guidelines recommend single-agent antiplatelet medication (either ASA or clopidogrel) for symptomatic peripheral arterial disease (PAD) whether planning LE revascularization with bypass or via endovascular means with grade 1A evidence.4 This works fine for single-focus vascular disease and each antiplatelet agent have proponents but either works well.

That’s great, but what about all those sick cardiac patients we see the most of? First, CHARISMA subgroup analysis of patients with preexisting coronary and/or cerebrovascular disease demonstrate a 7.1% risk reduction in MI, cerebrovascular events, and cardiac ischemic deaths when continuing DAPT over aspirin alone, and similar risk reduction is found in PAD patients for endpoints of MI and ischemic cardiovascular events. Second, there was no significant difference in severe, fatal, or moderate bleeding in those receiving DAPT vs. aspirin alone with only minor bleeding increased using DAPT. Third, real-life practice echoes multiple trial experiences such as the Vascular Study Group of New England study group confirmed in reviewing 16 centers and 66 surgeons with more than 10,000 patients. Approximately 39% underwent major aortic or lower extremity bypass operations.

No statistical difference could be found for reoperation (P = .74), transfusion (P = .1) or operative type between DAPT or aspirin use alone.5 This is rediscovered once again by Saadeh and Sfeir in their prospective study of 647 major arterial procedures over 7 years finding no significant difference in reoperation for bleeding or bleeding mortality between DAPT vs. aspirin alone.6

So can we stop bashing clopidogrel as an evil agent of bleeding as Dr. Dalsing wishes to do? After all, he has been on record as stating, “I don’t know if our bleeding risk is worse or better … something we have to do to keep our grafts going.” Evidence tells us the benefits for continuing DAPT as seen in risk reduction in primary cardiovascular outcomes far outweigh the risk of minor bleeding associated with continued use.

Let the science dictate practice. Patients with low or moderate risk for cardiovascular disease need no antiplatelet medication unless undergoing PAD treatment where a single agent, either aspirin or clopidogrel alone, is sufficient. In those patients having a large cardiovascular burden of disease, combination of aspirin and clopidogrel improves survival benefit and reduces ischemic events without a significant risk of reoperation, transfusion, or bleeding-related mortality. As many of our patients require DAPT for drug eluting coronary stents, withholding clopidogrel preoperatively increases overall risk beyond acceptable limits. Improving surgical skills and paying attention to hemostasis during the operation will allow naysayers to achieve improved patient survival without fear of bleeding when continuing best medical therapy such as DAPT.

Gary Lemmon, MD, is professor of vascular surgery at Indiana University, Indianapolis, and chief, vascular surgery, Indianapolis VA Medical Center. He reported no relevant conflicts.

References

1. J Vasc Surg. 2010;52:825-33

2. Eur Heart J. 2009;30:192-201

3. Lancet. 2009;373:1849-604. Chest. 2012;141:e669s-90s

5. J Vasc Surg. 2011;54: 779-84

6. J Vasc Surg. 2013;58: 1586-92

The continued use of perioperative clopidogrel is debatable!

There are cases in which clopidogrel should not be discontinued for a needed vascular intervention. Delaying operation or maintaining clopidogrel during operation if your patient required a recent coronary stent is warranted unless you are willing to accept an acute coronary thrombosis.

However, in other cases, for example infrainguinal grafts, the risk of potential increased bleeding when adding clopidogrel to aspirin may outweigh potential improvements in graft patency. This is especially true of below-knee vein bypass grafts where data do not support improved patency. However, in the CASPAR trial, prosthetic graft patency did appear to be beneficial, but only in subgroup analysis.1

 

 

Dr. Michael C. Dalsing

It is true that severe bleeding was not increased (intracranial hemorrhage, or hemodynamic compromise: 1 vs 2.7%, P = NS) but moderate bleeding (transfusion required: 0.7 vs 3.7%, P = .012) and mild bleeding (5.4 vs 12.1%, P = .004) was increased when this agent was used especially in vein graft surgery. This risk of bleeding was present even when clopidogrel was begun 2 or more days after surgery.1

To complicate this decision, a Cochrane review did not consider subgroup analysis as statistically valid and so the authors considered infrainguinal graft patency as not improved with clopidogrel but bleeding risk was increased. One might even question the use of acetylsalicylic acid (ASA) for vein graft bypasses based on the results of this metanalysis.2 Carotid endarterectomy is a common vascular surgery procedure in which antiplatelet use has been evaluated in the real-world situation and with large cohorts. As is always the case when dealing with patient issues, the addition of one agent does not tell the entire story and patient demographics can have a significant influence on the outcome. A report from the Vascular Quality Initiative (VQI) database controlled for patient differences by propensity matching with more than 4,500 patients in each of the two groups; ASA vs. ASA + clopidogrel; demonstrated that major bleeding, defined as return to the OR for bleeding, was statistically more common with dual therapy (1.3% vs. 0.7%, P = .004).3

The addition of clopidogrel did statistically decrease the risk of ipsilateral TIA or stroke (0.8% vs. 1.2%, P = .02) but not the risk of death (0.2% vs. 0.3%, P = .3) or postoperative MI (1% vs. 0.8%, P = .4). Reoperation for bleeding is not inconsequential since in patients requiring this intervention, there is a significantly worse outcome in regard to stroke (3.7% vs. 0.8%, P = .001), MI (6.2% vs. 0.8%, P = .001), and death (2.5% vs. 0.2%,P = .001). Further drill down involving propensity score–matched analysis stratified by symptom status (asymptomatic vs. symptomatic) was quite interesting in that in only asymptomatic patients did the addition of clopidogrel actually demonstrate a statistically significant reduction in TIA or stroke, any stroke, or composite stroke/death. Symptomatic patients taking dual therapy demonstrated a slight reduction in TIA or stroke (1.4% vs. 1.7%, P = .6), any stroke (1.1% vs. 1.2%, P = .9) and composite stroke/death (1.2% vs. 1.5%, P = .5) but in no instance was statistical significance reached. The use of protamine did help to decrease the risk of bleeding.

Regarding the use of dual therapy during open aortic operations, an earlier report of the VQI database demonstrated no significant difference in bleeding risk statistically, but if one delves deeper the data indicate something different. In the majority of cases, vascular surgeons do not feel comfortable preforming this extensive dissection on dual therapy. Of the cases reported, 1,074 were preformed either free of either drug or only on ASA while 42 were on dual therapy and only 12 on clopidogrel only. In fact, in the conclusions, the authors note that they do not believe that conclusions regarding clopidogrel use in patient undergoing open abdominal aortic aneurysm repair can be drawn based on their results since the potential for a type II error was too great.4

It may be that our current level of sophistication is not sufficiently mature to determine the actual effect that clopidogrel is having on our patients. Clopidogrel, a thienopyridine, inhibits platelet activation by blocking the ADP-binding site for the P2Y12 receptor. Over 85% of ingested drug is metabolized into inactive metabolites while 15% is metabolized by the liver via a two-step oxidative process into the active thiol metabolite. Inter-individual variability in the antiplatelet response to thienopyridines is noted and partially caused by genetic mutations in the CP isoenzymes. Platelet reactivity testing is possible but most of the work has been conducted for those patients requiring coronary artery revascularization. Results of tailoring intervention to maximize therapeutic benefit and decrease the risk of bleeding have been inconsistent but, in some studies, appear to be promising.5 This approach may ultimately be found superior to determining how effective clopidogrel actually is in a particular case with some insight into the bleeding risk as well. With this determination, whether or not to hold clopidogrel perioperatively can be made with some science behind the decision.

Clearly, a blanket statement that the risk of bleeding should be accepted or ignored because of the demonstrated benefits of clopidogrel in patients requiring vascular surgery is not accurate. In some cases, there is no clear benefit, so eliminating the bleeding risk may well be the appropriate decision. The astute vascular surgeon understands the details of the written word in order to make an educated decision and understands that new information such as determining platelet reactivity may provide more clarity to such decisions in the future.

 

 

Michael C. Dalsing, MD, is chief of vascular surgery at Indiana University, Indianapolis. He reported no relevant conflicts.

References

1. J Vasc Surg. 2010;52:825-33

2. Cochrane Database Syst Rev. 2015, Issue 2. Art. No.: CD000535

3. J Vasc Surg. 2016;63:1262-70

4.J Vasc Surg. 2011;54:779-84

5. Vascul Pharmacol. 2016;77:19-27

References

Author and Disclosure Information

Publications
Topics
Sections
Author and Disclosure Information

Author and Disclosure Information

The continued use of perioperative clopidogrel is appropriate

Surgeons have always worried about bleeding risks for procedures we do. Complex vascular procedures are further complicated by the myriad of available antiplatelet agents designed to reduce ischemic events from cardiovascular disease burden at the expense of potential bleeding complications if antiplatelet medications are continued. Rather than relying on anecdotal reports by historical vignettes, let’s look at the evidence.

There probably is no other drug available in our vascular toolbox which has been studied more in the last 20 years than clopidogrel. Multiple randomized and double blinded studies such as CASPAR1 and CHARISMA2 have amplified what was known since the early CAPRIE trial in the 1990’s and that is that clopidogrel is safe when used as a single medication or as a dual agent with aspirin (duel antiplatelet therapy [DAPT]).

Dr. Gary Lemmon

But not all our patients need DAPT. There is no level 1 evidence demonstrating the need for any antiplatelet therapy in the primary prevention of cardiovascular events for patients deemed at low or moderate risk of cardiovascular disease from a large meta-analysis review of six primary prevention trials encompassing over 95,000 patients.3

If our patients do present with vascular disease, current ACCP guidelines recommend single-agent antiplatelet medication (either ASA or clopidogrel) for symptomatic peripheral arterial disease (PAD) whether planning LE revascularization with bypass or via endovascular means with grade 1A evidence.4 This works fine for single-focus vascular disease and each antiplatelet agent have proponents but either works well.

That’s great, but what about all those sick cardiac patients we see the most of? First, CHARISMA subgroup analysis of patients with preexisting coronary and/or cerebrovascular disease demonstrate a 7.1% risk reduction in MI, cerebrovascular events, and cardiac ischemic deaths when continuing DAPT over aspirin alone, and similar risk reduction is found in PAD patients for endpoints of MI and ischemic cardiovascular events. Second, there was no significant difference in severe, fatal, or moderate bleeding in those receiving DAPT vs. aspirin alone with only minor bleeding increased using DAPT. Third, real-life practice echoes multiple trial experiences such as the Vascular Study Group of New England study group confirmed in reviewing 16 centers and 66 surgeons with more than 10,000 patients. Approximately 39% underwent major aortic or lower extremity bypass operations.

No statistical difference could be found for reoperation (P = .74), transfusion (P = .1) or operative type between DAPT or aspirin use alone.5 This is rediscovered once again by Saadeh and Sfeir in their prospective study of 647 major arterial procedures over 7 years finding no significant difference in reoperation for bleeding or bleeding mortality between DAPT vs. aspirin alone.6

So can we stop bashing clopidogrel as an evil agent of bleeding as Dr. Dalsing wishes to do? After all, he has been on record as stating, “I don’t know if our bleeding risk is worse or better … something we have to do to keep our grafts going.” Evidence tells us the benefits for continuing DAPT as seen in risk reduction in primary cardiovascular outcomes far outweigh the risk of minor bleeding associated with continued use.

Let the science dictate practice. Patients with low or moderate risk for cardiovascular disease need no antiplatelet medication unless undergoing PAD treatment where a single agent, either aspirin or clopidogrel alone, is sufficient. In those patients having a large cardiovascular burden of disease, combination of aspirin and clopidogrel improves survival benefit and reduces ischemic events without a significant risk of reoperation, transfusion, or bleeding-related mortality. As many of our patients require DAPT for drug eluting coronary stents, withholding clopidogrel preoperatively increases overall risk beyond acceptable limits. Improving surgical skills and paying attention to hemostasis during the operation will allow naysayers to achieve improved patient survival without fear of bleeding when continuing best medical therapy such as DAPT.

Gary Lemmon, MD, is professor of vascular surgery at Indiana University, Indianapolis, and chief, vascular surgery, Indianapolis VA Medical Center. He reported no relevant conflicts.

References

1. J Vasc Surg. 2010;52:825-33

2. Eur Heart J. 2009;30:192-201

3. Lancet. 2009;373:1849-604. Chest. 2012;141:e669s-90s

5. J Vasc Surg. 2011;54: 779-84

6. J Vasc Surg. 2013;58: 1586-92

The continued use of perioperative clopidogrel is debatable!

There are cases in which clopidogrel should not be discontinued for a needed vascular intervention. Delaying operation or maintaining clopidogrel during operation if your patient required a recent coronary stent is warranted unless you are willing to accept an acute coronary thrombosis.

However, in other cases, for example infrainguinal grafts, the risk of potential increased bleeding when adding clopidogrel to aspirin may outweigh potential improvements in graft patency. This is especially true of below-knee vein bypass grafts where data do not support improved patency. However, in the CASPAR trial, prosthetic graft patency did appear to be beneficial, but only in subgroup analysis.1

 

 

Dr. Michael C. Dalsing

It is true that severe bleeding was not increased (intracranial hemorrhage, or hemodynamic compromise: 1 vs 2.7%, P = NS) but moderate bleeding (transfusion required: 0.7 vs 3.7%, P = .012) and mild bleeding (5.4 vs 12.1%, P = .004) was increased when this agent was used especially in vein graft surgery. This risk of bleeding was present even when clopidogrel was begun 2 or more days after surgery.1

To complicate this decision, a Cochrane review did not consider subgroup analysis as statistically valid and so the authors considered infrainguinal graft patency as not improved with clopidogrel but bleeding risk was increased. One might even question the use of acetylsalicylic acid (ASA) for vein graft bypasses based on the results of this metanalysis.2 Carotid endarterectomy is a common vascular surgery procedure in which antiplatelet use has been evaluated in the real-world situation and with large cohorts. As is always the case when dealing with patient issues, the addition of one agent does not tell the entire story and patient demographics can have a significant influence on the outcome. A report from the Vascular Quality Initiative (VQI) database controlled for patient differences by propensity matching with more than 4,500 patients in each of the two groups; ASA vs. ASA + clopidogrel; demonstrated that major bleeding, defined as return to the OR for bleeding, was statistically more common with dual therapy (1.3% vs. 0.7%, P = .004).3

The addition of clopidogrel did statistically decrease the risk of ipsilateral TIA or stroke (0.8% vs. 1.2%, P = .02) but not the risk of death (0.2% vs. 0.3%, P = .3) or postoperative MI (1% vs. 0.8%, P = .4). Reoperation for bleeding is not inconsequential since in patients requiring this intervention, there is a significantly worse outcome in regard to stroke (3.7% vs. 0.8%, P = .001), MI (6.2% vs. 0.8%, P = .001), and death (2.5% vs. 0.2%,P = .001). Further drill down involving propensity score–matched analysis stratified by symptom status (asymptomatic vs. symptomatic) was quite interesting in that in only asymptomatic patients did the addition of clopidogrel actually demonstrate a statistically significant reduction in TIA or stroke, any stroke, or composite stroke/death. Symptomatic patients taking dual therapy demonstrated a slight reduction in TIA or stroke (1.4% vs. 1.7%, P = .6), any stroke (1.1% vs. 1.2%, P = .9) and composite stroke/death (1.2% vs. 1.5%, P = .5) but in no instance was statistical significance reached. The use of protamine did help to decrease the risk of bleeding.

Regarding the use of dual therapy during open aortic operations, an earlier report of the VQI database demonstrated no significant difference in bleeding risk statistically, but if one delves deeper the data indicate something different. In the majority of cases, vascular surgeons do not feel comfortable preforming this extensive dissection on dual therapy. Of the cases reported, 1,074 were preformed either free of either drug or only on ASA while 42 were on dual therapy and only 12 on clopidogrel only. In fact, in the conclusions, the authors note that they do not believe that conclusions regarding clopidogrel use in patient undergoing open abdominal aortic aneurysm repair can be drawn based on their results since the potential for a type II error was too great.4

It may be that our current level of sophistication is not sufficiently mature to determine the actual effect that clopidogrel is having on our patients. Clopidogrel, a thienopyridine, inhibits platelet activation by blocking the ADP-binding site for the P2Y12 receptor. Over 85% of ingested drug is metabolized into inactive metabolites while 15% is metabolized by the liver via a two-step oxidative process into the active thiol metabolite. Inter-individual variability in the antiplatelet response to thienopyridines is noted and partially caused by genetic mutations in the CP isoenzymes. Platelet reactivity testing is possible but most of the work has been conducted for those patients requiring coronary artery revascularization. Results of tailoring intervention to maximize therapeutic benefit and decrease the risk of bleeding have been inconsistent but, in some studies, appear to be promising.5 This approach may ultimately be found superior to determining how effective clopidogrel actually is in a particular case with some insight into the bleeding risk as well. With this determination, whether or not to hold clopidogrel perioperatively can be made with some science behind the decision.

Clearly, a blanket statement that the risk of bleeding should be accepted or ignored because of the demonstrated benefits of clopidogrel in patients requiring vascular surgery is not accurate. In some cases, there is no clear benefit, so eliminating the bleeding risk may well be the appropriate decision. The astute vascular surgeon understands the details of the written word in order to make an educated decision and understands that new information such as determining platelet reactivity may provide more clarity to such decisions in the future.

 

 

Michael C. Dalsing, MD, is chief of vascular surgery at Indiana University, Indianapolis. He reported no relevant conflicts.

References

1. J Vasc Surg. 2010;52:825-33

2. Cochrane Database Syst Rev. 2015, Issue 2. Art. No.: CD000535

3. J Vasc Surg. 2016;63:1262-70

4.J Vasc Surg. 2011;54:779-84

5. Vascul Pharmacol. 2016;77:19-27

The continued use of perioperative clopidogrel is appropriate

Surgeons have always worried about bleeding risks for procedures we do. Complex vascular procedures are further complicated by the myriad of available antiplatelet agents designed to reduce ischemic events from cardiovascular disease burden at the expense of potential bleeding complications if antiplatelet medications are continued. Rather than relying on anecdotal reports by historical vignettes, let’s look at the evidence.

There probably is no other drug available in our vascular toolbox which has been studied more in the last 20 years than clopidogrel. Multiple randomized and double blinded studies such as CASPAR1 and CHARISMA2 have amplified what was known since the early CAPRIE trial in the 1990’s and that is that clopidogrel is safe when used as a single medication or as a dual agent with aspirin (duel antiplatelet therapy [DAPT]).

Dr. Gary Lemmon

But not all our patients need DAPT. There is no level 1 evidence demonstrating the need for any antiplatelet therapy in the primary prevention of cardiovascular events for patients deemed at low or moderate risk of cardiovascular disease from a large meta-analysis review of six primary prevention trials encompassing over 95,000 patients.3

If our patients do present with vascular disease, current ACCP guidelines recommend single-agent antiplatelet medication (either ASA or clopidogrel) for symptomatic peripheral arterial disease (PAD) whether planning LE revascularization with bypass or via endovascular means with grade 1A evidence.4 This works fine for single-focus vascular disease and each antiplatelet agent have proponents but either works well.

That’s great, but what about all those sick cardiac patients we see the most of? First, CHARISMA subgroup analysis of patients with preexisting coronary and/or cerebrovascular disease demonstrate a 7.1% risk reduction in MI, cerebrovascular events, and cardiac ischemic deaths when continuing DAPT over aspirin alone, and similar risk reduction is found in PAD patients for endpoints of MI and ischemic cardiovascular events. Second, there was no significant difference in severe, fatal, or moderate bleeding in those receiving DAPT vs. aspirin alone with only minor bleeding increased using DAPT. Third, real-life practice echoes multiple trial experiences such as the Vascular Study Group of New England study group confirmed in reviewing 16 centers and 66 surgeons with more than 10,000 patients. Approximately 39% underwent major aortic or lower extremity bypass operations.

No statistical difference could be found for reoperation (P = .74), transfusion (P = .1) or operative type between DAPT or aspirin use alone.5 This is rediscovered once again by Saadeh and Sfeir in their prospective study of 647 major arterial procedures over 7 years finding no significant difference in reoperation for bleeding or bleeding mortality between DAPT vs. aspirin alone.6

So can we stop bashing clopidogrel as an evil agent of bleeding as Dr. Dalsing wishes to do? After all, he has been on record as stating, “I don’t know if our bleeding risk is worse or better … something we have to do to keep our grafts going.” Evidence tells us the benefits for continuing DAPT as seen in risk reduction in primary cardiovascular outcomes far outweigh the risk of minor bleeding associated with continued use.

Let the science dictate practice. Patients with low or moderate risk for cardiovascular disease need no antiplatelet medication unless undergoing PAD treatment where a single agent, either aspirin or clopidogrel alone, is sufficient. In those patients having a large cardiovascular burden of disease, combination of aspirin and clopidogrel improves survival benefit and reduces ischemic events without a significant risk of reoperation, transfusion, or bleeding-related mortality. As many of our patients require DAPT for drug eluting coronary stents, withholding clopidogrel preoperatively increases overall risk beyond acceptable limits. Improving surgical skills and paying attention to hemostasis during the operation will allow naysayers to achieve improved patient survival without fear of bleeding when continuing best medical therapy such as DAPT.

Gary Lemmon, MD, is professor of vascular surgery at Indiana University, Indianapolis, and chief, vascular surgery, Indianapolis VA Medical Center. He reported no relevant conflicts.

References

1. J Vasc Surg. 2010;52:825-33

2. Eur Heart J. 2009;30:192-201

3. Lancet. 2009;373:1849-604. Chest. 2012;141:e669s-90s

5. J Vasc Surg. 2011;54: 779-84

6. J Vasc Surg. 2013;58: 1586-92

The continued use of perioperative clopidogrel is debatable!

There are cases in which clopidogrel should not be discontinued for a needed vascular intervention. Delaying operation or maintaining clopidogrel during operation if your patient required a recent coronary stent is warranted unless you are willing to accept an acute coronary thrombosis.

However, in other cases, for example infrainguinal grafts, the risk of potential increased bleeding when adding clopidogrel to aspirin may outweigh potential improvements in graft patency. This is especially true of below-knee vein bypass grafts where data do not support improved patency. However, in the CASPAR trial, prosthetic graft patency did appear to be beneficial, but only in subgroup analysis.1

 

 

Dr. Michael C. Dalsing

It is true that severe bleeding was not increased (intracranial hemorrhage, or hemodynamic compromise: 1 vs 2.7%, P = NS) but moderate bleeding (transfusion required: 0.7 vs 3.7%, P = .012) and mild bleeding (5.4 vs 12.1%, P = .004) was increased when this agent was used especially in vein graft surgery. This risk of bleeding was present even when clopidogrel was begun 2 or more days after surgery.1

To complicate this decision, a Cochrane review did not consider subgroup analysis as statistically valid and so the authors considered infrainguinal graft patency as not improved with clopidogrel but bleeding risk was increased. One might even question the use of acetylsalicylic acid (ASA) for vein graft bypasses based on the results of this metanalysis.2 Carotid endarterectomy is a common vascular surgery procedure in which antiplatelet use has been evaluated in the real-world situation and with large cohorts. As is always the case when dealing with patient issues, the addition of one agent does not tell the entire story and patient demographics can have a significant influence on the outcome. A report from the Vascular Quality Initiative (VQI) database controlled for patient differences by propensity matching with more than 4,500 patients in each of the two groups; ASA vs. ASA + clopidogrel; demonstrated that major bleeding, defined as return to the OR for bleeding, was statistically more common with dual therapy (1.3% vs. 0.7%, P = .004).3

The addition of clopidogrel did statistically decrease the risk of ipsilateral TIA or stroke (0.8% vs. 1.2%, P = .02) but not the risk of death (0.2% vs. 0.3%, P = .3) or postoperative MI (1% vs. 0.8%, P = .4). Reoperation for bleeding is not inconsequential since in patients requiring this intervention, there is a significantly worse outcome in regard to stroke (3.7% vs. 0.8%, P = .001), MI (6.2% vs. 0.8%, P = .001), and death (2.5% vs. 0.2%,P = .001). Further drill down involving propensity score–matched analysis stratified by symptom status (asymptomatic vs. symptomatic) was quite interesting in that in only asymptomatic patients did the addition of clopidogrel actually demonstrate a statistically significant reduction in TIA or stroke, any stroke, or composite stroke/death. Symptomatic patients taking dual therapy demonstrated a slight reduction in TIA or stroke (1.4% vs. 1.7%, P = .6), any stroke (1.1% vs. 1.2%, P = .9) and composite stroke/death (1.2% vs. 1.5%, P = .5) but in no instance was statistical significance reached. The use of protamine did help to decrease the risk of bleeding.

Regarding the use of dual therapy during open aortic operations, an earlier report of the VQI database demonstrated no significant difference in bleeding risk statistically, but if one delves deeper the data indicate something different. In the majority of cases, vascular surgeons do not feel comfortable preforming this extensive dissection on dual therapy. Of the cases reported, 1,074 were preformed either free of either drug or only on ASA while 42 were on dual therapy and only 12 on clopidogrel only. In fact, in the conclusions, the authors note that they do not believe that conclusions regarding clopidogrel use in patient undergoing open abdominal aortic aneurysm repair can be drawn based on their results since the potential for a type II error was too great.4

It may be that our current level of sophistication is not sufficiently mature to determine the actual effect that clopidogrel is having on our patients. Clopidogrel, a thienopyridine, inhibits platelet activation by blocking the ADP-binding site for the P2Y12 receptor. Over 85% of ingested drug is metabolized into inactive metabolites while 15% is metabolized by the liver via a two-step oxidative process into the active thiol metabolite. Inter-individual variability in the antiplatelet response to thienopyridines is noted and partially caused by genetic mutations in the CP isoenzymes. Platelet reactivity testing is possible but most of the work has been conducted for those patients requiring coronary artery revascularization. Results of tailoring intervention to maximize therapeutic benefit and decrease the risk of bleeding have been inconsistent but, in some studies, appear to be promising.5 This approach may ultimately be found superior to determining how effective clopidogrel actually is in a particular case with some insight into the bleeding risk as well. With this determination, whether or not to hold clopidogrel perioperatively can be made with some science behind the decision.

Clearly, a blanket statement that the risk of bleeding should be accepted or ignored because of the demonstrated benefits of clopidogrel in patients requiring vascular surgery is not accurate. In some cases, there is no clear benefit, so eliminating the bleeding risk may well be the appropriate decision. The astute vascular surgeon understands the details of the written word in order to make an educated decision and understands that new information such as determining platelet reactivity may provide more clarity to such decisions in the future.

 

 

Michael C. Dalsing, MD, is chief of vascular surgery at Indiana University, Indianapolis. He reported no relevant conflicts.

References

1. J Vasc Surg. 2010;52:825-33

2. Cochrane Database Syst Rev. 2015, Issue 2. Art. No.: CD000535

3. J Vasc Surg. 2016;63:1262-70

4.J Vasc Surg. 2011;54:779-84

5. Vascul Pharmacol. 2016;77:19-27

References

References

Publications
Publications
Topics
Article Type
Display Headline
Should clopidogrel be discontinued prior to open vascular procedures?
Display Headline
Should clopidogrel be discontinued prior to open vascular procedures?
Sections
Article Source

PURLs Copyright

Inside the Article

Disallow All Ads

Ticagrelor slashes first stroke risk after MI

Article Type
Changed
Fri, 01/18/2019 - 16:11
Display Headline
Ticagrelor slashes first stroke risk after MI

ROME – Adding ticagrelor at 60 mg twice daily in patients on low-dose aspirin due to a prior MI reduced their risk of a first stroke by 25% in a secondary analysis of the landmark PEGASUS-TIMI 54 trial, Marc P. Bonaca, MD, reported at the annual congress of the European Society of Cardiology.

PEGASUS-TIMI 54 was a randomized, double-blind, placebo-controlled clinical trial conducted in more than 21,000 stable patients on low-dose aspirin with a history of an acute MI 1-3 years earlier. The significant reduction in secondary cardiovascular events seen in this study during a median 33 months of follow-up (N Engl J Med. 2015 May 7;372[19]:1791-800) led to approval of ticagrelor (Brilinta) at 60 mg twice daily for long-term secondary prevention.

 

Bruce Jancin/Frontline Medical News
Dr. Marc P. Bonaca

But while PEGASUS-TIMI 54 was a secondary prevention study in terms of cardiovascular events, it was actually a primary prevention study in terms of stroke, since patients with a history of stroke weren’t eligible for enrollment. And in this trial, recipients of ticagrelor at 50 mg twice daily experienced a 25% reduction in the risk of stroke relative to placebo, from 1.94% at 3 years to 1.47%. This benefit was driven by fewer ischemic strokes, with no increase in hemorrhagic strokes seen with ticagrelor. And therein lies a clinical take home point: “When evaluating the overall benefits and risks of long-term ticagrelor in patients with prior MI, stroke reduction should also be considered,” according to Dr. Bonaca of Brigham and Women’s Hospital, Boston.

All strokes were adjudicated and subclassified by a blinded central committee. A total of 213 stroke events occurred during follow-up: 81% ischemic, 7% hemorrhagic, 4% ischemic with hemorrhagic conversion, and 8% unknown; 18% of the strokes were fatal. Another 15% resulted in moderate or severe disability at 30 days. All PEGASUS-TIMI 54 participants were on aspirin and more than 90% were on statin therapy.

The strokes that occurred in patients on ticagrelor were generally less severe than in controls. The risk of having a modified Rankin score of 3-6, which encompasses outcomes ranging from moderate disability to death, was reduced by 43% in stroke patients on ticagrelor relative to those on placebo, the cardiologist continued.

To ensure that the stroke benefit with ticagrelor seen in PEGASUS-TIMI 54 wasn’t a fluke, Dr. Bonaca and his coinvestigators performed a meta-analysis of four placebo-controlled randomized trials of more intensive versus less intensive antiplatelet therapy in nearly 45,000 participants with coronary disease in the CHARISMA, DAPT, PEGASUS-TIMI 54, and TRA 2*P-TIMI 50 trials. A total of 532 strokes occurred in this enlarged analysis. More intensive antiplatelet therapy – typically adding a second drug to low-dose aspirin – resulted in a 34% reduction in ischemic stroke, compared with low-dose aspirin and placebo.

Excluding from the meta-analysis the large subgroup of patients in TRA 2*P-TIMI 50 who were on triple-drug antiplatelet therapy, investigators were left with 32,348 participants in the four trials who were randomized to dual-antiplatelet therapy or monotherapy with aspirin. In this population, there was no increase in the risk of hemorrhagic stroke associated with more intensive antiplatelet therapy, according to Dr. Bonaca.

Session co-chair Keith A.A. Fox, MD, of the University of Edinburgh, noted that various studies have shown monotherapy with aspirin or another antiplatelet agent reduces stroke risk by about 15%, and now PEGASUS-TIMI 54 shows that ticagrelor plus aspirin decreases stroke risk by 25%. He posed a direct question: “How much is too much?”

“More and more antiplatelet therapy begets more bleeding, so I think that more than two agents may be approaching too much, although it really depends on what agents you’re using and in what dosages,” Dr. Bonaca replied.

He reported serving as a consultant to AstraZeneca, Merck, and Bayer.

Simultaneous with Dr. Bonaca’s presentation at ESC 2016 in Rome, the new report from PEGASUS-TIMI 54 including the four-trial meta-analysis was published online (Circulation. 2016 Aug 30. doi: circulationaha.116.024637).

[email protected]

Meeting/Event
Publications
Topics
Legacy Keywords
ticagrelor, stroke, MI, PEGASUS-TIMI 54
Sections
Meeting/Event
Meeting/Event

ROME – Adding ticagrelor at 60 mg twice daily in patients on low-dose aspirin due to a prior MI reduced their risk of a first stroke by 25% in a secondary analysis of the landmark PEGASUS-TIMI 54 trial, Marc P. Bonaca, MD, reported at the annual congress of the European Society of Cardiology.

PEGASUS-TIMI 54 was a randomized, double-blind, placebo-controlled clinical trial conducted in more than 21,000 stable patients on low-dose aspirin with a history of an acute MI 1-3 years earlier. The significant reduction in secondary cardiovascular events seen in this study during a median 33 months of follow-up (N Engl J Med. 2015 May 7;372[19]:1791-800) led to approval of ticagrelor (Brilinta) at 60 mg twice daily for long-term secondary prevention.

 

Bruce Jancin/Frontline Medical News
Dr. Marc P. Bonaca

But while PEGASUS-TIMI 54 was a secondary prevention study in terms of cardiovascular events, it was actually a primary prevention study in terms of stroke, since patients with a history of stroke weren’t eligible for enrollment. And in this trial, recipients of ticagrelor at 50 mg twice daily experienced a 25% reduction in the risk of stroke relative to placebo, from 1.94% at 3 years to 1.47%. This benefit was driven by fewer ischemic strokes, with no increase in hemorrhagic strokes seen with ticagrelor. And therein lies a clinical take home point: “When evaluating the overall benefits and risks of long-term ticagrelor in patients with prior MI, stroke reduction should also be considered,” according to Dr. Bonaca of Brigham and Women’s Hospital, Boston.

All strokes were adjudicated and subclassified by a blinded central committee. A total of 213 stroke events occurred during follow-up: 81% ischemic, 7% hemorrhagic, 4% ischemic with hemorrhagic conversion, and 8% unknown; 18% of the strokes were fatal. Another 15% resulted in moderate or severe disability at 30 days. All PEGASUS-TIMI 54 participants were on aspirin and more than 90% were on statin therapy.

The strokes that occurred in patients on ticagrelor were generally less severe than in controls. The risk of having a modified Rankin score of 3-6, which encompasses outcomes ranging from moderate disability to death, was reduced by 43% in stroke patients on ticagrelor relative to those on placebo, the cardiologist continued.

To ensure that the stroke benefit with ticagrelor seen in PEGASUS-TIMI 54 wasn’t a fluke, Dr. Bonaca and his coinvestigators performed a meta-analysis of four placebo-controlled randomized trials of more intensive versus less intensive antiplatelet therapy in nearly 45,000 participants with coronary disease in the CHARISMA, DAPT, PEGASUS-TIMI 54, and TRA 2*P-TIMI 50 trials. A total of 532 strokes occurred in this enlarged analysis. More intensive antiplatelet therapy – typically adding a second drug to low-dose aspirin – resulted in a 34% reduction in ischemic stroke, compared with low-dose aspirin and placebo.

Excluding from the meta-analysis the large subgroup of patients in TRA 2*P-TIMI 50 who were on triple-drug antiplatelet therapy, investigators were left with 32,348 participants in the four trials who were randomized to dual-antiplatelet therapy or monotherapy with aspirin. In this population, there was no increase in the risk of hemorrhagic stroke associated with more intensive antiplatelet therapy, according to Dr. Bonaca.

Session co-chair Keith A.A. Fox, MD, of the University of Edinburgh, noted that various studies have shown monotherapy with aspirin or another antiplatelet agent reduces stroke risk by about 15%, and now PEGASUS-TIMI 54 shows that ticagrelor plus aspirin decreases stroke risk by 25%. He posed a direct question: “How much is too much?”

“More and more antiplatelet therapy begets more bleeding, so I think that more than two agents may be approaching too much, although it really depends on what agents you’re using and in what dosages,” Dr. Bonaca replied.

He reported serving as a consultant to AstraZeneca, Merck, and Bayer.

Simultaneous with Dr. Bonaca’s presentation at ESC 2016 in Rome, the new report from PEGASUS-TIMI 54 including the four-trial meta-analysis was published online (Circulation. 2016 Aug 30. doi: circulationaha.116.024637).

[email protected]

ROME – Adding ticagrelor at 60 mg twice daily in patients on low-dose aspirin due to a prior MI reduced their risk of a first stroke by 25% in a secondary analysis of the landmark PEGASUS-TIMI 54 trial, Marc P. Bonaca, MD, reported at the annual congress of the European Society of Cardiology.

PEGASUS-TIMI 54 was a randomized, double-blind, placebo-controlled clinical trial conducted in more than 21,000 stable patients on low-dose aspirin with a history of an acute MI 1-3 years earlier. The significant reduction in secondary cardiovascular events seen in this study during a median 33 months of follow-up (N Engl J Med. 2015 May 7;372[19]:1791-800) led to approval of ticagrelor (Brilinta) at 60 mg twice daily for long-term secondary prevention.

 

Bruce Jancin/Frontline Medical News
Dr. Marc P. Bonaca

But while PEGASUS-TIMI 54 was a secondary prevention study in terms of cardiovascular events, it was actually a primary prevention study in terms of stroke, since patients with a history of stroke weren’t eligible for enrollment. And in this trial, recipients of ticagrelor at 50 mg twice daily experienced a 25% reduction in the risk of stroke relative to placebo, from 1.94% at 3 years to 1.47%. This benefit was driven by fewer ischemic strokes, with no increase in hemorrhagic strokes seen with ticagrelor. And therein lies a clinical take home point: “When evaluating the overall benefits and risks of long-term ticagrelor in patients with prior MI, stroke reduction should also be considered,” according to Dr. Bonaca of Brigham and Women’s Hospital, Boston.

All strokes were adjudicated and subclassified by a blinded central committee. A total of 213 stroke events occurred during follow-up: 81% ischemic, 7% hemorrhagic, 4% ischemic with hemorrhagic conversion, and 8% unknown; 18% of the strokes were fatal. Another 15% resulted in moderate or severe disability at 30 days. All PEGASUS-TIMI 54 participants were on aspirin and more than 90% were on statin therapy.

The strokes that occurred in patients on ticagrelor were generally less severe than in controls. The risk of having a modified Rankin score of 3-6, which encompasses outcomes ranging from moderate disability to death, was reduced by 43% in stroke patients on ticagrelor relative to those on placebo, the cardiologist continued.

To ensure that the stroke benefit with ticagrelor seen in PEGASUS-TIMI 54 wasn’t a fluke, Dr. Bonaca and his coinvestigators performed a meta-analysis of four placebo-controlled randomized trials of more intensive versus less intensive antiplatelet therapy in nearly 45,000 participants with coronary disease in the CHARISMA, DAPT, PEGASUS-TIMI 54, and TRA 2*P-TIMI 50 trials. A total of 532 strokes occurred in this enlarged analysis. More intensive antiplatelet therapy – typically adding a second drug to low-dose aspirin – resulted in a 34% reduction in ischemic stroke, compared with low-dose aspirin and placebo.

Excluding from the meta-analysis the large subgroup of patients in TRA 2*P-TIMI 50 who were on triple-drug antiplatelet therapy, investigators were left with 32,348 participants in the four trials who were randomized to dual-antiplatelet therapy or monotherapy with aspirin. In this population, there was no increase in the risk of hemorrhagic stroke associated with more intensive antiplatelet therapy, according to Dr. Bonaca.

Session co-chair Keith A.A. Fox, MD, of the University of Edinburgh, noted that various studies have shown monotherapy with aspirin or another antiplatelet agent reduces stroke risk by about 15%, and now PEGASUS-TIMI 54 shows that ticagrelor plus aspirin decreases stroke risk by 25%. He posed a direct question: “How much is too much?”

“More and more antiplatelet therapy begets more bleeding, so I think that more than two agents may be approaching too much, although it really depends on what agents you’re using and in what dosages,” Dr. Bonaca replied.

He reported serving as a consultant to AstraZeneca, Merck, and Bayer.

Simultaneous with Dr. Bonaca’s presentation at ESC 2016 in Rome, the new report from PEGASUS-TIMI 54 including the four-trial meta-analysis was published online (Circulation. 2016 Aug 30. doi: circulationaha.116.024637).

[email protected]

Publications
Publications
Topics
Article Type
Display Headline
Ticagrelor slashes first stroke risk after MI
Display Headline
Ticagrelor slashes first stroke risk after MI
Legacy Keywords
ticagrelor, stroke, MI, PEGASUS-TIMI 54
Legacy Keywords
ticagrelor, stroke, MI, PEGASUS-TIMI 54
Sections
Article Source

AT THE ESC CONGRESS 2016

Disallow All Ads
Vitals

Key clinical point: Ticagrelor reduced the risk of a first stroke by 25% in patients with a prior MI.

Major finding: Ticagrelor, at the approved dose of 60 mg twice daily for long-term secondary cardiovascular prevention, reduced the risk of a first stroke by 25% in patients with a prior MI.

Data source: This secondary analysis of a randomized, double-blind, placebo-controlled trial included 14,112 stable patients with a prior MI 1-3 years earlier who were randomized to ticagrelor at 60 mg twice daily or placebo and followed prospectively for a median of 33 months.

Disclosures: PEGASUS-TIMI 54 was supported by AstraZeneca. The presenter of the updated analysis reported serving as a consultant to AstraZeneca, Merck, and Bayer.

CMS offers lower-stress reporting options for MACRA in 2017

Article Type
Changed
Wed, 04/03/2019 - 10:30
Display Headline
CMS offers lower-stress reporting options for MACRA in 2017

Physicians will have options for when they can start meeting the requirements for the Merit-based Incentive Payment System (MIPS) track under regulations that implement the Medicare Access and CHIP Reauthorization Act.

The options are designed to allow physicians a variety of ways to get started with the new Quality Payment Program – the term CMS has given the MACRA-legislated reforms – and provide more limited ways to participate in 2017.

Andy Slavitt

Option 1: Test the quality payment program in 2017 by submitting data without facing any negative payment adjustments. This will give physicians the year to make sure their processes are in place and ready for broader participation in 2018 and beyond.

Option 2: Delay the start of the performance period and participate for just part of 2017. Depending on how long a physician delays reporting quality information back to CMS, they could still qualify for a smaller bonus payment.

Option 3: Participate for the entire calendar year as called for by the law and be eligible for the full participation bonuses.

Option 4: For those who qualify, participate in an Advanced Alternative Payment Model (APM) beginning next year.

The final regulations for implementing MACRA will be released on Nov. 1, CMS Acting Administrator Andy Slavitt wrote in a blog post published Sept. 8.

“However you choose to participate in 2017, we will have resources available to assist you and walk you through what needs to be done,” Mr. Slavitt wrote.

[email protected]

References

Author and Disclosure Information

Publications
Topics
Legacy Keywords
MACRA, MIPS, APM
Sections
Author and Disclosure Information

Author and Disclosure Information

Physicians will have options for when they can start meeting the requirements for the Merit-based Incentive Payment System (MIPS) track under regulations that implement the Medicare Access and CHIP Reauthorization Act.

The options are designed to allow physicians a variety of ways to get started with the new Quality Payment Program – the term CMS has given the MACRA-legislated reforms – and provide more limited ways to participate in 2017.

Andy Slavitt

Option 1: Test the quality payment program in 2017 by submitting data without facing any negative payment adjustments. This will give physicians the year to make sure their processes are in place and ready for broader participation in 2018 and beyond.

Option 2: Delay the start of the performance period and participate for just part of 2017. Depending on how long a physician delays reporting quality information back to CMS, they could still qualify for a smaller bonus payment.

Option 3: Participate for the entire calendar year as called for by the law and be eligible for the full participation bonuses.

Option 4: For those who qualify, participate in an Advanced Alternative Payment Model (APM) beginning next year.

The final regulations for implementing MACRA will be released on Nov. 1, CMS Acting Administrator Andy Slavitt wrote in a blog post published Sept. 8.

“However you choose to participate in 2017, we will have resources available to assist you and walk you through what needs to be done,” Mr. Slavitt wrote.

[email protected]

Physicians will have options for when they can start meeting the requirements for the Merit-based Incentive Payment System (MIPS) track under regulations that implement the Medicare Access and CHIP Reauthorization Act.

The options are designed to allow physicians a variety of ways to get started with the new Quality Payment Program – the term CMS has given the MACRA-legislated reforms – and provide more limited ways to participate in 2017.

Andy Slavitt

Option 1: Test the quality payment program in 2017 by submitting data without facing any negative payment adjustments. This will give physicians the year to make sure their processes are in place and ready for broader participation in 2018 and beyond.

Option 2: Delay the start of the performance period and participate for just part of 2017. Depending on how long a physician delays reporting quality information back to CMS, they could still qualify for a smaller bonus payment.

Option 3: Participate for the entire calendar year as called for by the law and be eligible for the full participation bonuses.

Option 4: For those who qualify, participate in an Advanced Alternative Payment Model (APM) beginning next year.

The final regulations for implementing MACRA will be released on Nov. 1, CMS Acting Administrator Andy Slavitt wrote in a blog post published Sept. 8.

“However you choose to participate in 2017, we will have resources available to assist you and walk you through what needs to be done,” Mr. Slavitt wrote.

[email protected]

References

References

Publications
Publications
Topics
Article Type
Display Headline
CMS offers lower-stress reporting options for MACRA in 2017
Display Headline
CMS offers lower-stress reporting options for MACRA in 2017
Legacy Keywords
MACRA, MIPS, APM
Legacy Keywords
MACRA, MIPS, APM
Sections
Article Source

PURLs Copyright

Inside the Article

Disallow All Ads

Data are mixed on cancerous transformation of cardiac mucosa in Barrett’s esophagus

Article Type
Changed
Wed, 05/26/2021 - 13:53
Display Headline
Data are mixed on cancerous transformation of cardiac mucosa in Barrett’s esophagus

CHICAGO – If scouring data is what makes a gastroenterologist feel good about risk assessment, there may be a lot of unhappy gastroenterologists out there, at least when it comes to the risk of cancer arising from cardiac mucosa in Barrett’s esophagus, according to Nicholas J. Shaheen, MD.

The risk arising from this nonintestinal metaplasia growth is probably quite low in real life, but the extant literature gives doctors a lot of contradictions, he said at the meeting sponsored by the American Gastroenterological Association.

“The risk of cancer with cardiac mucosa is unclear,” said Dr. Shaheen of the University of North Carolina at Chapel Hill. “Some data do suggest that, at least when present in the tubular esophagus in patients with gastroesophageal reflux symptoms, there may be a risk of adenocarcinoma close to what’s seen in patients with intestinal metaplasia. Other data suggest the risk is quite low, perhaps even approximating that of the general population.”

The reasons for what Dr. Shaheen called “remarkable variability” in these data probably arise more from sampling error than real life. The studies are retrospective, and many lack long-term follow-up data, are plagued with insufficient numbers, and – perhaps most importantly – are not grounded in any standard clinical methodology.

“People who do endoscopy for a living understand that the stuff you read about systematic biopsy protocols is hardly ever honored in the breach. None of these studies ever reports the biopsy protocol from which the samples were taken.”

This lack of protocol means that studies on the cancer risk of columnar lined esophagus (CLE), which is negative for intestinal metaplasia are probably flawed from the beginning.

“The truth is that most gastroenterologists do a lousy job of biopsying Barrett’s, so there is probably a lot of sampling error in these studies, and they are contaminated with a high rate of intestinal metaplasia [IM],” said Dr. Shaheen.

And these studies do not report on the length of the CLE segment from which the biopsy was taken. “The likelihood of finding goblet cells [a characteristic of cardiac mucosa] increases with the length of Barrett’s. None of the studies is normalized for Barrett’s length. When we see studies saying the cancer risk is higher in the presence of goblet cells, length could be a partially confounding association.”

A 2009 study with a small sample size of 68 CLE patients found that abnormal DNA was just as likely in IM-negative samples as IM-positive ones. All of the samples were significantly different from the control samples, suggesting that any metaplasia in the CLE may already be on the path to cancer, Dr. Shaheen said (Am J Gastro. 2009;104:816-24)

In fact, a 2007 Scandinavian study supported the idea that IM isn’t even necessary for neoplastic progression of CLE (Scand J Gastroenterol 2007;42:1271-4). The investigators followed 712 patients for 12 years, and found that the adenocarcinoma rate was about 0.4 per patient per year whether the sample was IM positive or not.

“This study was enough to put a little shudder in the endoscopy community. If IM doesn’t matter, you’re talking about increasing the work in the endoscopy lab by 100%, because there are twice as many non-IM patients as those with IM.”

A 2008 study seemingly found something similar – but with a caveat, Dr. Shaheen said. The CLE patients in this study were followed for 3.5 years, and the cancer rate was virtually identical. But as the follow-up progressed, more and more biopsies turned up IM positive. “A first negative biopsy looked like it was associated with disease-free survival, but almost all IM-negative samples eventually became IM positive, so this didn’t really answer our question.”

Other studies have found that non-IM CLE has a very low neoplastic risk, and that IM is almost always a prerequisite for cancer to develop. The largest of these was conducted in the Northern Ireland Barrett’s Esophagus Registry in 2011. It followed more than 8,000 patients for 7 years. Patients with IM were 3.5 times more likely to develop a related adenocarcinoma than were those without IM (J Natl Cancer Inst. 2011;103:1049-57).

The contradictory evidence leads Dr. Shaheen to suggest a specific biopsy protocol for patients with Barrett’s esophagus.

“In my opinion, if you see a long segment of Barrett’s – more than 2 cm – and the biopsy is negative for IM, there is a good chance that you have a sampling error there, and a second endoscopy and biopsy are not unreasonable. If you see a short segment of Barrett’s and the biopsy is negative for IM, the cancer risk is unclear, but in general it’s probably pretty low, whether there are goblet cells there or not. I would say retaining these patients under endoscopic surveillance is of dubious value. [With] the likely low absolute risk of cancer in this patient population, no blanket recommendation for surveillance is advisable.”

 

 

Dr. Shaheen had no relevant financial disclosures.

[email protected]

On Twitter @Alz_Gal

Meeting/Event
Publications
Topics
Legacy Keywords
2016 Freston Conference, Barretts esopahgus, cancer
Sections
Meeting/Event
Meeting/Event

CHICAGO – If scouring data is what makes a gastroenterologist feel good about risk assessment, there may be a lot of unhappy gastroenterologists out there, at least when it comes to the risk of cancer arising from cardiac mucosa in Barrett’s esophagus, according to Nicholas J. Shaheen, MD.

The risk arising from this nonintestinal metaplasia growth is probably quite low in real life, but the extant literature gives doctors a lot of contradictions, he said at the meeting sponsored by the American Gastroenterological Association.

“The risk of cancer with cardiac mucosa is unclear,” said Dr. Shaheen of the University of North Carolina at Chapel Hill. “Some data do suggest that, at least when present in the tubular esophagus in patients with gastroesophageal reflux symptoms, there may be a risk of adenocarcinoma close to what’s seen in patients with intestinal metaplasia. Other data suggest the risk is quite low, perhaps even approximating that of the general population.”

The reasons for what Dr. Shaheen called “remarkable variability” in these data probably arise more from sampling error than real life. The studies are retrospective, and many lack long-term follow-up data, are plagued with insufficient numbers, and – perhaps most importantly – are not grounded in any standard clinical methodology.

“People who do endoscopy for a living understand that the stuff you read about systematic biopsy protocols is hardly ever honored in the breach. None of these studies ever reports the biopsy protocol from which the samples were taken.”

This lack of protocol means that studies on the cancer risk of columnar lined esophagus (CLE), which is negative for intestinal metaplasia are probably flawed from the beginning.

“The truth is that most gastroenterologists do a lousy job of biopsying Barrett’s, so there is probably a lot of sampling error in these studies, and they are contaminated with a high rate of intestinal metaplasia [IM],” said Dr. Shaheen.

And these studies do not report on the length of the CLE segment from which the biopsy was taken. “The likelihood of finding goblet cells [a characteristic of cardiac mucosa] increases with the length of Barrett’s. None of the studies is normalized for Barrett’s length. When we see studies saying the cancer risk is higher in the presence of goblet cells, length could be a partially confounding association.”

A 2009 study with a small sample size of 68 CLE patients found that abnormal DNA was just as likely in IM-negative samples as IM-positive ones. All of the samples were significantly different from the control samples, suggesting that any metaplasia in the CLE may already be on the path to cancer, Dr. Shaheen said (Am J Gastro. 2009;104:816-24)

In fact, a 2007 Scandinavian study supported the idea that IM isn’t even necessary for neoplastic progression of CLE (Scand J Gastroenterol 2007;42:1271-4). The investigators followed 712 patients for 12 years, and found that the adenocarcinoma rate was about 0.4 per patient per year whether the sample was IM positive or not.

“This study was enough to put a little shudder in the endoscopy community. If IM doesn’t matter, you’re talking about increasing the work in the endoscopy lab by 100%, because there are twice as many non-IM patients as those with IM.”

A 2008 study seemingly found something similar – but with a caveat, Dr. Shaheen said. The CLE patients in this study were followed for 3.5 years, and the cancer rate was virtually identical. But as the follow-up progressed, more and more biopsies turned up IM positive. “A first negative biopsy looked like it was associated with disease-free survival, but almost all IM-negative samples eventually became IM positive, so this didn’t really answer our question.”

Other studies have found that non-IM CLE has a very low neoplastic risk, and that IM is almost always a prerequisite for cancer to develop. The largest of these was conducted in the Northern Ireland Barrett’s Esophagus Registry in 2011. It followed more than 8,000 patients for 7 years. Patients with IM were 3.5 times more likely to develop a related adenocarcinoma than were those without IM (J Natl Cancer Inst. 2011;103:1049-57).

The contradictory evidence leads Dr. Shaheen to suggest a specific biopsy protocol for patients with Barrett’s esophagus.

“In my opinion, if you see a long segment of Barrett’s – more than 2 cm – and the biopsy is negative for IM, there is a good chance that you have a sampling error there, and a second endoscopy and biopsy are not unreasonable. If you see a short segment of Barrett’s and the biopsy is negative for IM, the cancer risk is unclear, but in general it’s probably pretty low, whether there are goblet cells there or not. I would say retaining these patients under endoscopic surveillance is of dubious value. [With] the likely low absolute risk of cancer in this patient population, no blanket recommendation for surveillance is advisable.”

 

 

Dr. Shaheen had no relevant financial disclosures.

[email protected]

On Twitter @Alz_Gal

CHICAGO – If scouring data is what makes a gastroenterologist feel good about risk assessment, there may be a lot of unhappy gastroenterologists out there, at least when it comes to the risk of cancer arising from cardiac mucosa in Barrett’s esophagus, according to Nicholas J. Shaheen, MD.

The risk arising from this nonintestinal metaplasia growth is probably quite low in real life, but the extant literature gives doctors a lot of contradictions, he said at the meeting sponsored by the American Gastroenterological Association.

“The risk of cancer with cardiac mucosa is unclear,” said Dr. Shaheen of the University of North Carolina at Chapel Hill. “Some data do suggest that, at least when present in the tubular esophagus in patients with gastroesophageal reflux symptoms, there may be a risk of adenocarcinoma close to what’s seen in patients with intestinal metaplasia. Other data suggest the risk is quite low, perhaps even approximating that of the general population.”

The reasons for what Dr. Shaheen called “remarkable variability” in these data probably arise more from sampling error than real life. The studies are retrospective, and many lack long-term follow-up data, are plagued with insufficient numbers, and – perhaps most importantly – are not grounded in any standard clinical methodology.

“People who do endoscopy for a living understand that the stuff you read about systematic biopsy protocols is hardly ever honored in the breach. None of these studies ever reports the biopsy protocol from which the samples were taken.”

This lack of protocol means that studies on the cancer risk of columnar lined esophagus (CLE), which is negative for intestinal metaplasia are probably flawed from the beginning.

“The truth is that most gastroenterologists do a lousy job of biopsying Barrett’s, so there is probably a lot of sampling error in these studies, and they are contaminated with a high rate of intestinal metaplasia [IM],” said Dr. Shaheen.

And these studies do not report on the length of the CLE segment from which the biopsy was taken. “The likelihood of finding goblet cells [a characteristic of cardiac mucosa] increases with the length of Barrett’s. None of the studies is normalized for Barrett’s length. When we see studies saying the cancer risk is higher in the presence of goblet cells, length could be a partially confounding association.”

A 2009 study with a small sample size of 68 CLE patients found that abnormal DNA was just as likely in IM-negative samples as IM-positive ones. All of the samples were significantly different from the control samples, suggesting that any metaplasia in the CLE may already be on the path to cancer, Dr. Shaheen said (Am J Gastro. 2009;104:816-24)

In fact, a 2007 Scandinavian study supported the idea that IM isn’t even necessary for neoplastic progression of CLE (Scand J Gastroenterol 2007;42:1271-4). The investigators followed 712 patients for 12 years, and found that the adenocarcinoma rate was about 0.4 per patient per year whether the sample was IM positive or not.

“This study was enough to put a little shudder in the endoscopy community. If IM doesn’t matter, you’re talking about increasing the work in the endoscopy lab by 100%, because there are twice as many non-IM patients as those with IM.”

A 2008 study seemingly found something similar – but with a caveat, Dr. Shaheen said. The CLE patients in this study were followed for 3.5 years, and the cancer rate was virtually identical. But as the follow-up progressed, more and more biopsies turned up IM positive. “A first negative biopsy looked like it was associated with disease-free survival, but almost all IM-negative samples eventually became IM positive, so this didn’t really answer our question.”

Other studies have found that non-IM CLE has a very low neoplastic risk, and that IM is almost always a prerequisite for cancer to develop. The largest of these was conducted in the Northern Ireland Barrett’s Esophagus Registry in 2011. It followed more than 8,000 patients for 7 years. Patients with IM were 3.5 times more likely to develop a related adenocarcinoma than were those without IM (J Natl Cancer Inst. 2011;103:1049-57).

The contradictory evidence leads Dr. Shaheen to suggest a specific biopsy protocol for patients with Barrett’s esophagus.

“In my opinion, if you see a long segment of Barrett’s – more than 2 cm – and the biopsy is negative for IM, there is a good chance that you have a sampling error there, and a second endoscopy and biopsy are not unreasonable. If you see a short segment of Barrett’s and the biopsy is negative for IM, the cancer risk is unclear, but in general it’s probably pretty low, whether there are goblet cells there or not. I would say retaining these patients under endoscopic surveillance is of dubious value. [With] the likely low absolute risk of cancer in this patient population, no blanket recommendation for surveillance is advisable.”

 

 

Dr. Shaheen had no relevant financial disclosures.

[email protected]

On Twitter @Alz_Gal

Publications
Publications
Topics
Article Type
Display Headline
Data are mixed on cancerous transformation of cardiac mucosa in Barrett’s esophagus
Display Headline
Data are mixed on cancerous transformation of cardiac mucosa in Barrett’s esophagus
Legacy Keywords
2016 Freston Conference, Barretts esopahgus, cancer
Legacy Keywords
2016 Freston Conference, Barretts esopahgus, cancer
Sections
Article Source

EXPERT ANALYSIS FROM THE 2016 JAMES W. FRESTON CONFERENCE

Disallow All Ads

The new NOACs are generally the best bet

Article Type
Changed
Fri, 01/18/2019 - 16:11
Display Headline
The new NOACs are generally the best bet

New NOACs have largely replaced the need for vitamin K antagonists

The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.

Pharmacologic design

Dr. Elliot Chaikof

The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.

Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.

Patient selection

The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7

In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.

Overcoming challenges

Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.

 

 

Conclusions

Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.

Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.

References

1. J Am Vet Med Assoc 1924;64:553-575

2. J Biol Chem 1941;138:21-33

3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470

4. Eur Heart J 2013;34:2094-2106

5. Stroke 2013;44:1676-1681

6. Nat Rev Cardiol 2014;11:693-703

7. Lancet 2014;383:955-962

8. N Engl J Med 2015;373:511-520

9. N Engl J Med 2014;371:2141-2142

What the doctor didn’t order: unintended consequences and pitfalls of NOACs

Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.

Dabigatran and edoxaban

Dr. Thomas Wakefield

Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3

Rivaroxaban and apixaban

Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6

Consequences and pitfalls with NOACs

Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:

• When a patient is bleeding.

• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.

• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.

• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.

• Patients with deteriorating renal function.

• During perioperative management.

• During reversal of anticoagulation.

• When there is suspicion of overdose.

• Assessment of compliance in patients suffering thrombotic events while on treatment.7

Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8

 

 

Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.

The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.

The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.

So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.

Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.

References

1. N Engl J Med. 2009;361:2342-2352

2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426

3. N Engl J Med 2013;369:1406-1415

4. N Engl J Med 2010;363:2499-2510

5. N Engl J Med 2013;368:699-708

6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065

7. J Thrombosis and Haemostasis 2013;11:756-760

8. N Engl J Med 2015; 373: 511-520

9. Current Opinion in Anaesthesiology. 2014;27:409-19

References

Author and Disclosure Information

Publications
Topics
Sections
Author and Disclosure Information

Author and Disclosure Information

New NOACs have largely replaced the need for vitamin K antagonists

The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.

Pharmacologic design

Dr. Elliot Chaikof

The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.

Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.

Patient selection

The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7

In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.

Overcoming challenges

Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.

 

 

Conclusions

Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.

Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.

References

1. J Am Vet Med Assoc 1924;64:553-575

2. J Biol Chem 1941;138:21-33

3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470

4. Eur Heart J 2013;34:2094-2106

5. Stroke 2013;44:1676-1681

6. Nat Rev Cardiol 2014;11:693-703

7. Lancet 2014;383:955-962

8. N Engl J Med 2015;373:511-520

9. N Engl J Med 2014;371:2141-2142

What the doctor didn’t order: unintended consequences and pitfalls of NOACs

Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.

Dabigatran and edoxaban

Dr. Thomas Wakefield

Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3

Rivaroxaban and apixaban

Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6

Consequences and pitfalls with NOACs

Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:

• When a patient is bleeding.

• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.

• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.

• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.

• Patients with deteriorating renal function.

• During perioperative management.

• During reversal of anticoagulation.

• When there is suspicion of overdose.

• Assessment of compliance in patients suffering thrombotic events while on treatment.7

Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8

 

 

Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.

The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.

The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.

So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.

Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.

References

1. N Engl J Med. 2009;361:2342-2352

2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426

3. N Engl J Med 2013;369:1406-1415

4. N Engl J Med 2010;363:2499-2510

5. N Engl J Med 2013;368:699-708

6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065

7. J Thrombosis and Haemostasis 2013;11:756-760

8. N Engl J Med 2015; 373: 511-520

9. Current Opinion in Anaesthesiology. 2014;27:409-19

New NOACs have largely replaced the need for vitamin K antagonists

The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.

Pharmacologic design

Dr. Elliot Chaikof

The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.

Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.

Patient selection

The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7

In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.

Overcoming challenges

Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.

 

 

Conclusions

Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.

Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.

References

1. J Am Vet Med Assoc 1924;64:553-575

2. J Biol Chem 1941;138:21-33

3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470

4. Eur Heart J 2013;34:2094-2106

5. Stroke 2013;44:1676-1681

6. Nat Rev Cardiol 2014;11:693-703

7. Lancet 2014;383:955-962

8. N Engl J Med 2015;373:511-520

9. N Engl J Med 2014;371:2141-2142

What the doctor didn’t order: unintended consequences and pitfalls of NOACs

Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.

Dabigatran and edoxaban

Dr. Thomas Wakefield

Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3

Rivaroxaban and apixaban

Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6

Consequences and pitfalls with NOACs

Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:

• When a patient is bleeding.

• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.

• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.

• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.

• Patients with deteriorating renal function.

• During perioperative management.

• During reversal of anticoagulation.

• When there is suspicion of overdose.

• Assessment of compliance in patients suffering thrombotic events while on treatment.7

Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8

 

 

Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.

The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.

The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.

So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.

Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.

References

1. N Engl J Med. 2009;361:2342-2352

2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426

3. N Engl J Med 2013;369:1406-1415

4. N Engl J Med 2010;363:2499-2510

5. N Engl J Med 2013;368:699-708

6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065

7. J Thrombosis and Haemostasis 2013;11:756-760

8. N Engl J Med 2015; 373: 511-520

9. Current Opinion in Anaesthesiology. 2014;27:409-19

References

References

Publications
Publications
Topics
Article Type
Display Headline
The new NOACs are generally the best bet
Display Headline
The new NOACs are generally the best bet
Sections
Article Source

PURLs Copyright

Inside the Article

Disallow All Ads

Whole brain radiotherapy not beneficial for NSCLC metastasis

Large, well designed trial has limitations
Article Type
Changed
Fri, 01/04/2019 - 13:20
Display Headline
Whole brain radiotherapy not beneficial for NSCLC metastasis

Whole brain radiotherapy, a standard treatment for patients with metastatic non–small-cell lung cancer, provided no clinical benefit in a noninferiority trial specifically designed to assess both patient survival and quality of life.

The findings were published online Sept. 4 in the Lancet.

Whole brain radiotherapy, with or without concomitant steroid treatment, has been widely used for decades in that patient population, even though no sufficiently powered, definitive studies support the approach. It is likely that patients and clinicians alike continue to embrace it because of the absence of alternative treatment options.

The Quality of Life After Treatment for Brain Metastases (QUARTZ) trial was intended to assess whether any improvement in survival offered by whole brain radiotherapy is balanced by deterioration in quality of life, said Paula Mulvenna, MBBS, of the Northern Center for Cancer Care, Newcastle (England) Hospitals, and her associates (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736(16)30825-X).

QUARTZ involved 538 adults seen during a 7-year period who had NSCLC with brain metastases and who were not suited for either brain surgery or stereotactic radiotherapy. The median age was 66 years (range, 35-85 years), and 38% had a Karnofsky Performance Status score of less than 70.

The participants were randomly assigned to receive either optimal supportive care plus whole brain radiotherapy (269 patients) or optimal supportive care alone (269 patients) at 69 U.K. and 3 Australian medical centers. They reported on 20 symptoms and adverse effects, as well as health-related quality of life, approximately once per week.

The primary outcome measure – quality-adjusted life-years (QALY), which combines overall survival and quality of life – was 46.4 days with radiotherapy and 41.7 days without it.

Symptoms, adverse effects, and quality of life (QOL) were similar between the two study groups at 4 weeks, except that the radiotherapy group reported more moderate or severe episodes of drowsiness, hair loss, nausea, and dry or itchy scalp. The number and severity of serious adverse events were similar through 12 weeks of follow-up.

The percentage of patients whose QOL was either maintained or improved over time was similar between the two groups at 4 weeks (54% vs. 57%), 8 weeks (44% vs. 51%), and 12 weeks (44% vs. 49%). Changes in Karnofsky scores also were similar.

The study refuted the widely held belief that whole brain radiotherapy allows patients to reduce or discontinue steroid treatment, averting the associated adverse effects. Steroid doses were not significantly different between the two study groups through the first 8 weeks of treatment, which “challenges the dogma that whole brain radiotherapy can be seen as a steroid-sparing modality,” the investigators said.

Taken together, the findings “suggest that whole brain radiotherapy can be omitted and patients treated with optimal supportive care alone, without an important reduction in either overall survival or quality of life,” Dr. Mulvenna and her associates said.

The approximately 5-day difference between the two study groups in median overall survival highlights both the limited benefit offered by radiotherapy and the poor prognosis of this patient population, the researchers added.

Whole brain radiotherapy did offer a small survival benefit to the youngest patients who had good performance status and a “controlled” primary NSCLC. “For all other groups, [it] does not significantly affect QALY or overall survival,” they said.

Cancer Research U.K., the Medical Research Council in the U.K., the Trans Tasman Radiation Oncology Group, and the National Health and Medical Research Council Australia supported the study. Dr. Mulvenna and her associates reported having no relevant financial disclosures.

Click for Credit Link
Body

Managing brain metastases from NSCLC is a challenge, because the lesions may well produce life-threatening symptoms and serious impairment, which could be ameliorated with whole brain radiotherapy.

This is a large and well designed trial, but it was limited in that the maximal benefit of radiotherapy is believed to occur 6 weeks after the end of treatment. Given that median overall survival was only 8 weeks and considering the time it took to deliver the treatment, approximately half of the patients in this study died before an optimal assessment of symptoms could be done.

This might also explain why radiotherapy didn’t have an effect on steroid use in this study. Many patients didn’t live long enough for radiotherapy’s steroid-sparing effect to be observed.

Cécile Le Pechoux, MD, is in the department of radiation oncology at Gustave Roussy Cancer Campus in Villejuif, France. She and her associates reported having no relevant financial disclosures. They made these remarks in a comment accompanying the report on the QUARTZ trial (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736[16]31391-5).

Publications
Topics
Legacy Keywords
lung cancer, NSCLC, brain metastasis, radiation
Sections
Click for Credit Link
Click for Credit Link
Body

Managing brain metastases from NSCLC is a challenge, because the lesions may well produce life-threatening symptoms and serious impairment, which could be ameliorated with whole brain radiotherapy.

This is a large and well designed trial, but it was limited in that the maximal benefit of radiotherapy is believed to occur 6 weeks after the end of treatment. Given that median overall survival was only 8 weeks and considering the time it took to deliver the treatment, approximately half of the patients in this study died before an optimal assessment of symptoms could be done.

This might also explain why radiotherapy didn’t have an effect on steroid use in this study. Many patients didn’t live long enough for radiotherapy’s steroid-sparing effect to be observed.

Cécile Le Pechoux, MD, is in the department of radiation oncology at Gustave Roussy Cancer Campus in Villejuif, France. She and her associates reported having no relevant financial disclosures. They made these remarks in a comment accompanying the report on the QUARTZ trial (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736[16]31391-5).

Body

Managing brain metastases from NSCLC is a challenge, because the lesions may well produce life-threatening symptoms and serious impairment, which could be ameliorated with whole brain radiotherapy.

This is a large and well designed trial, but it was limited in that the maximal benefit of radiotherapy is believed to occur 6 weeks after the end of treatment. Given that median overall survival was only 8 weeks and considering the time it took to deliver the treatment, approximately half of the patients in this study died before an optimal assessment of symptoms could be done.

This might also explain why radiotherapy didn’t have an effect on steroid use in this study. Many patients didn’t live long enough for radiotherapy’s steroid-sparing effect to be observed.

Cécile Le Pechoux, MD, is in the department of radiation oncology at Gustave Roussy Cancer Campus in Villejuif, France. She and her associates reported having no relevant financial disclosures. They made these remarks in a comment accompanying the report on the QUARTZ trial (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736[16]31391-5).

Title
Large, well designed trial has limitations
Large, well designed trial has limitations

Whole brain radiotherapy, a standard treatment for patients with metastatic non–small-cell lung cancer, provided no clinical benefit in a noninferiority trial specifically designed to assess both patient survival and quality of life.

The findings were published online Sept. 4 in the Lancet.

Whole brain radiotherapy, with or without concomitant steroid treatment, has been widely used for decades in that patient population, even though no sufficiently powered, definitive studies support the approach. It is likely that patients and clinicians alike continue to embrace it because of the absence of alternative treatment options.

The Quality of Life After Treatment for Brain Metastases (QUARTZ) trial was intended to assess whether any improvement in survival offered by whole brain radiotherapy is balanced by deterioration in quality of life, said Paula Mulvenna, MBBS, of the Northern Center for Cancer Care, Newcastle (England) Hospitals, and her associates (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736(16)30825-X).

QUARTZ involved 538 adults seen during a 7-year period who had NSCLC with brain metastases and who were not suited for either brain surgery or stereotactic radiotherapy. The median age was 66 years (range, 35-85 years), and 38% had a Karnofsky Performance Status score of less than 70.

The participants were randomly assigned to receive either optimal supportive care plus whole brain radiotherapy (269 patients) or optimal supportive care alone (269 patients) at 69 U.K. and 3 Australian medical centers. They reported on 20 symptoms and adverse effects, as well as health-related quality of life, approximately once per week.

The primary outcome measure – quality-adjusted life-years (QALY), which combines overall survival and quality of life – was 46.4 days with radiotherapy and 41.7 days without it.

Symptoms, adverse effects, and quality of life (QOL) were similar between the two study groups at 4 weeks, except that the radiotherapy group reported more moderate or severe episodes of drowsiness, hair loss, nausea, and dry or itchy scalp. The number and severity of serious adverse events were similar through 12 weeks of follow-up.

The percentage of patients whose QOL was either maintained or improved over time was similar between the two groups at 4 weeks (54% vs. 57%), 8 weeks (44% vs. 51%), and 12 weeks (44% vs. 49%). Changes in Karnofsky scores also were similar.

The study refuted the widely held belief that whole brain radiotherapy allows patients to reduce or discontinue steroid treatment, averting the associated adverse effects. Steroid doses were not significantly different between the two study groups through the first 8 weeks of treatment, which “challenges the dogma that whole brain radiotherapy can be seen as a steroid-sparing modality,” the investigators said.

Taken together, the findings “suggest that whole brain radiotherapy can be omitted and patients treated with optimal supportive care alone, without an important reduction in either overall survival or quality of life,” Dr. Mulvenna and her associates said.

The approximately 5-day difference between the two study groups in median overall survival highlights both the limited benefit offered by radiotherapy and the poor prognosis of this patient population, the researchers added.

Whole brain radiotherapy did offer a small survival benefit to the youngest patients who had good performance status and a “controlled” primary NSCLC. “For all other groups, [it] does not significantly affect QALY or overall survival,” they said.

Cancer Research U.K., the Medical Research Council in the U.K., the Trans Tasman Radiation Oncology Group, and the National Health and Medical Research Council Australia supported the study. Dr. Mulvenna and her associates reported having no relevant financial disclosures.

Whole brain radiotherapy, a standard treatment for patients with metastatic non–small-cell lung cancer, provided no clinical benefit in a noninferiority trial specifically designed to assess both patient survival and quality of life.

The findings were published online Sept. 4 in the Lancet.

Whole brain radiotherapy, with or without concomitant steroid treatment, has been widely used for decades in that patient population, even though no sufficiently powered, definitive studies support the approach. It is likely that patients and clinicians alike continue to embrace it because of the absence of alternative treatment options.

The Quality of Life After Treatment for Brain Metastases (QUARTZ) trial was intended to assess whether any improvement in survival offered by whole brain radiotherapy is balanced by deterioration in quality of life, said Paula Mulvenna, MBBS, of the Northern Center for Cancer Care, Newcastle (England) Hospitals, and her associates (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736(16)30825-X).

QUARTZ involved 538 adults seen during a 7-year period who had NSCLC with brain metastases and who were not suited for either brain surgery or stereotactic radiotherapy. The median age was 66 years (range, 35-85 years), and 38% had a Karnofsky Performance Status score of less than 70.

The participants were randomly assigned to receive either optimal supportive care plus whole brain radiotherapy (269 patients) or optimal supportive care alone (269 patients) at 69 U.K. and 3 Australian medical centers. They reported on 20 symptoms and adverse effects, as well as health-related quality of life, approximately once per week.

The primary outcome measure – quality-adjusted life-years (QALY), which combines overall survival and quality of life – was 46.4 days with radiotherapy and 41.7 days without it.

Symptoms, adverse effects, and quality of life (QOL) were similar between the two study groups at 4 weeks, except that the radiotherapy group reported more moderate or severe episodes of drowsiness, hair loss, nausea, and dry or itchy scalp. The number and severity of serious adverse events were similar through 12 weeks of follow-up.

The percentage of patients whose QOL was either maintained or improved over time was similar between the two groups at 4 weeks (54% vs. 57%), 8 weeks (44% vs. 51%), and 12 weeks (44% vs. 49%). Changes in Karnofsky scores also were similar.

The study refuted the widely held belief that whole brain radiotherapy allows patients to reduce or discontinue steroid treatment, averting the associated adverse effects. Steroid doses were not significantly different between the two study groups through the first 8 weeks of treatment, which “challenges the dogma that whole brain radiotherapy can be seen as a steroid-sparing modality,” the investigators said.

Taken together, the findings “suggest that whole brain radiotherapy can be omitted and patients treated with optimal supportive care alone, without an important reduction in either overall survival or quality of life,” Dr. Mulvenna and her associates said.

The approximately 5-day difference between the two study groups in median overall survival highlights both the limited benefit offered by radiotherapy and the poor prognosis of this patient population, the researchers added.

Whole brain radiotherapy did offer a small survival benefit to the youngest patients who had good performance status and a “controlled” primary NSCLC. “For all other groups, [it] does not significantly affect QALY or overall survival,” they said.

Cancer Research U.K., the Medical Research Council in the U.K., the Trans Tasman Radiation Oncology Group, and the National Health and Medical Research Council Australia supported the study. Dr. Mulvenna and her associates reported having no relevant financial disclosures.

Publications
Publications
Topics
Article Type
Display Headline
Whole brain radiotherapy not beneficial for NSCLC metastasis
Display Headline
Whole brain radiotherapy not beneficial for NSCLC metastasis
Legacy Keywords
lung cancer, NSCLC, brain metastasis, radiation
Legacy Keywords
lung cancer, NSCLC, brain metastasis, radiation
Click for Credit Status
Active
Sections
Article Source

FROM THE LANCET

Disallow All Ads
Alternative CME
Vitals

Key clinical point: Whole brain radiotherapy provided no clinically significant benefit for most patients with metastatic NSCLC.

Major finding: The primary outcome measure, quality-adjusted life-years, was 46.4 days with radiotherapy and 41.7 days without it.

Data source: An international, randomized, phase III noninferiority trial involving 538 patients treated during a 7-year period.

Disclosures: Cancer Research U.K., the Medical Research Council in the U.K., the Trans Tasman Radiation Oncology Group, and the Medical Research Council Australia supported the study. Dr. Mulvenna and her associates reported having no relevant financial disclosures.

Antibiotic susceptibility differs in transplant recipients

Article Type
Changed
Fri, 01/18/2019 - 16:11
Display Headline
Antibiotic susceptibility differs in transplant recipients

Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.

Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.

The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.

 

The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).

“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.

Publications
Topics

Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.

Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.

The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.

 

The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).

“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.

Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.

Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.

The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.

 

The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).

“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.

Publications
Publications
Topics
Article Type
Display Headline
Antibiotic susceptibility differs in transplant recipients
Display Headline
Antibiotic susceptibility differs in transplant recipients
Article Source

FROM DIAGNOSTIC MICROBIOLOGY AND INFECTIOUS DISEASE

Disallow All Ads
Alternative CME
Vitals

Key clinical point: Antibiotic susceptibility in bacteria cultured from transplant recipients differs markedly from that in hospital-wide antibiograms.

Major finding: In the transplant recipients, E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin.

Data source: A single-center study comparing the antibiotic susceptibility of 1,889 bacterial isolates from transplant recipients with 10,439 isolates from other patients.

Disclosures: This study was not supported by funding from any public, commercial, or not-for-profit entities. Dr. Rosa and her associates reported having no relevant financial disclosures.

USPSTF: Screen for tuberculosis in those at greatest risk

Article Type
Changed
Fri, 01/18/2019 - 16:11
Display Headline
USPSTF: Screen for tuberculosis in those at greatest risk

Screening for latent tuberculosis infection (LTBI) can help prevent progression to active disease, and the availability of effective tests supports screening asymptomatic adults aged 18 years and older at increased risk for infection, according to new recommendations from the U.S. Preventive Services Task Force.

The recommendations were published online Sept. 6 in JAMA.

“The USPSTF concludes with moderate certainty that the net benefit of screening for LTBI in persons at increased risk for tuberculosis is moderate,” wrote lead author Kirsten Bibbins-Domingo, MD, PhD, of the University of California, San Francisco, and her colleagues (JAMA 2016 Sep 6;316[9]:962-9).

Dr. Kirsten Bibbins-Domingo

TB infection spreads through the coughing or sneezing of someone with active disease. Individuals at high risk for TB include those who are immunocompromised, residents of long-term care facilities or correctional facilities, or homeless individuals, as well as those born in countries known to have a high incidence of TB, including China, India, Mexico, and Vietnam.

Other populations at increased risk for TB are contacts of patients with active TB, health care workers, and workers in high-risk settings, the researchers noted.

TB remains a preventable disease in the United States, with a prevalence of approximately 5%, the researchers said. The two most effective screening tests, tuberculin skin test (TST) and interferon-gamma release assays (IGRA), demonstrated sensitivity and specificity of 79% and 97%, and at least 80% and 95%, respectively.

The recommendations are supported by an evidence review, also published in JAMA (2016 Sep 6;316[9]:970-83). The review included 72 studies and 51,711 adults.

The studies in the evidence review did not assess the benefits vs. harms of TB screening, compared with no screening, noted Leila C. Kahwati, MD, of RTI International in Research Triangle Park, N.C., and her colleagues.

“The applicability of the evidence on accuracy and reliability of screening tests to primary care practice settings and populations is uncertain for several reasons,” the investigators said. However, the findings suggest that “treatment reduced the risk of active TB among the populations included in this review.”

The researchers had no financial conflicts to disclose.

References

Click for Credit Link
Author and Disclosure Information

Publications
Topics
Legacy Keywords
USPSTF, tuberculosis, TB, TB test
Sections
Click for Credit Link
Click for Credit Link
Author and Disclosure Information

Author and Disclosure Information

Screening for latent tuberculosis infection (LTBI) can help prevent progression to active disease, and the availability of effective tests supports screening asymptomatic adults aged 18 years and older at increased risk for infection, according to new recommendations from the U.S. Preventive Services Task Force.

The recommendations were published online Sept. 6 in JAMA.

“The USPSTF concludes with moderate certainty that the net benefit of screening for LTBI in persons at increased risk for tuberculosis is moderate,” wrote lead author Kirsten Bibbins-Domingo, MD, PhD, of the University of California, San Francisco, and her colleagues (JAMA 2016 Sep 6;316[9]:962-9).

Dr. Kirsten Bibbins-Domingo

TB infection spreads through the coughing or sneezing of someone with active disease. Individuals at high risk for TB include those who are immunocompromised, residents of long-term care facilities or correctional facilities, or homeless individuals, as well as those born in countries known to have a high incidence of TB, including China, India, Mexico, and Vietnam.

Other populations at increased risk for TB are contacts of patients with active TB, health care workers, and workers in high-risk settings, the researchers noted.

TB remains a preventable disease in the United States, with a prevalence of approximately 5%, the researchers said. The two most effective screening tests, tuberculin skin test (TST) and interferon-gamma release assays (IGRA), demonstrated sensitivity and specificity of 79% and 97%, and at least 80% and 95%, respectively.

The recommendations are supported by an evidence review, also published in JAMA (2016 Sep 6;316[9]:970-83). The review included 72 studies and 51,711 adults.

The studies in the evidence review did not assess the benefits vs. harms of TB screening, compared with no screening, noted Leila C. Kahwati, MD, of RTI International in Research Triangle Park, N.C., and her colleagues.

“The applicability of the evidence on accuracy and reliability of screening tests to primary care practice settings and populations is uncertain for several reasons,” the investigators said. However, the findings suggest that “treatment reduced the risk of active TB among the populations included in this review.”

The researchers had no financial conflicts to disclose.

Screening for latent tuberculosis infection (LTBI) can help prevent progression to active disease, and the availability of effective tests supports screening asymptomatic adults aged 18 years and older at increased risk for infection, according to new recommendations from the U.S. Preventive Services Task Force.

The recommendations were published online Sept. 6 in JAMA.

“The USPSTF concludes with moderate certainty that the net benefit of screening for LTBI in persons at increased risk for tuberculosis is moderate,” wrote lead author Kirsten Bibbins-Domingo, MD, PhD, of the University of California, San Francisco, and her colleagues (JAMA 2016 Sep 6;316[9]:962-9).

Dr. Kirsten Bibbins-Domingo

TB infection spreads through the coughing or sneezing of someone with active disease. Individuals at high risk for TB include those who are immunocompromised, residents of long-term care facilities or correctional facilities, or homeless individuals, as well as those born in countries known to have a high incidence of TB, including China, India, Mexico, and Vietnam.

Other populations at increased risk for TB are contacts of patients with active TB, health care workers, and workers in high-risk settings, the researchers noted.

TB remains a preventable disease in the United States, with a prevalence of approximately 5%, the researchers said. The two most effective screening tests, tuberculin skin test (TST) and interferon-gamma release assays (IGRA), demonstrated sensitivity and specificity of 79% and 97%, and at least 80% and 95%, respectively.

The recommendations are supported by an evidence review, also published in JAMA (2016 Sep 6;316[9]:970-83). The review included 72 studies and 51,711 adults.

The studies in the evidence review did not assess the benefits vs. harms of TB screening, compared with no screening, noted Leila C. Kahwati, MD, of RTI International in Research Triangle Park, N.C., and her colleagues.

“The applicability of the evidence on accuracy and reliability of screening tests to primary care practice settings and populations is uncertain for several reasons,” the investigators said. However, the findings suggest that “treatment reduced the risk of active TB among the populations included in this review.”

The researchers had no financial conflicts to disclose.

References

References

Publications
Publications
Topics
Article Type
Display Headline
USPSTF: Screen for tuberculosis in those at greatest risk
Display Headline
USPSTF: Screen for tuberculosis in those at greatest risk
Legacy Keywords
USPSTF, tuberculosis, TB, TB test
Legacy Keywords
USPSTF, tuberculosis, TB, TB test
Click for Credit Status
Active
Sections
Article Source

FROM JAMA

PURLs Copyright

Inside the Article

Disallow All Ads
Vitals

Key clinical point: Latent tuberculosis infection is a significant problem, and both the tuberculin skin test (TST) and interferon-gamma release assays (IGRA) were moderately sensitive and highly specific in areas with a low tuberculosis burden.

Major finding: Approximately 5%-10% of individuals with latent TB progress to active disease, according to the USPSTF, and treatment reduces the risk of progression.

Data source: An evidence review including 72 studies and 51,711 individuals.

Disclosures: The researchers had no financial conflicts to disclose.

Pneumonitis with nivolumab treatment shows common radiographic patterns

Article Type
Changed
Fri, 01/04/2019 - 13:20
Display Headline
Pneumonitis with nivolumab treatment shows common radiographic patterns

A study of cancer patients enrolled in trials of the programmed cell death-1 inhibiting medicine nivolumab found that among a minority who developed pneumonitis during treatment, distinct radiographic patterns were significantly associated with the level of pneumonitis severity.

Investigators found that cryptic organizing pneumonia pattern (COP) was the most common, though not the most severe. Led by Mizuki Nishino, MD, of Brigham and Women’s Hospital, Boston, the researchers looked at the 20 patients out of a cohort of 170 (11.8%) who had developed pneumonitis, and found that radiologic patterns indicating acute interstitial pneumonia/acute respiratory distress syndrome (n = 2) had the highest severity grade on a scale of 1-5 (median 3), followed by those with COP pattern (n = 13, median grade 2), hypersensitivity pneumonitis (n = 2, median grade 1), and nonspecific interstitial pneumonia (n = 3, median grade 1). The pattern was significantly associated with severity (P = .0006).

The study cohort included patients being treated with nivolumab for lung cancer, melanoma, and lymphoma; the COP patten was the most common across tumor types and observed in patients receiving monotherapy and combination therapy alike. Therapy with nivolumab was suspended for all 20 pneumonitis patients, and most (n = 17) received treatment for pneumonitis with corticosteroids with or without infliximab, for a median treatment time of 6 weeks. Seven patients were able to restart nivolumab, though pneumonitis recurred in two, the investigators reported (Clin Cancer Res. 2016 Aug 17. doi: 10.1158/1078-0432.CCR-16-1320).

“Time from initiation of therapy to the development of pneumonitis had a wide range (0.5-11.5 months), indicating an importance of careful observation and follow-up for signs and symptoms of pneumonitis throughout treatment,” Dr. Nishino and colleagues wrote in their analysis, adding that shorter times were observed for lung cancer patients, possibly because of their higher pulmonary burden, a lower threshold for performing chest scans in these patients, or both. “In most patients, clinical and radiographic improvements were noted after treatment, indicating that [PD-1 inhibitor-related pneumonitis], although potentially serious, is treatable if diagnosed and managed appropriately. The observation emphasizes the importance of timely recognition, accurate diagnosis, and early intervention.”

The lead author and several coauthors disclosed funding from Bristol-Myers Squibb, which sponsored the trial, as well as from other manufacturers.

Publications
Topics
Sections

A study of cancer patients enrolled in trials of the programmed cell death-1 inhibiting medicine nivolumab found that among a minority who developed pneumonitis during treatment, distinct radiographic patterns were significantly associated with the level of pneumonitis severity.

Investigators found that cryptic organizing pneumonia pattern (COP) was the most common, though not the most severe. Led by Mizuki Nishino, MD, of Brigham and Women’s Hospital, Boston, the researchers looked at the 20 patients out of a cohort of 170 (11.8%) who had developed pneumonitis, and found that radiologic patterns indicating acute interstitial pneumonia/acute respiratory distress syndrome (n = 2) had the highest severity grade on a scale of 1-5 (median 3), followed by those with COP pattern (n = 13, median grade 2), hypersensitivity pneumonitis (n = 2, median grade 1), and nonspecific interstitial pneumonia (n = 3, median grade 1). The pattern was significantly associated with severity (P = .0006).

The study cohort included patients being treated with nivolumab for lung cancer, melanoma, and lymphoma; the COP patten was the most common across tumor types and observed in patients receiving monotherapy and combination therapy alike. Therapy with nivolumab was suspended for all 20 pneumonitis patients, and most (n = 17) received treatment for pneumonitis with corticosteroids with or without infliximab, for a median treatment time of 6 weeks. Seven patients were able to restart nivolumab, though pneumonitis recurred in two, the investigators reported (Clin Cancer Res. 2016 Aug 17. doi: 10.1158/1078-0432.CCR-16-1320).

“Time from initiation of therapy to the development of pneumonitis had a wide range (0.5-11.5 months), indicating an importance of careful observation and follow-up for signs and symptoms of pneumonitis throughout treatment,” Dr. Nishino and colleagues wrote in their analysis, adding that shorter times were observed for lung cancer patients, possibly because of their higher pulmonary burden, a lower threshold for performing chest scans in these patients, or both. “In most patients, clinical and radiographic improvements were noted after treatment, indicating that [PD-1 inhibitor-related pneumonitis], although potentially serious, is treatable if diagnosed and managed appropriately. The observation emphasizes the importance of timely recognition, accurate diagnosis, and early intervention.”

The lead author and several coauthors disclosed funding from Bristol-Myers Squibb, which sponsored the trial, as well as from other manufacturers.

A study of cancer patients enrolled in trials of the programmed cell death-1 inhibiting medicine nivolumab found that among a minority who developed pneumonitis during treatment, distinct radiographic patterns were significantly associated with the level of pneumonitis severity.

Investigators found that cryptic organizing pneumonia pattern (COP) was the most common, though not the most severe. Led by Mizuki Nishino, MD, of Brigham and Women’s Hospital, Boston, the researchers looked at the 20 patients out of a cohort of 170 (11.8%) who had developed pneumonitis, and found that radiologic patterns indicating acute interstitial pneumonia/acute respiratory distress syndrome (n = 2) had the highest severity grade on a scale of 1-5 (median 3), followed by those with COP pattern (n = 13, median grade 2), hypersensitivity pneumonitis (n = 2, median grade 1), and nonspecific interstitial pneumonia (n = 3, median grade 1). The pattern was significantly associated with severity (P = .0006).

The study cohort included patients being treated with nivolumab for lung cancer, melanoma, and lymphoma; the COP patten was the most common across tumor types and observed in patients receiving monotherapy and combination therapy alike. Therapy with nivolumab was suspended for all 20 pneumonitis patients, and most (n = 17) received treatment for pneumonitis with corticosteroids with or without infliximab, for a median treatment time of 6 weeks. Seven patients were able to restart nivolumab, though pneumonitis recurred in two, the investigators reported (Clin Cancer Res. 2016 Aug 17. doi: 10.1158/1078-0432.CCR-16-1320).

“Time from initiation of therapy to the development of pneumonitis had a wide range (0.5-11.5 months), indicating an importance of careful observation and follow-up for signs and symptoms of pneumonitis throughout treatment,” Dr. Nishino and colleagues wrote in their analysis, adding that shorter times were observed for lung cancer patients, possibly because of their higher pulmonary burden, a lower threshold for performing chest scans in these patients, or both. “In most patients, clinical and radiographic improvements were noted after treatment, indicating that [PD-1 inhibitor-related pneumonitis], although potentially serious, is treatable if diagnosed and managed appropriately. The observation emphasizes the importance of timely recognition, accurate diagnosis, and early intervention.”

The lead author and several coauthors disclosed funding from Bristol-Myers Squibb, which sponsored the trial, as well as from other manufacturers.

Publications
Publications
Topics
Article Type
Display Headline
Pneumonitis with nivolumab treatment shows common radiographic patterns
Display Headline
Pneumonitis with nivolumab treatment shows common radiographic patterns
Sections
Article Source

FROM CLINICAL CANCER RESEARCH

Disallow All Ads
Alternative CME
Vitals

Key clinical point: Pneumonitis related to treatment with PD-1 inhibitors showed distinct radiographic patterns associated with severity; most cases resolved with corticosteroid treatment.

Major finding: Of 20 patients in nivolumab trials who developed pneumonitis, a COP pattern was seen in 13, and other patterns in 7; different patterns were significantly associated with pneumonitis severity (P = .006).

Data source: 170 patients with melanoma, lung cancer or lymphoma enrolled in single-site open-label clinical trial of nivolumab.

Disclosures: The lead author and several coauthors disclosed funding from Bristol-Myers Squibb, which sponsored the trial, as well as from other manufacturers.

VIDEO: Coronary DES outperform BMS mostly on restenosis

NORSTENT results won’t change practice
Article Type
Changed
Tue, 07/21/2020 - 14:18
Display Headline
VIDEO: Coronary DES outperform BMS mostly on restenosis

ROME – The difference between contemporary drug-eluting coronary stents and bare-metal stents is not very great, a large Norwegian coronary stent trial showed.

Today’s drug-eluting stents (DES), often called second-generation DES, largely do only what they were designed to do, compared with bare-metal stents (BMS): reduce the rate of stent restenosis and the need for target-lesion revascularization.

“The long-term benefit of contemporary DES over BMS was less that expected,” Kaare H. Bønaa, MD, reported at the annual congress of the European Society of Cardiology.

 

Mitchel L. Zoler/Frontline Medical News
Dr. Kaare H. Bønaa

Results from the Norwegian Coronary Stent Trial (NORSTENT), run with 9,013 patients, showed that patients who received one or more drug-eluting stents had, during nearly 5 years of follow-up, a 5% absolute drop in target-lesion revascularizations (a 53% relative risk reduction), and a 3.3% reduction in all revascularizations (a 24% relative risk reduction), compared with patients who received bare-metal stents, said Dr. Bønaa.

The results also showed that patients who received DES had a 0.4% reduced rate of stent thrombosis (a 36% relative risk reduction), compared with patients treated with BMS during nearly 5 years of follow-up. All three differences were statistically significant.

But the NORSTENT findings also documented that the patients who received either DES or BMS had virtually identical rates of all-cause deaths and nonfatal myocardial infarctions. And, on average, the two different types of coronary stents produced identical improvements in patients’ quality of life, reported Dr. Bønaa, a professor and researcher in the Clinic for Heart Disease at St. Olav’s University Hospital in Trondheim, Norway.

The study’s primary endpoint was the combined rate of death or nonfatal MI, and so the nonsignificant difference in that outcome between the two study arms meant that, formally, the NORSTENT trial produced a neutral result. Concurrently with his report, the results appeared in an article online (New Engl J Med. 2016 Aug 30. doi: 10.1056/NEJMoa1607991).

“The difference between the two stent types is not as great as we thought. Patients who get DES do not live longer or better” than those who receive BMS, Dr. Bønaa said. “We suggest that both contemporary DES and BMS can be recommended for contemporary revascularization. The results open up use of BMS for certain patients,” such as those scheduled for surgery or patients who cannot tolerate or afford the drugs used for dual antiplatelet therapy following coronary stent placement.

 

Mitchel L. Zoler/Frontline Medical News
Dr. Stefan James

But the designated discussant for the study, Stefan James, MD, insisted that recent-generation DES “should remain recommended over BMS,” particularly the specific DES that underwent testing in randomized trials that used hard clinical endpoints. The 2014 revascularization guidelines of the European Society of Cardiology recommend new-generation DES over BMS, he noted.

In addition, “BMS should not be specifically recommended for patients at high risk of stent thrombosis or for patients who do not tolerate dual-antiplatelet therapy,” said Dr. James, professor of cardiology at Uppsala University in Sweden.

NORSTENT ran at eight centers in Norway during 2008-2011, and enrolled patients either had acute coronary syndrome (71% of those in the study) or stable coronary disease. Patients averaged 63 years old. The trial excluded patients with prior stents or bifurcated coronary lesions. Enrolled patients received, on average, 1.7 stents. The specific stent in each class that patients received was left to the discretion of each operator, and 95% of patients in the DES arm received a second-generation device. All patients in both arms of the study received dual-antiplatelet therapy for 9 months.

The finding that DES cut the rate of revascularization procedures by 3.3%, compared with patients treated with BMS, means that, on average, clinicians would need to treat 30 patients with DES to avoid the need for one additional repeat revascularization procedure that would occur if BMS were used instead.

That number needed to treat of 30 to avoid one repeat revascularization may seem high, but the money saved that way would still counterbalance the incremental cost of a DES over a BMS, which today in Europe would be about 50-100 euros, noted one cardiologist.

If you multiply 30 procedures by 100 extra euros per stent and by an average of 1.7 stents per patient, you may spend 5,100 euros, less than the cost of a repeat revascularization procedure, commented Carlo Di Mario, MD, a professor of cardiology and an interventional cardiologist at Royal Brompton & Harefield Hospitals in London.

In a video interview, Steen D. Kristensen, MD, of Aarhus University, Denmark, discussed the NORSTENT findings and their implications.

 

 

NORSTENT received no commercial support. Dr. Bønaa and Dr. Di Mario had no disclosures. Dr. James has been a consultant to Boston Scientific and has received research support from Boston Scientific and Abbott Vascular.

 

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

[email protected]

On Twitter @mitchelzoler

Body

NORSTENT was a very well-performed trial. It produced a neutral result for its primary endpoint, but for the secondary endpoint of repeat revascularization, there were significantly more events using bare-metal stents. This is a major finding, and NORSTENT’s design make the results very generalizable.

It may be slightly surprising that the newer drug-eluting stents did not perform better for the primary endpoint of reducing deaths and MIs during 5 years of follow-up, but seeing a difference in the revascularization rate is not surprising; that is what we would expect. We use DES to reduce the problem of restenosis. Results from several earlier studies that had compared DES with BMS had suggested other benefits from DES, and that is also what the European Society of Cardiology guidelines say.

I will not go home now and start using BMS in my own practice. I will continue to use DES, because they have an advantage. I use BMS in patients who cannot tolerate long-term treatment with dual antiplatelet therapy. The results are encouraging for centers where there is a large price difference between DES and BMS, but that is not the case where I practice in Denmark.

Steen D. Kristensen, MD, is a professor of interventional cardiologist at Aarhus University, Denmark. He made these comments in an interview. He had no relevant disclosures.

Meeting/Event
Publications
Topics
Legacy Keywords
drug eluting stent, bare metal stent, DES, BMS, NORSTENT, Bonaa
Sections
Meeting/Event
Meeting/Event
Body

NORSTENT was a very well-performed trial. It produced a neutral result for its primary endpoint, but for the secondary endpoint of repeat revascularization, there were significantly more events using bare-metal stents. This is a major finding, and NORSTENT’s design make the results very generalizable.

It may be slightly surprising that the newer drug-eluting stents did not perform better for the primary endpoint of reducing deaths and MIs during 5 years of follow-up, but seeing a difference in the revascularization rate is not surprising; that is what we would expect. We use DES to reduce the problem of restenosis. Results from several earlier studies that had compared DES with BMS had suggested other benefits from DES, and that is also what the European Society of Cardiology guidelines say.

I will not go home now and start using BMS in my own practice. I will continue to use DES, because they have an advantage. I use BMS in patients who cannot tolerate long-term treatment with dual antiplatelet therapy. The results are encouraging for centers where there is a large price difference between DES and BMS, but that is not the case where I practice in Denmark.

Steen D. Kristensen, MD, is a professor of interventional cardiologist at Aarhus University, Denmark. He made these comments in an interview. He had no relevant disclosures.

Body

NORSTENT was a very well-performed trial. It produced a neutral result for its primary endpoint, but for the secondary endpoint of repeat revascularization, there were significantly more events using bare-metal stents. This is a major finding, and NORSTENT’s design make the results very generalizable.

It may be slightly surprising that the newer drug-eluting stents did not perform better for the primary endpoint of reducing deaths and MIs during 5 years of follow-up, but seeing a difference in the revascularization rate is not surprising; that is what we would expect. We use DES to reduce the problem of restenosis. Results from several earlier studies that had compared DES with BMS had suggested other benefits from DES, and that is also what the European Society of Cardiology guidelines say.

I will not go home now and start using BMS in my own practice. I will continue to use DES, because they have an advantage. I use BMS in patients who cannot tolerate long-term treatment with dual antiplatelet therapy. The results are encouraging for centers where there is a large price difference between DES and BMS, but that is not the case where I practice in Denmark.

Steen D. Kristensen, MD, is a professor of interventional cardiologist at Aarhus University, Denmark. He made these comments in an interview. He had no relevant disclosures.

Title
NORSTENT results won’t change practice
NORSTENT results won’t change practice

ROME – The difference between contemporary drug-eluting coronary stents and bare-metal stents is not very great, a large Norwegian coronary stent trial showed.

Today’s drug-eluting stents (DES), often called second-generation DES, largely do only what they were designed to do, compared with bare-metal stents (BMS): reduce the rate of stent restenosis and the need for target-lesion revascularization.

“The long-term benefit of contemporary DES over BMS was less that expected,” Kaare H. Bønaa, MD, reported at the annual congress of the European Society of Cardiology.

 

Mitchel L. Zoler/Frontline Medical News
Dr. Kaare H. Bønaa

Results from the Norwegian Coronary Stent Trial (NORSTENT), run with 9,013 patients, showed that patients who received one or more drug-eluting stents had, during nearly 5 years of follow-up, a 5% absolute drop in target-lesion revascularizations (a 53% relative risk reduction), and a 3.3% reduction in all revascularizations (a 24% relative risk reduction), compared with patients who received bare-metal stents, said Dr. Bønaa.

The results also showed that patients who received DES had a 0.4% reduced rate of stent thrombosis (a 36% relative risk reduction), compared with patients treated with BMS during nearly 5 years of follow-up. All three differences were statistically significant.

But the NORSTENT findings also documented that the patients who received either DES or BMS had virtually identical rates of all-cause deaths and nonfatal myocardial infarctions. And, on average, the two different types of coronary stents produced identical improvements in patients’ quality of life, reported Dr. Bønaa, a professor and researcher in the Clinic for Heart Disease at St. Olav’s University Hospital in Trondheim, Norway.

The study’s primary endpoint was the combined rate of death or nonfatal MI, and so the nonsignificant difference in that outcome between the two study arms meant that, formally, the NORSTENT trial produced a neutral result. Concurrently with his report, the results appeared in an article online (New Engl J Med. 2016 Aug 30. doi: 10.1056/NEJMoa1607991).

“The difference between the two stent types is not as great as we thought. Patients who get DES do not live longer or better” than those who receive BMS, Dr. Bønaa said. “We suggest that both contemporary DES and BMS can be recommended for contemporary revascularization. The results open up use of BMS for certain patients,” such as those scheduled for surgery or patients who cannot tolerate or afford the drugs used for dual antiplatelet therapy following coronary stent placement.

 

Mitchel L. Zoler/Frontline Medical News
Dr. Stefan James

But the designated discussant for the study, Stefan James, MD, insisted that recent-generation DES “should remain recommended over BMS,” particularly the specific DES that underwent testing in randomized trials that used hard clinical endpoints. The 2014 revascularization guidelines of the European Society of Cardiology recommend new-generation DES over BMS, he noted.

In addition, “BMS should not be specifically recommended for patients at high risk of stent thrombosis or for patients who do not tolerate dual-antiplatelet therapy,” said Dr. James, professor of cardiology at Uppsala University in Sweden.

NORSTENT ran at eight centers in Norway during 2008-2011, and enrolled patients either had acute coronary syndrome (71% of those in the study) or stable coronary disease. Patients averaged 63 years old. The trial excluded patients with prior stents or bifurcated coronary lesions. Enrolled patients received, on average, 1.7 stents. The specific stent in each class that patients received was left to the discretion of each operator, and 95% of patients in the DES arm received a second-generation device. All patients in both arms of the study received dual-antiplatelet therapy for 9 months.

The finding that DES cut the rate of revascularization procedures by 3.3%, compared with patients treated with BMS, means that, on average, clinicians would need to treat 30 patients with DES to avoid the need for one additional repeat revascularization procedure that would occur if BMS were used instead.

That number needed to treat of 30 to avoid one repeat revascularization may seem high, but the money saved that way would still counterbalance the incremental cost of a DES over a BMS, which today in Europe would be about 50-100 euros, noted one cardiologist.

If you multiply 30 procedures by 100 extra euros per stent and by an average of 1.7 stents per patient, you may spend 5,100 euros, less than the cost of a repeat revascularization procedure, commented Carlo Di Mario, MD, a professor of cardiology and an interventional cardiologist at Royal Brompton & Harefield Hospitals in London.

In a video interview, Steen D. Kristensen, MD, of Aarhus University, Denmark, discussed the NORSTENT findings and their implications.

 

 

NORSTENT received no commercial support. Dr. Bønaa and Dr. Di Mario had no disclosures. Dr. James has been a consultant to Boston Scientific and has received research support from Boston Scientific and Abbott Vascular.

 

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

[email protected]

On Twitter @mitchelzoler

ROME – The difference between contemporary drug-eluting coronary stents and bare-metal stents is not very great, a large Norwegian coronary stent trial showed.

Today’s drug-eluting stents (DES), often called second-generation DES, largely do only what they were designed to do, compared with bare-metal stents (BMS): reduce the rate of stent restenosis and the need for target-lesion revascularization.

“The long-term benefit of contemporary DES over BMS was less that expected,” Kaare H. Bønaa, MD, reported at the annual congress of the European Society of Cardiology.

 

Mitchel L. Zoler/Frontline Medical News
Dr. Kaare H. Bønaa

Results from the Norwegian Coronary Stent Trial (NORSTENT), run with 9,013 patients, showed that patients who received one or more drug-eluting stents had, during nearly 5 years of follow-up, a 5% absolute drop in target-lesion revascularizations (a 53% relative risk reduction), and a 3.3% reduction in all revascularizations (a 24% relative risk reduction), compared with patients who received bare-metal stents, said Dr. Bønaa.

The results also showed that patients who received DES had a 0.4% reduced rate of stent thrombosis (a 36% relative risk reduction), compared with patients treated with BMS during nearly 5 years of follow-up. All three differences were statistically significant.

But the NORSTENT findings also documented that the patients who received either DES or BMS had virtually identical rates of all-cause deaths and nonfatal myocardial infarctions. And, on average, the two different types of coronary stents produced identical improvements in patients’ quality of life, reported Dr. Bønaa, a professor and researcher in the Clinic for Heart Disease at St. Olav’s University Hospital in Trondheim, Norway.

The study’s primary endpoint was the combined rate of death or nonfatal MI, and so the nonsignificant difference in that outcome between the two study arms meant that, formally, the NORSTENT trial produced a neutral result. Concurrently with his report, the results appeared in an article online (New Engl J Med. 2016 Aug 30. doi: 10.1056/NEJMoa1607991).

“The difference between the two stent types is not as great as we thought. Patients who get DES do not live longer or better” than those who receive BMS, Dr. Bønaa said. “We suggest that both contemporary DES and BMS can be recommended for contemporary revascularization. The results open up use of BMS for certain patients,” such as those scheduled for surgery or patients who cannot tolerate or afford the drugs used for dual antiplatelet therapy following coronary stent placement.

 

Mitchel L. Zoler/Frontline Medical News
Dr. Stefan James

But the designated discussant for the study, Stefan James, MD, insisted that recent-generation DES “should remain recommended over BMS,” particularly the specific DES that underwent testing in randomized trials that used hard clinical endpoints. The 2014 revascularization guidelines of the European Society of Cardiology recommend new-generation DES over BMS, he noted.

In addition, “BMS should not be specifically recommended for patients at high risk of stent thrombosis or for patients who do not tolerate dual-antiplatelet therapy,” said Dr. James, professor of cardiology at Uppsala University in Sweden.

NORSTENT ran at eight centers in Norway during 2008-2011, and enrolled patients either had acute coronary syndrome (71% of those in the study) or stable coronary disease. Patients averaged 63 years old. The trial excluded patients with prior stents or bifurcated coronary lesions. Enrolled patients received, on average, 1.7 stents. The specific stent in each class that patients received was left to the discretion of each operator, and 95% of patients in the DES arm received a second-generation device. All patients in both arms of the study received dual-antiplatelet therapy for 9 months.

The finding that DES cut the rate of revascularization procedures by 3.3%, compared with patients treated with BMS, means that, on average, clinicians would need to treat 30 patients with DES to avoid the need for one additional repeat revascularization procedure that would occur if BMS were used instead.

That number needed to treat of 30 to avoid one repeat revascularization may seem high, but the money saved that way would still counterbalance the incremental cost of a DES over a BMS, which today in Europe would be about 50-100 euros, noted one cardiologist.

If you multiply 30 procedures by 100 extra euros per stent and by an average of 1.7 stents per patient, you may spend 5,100 euros, less than the cost of a repeat revascularization procedure, commented Carlo Di Mario, MD, a professor of cardiology and an interventional cardiologist at Royal Brompton & Harefield Hospitals in London.

In a video interview, Steen D. Kristensen, MD, of Aarhus University, Denmark, discussed the NORSTENT findings and their implications.

 

 

NORSTENT received no commercial support. Dr. Bønaa and Dr. Di Mario had no disclosures. Dr. James has been a consultant to Boston Scientific and has received research support from Boston Scientific and Abbott Vascular.

 

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

[email protected]

On Twitter @mitchelzoler

Publications
Publications
Topics
Article Type
Display Headline
VIDEO: Coronary DES outperform BMS mostly on restenosis
Display Headline
VIDEO: Coronary DES outperform BMS mostly on restenosis
Legacy Keywords
drug eluting stent, bare metal stent, DES, BMS, NORSTENT, Bonaa
Legacy Keywords
drug eluting stent, bare metal stent, DES, BMS, NORSTENT, Bonaa
Sections
Article Source

AT THE ESC CONGRESS 2016

Disallow All Ads
Vitals

Key clinical point: The benefit from coronary revascularization with drug-eluting stents, compared with bare-metal stents, was mostly in a reduced need for repeat revascularization, with no difference in mortality or MIs during 5 years of follow-up.

Major finding: Thirty patients need to be treated with drug-eluting stents to prevent one repeat revascularization, compared with bare-metal stents.

Data source: NORSTENT, a randomized, multicenter trial with 9,013 patients.

Disclosures: NORSTENT received no commercial support. Dr. Bønaa and Dr. Di Mario had no disclosures. Dr. James has been a consultant to Boston Scientific and has received research support from Boston Scientific and Abbott Vascular.