User login
Adverse effects of PD-1/PD-L1 inhibitors varied by tumor type in systematic review
The immune-related adverse effects of inhibitors of programmed cell death protein 1 (PD-1) and its ligand varied by tumor type in a large systematic review and meta-analysis.
Patients with melanoma were significantly more likely to develop colitis (odds ratio, 4.2; 95% confidence interval, 1.3 to 14.0), diarrhea (OR, 1.9), pruritus (OR, 2.4), and rash (OR, 1.8) compared with patients with non–small cell lung cancer, who were significantly more likely to develop pneumonitis, reported Leila Khoja, MBChB, PhD, of AstraZeneca UK, Melbourn, England, and associates. Patients with melanoma also were significantly more likely to develop arthralgia, hypothyroidism, rash, pruritus, and diarrhea compared with patients with renal cell carcinoma, who were more likely to develop pneumonitis and dyspnea.
“In light of this study, we should be mindful that different tumor types may have different immune-related adverse effect patterns when treated with the same immune checkpoint inhibitor,” the reviewers noted (Ann Oncol. 2017 Aug 8. doi: 10.1093/annonc/mdx286).
The review included 48 trials of nearly 7,000 patients with solid tumors who received CTLA-4 inhibitors (26 studies), PD-1 inhibitors (17 studies), PD-1 ligand (PD-L1) inhibitors (two trials), or both CTLA-4 and PD-1 inhibitors (three trials). The reviewers identified the studies by searching the Medline, EMBASE, and COCHRANE databases for prospective trials published from 2003 through November 2015.
Severe or life-threatening immune-related adverse effects developed in 31% of patients who received CTLA-4 inhibitors and 10% of patients who received PD-1 inhibitors. Inhibitors of CTLA-4 were significantly more likely to cause all grades of colitis (OR, 8.7), hypophysitis (OR, 6.5), and rash (OR, 2.0), while PD-1 inhibitors were more strongly linked with pneumonitis (OR 6.4), hypothyroidism (OR 4.3), arthralgia (OR, 3.5), and vitiligo (OR, 3.5).
The reviewers also looked for significant predictors of immune-related colitis and pneumonitis, because these are potentially fatal. They found that pneumonitis was significantly linked to PD-1/PD-L1 inhibitor therapy (P less than .001) and colitis to CTLA-4 treatment (P = .04), even after accounting for therapeutic dose and tumor type. No other factors reached significance in this multivariable model.
“Clearly, a more thorough understanding of the mechanisms of immune-related adverse effects is needed, which may lead to the identification of biomarkers to predict the occurrence of toxicity in patients or predict those who have immune-related adverse effects that are unlikely to respond to corticosteroids,” the reviewers concluded. Researchers should also study whether clinical factors such as treatment history or comorbidities affect the risk of immune-related adverse effects from immune checkpoint inhibitors, they said.
The reviewers reported having no funding sources and no relevant conflicts of interest.
The immune-related adverse effects of inhibitors of programmed cell death protein 1 (PD-1) and its ligand varied by tumor type in a large systematic review and meta-analysis.
Patients with melanoma were significantly more likely to develop colitis (odds ratio, 4.2; 95% confidence interval, 1.3 to 14.0), diarrhea (OR, 1.9), pruritus (OR, 2.4), and rash (OR, 1.8) compared with patients with non–small cell lung cancer, who were significantly more likely to develop pneumonitis, reported Leila Khoja, MBChB, PhD, of AstraZeneca UK, Melbourn, England, and associates. Patients with melanoma also were significantly more likely to develop arthralgia, hypothyroidism, rash, pruritus, and diarrhea compared with patients with renal cell carcinoma, who were more likely to develop pneumonitis and dyspnea.
“In light of this study, we should be mindful that different tumor types may have different immune-related adverse effect patterns when treated with the same immune checkpoint inhibitor,” the reviewers noted (Ann Oncol. 2017 Aug 8. doi: 10.1093/annonc/mdx286).
The review included 48 trials of nearly 7,000 patients with solid tumors who received CTLA-4 inhibitors (26 studies), PD-1 inhibitors (17 studies), PD-1 ligand (PD-L1) inhibitors (two trials), or both CTLA-4 and PD-1 inhibitors (three trials). The reviewers identified the studies by searching the Medline, EMBASE, and COCHRANE databases for prospective trials published from 2003 through November 2015.
Severe or life-threatening immune-related adverse effects developed in 31% of patients who received CTLA-4 inhibitors and 10% of patients who received PD-1 inhibitors. Inhibitors of CTLA-4 were significantly more likely to cause all grades of colitis (OR, 8.7), hypophysitis (OR, 6.5), and rash (OR, 2.0), while PD-1 inhibitors were more strongly linked with pneumonitis (OR 6.4), hypothyroidism (OR 4.3), arthralgia (OR, 3.5), and vitiligo (OR, 3.5).
The reviewers also looked for significant predictors of immune-related colitis and pneumonitis, because these are potentially fatal. They found that pneumonitis was significantly linked to PD-1/PD-L1 inhibitor therapy (P less than .001) and colitis to CTLA-4 treatment (P = .04), even after accounting for therapeutic dose and tumor type. No other factors reached significance in this multivariable model.
“Clearly, a more thorough understanding of the mechanisms of immune-related adverse effects is needed, which may lead to the identification of biomarkers to predict the occurrence of toxicity in patients or predict those who have immune-related adverse effects that are unlikely to respond to corticosteroids,” the reviewers concluded. Researchers should also study whether clinical factors such as treatment history or comorbidities affect the risk of immune-related adverse effects from immune checkpoint inhibitors, they said.
The reviewers reported having no funding sources and no relevant conflicts of interest.
The immune-related adverse effects of inhibitors of programmed cell death protein 1 (PD-1) and its ligand varied by tumor type in a large systematic review and meta-analysis.
Patients with melanoma were significantly more likely to develop colitis (odds ratio, 4.2; 95% confidence interval, 1.3 to 14.0), diarrhea (OR, 1.9), pruritus (OR, 2.4), and rash (OR, 1.8) compared with patients with non–small cell lung cancer, who were significantly more likely to develop pneumonitis, reported Leila Khoja, MBChB, PhD, of AstraZeneca UK, Melbourn, England, and associates. Patients with melanoma also were significantly more likely to develop arthralgia, hypothyroidism, rash, pruritus, and diarrhea compared with patients with renal cell carcinoma, who were more likely to develop pneumonitis and dyspnea.
“In light of this study, we should be mindful that different tumor types may have different immune-related adverse effect patterns when treated with the same immune checkpoint inhibitor,” the reviewers noted (Ann Oncol. 2017 Aug 8. doi: 10.1093/annonc/mdx286).
The review included 48 trials of nearly 7,000 patients with solid tumors who received CTLA-4 inhibitors (26 studies), PD-1 inhibitors (17 studies), PD-1 ligand (PD-L1) inhibitors (two trials), or both CTLA-4 and PD-1 inhibitors (three trials). The reviewers identified the studies by searching the Medline, EMBASE, and COCHRANE databases for prospective trials published from 2003 through November 2015.
Severe or life-threatening immune-related adverse effects developed in 31% of patients who received CTLA-4 inhibitors and 10% of patients who received PD-1 inhibitors. Inhibitors of CTLA-4 were significantly more likely to cause all grades of colitis (OR, 8.7), hypophysitis (OR, 6.5), and rash (OR, 2.0), while PD-1 inhibitors were more strongly linked with pneumonitis (OR 6.4), hypothyroidism (OR 4.3), arthralgia (OR, 3.5), and vitiligo (OR, 3.5).
The reviewers also looked for significant predictors of immune-related colitis and pneumonitis, because these are potentially fatal. They found that pneumonitis was significantly linked to PD-1/PD-L1 inhibitor therapy (P less than .001) and colitis to CTLA-4 treatment (P = .04), even after accounting for therapeutic dose and tumor type. No other factors reached significance in this multivariable model.
“Clearly, a more thorough understanding of the mechanisms of immune-related adverse effects is needed, which may lead to the identification of biomarkers to predict the occurrence of toxicity in patients or predict those who have immune-related adverse effects that are unlikely to respond to corticosteroids,” the reviewers concluded. Researchers should also study whether clinical factors such as treatment history or comorbidities affect the risk of immune-related adverse effects from immune checkpoint inhibitors, they said.
The reviewers reported having no funding sources and no relevant conflicts of interest.
FROM ANNALS OF ONCOLOGY
Key clinical point: Immune-related adverse effects varied by tumor type in patients receiving programmed cell death protein 1 (PD-1) and PD-L1 inhibitors.
Major finding: Patients with melanoma who received PD-1/PD-L1 inhibitors were significantly more likely to develop colitis (odds ratio, 4.2; 95% confidence interval, 1.3 to 14.0), diarrhea (OR, 1.9), pruritus (OR, 2.4), and rash (OR, 1.8), compared with patients with non-small cell lung cancer, who were significantly more likely to develop pneumonitis.
Data source: A systematic review and meta-analysis of 48 prospective trials of immune checkpoint inhibitors in of 6,938 adults with solid tumors.
Disclosures: The reviewers reported having no funding sources and no relevant conflicts of interest.
Forgo supplemental oxygen in adequately perfused patients with acute MI, study suggests
Supplemental oxygen did not prevent mortality or rehospitalization among patients with suspected myocardial infarction whose oxygen saturation on room air exceeded 90%, investigators reported.
Rates of all-cause mortality at 1 year were 5% among patients who received supplemental oxygen through an open face mask (6 liters per minute for 6-12 hours) and 5.1% among patients who breathed room air, said Robin Hofmann, MD, of Karolinska Institutet, Stockholm, and his associates. In addition, rehospitalization for MI occurred in 3.8% of patients who received supplemental oxygen and 3.3% of those breathed room air. The findings of the randomized registry-based trial of 6,629 patients were presented at the annual congress of the European Society of Cardiology and published simultaneously in the New England Journal of Medicine.
Guidelines recommend oxygen supplementation in MI, and the practice has persisted for more than a century, but adequately powered trials of hard clinical endpoints are lacking. Above-normal oxygen saturation can potentially worsen reperfusion injury by causing coronary vasoconstriction and increasing production of reactive oxygen species, the researchers noted.
Notably, the Australian Air Versus Oxygen in Myocardial Infarction (AVOID) trial found that oxygen supplementation was associated with larger infarct sizes in patients with ST-segment elevation myocardial infarction, and a recent Cochrane report did not support routine oxygen supplementation for MI.
The current trial enrolled patients aged 30 years and older who had chest pain or shortness of breath lasting less than 6 hours, an oxygen saturation of at least 90% on pulse oximetry, and either electrocardiographic evidence of ischemia or elevated cardiac troponin T or I levels (N Engl J Med. 2017 Aug 28. doi: 10.1056/NEJMoa1706222).
Oxygen therapy lasted a median of 11.6 hours, after which median oxygen saturation levels were 99% in the intervention group and 97% in the control group.
A total of 62 patients (2%) who received oxygen developed hypoxemia, as did 254 patients (8%) who breathed room air. Median highest troponin levels during hospitalization were 946.5 ng per L and 983.0 ng per L, respectively. A total of 166 (5%) patients in the oxygen group and 168 (5.1%) control patients died from any cause by a year after treatment (hazard ratio, 0.97; P = .8). Likewise, supplemental oxygen did not prevent rehospitalization with MI within 1 year (HR, 1.13; P = .3).
“Because power for evaluation of the primary endpoint was lower than anticipated, we cannot completely rule out a small beneficial or detrimental effect of oxygen on mortality,” the researchers wrote. But clinical differences were unlikely, based on the superimposable time-to-event curves through 12 months, the consistent results across subgroups, and the neutral findings on secondary clinical endpoints, they added.
The Swedish Research Council, the Swedish Heart-Lung Foundation, and the Swedish Foundation for Strategic Research funded the study. Dr. Hofmann disclosed research grants from these entities.
The study by Hofmann and coworkers provides definitive evidence for a lack of benefit of supplemental oxygen therapy in patients with acute myocardial infarction who have normal oxygen saturation. Although the mechanisms underlying physiological and biochemical adaptation to myocardial ischemia are complex, the answer to the question is straightforward, and its implications for coronary care are indisputable: Supplemental oxygen provides no benefit to patients with acute coronary syndromes who do not have hypoxemia. It is clearly time for clinical practice to change to reflect this definitive evidence.
Joseph Loscalzo, MD, PhD, is in the department of medicine, Brigham and Women’s Hospital, Boston. He is an editor-at-large for the New England Journal of Medicine. He had no other disclosures. These comments are from his accompanying editorial (N Engl J Med. 2017 Aug 28. doi: 10.1056/NEJMe1709250).
The study by Hofmann and coworkers provides definitive evidence for a lack of benefit of supplemental oxygen therapy in patients with acute myocardial infarction who have normal oxygen saturation. Although the mechanisms underlying physiological and biochemical adaptation to myocardial ischemia are complex, the answer to the question is straightforward, and its implications for coronary care are indisputable: Supplemental oxygen provides no benefit to patients with acute coronary syndromes who do not have hypoxemia. It is clearly time for clinical practice to change to reflect this definitive evidence.
Joseph Loscalzo, MD, PhD, is in the department of medicine, Brigham and Women’s Hospital, Boston. He is an editor-at-large for the New England Journal of Medicine. He had no other disclosures. These comments are from his accompanying editorial (N Engl J Med. 2017 Aug 28. doi: 10.1056/NEJMe1709250).
The study by Hofmann and coworkers provides definitive evidence for a lack of benefit of supplemental oxygen therapy in patients with acute myocardial infarction who have normal oxygen saturation. Although the mechanisms underlying physiological and biochemical adaptation to myocardial ischemia are complex, the answer to the question is straightforward, and its implications for coronary care are indisputable: Supplemental oxygen provides no benefit to patients with acute coronary syndromes who do not have hypoxemia. It is clearly time for clinical practice to change to reflect this definitive evidence.
Joseph Loscalzo, MD, PhD, is in the department of medicine, Brigham and Women’s Hospital, Boston. He is an editor-at-large for the New England Journal of Medicine. He had no other disclosures. These comments are from his accompanying editorial (N Engl J Med. 2017 Aug 28. doi: 10.1056/NEJMe1709250).
Supplemental oxygen did not prevent mortality or rehospitalization among patients with suspected myocardial infarction whose oxygen saturation on room air exceeded 90%, investigators reported.
Rates of all-cause mortality at 1 year were 5% among patients who received supplemental oxygen through an open face mask (6 liters per minute for 6-12 hours) and 5.1% among patients who breathed room air, said Robin Hofmann, MD, of Karolinska Institutet, Stockholm, and his associates. In addition, rehospitalization for MI occurred in 3.8% of patients who received supplemental oxygen and 3.3% of those breathed room air. The findings of the randomized registry-based trial of 6,629 patients were presented at the annual congress of the European Society of Cardiology and published simultaneously in the New England Journal of Medicine.
Guidelines recommend oxygen supplementation in MI, and the practice has persisted for more than a century, but adequately powered trials of hard clinical endpoints are lacking. Above-normal oxygen saturation can potentially worsen reperfusion injury by causing coronary vasoconstriction and increasing production of reactive oxygen species, the researchers noted.
Notably, the Australian Air Versus Oxygen in Myocardial Infarction (AVOID) trial found that oxygen supplementation was associated with larger infarct sizes in patients with ST-segment elevation myocardial infarction, and a recent Cochrane report did not support routine oxygen supplementation for MI.
The current trial enrolled patients aged 30 years and older who had chest pain or shortness of breath lasting less than 6 hours, an oxygen saturation of at least 90% on pulse oximetry, and either electrocardiographic evidence of ischemia or elevated cardiac troponin T or I levels (N Engl J Med. 2017 Aug 28. doi: 10.1056/NEJMoa1706222).
Oxygen therapy lasted a median of 11.6 hours, after which median oxygen saturation levels were 99% in the intervention group and 97% in the control group.
A total of 62 patients (2%) who received oxygen developed hypoxemia, as did 254 patients (8%) who breathed room air. Median highest troponin levels during hospitalization were 946.5 ng per L and 983.0 ng per L, respectively. A total of 166 (5%) patients in the oxygen group and 168 (5.1%) control patients died from any cause by a year after treatment (hazard ratio, 0.97; P = .8). Likewise, supplemental oxygen did not prevent rehospitalization with MI within 1 year (HR, 1.13; P = .3).
“Because power for evaluation of the primary endpoint was lower than anticipated, we cannot completely rule out a small beneficial or detrimental effect of oxygen on mortality,” the researchers wrote. But clinical differences were unlikely, based on the superimposable time-to-event curves through 12 months, the consistent results across subgroups, and the neutral findings on secondary clinical endpoints, they added.
The Swedish Research Council, the Swedish Heart-Lung Foundation, and the Swedish Foundation for Strategic Research funded the study. Dr. Hofmann disclosed research grants from these entities.
Supplemental oxygen did not prevent mortality or rehospitalization among patients with suspected myocardial infarction whose oxygen saturation on room air exceeded 90%, investigators reported.
Rates of all-cause mortality at 1 year were 5% among patients who received supplemental oxygen through an open face mask (6 liters per minute for 6-12 hours) and 5.1% among patients who breathed room air, said Robin Hofmann, MD, of Karolinska Institutet, Stockholm, and his associates. In addition, rehospitalization for MI occurred in 3.8% of patients who received supplemental oxygen and 3.3% of those breathed room air. The findings of the randomized registry-based trial of 6,629 patients were presented at the annual congress of the European Society of Cardiology and published simultaneously in the New England Journal of Medicine.
Guidelines recommend oxygen supplementation in MI, and the practice has persisted for more than a century, but adequately powered trials of hard clinical endpoints are lacking. Above-normal oxygen saturation can potentially worsen reperfusion injury by causing coronary vasoconstriction and increasing production of reactive oxygen species, the researchers noted.
Notably, the Australian Air Versus Oxygen in Myocardial Infarction (AVOID) trial found that oxygen supplementation was associated with larger infarct sizes in patients with ST-segment elevation myocardial infarction, and a recent Cochrane report did not support routine oxygen supplementation for MI.
The current trial enrolled patients aged 30 years and older who had chest pain or shortness of breath lasting less than 6 hours, an oxygen saturation of at least 90% on pulse oximetry, and either electrocardiographic evidence of ischemia or elevated cardiac troponin T or I levels (N Engl J Med. 2017 Aug 28. doi: 10.1056/NEJMoa1706222).
Oxygen therapy lasted a median of 11.6 hours, after which median oxygen saturation levels were 99% in the intervention group and 97% in the control group.
A total of 62 patients (2%) who received oxygen developed hypoxemia, as did 254 patients (8%) who breathed room air. Median highest troponin levels during hospitalization were 946.5 ng per L and 983.0 ng per L, respectively. A total of 166 (5%) patients in the oxygen group and 168 (5.1%) control patients died from any cause by a year after treatment (hazard ratio, 0.97; P = .8). Likewise, supplemental oxygen did not prevent rehospitalization with MI within 1 year (HR, 1.13; P = .3).
“Because power for evaluation of the primary endpoint was lower than anticipated, we cannot completely rule out a small beneficial or detrimental effect of oxygen on mortality,” the researchers wrote. But clinical differences were unlikely, based on the superimposable time-to-event curves through 12 months, the consistent results across subgroups, and the neutral findings on secondary clinical endpoints, they added.
The Swedish Research Council, the Swedish Heart-Lung Foundation, and the Swedish Foundation for Strategic Research funded the study. Dr. Hofmann disclosed research grants from these entities.
FROM THE ESC CONGRESS 2017
Key clinical point: Supplemental oxygen did not benefit patients with suspected myocardial infarction who did not have hypoxemia.
Major finding: At 1 year, rates of all-cause mortality were 5% among patients who received supplemental oxygen and 5.1% among those who received no oxygen.
Data source: A registry-based, randomized clinical trial of 6,629 patients with suspected myocardial infarction without hypoxemia.
Disclosures: The Swedish Research Council, the Swedish Heart-Lung Foundation, and the Swedish Foundation for Strategic Research funded the study. Dr. Hofmann disclosed research grants from these entities.
Undiagnosed AF common in higher-risk patients
Over an 18-month period, small, insertable cardiac monitors detected atrial fibrillation in 29% of previously undiagnosed patients who were at high risk of both AF and stroke, and in 40% of patients over 30 months, according to investigators. The study was presented at the annual congress of the European Society of Cardiology and simultaneously published in JAMA Cardiology.
More than half (56%) of patients consequently started oral anticoagulation therapy, noted James A. Reiffel, MD, of Columbia University College of Physicians and Surgeons, New York, with his associates, for the REVEAL AF investigators.
“The incidence of previously undiagnosed atrial fibrillation may be substantial in patients with risk factors for AF and stroke,” they concluded. “Atrial fibrillation would have gone undetected in most patients had monitoring been limited to 30 days. Further trials regarding the value of detecting subclinical AF and of prophylactic therapies are warranted.”
Atrial fibrillation affects millions worldwide and is associated with older age, hypertension, diabetes, and heart failure, all of which also independently increase the risk of stroke. Minimally invasive prolonged electrocardiographic monitoring with insertable cardiac monitors might help hasten detection and treatment of AF, but diagnostic yield in high-risk patients has been unclear.
In this single-arm, multicenter, prospective study, researchers inserted Reveal XT or Reveal LINQ (Medtronic) cardiac monitors in 385 adults who had either CHAD2 scores of 3, or CHAD2 scores of 2 and one additional risk factor for AF, such as coronary artery disease, sleep apnea, chronic obstructive pulmonary disease, or renal insufficiency. The primary endpoint was AF lasting at least 6 minutes (JAMA Cardiol. 2017 Aug 26. doi: 10.1001/jamacardio.2017.3180). Median follow-up time was 22.5 months. Rates of detecting AF were 6% at 30 days compared with 20% at 6 months, 27% at 12 months, 34% at 24 months, and 40% at 30 months. Patients typically had their first AF episode about 4 months (median, 123 days) after the device was inserted. Among patients who had experienced AF by 18 months, 10% had one or more episodes lasting at least 24 hours, and 72 (56%) were prescribed oral anticoagulation therapy.
The recent PREDATE AF and ASSERT-II studies also found that previously undiagnosed AF was common among high-risk patients, the researchers noted. However, whether anticoagulating patients who have only brief episodes of AF significantly reduces their risk of stroke remains unclear, they added. Three trials (ARTESiA, NOAH, and LOOP) are underway to assess whether oral anticoagulation therapy improves outcomes in patients with device-detected AF.
Medtronic funded the study. Dr. Reiffel and five coinvestigators disclosed consulting for and receiving “modest honoraria” from Medtronic. Two other coinvestigators reported employment with and stock ownership in Medtronic.
The availability of safe and effective oral anticoagulant therapy makes the findings of REVEAL AF highly relevant. This high rate of incident AF makes ICM-based screenings of high-risk individuals a potentially attractive stroke prevention strategy. More detailed subgroup analyses may help identify a patient population with an even higher risk of developing AF. It is also conceivable that this population could have a sufficiently high risk of AF and stroke that a strategy of empiric oral anticoagulation, without the need for AF monitoring, could prove beneficial.
However; both intervention studies and economic evaluations are needed before either strategy should be routinely adopted.
The REVEAL AF study has shown that AF is extremely common among older individuals with stroke risk factors. Over the next 3-4 years, subgroup analyses, economic evaluations, and randomized clinical trials will help determine if this insight can be translated into a cost-effective stroke prevention strategy for high-risk individuals.
Jeff S. Healey, MD, MSc, is at the Population Health Research Institute, McMaster University, Hamilton, Ont. He is the principal investigator of the ASSERT-II and ARTESiA trials, and had no other relevant disclosures. These comments are from his editorial (JAMA Cardiol. 2017 Aug 26. doi: 10.1001/jamacardio.2017.3203).
The availability of safe and effective oral anticoagulant therapy makes the findings of REVEAL AF highly relevant. This high rate of incident AF makes ICM-based screenings of high-risk individuals a potentially attractive stroke prevention strategy. More detailed subgroup analyses may help identify a patient population with an even higher risk of developing AF. It is also conceivable that this population could have a sufficiently high risk of AF and stroke that a strategy of empiric oral anticoagulation, without the need for AF monitoring, could prove beneficial.
However; both intervention studies and economic evaluations are needed before either strategy should be routinely adopted.
The REVEAL AF study has shown that AF is extremely common among older individuals with stroke risk factors. Over the next 3-4 years, subgroup analyses, economic evaluations, and randomized clinical trials will help determine if this insight can be translated into a cost-effective stroke prevention strategy for high-risk individuals.
Jeff S. Healey, MD, MSc, is at the Population Health Research Institute, McMaster University, Hamilton, Ont. He is the principal investigator of the ASSERT-II and ARTESiA trials, and had no other relevant disclosures. These comments are from his editorial (JAMA Cardiol. 2017 Aug 26. doi: 10.1001/jamacardio.2017.3203).
The availability of safe and effective oral anticoagulant therapy makes the findings of REVEAL AF highly relevant. This high rate of incident AF makes ICM-based screenings of high-risk individuals a potentially attractive stroke prevention strategy. More detailed subgroup analyses may help identify a patient population with an even higher risk of developing AF. It is also conceivable that this population could have a sufficiently high risk of AF and stroke that a strategy of empiric oral anticoagulation, without the need for AF monitoring, could prove beneficial.
However; both intervention studies and economic evaluations are needed before either strategy should be routinely adopted.
The REVEAL AF study has shown that AF is extremely common among older individuals with stroke risk factors. Over the next 3-4 years, subgroup analyses, economic evaluations, and randomized clinical trials will help determine if this insight can be translated into a cost-effective stroke prevention strategy for high-risk individuals.
Jeff S. Healey, MD, MSc, is at the Population Health Research Institute, McMaster University, Hamilton, Ont. He is the principal investigator of the ASSERT-II and ARTESiA trials, and had no other relevant disclosures. These comments are from his editorial (JAMA Cardiol. 2017 Aug 26. doi: 10.1001/jamacardio.2017.3203).
Over an 18-month period, small, insertable cardiac monitors detected atrial fibrillation in 29% of previously undiagnosed patients who were at high risk of both AF and stroke, and in 40% of patients over 30 months, according to investigators. The study was presented at the annual congress of the European Society of Cardiology and simultaneously published in JAMA Cardiology.
More than half (56%) of patients consequently started oral anticoagulation therapy, noted James A. Reiffel, MD, of Columbia University College of Physicians and Surgeons, New York, with his associates, for the REVEAL AF investigators.
“The incidence of previously undiagnosed atrial fibrillation may be substantial in patients with risk factors for AF and stroke,” they concluded. “Atrial fibrillation would have gone undetected in most patients had monitoring been limited to 30 days. Further trials regarding the value of detecting subclinical AF and of prophylactic therapies are warranted.”
Atrial fibrillation affects millions worldwide and is associated with older age, hypertension, diabetes, and heart failure, all of which also independently increase the risk of stroke. Minimally invasive prolonged electrocardiographic monitoring with insertable cardiac monitors might help hasten detection and treatment of AF, but diagnostic yield in high-risk patients has been unclear.
In this single-arm, multicenter, prospective study, researchers inserted Reveal XT or Reveal LINQ (Medtronic) cardiac monitors in 385 adults who had either CHAD2 scores of 3, or CHAD2 scores of 2 and one additional risk factor for AF, such as coronary artery disease, sleep apnea, chronic obstructive pulmonary disease, or renal insufficiency. The primary endpoint was AF lasting at least 6 minutes (JAMA Cardiol. 2017 Aug 26. doi: 10.1001/jamacardio.2017.3180). Median follow-up time was 22.5 months. Rates of detecting AF were 6% at 30 days compared with 20% at 6 months, 27% at 12 months, 34% at 24 months, and 40% at 30 months. Patients typically had their first AF episode about 4 months (median, 123 days) after the device was inserted. Among patients who had experienced AF by 18 months, 10% had one or more episodes lasting at least 24 hours, and 72 (56%) were prescribed oral anticoagulation therapy.
The recent PREDATE AF and ASSERT-II studies also found that previously undiagnosed AF was common among high-risk patients, the researchers noted. However, whether anticoagulating patients who have only brief episodes of AF significantly reduces their risk of stroke remains unclear, they added. Three trials (ARTESiA, NOAH, and LOOP) are underway to assess whether oral anticoagulation therapy improves outcomes in patients with device-detected AF.
Medtronic funded the study. Dr. Reiffel and five coinvestigators disclosed consulting for and receiving “modest honoraria” from Medtronic. Two other coinvestigators reported employment with and stock ownership in Medtronic.
Over an 18-month period, small, insertable cardiac monitors detected atrial fibrillation in 29% of previously undiagnosed patients who were at high risk of both AF and stroke, and in 40% of patients over 30 months, according to investigators. The study was presented at the annual congress of the European Society of Cardiology and simultaneously published in JAMA Cardiology.
More than half (56%) of patients consequently started oral anticoagulation therapy, noted James A. Reiffel, MD, of Columbia University College of Physicians and Surgeons, New York, with his associates, for the REVEAL AF investigators.
“The incidence of previously undiagnosed atrial fibrillation may be substantial in patients with risk factors for AF and stroke,” they concluded. “Atrial fibrillation would have gone undetected in most patients had monitoring been limited to 30 days. Further trials regarding the value of detecting subclinical AF and of prophylactic therapies are warranted.”
Atrial fibrillation affects millions worldwide and is associated with older age, hypertension, diabetes, and heart failure, all of which also independently increase the risk of stroke. Minimally invasive prolonged electrocardiographic monitoring with insertable cardiac monitors might help hasten detection and treatment of AF, but diagnostic yield in high-risk patients has been unclear.
In this single-arm, multicenter, prospective study, researchers inserted Reveal XT or Reveal LINQ (Medtronic) cardiac monitors in 385 adults who had either CHAD2 scores of 3, or CHAD2 scores of 2 and one additional risk factor for AF, such as coronary artery disease, sleep apnea, chronic obstructive pulmonary disease, or renal insufficiency. The primary endpoint was AF lasting at least 6 minutes (JAMA Cardiol. 2017 Aug 26. doi: 10.1001/jamacardio.2017.3180). Median follow-up time was 22.5 months. Rates of detecting AF were 6% at 30 days compared with 20% at 6 months, 27% at 12 months, 34% at 24 months, and 40% at 30 months. Patients typically had their first AF episode about 4 months (median, 123 days) after the device was inserted. Among patients who had experienced AF by 18 months, 10% had one or more episodes lasting at least 24 hours, and 72 (56%) were prescribed oral anticoagulation therapy.
The recent PREDATE AF and ASSERT-II studies also found that previously undiagnosed AF was common among high-risk patients, the researchers noted. However, whether anticoagulating patients who have only brief episodes of AF significantly reduces their risk of stroke remains unclear, they added. Three trials (ARTESiA, NOAH, and LOOP) are underway to assess whether oral anticoagulation therapy improves outcomes in patients with device-detected AF.
Medtronic funded the study. Dr. Reiffel and five coinvestigators disclosed consulting for and receiving “modest honoraria” from Medtronic. Two other coinvestigators reported employment with and stock ownership in Medtronic.
FROM THE ESC CONGRESS 2017
Key clinical point: Undiagnosed atrial fibrillation is common in high-risk patients.
Major finding: At 18 months, 29% of previously undiagnosed, high-risk patients had experienced atrial fibrillation lasting 6 or more minutes.
Data source: A single-arm, prospective, multicenter study of 446 patients with a CHADS2 score of at least 3, or a CHADS2 score of at least 2 plus at least one other risk factor (coronary artery disease, sleep apnea, chronic obstructive pulmonary disease, or renal insufficiency).
Disclosures: Medtronic funded the study. Dr. Reiffel and five coinvestigators disclosed consulting for and receiving “modest honoraria” from Medtronic. Two other coinvestigators reported employment with and stock ownership in Medtronic.
Despite global decline, rheumatic heart disease persists in poorest regions
Global mortality due to rheumatic heart disease fell by about 48% during a recent 25-year-period, but some of the poorest areas of the world were left behind, according to a report in the New England Journal of Medicine.
Those regions included Oceania, South Asia, and central sub-Saharan Africa, where rheumatic heart disease remains endemic, wrote David A. Watkins, MD, MPH, of the University of Washington, Seattle, and his coinvestigators. “We estimate that 10 persons per 1,000 population living in South Asia and central sub-Saharan Africa and 15 persons per 1,000 population in Oceania were living with rheumatic heart disease in the year 2015,” they wrote. “Improvements in the measurement of the burden of rheumatic heart disease will assist in planning for its control and will help identify countries where further investments are needed.”
Rheumatic heart disease is a sequela of untreated streptococcal pharyngitis, which is associated with poverty, overcrowding, poor sanitation, and other social predictors of poor health. In high-income countries, treatment with penicillin G and improved sanitation had nearly eliminated rheumatic heart disease by the late 20th century, but local studies pointed to ongoing morbidity and mortality in lower-income regions.
To better define the problem, Dr. Watkins and his associates analyzed epidemiologic studies of rheumatic heart disease from 1990 through 2015. They used the Cause of Death Ensemble model, which estimates mortality more reliably than older methods, and DisMod-MR (version 2.1), which sums epidemiologic data from multiple sources and corrects for gaps and inconsistencies (N Engl J Med. 2017;377:713-22).
Worldwide, about 319,400 individuals died of rheumatic heart disease in 2015, the researchers reported. Age-adjusted death rates fell by about 48% (95% confidence interval, 45%-51%), from 9.2 deaths per 100,000 population in 1990 to 4.8 deaths per 100,000 population in 2015. But this global trend masked striking regional disparities. In 1990, 77% of deaths from rheumatic heart disease occurred in endemic areas of Africa, South Asia, Oceania, and the Caribbean; by 2015, 82% of deaths occurred in endemic regions. Oceania, South Asia, and central sub-Saharan Africa had the highest death rates and were the only regions where the 95% confidence intervals for 1990 and 2015 overlapped, the investigators noted.
In 2015, age-standardized death rates exceeded 10 deaths per 100,000 population in the Solomon Islands, Pakistan, Papua New Guinea, Kiribati, Vanuatu, Fiji, India, Federated States of Micronesia, Marshall Islands, Central African Republic, and Lesotho, they reported. Estimated fatalities were highest in India (119,100 deaths), China (72,600), and Pakistan (18,900). They estimated that in 2015, there were 33.2 million cases of rheumatic heart disease and 10.5 million associated disability-adjusted life-years globally.
The study excluded “borderline” or subclinical rheumatic heart disease, which is detected by echocardiography and whose management remains unclear. “Better data for low-income and middle-income countries are needed to guide policies for the control of rheumatic heart disease,” the investigators wrote. They recommended studying death certificate misclassifications, disease prevalence among adults, and longitudinal trends in nonfatal outcomes and excess mortality.
Funders of the study included the Bill and Melinda Gates Foundation and the Medtronic Foundation. Dr. Watkins disclosed grants from the Medtronic Foundation during the conduct of the study and grants from the Bill and Melinda Gates Foundation outside the submitted work.
Rheumatic heart disease ranks as one of the most serious cardiovascular scourges of the past century. As a result of improvements in living conditions and the introduction of penicillin, the disease was almost eradicated in the developed world by the 1980s. However, it remains a force to be reckoned with in the developing world, as demonstrated by an assessment from the 2015 Global Burden of Disease study (GBD 2015), painstakingly performed by Dr. Watkins and his colleagues.
Several key messages emerge from this important study. It confirms the marked global heterogeneity of the burden of rheumatic heart disease, with near-zero prevalence in developed countries sharply contrasting with substantial prevalence and mortality in developing areas. In addition, however, the study documents the scarcity of accurately measured data in many locations, especially in areas with the highest prevalence (such as sub-Saharan Africa).
Although the “headline news” of a global decline in the prevalence of rheumatic heart disease described by Watkins et al. may give cause for optimism, the burden remains great for those parts of the world least able to afford it. Without sustained re-engagement of clinicians, researchers, funders, and public health bodies, the menace of rheumatic heart disease is unlikely to be eliminated in the near future. Rheumatic heart disease remains a problematic iceberg, yet undissolved, in warm tropical waters.
Eloi Marijon, MD, PhD, and Xavier Jouven, MD, PhD, are at European Georges Pompidou Hospital, Paris. David S. Celermajer, PhD, is at Sydney (Australia) Medical School. They reported having no conflicts of interest. Their editorial accompanied the report by Dr. Watkins and his colleagues (N Engl J Med. 2017;377:780-1).
Rheumatic heart disease ranks as one of the most serious cardiovascular scourges of the past century. As a result of improvements in living conditions and the introduction of penicillin, the disease was almost eradicated in the developed world by the 1980s. However, it remains a force to be reckoned with in the developing world, as demonstrated by an assessment from the 2015 Global Burden of Disease study (GBD 2015), painstakingly performed by Dr. Watkins and his colleagues.
Several key messages emerge from this important study. It confirms the marked global heterogeneity of the burden of rheumatic heart disease, with near-zero prevalence in developed countries sharply contrasting with substantial prevalence and mortality in developing areas. In addition, however, the study documents the scarcity of accurately measured data in many locations, especially in areas with the highest prevalence (such as sub-Saharan Africa).
Although the “headline news” of a global decline in the prevalence of rheumatic heart disease described by Watkins et al. may give cause for optimism, the burden remains great for those parts of the world least able to afford it. Without sustained re-engagement of clinicians, researchers, funders, and public health bodies, the menace of rheumatic heart disease is unlikely to be eliminated in the near future. Rheumatic heart disease remains a problematic iceberg, yet undissolved, in warm tropical waters.
Eloi Marijon, MD, PhD, and Xavier Jouven, MD, PhD, are at European Georges Pompidou Hospital, Paris. David S. Celermajer, PhD, is at Sydney (Australia) Medical School. They reported having no conflicts of interest. Their editorial accompanied the report by Dr. Watkins and his colleagues (N Engl J Med. 2017;377:780-1).
Rheumatic heart disease ranks as one of the most serious cardiovascular scourges of the past century. As a result of improvements in living conditions and the introduction of penicillin, the disease was almost eradicated in the developed world by the 1980s. However, it remains a force to be reckoned with in the developing world, as demonstrated by an assessment from the 2015 Global Burden of Disease study (GBD 2015), painstakingly performed by Dr. Watkins and his colleagues.
Several key messages emerge from this important study. It confirms the marked global heterogeneity of the burden of rheumatic heart disease, with near-zero prevalence in developed countries sharply contrasting with substantial prevalence and mortality in developing areas. In addition, however, the study documents the scarcity of accurately measured data in many locations, especially in areas with the highest prevalence (such as sub-Saharan Africa).
Although the “headline news” of a global decline in the prevalence of rheumatic heart disease described by Watkins et al. may give cause for optimism, the burden remains great for those parts of the world least able to afford it. Without sustained re-engagement of clinicians, researchers, funders, and public health bodies, the menace of rheumatic heart disease is unlikely to be eliminated in the near future. Rheumatic heart disease remains a problematic iceberg, yet undissolved, in warm tropical waters.
Eloi Marijon, MD, PhD, and Xavier Jouven, MD, PhD, are at European Georges Pompidou Hospital, Paris. David S. Celermajer, PhD, is at Sydney (Australia) Medical School. They reported having no conflicts of interest. Their editorial accompanied the report by Dr. Watkins and his colleagues (N Engl J Med. 2017;377:780-1).
Global mortality due to rheumatic heart disease fell by about 48% during a recent 25-year-period, but some of the poorest areas of the world were left behind, according to a report in the New England Journal of Medicine.
Those regions included Oceania, South Asia, and central sub-Saharan Africa, where rheumatic heart disease remains endemic, wrote David A. Watkins, MD, MPH, of the University of Washington, Seattle, and his coinvestigators. “We estimate that 10 persons per 1,000 population living in South Asia and central sub-Saharan Africa and 15 persons per 1,000 population in Oceania were living with rheumatic heart disease in the year 2015,” they wrote. “Improvements in the measurement of the burden of rheumatic heart disease will assist in planning for its control and will help identify countries where further investments are needed.”
Rheumatic heart disease is a sequela of untreated streptococcal pharyngitis, which is associated with poverty, overcrowding, poor sanitation, and other social predictors of poor health. In high-income countries, treatment with penicillin G and improved sanitation had nearly eliminated rheumatic heart disease by the late 20th century, but local studies pointed to ongoing morbidity and mortality in lower-income regions.
To better define the problem, Dr. Watkins and his associates analyzed epidemiologic studies of rheumatic heart disease from 1990 through 2015. They used the Cause of Death Ensemble model, which estimates mortality more reliably than older methods, and DisMod-MR (version 2.1), which sums epidemiologic data from multiple sources and corrects for gaps and inconsistencies (N Engl J Med. 2017;377:713-22).
Worldwide, about 319,400 individuals died of rheumatic heart disease in 2015, the researchers reported. Age-adjusted death rates fell by about 48% (95% confidence interval, 45%-51%), from 9.2 deaths per 100,000 population in 1990 to 4.8 deaths per 100,000 population in 2015. But this global trend masked striking regional disparities. In 1990, 77% of deaths from rheumatic heart disease occurred in endemic areas of Africa, South Asia, Oceania, and the Caribbean; by 2015, 82% of deaths occurred in endemic regions. Oceania, South Asia, and central sub-Saharan Africa had the highest death rates and were the only regions where the 95% confidence intervals for 1990 and 2015 overlapped, the investigators noted.
In 2015, age-standardized death rates exceeded 10 deaths per 100,000 population in the Solomon Islands, Pakistan, Papua New Guinea, Kiribati, Vanuatu, Fiji, India, Federated States of Micronesia, Marshall Islands, Central African Republic, and Lesotho, they reported. Estimated fatalities were highest in India (119,100 deaths), China (72,600), and Pakistan (18,900). They estimated that in 2015, there were 33.2 million cases of rheumatic heart disease and 10.5 million associated disability-adjusted life-years globally.
The study excluded “borderline” or subclinical rheumatic heart disease, which is detected by echocardiography and whose management remains unclear. “Better data for low-income and middle-income countries are needed to guide policies for the control of rheumatic heart disease,” the investigators wrote. They recommended studying death certificate misclassifications, disease prevalence among adults, and longitudinal trends in nonfatal outcomes and excess mortality.
Funders of the study included the Bill and Melinda Gates Foundation and the Medtronic Foundation. Dr. Watkins disclosed grants from the Medtronic Foundation during the conduct of the study and grants from the Bill and Melinda Gates Foundation outside the submitted work.
Global mortality due to rheumatic heart disease fell by about 48% during a recent 25-year-period, but some of the poorest areas of the world were left behind, according to a report in the New England Journal of Medicine.
Those regions included Oceania, South Asia, and central sub-Saharan Africa, where rheumatic heart disease remains endemic, wrote David A. Watkins, MD, MPH, of the University of Washington, Seattle, and his coinvestigators. “We estimate that 10 persons per 1,000 population living in South Asia and central sub-Saharan Africa and 15 persons per 1,000 population in Oceania were living with rheumatic heart disease in the year 2015,” they wrote. “Improvements in the measurement of the burden of rheumatic heart disease will assist in planning for its control and will help identify countries where further investments are needed.”
Rheumatic heart disease is a sequela of untreated streptococcal pharyngitis, which is associated with poverty, overcrowding, poor sanitation, and other social predictors of poor health. In high-income countries, treatment with penicillin G and improved sanitation had nearly eliminated rheumatic heart disease by the late 20th century, but local studies pointed to ongoing morbidity and mortality in lower-income regions.
To better define the problem, Dr. Watkins and his associates analyzed epidemiologic studies of rheumatic heart disease from 1990 through 2015. They used the Cause of Death Ensemble model, which estimates mortality more reliably than older methods, and DisMod-MR (version 2.1), which sums epidemiologic data from multiple sources and corrects for gaps and inconsistencies (N Engl J Med. 2017;377:713-22).
Worldwide, about 319,400 individuals died of rheumatic heart disease in 2015, the researchers reported. Age-adjusted death rates fell by about 48% (95% confidence interval, 45%-51%), from 9.2 deaths per 100,000 population in 1990 to 4.8 deaths per 100,000 population in 2015. But this global trend masked striking regional disparities. In 1990, 77% of deaths from rheumatic heart disease occurred in endemic areas of Africa, South Asia, Oceania, and the Caribbean; by 2015, 82% of deaths occurred in endemic regions. Oceania, South Asia, and central sub-Saharan Africa had the highest death rates and were the only regions where the 95% confidence intervals for 1990 and 2015 overlapped, the investigators noted.
In 2015, age-standardized death rates exceeded 10 deaths per 100,000 population in the Solomon Islands, Pakistan, Papua New Guinea, Kiribati, Vanuatu, Fiji, India, Federated States of Micronesia, Marshall Islands, Central African Republic, and Lesotho, they reported. Estimated fatalities were highest in India (119,100 deaths), China (72,600), and Pakistan (18,900). They estimated that in 2015, there were 33.2 million cases of rheumatic heart disease and 10.5 million associated disability-adjusted life-years globally.
The study excluded “borderline” or subclinical rheumatic heart disease, which is detected by echocardiography and whose management remains unclear. “Better data for low-income and middle-income countries are needed to guide policies for the control of rheumatic heart disease,” the investigators wrote. They recommended studying death certificate misclassifications, disease prevalence among adults, and longitudinal trends in nonfatal outcomes and excess mortality.
Funders of the study included the Bill and Melinda Gates Foundation and the Medtronic Foundation. Dr. Watkins disclosed grants from the Medtronic Foundation during the conduct of the study and grants from the Bill and Melinda Gates Foundation outside the submitted work.
FROM NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point:
Major finding: Globally, age-adjusted death rates fell by about 48% between 1990 and 2015. Oceania, South Asia, and central sub-Saharan Africa had the highest death rates in 2015, and were the only regions where the 95% confidence intervals overlapped with those for 1990.
Data source: A systematic review and analysis of morbidity and mortality data from 1990 through 2015.
Disclosures: Funders included the Bill and Melinda Gates Foundation and the Medtronic Foundation. Dr. Watkins disclosed grants from the Medtronic Foundation during the conduct of the study and grants from the Bill and Melinda Gates Foundation outside the submitted work.
PVC phlebitis rates varied widely, depending on assessment tool
Rates of phlebitis associated with peripheral venous catheters (PVC) ranged from less than 1% to 34% depending on which assessment tool researchers used in a large cross-sectional study.
Rates also varied within individual instruments because they included several possible case definitions, Katarina Göransson, PhD, and her associates reported in Lancet Haematology. “We find it concerning that our study shows variation of the proportion of PVCs causing phlebitis both within and across the instruments investigated,” they wrote. “How to best measure phlebitis outcomes is still unclear, since no universally accepted instrument exists that has had rigorous testing. From a work environment and patient safety perspective, clinical staff engaged in PVC management should be aware of the absence of adequately validated instruments for phlebitis assessment.”
There are many tools to measure PVC-related phlebitis, but no consensus on which to use, and past studies have reported rates of anywhere from 2% to 62%. Hypothesizing that instrument variability contributed to this discrepancy, the researchers tested 17 instruments in 1,032 patients who had 1,175 PVCs placed at 12 inpatient units in Sweden. Eight tools used clinical definitions, seven used severity rating systems, and two used scoring systems (Lancet Haematol. 2017 doi: 10.1016/S2352-3026[17]30122-9).
Rates of PVC-induced phlebitis reached 12% (137 cases) when the researchers used case definition tools, up to 31% when they used scoring systems (P less than .0001), and up to 34% when they used severity rating systems (P less than .0001, compared with the 12% rate). “The proportion within instruments ranged from less than 1% to 28%,” they added. “We [also] identified face validity issues, such as use of indistinct or complex measurements and inconsistent measurements or definitions.”
The investigators did not perform a systematic review to identify these instruments, and they did not necessarily use the most recent versions, they noted. Nevertheless, the findings have direct implications for hospital quality control measures, which require using a single validated instrument over time to generate meaningful results, they said. Hence, the investigators recommended developing a joint research program to develop reliable measures of PVC-related adverse events and better support clinicians who are trying to decide whether to remove PVCs.
The investigators reported having no funding sources and no competing interests.
Rates of phlebitis associated with peripheral venous catheters (PVC) ranged from less than 1% to 34% depending on which assessment tool researchers used in a large cross-sectional study.
Rates also varied within individual instruments because they included several possible case definitions, Katarina Göransson, PhD, and her associates reported in Lancet Haematology. “We find it concerning that our study shows variation of the proportion of PVCs causing phlebitis both within and across the instruments investigated,” they wrote. “How to best measure phlebitis outcomes is still unclear, since no universally accepted instrument exists that has had rigorous testing. From a work environment and patient safety perspective, clinical staff engaged in PVC management should be aware of the absence of adequately validated instruments for phlebitis assessment.”
There are many tools to measure PVC-related phlebitis, but no consensus on which to use, and past studies have reported rates of anywhere from 2% to 62%. Hypothesizing that instrument variability contributed to this discrepancy, the researchers tested 17 instruments in 1,032 patients who had 1,175 PVCs placed at 12 inpatient units in Sweden. Eight tools used clinical definitions, seven used severity rating systems, and two used scoring systems (Lancet Haematol. 2017 doi: 10.1016/S2352-3026[17]30122-9).
Rates of PVC-induced phlebitis reached 12% (137 cases) when the researchers used case definition tools, up to 31% when they used scoring systems (P less than .0001), and up to 34% when they used severity rating systems (P less than .0001, compared with the 12% rate). “The proportion within instruments ranged from less than 1% to 28%,” they added. “We [also] identified face validity issues, such as use of indistinct or complex measurements and inconsistent measurements or definitions.”
The investigators did not perform a systematic review to identify these instruments, and they did not necessarily use the most recent versions, they noted. Nevertheless, the findings have direct implications for hospital quality control measures, which require using a single validated instrument over time to generate meaningful results, they said. Hence, the investigators recommended developing a joint research program to develop reliable measures of PVC-related adverse events and better support clinicians who are trying to decide whether to remove PVCs.
The investigators reported having no funding sources and no competing interests.
Rates of phlebitis associated with peripheral venous catheters (PVC) ranged from less than 1% to 34% depending on which assessment tool researchers used in a large cross-sectional study.
Rates also varied within individual instruments because they included several possible case definitions, Katarina Göransson, PhD, and her associates reported in Lancet Haematology. “We find it concerning that our study shows variation of the proportion of PVCs causing phlebitis both within and across the instruments investigated,” they wrote. “How to best measure phlebitis outcomes is still unclear, since no universally accepted instrument exists that has had rigorous testing. From a work environment and patient safety perspective, clinical staff engaged in PVC management should be aware of the absence of adequately validated instruments for phlebitis assessment.”
There are many tools to measure PVC-related phlebitis, but no consensus on which to use, and past studies have reported rates of anywhere from 2% to 62%. Hypothesizing that instrument variability contributed to this discrepancy, the researchers tested 17 instruments in 1,032 patients who had 1,175 PVCs placed at 12 inpatient units in Sweden. Eight tools used clinical definitions, seven used severity rating systems, and two used scoring systems (Lancet Haematol. 2017 doi: 10.1016/S2352-3026[17]30122-9).
Rates of PVC-induced phlebitis reached 12% (137 cases) when the researchers used case definition tools, up to 31% when they used scoring systems (P less than .0001), and up to 34% when they used severity rating systems (P less than .0001, compared with the 12% rate). “The proportion within instruments ranged from less than 1% to 28%,” they added. “We [also] identified face validity issues, such as use of indistinct or complex measurements and inconsistent measurements or definitions.”
The investigators did not perform a systematic review to identify these instruments, and they did not necessarily use the most recent versions, they noted. Nevertheless, the findings have direct implications for hospital quality control measures, which require using a single validated instrument over time to generate meaningful results, they said. Hence, the investigators recommended developing a joint research program to develop reliable measures of PVC-related adverse events and better support clinicians who are trying to decide whether to remove PVCs.
The investigators reported having no funding sources and no competing interests.
FROM LANCET HAEMATOLOGY
Key clinical point: Rates of PVC-induced phlebitis varied widely within and between assessment instruments.
Major finding: Rates were as high as 12% (137 cases) based on the case definition tools, up to 31% based on the scoring systems (P less than .0001), and up to 34% based on the severity rating systems (P less than .0001, compared with the case definition rate).
Data source: A cross-sectional study of 17 instruments used to identify phlebitis associated with peripheral venous catheters.
Disclosures: The investigators reported having no funding sources and no competing interests.
Inotuzumab ozogamicin tied to sinusoidal obstruction syndrome in ALL
Inotuzumab ozogamicin therapy significantly increased the risk of sinusoidal obstruction syndrome (veno-occlusive disease) among adults with relapsed or refractory B-cell precursor acute lymphoblastic leukemia (ALL), especially when they also received follow-up hematopoietic stem cell transplantation, according to a safety analysis from the INO-VATE trial.
After a median of 9 weeks of treatment, 13% of 164 patients who received inotuzumab ozogamicin (Besponsa, Wyeth/Pfizer) developed sinusoidal obstruction syndrome, compared with less than 1% of 143 patients who received standard care, reported Hagop M. Kantarjian, MD, of the University of Texas MD Anderson Cancer Center in Houston, and his associates (Lancet Haematol. 2017 Jul 4. doi: 10.1016/ S2352-3026[17]30103-5).
Follow-up treatment with HSCT increased the risk of sinusoidal obstruction syndrome in both the intervention (22%) and standard-care (3%) groups. Among patients who did not undergo HSCT, rates of this adverse event were 3% and 0%, respectively. Five patients died from sinusoidal obstruction syndrome, all of whom received both inotuzumab ozogamicin and HSCT. The findings earned the newly approved regimen a boxed warning for severe hepatotoxicity.
The open-label, phase 3, multicenter INO-VATE study included 326 adults with CD22-positive, Philadelphia chromosome–negative or Philadelphia chromosome–positive relapsed or refractory B-cell precursor ALL. The safety analysis included 305 patients. Rates of treatment-emergent hepatotoxicities, of all grades, were 51% with inotuzumab ozogamicin and 34% with standard care. Most adverse hepatic events were grade 1-2 liver-related laboratory abnormalities, but 8% of inotuzumab ozogamicin recipients developed grade 3 or higher sinusoidal obstruction syndrome, versus less than 1% of the control group.
“After follow-up HSCT, the frequency of sinusoidal obstruction syndrome was 50% or higher in the following subgroups: patients aged 65 years or older, patients with last available pre-HSCT serum bilirubin concentration more than or equal to the upper limit of normal, and patients who received conditioning regimens with two alkylating agents,” Dr. Kantarjian and his fellow investigators wrote. Conditioning regimens that included thiotepa markedly increased the risk of sinusoidal obstruction syndrome. Additional risk factors included HSCT before study enrollment, history of liver disease, and a final pre-HSCT platelet count of less than 100 × 109 platelets per L.
Rates of sinusoidal obstruction syndrome were 42% with four to six cycles of inotuzumab ozogamicin, 23% with three cycles, 19% with two cycles, and 8% with one cycle, said the investigators. In multivariate analysis, conditioning with two alkylating agents (P = .02 compared with one alkylating agent) and pre-HSCT bilirubin of at least the upper limit of normal (P = .01) significantly increased the risk of sinusoidal obstruction syndrome during or after treatment with inotuzumab ozogamicin.
Notably, inotuzumab ozogamicin did not significantly increase the chances of survival compared with standard care among patients who also received follow-up HSCT (hazard ratio, 1.3; 97.5% confidence interval, 0.66 to 2.3; P = 0.77). Among HSCT recipients, the chances of surviving to 24 months were 39% (95% CI, 28%-50%) with inotuzumab ozogamicin and 29% (11%-49%) with standard care. Nonetheless, HSCT “offers possibility of cure in the relapsed or recurrent [ALL] setting,” the researchers wrote. Clinicians should be especially wary of sinusoidal obstruction syndrome if patients are 65 years or older, received HSCT before inotuzumab ozogamicin treatment, or have a baseline history of liver disease, they said. Strategies to minimize risk include shortening the duration of inotuzumab ozogamicin treatment and avoiding conditioning regimens that contain two alkylating agents.
Pfizer funded and collaborated in the trial. Dr. Kantarjian disclosed ties to Pfizer and numerous other pharmaceutical companies.
Inotuzumab ozogamicin therapy significantly increased the risk of sinusoidal obstruction syndrome (veno-occlusive disease) among adults with relapsed or refractory B-cell precursor acute lymphoblastic leukemia (ALL), especially when they also received follow-up hematopoietic stem cell transplantation, according to a safety analysis from the INO-VATE trial.
After a median of 9 weeks of treatment, 13% of 164 patients who received inotuzumab ozogamicin (Besponsa, Wyeth/Pfizer) developed sinusoidal obstruction syndrome, compared with less than 1% of 143 patients who received standard care, reported Hagop M. Kantarjian, MD, of the University of Texas MD Anderson Cancer Center in Houston, and his associates (Lancet Haematol. 2017 Jul 4. doi: 10.1016/ S2352-3026[17]30103-5).
Follow-up treatment with HSCT increased the risk of sinusoidal obstruction syndrome in both the intervention (22%) and standard-care (3%) groups. Among patients who did not undergo HSCT, rates of this adverse event were 3% and 0%, respectively. Five patients died from sinusoidal obstruction syndrome, all of whom received both inotuzumab ozogamicin and HSCT. The findings earned the newly approved regimen a boxed warning for severe hepatotoxicity.
The open-label, phase 3, multicenter INO-VATE study included 326 adults with CD22-positive, Philadelphia chromosome–negative or Philadelphia chromosome–positive relapsed or refractory B-cell precursor ALL. The safety analysis included 305 patients. Rates of treatment-emergent hepatotoxicities, of all grades, were 51% with inotuzumab ozogamicin and 34% with standard care. Most adverse hepatic events were grade 1-2 liver-related laboratory abnormalities, but 8% of inotuzumab ozogamicin recipients developed grade 3 or higher sinusoidal obstruction syndrome, versus less than 1% of the control group.
“After follow-up HSCT, the frequency of sinusoidal obstruction syndrome was 50% or higher in the following subgroups: patients aged 65 years or older, patients with last available pre-HSCT serum bilirubin concentration more than or equal to the upper limit of normal, and patients who received conditioning regimens with two alkylating agents,” Dr. Kantarjian and his fellow investigators wrote. Conditioning regimens that included thiotepa markedly increased the risk of sinusoidal obstruction syndrome. Additional risk factors included HSCT before study enrollment, history of liver disease, and a final pre-HSCT platelet count of less than 100 × 109 platelets per L.
Rates of sinusoidal obstruction syndrome were 42% with four to six cycles of inotuzumab ozogamicin, 23% with three cycles, 19% with two cycles, and 8% with one cycle, said the investigators. In multivariate analysis, conditioning with two alkylating agents (P = .02 compared with one alkylating agent) and pre-HSCT bilirubin of at least the upper limit of normal (P = .01) significantly increased the risk of sinusoidal obstruction syndrome during or after treatment with inotuzumab ozogamicin.
Notably, inotuzumab ozogamicin did not significantly increase the chances of survival compared with standard care among patients who also received follow-up HSCT (hazard ratio, 1.3; 97.5% confidence interval, 0.66 to 2.3; P = 0.77). Among HSCT recipients, the chances of surviving to 24 months were 39% (95% CI, 28%-50%) with inotuzumab ozogamicin and 29% (11%-49%) with standard care. Nonetheless, HSCT “offers possibility of cure in the relapsed or recurrent [ALL] setting,” the researchers wrote. Clinicians should be especially wary of sinusoidal obstruction syndrome if patients are 65 years or older, received HSCT before inotuzumab ozogamicin treatment, or have a baseline history of liver disease, they said. Strategies to minimize risk include shortening the duration of inotuzumab ozogamicin treatment and avoiding conditioning regimens that contain two alkylating agents.
Pfizer funded and collaborated in the trial. Dr. Kantarjian disclosed ties to Pfizer and numerous other pharmaceutical companies.
Inotuzumab ozogamicin therapy significantly increased the risk of sinusoidal obstruction syndrome (veno-occlusive disease) among adults with relapsed or refractory B-cell precursor acute lymphoblastic leukemia (ALL), especially when they also received follow-up hematopoietic stem cell transplantation, according to a safety analysis from the INO-VATE trial.
After a median of 9 weeks of treatment, 13% of 164 patients who received inotuzumab ozogamicin (Besponsa, Wyeth/Pfizer) developed sinusoidal obstruction syndrome, compared with less than 1% of 143 patients who received standard care, reported Hagop M. Kantarjian, MD, of the University of Texas MD Anderson Cancer Center in Houston, and his associates (Lancet Haematol. 2017 Jul 4. doi: 10.1016/ S2352-3026[17]30103-5).
Follow-up treatment with HSCT increased the risk of sinusoidal obstruction syndrome in both the intervention (22%) and standard-care (3%) groups. Among patients who did not undergo HSCT, rates of this adverse event were 3% and 0%, respectively. Five patients died from sinusoidal obstruction syndrome, all of whom received both inotuzumab ozogamicin and HSCT. The findings earned the newly approved regimen a boxed warning for severe hepatotoxicity.
The open-label, phase 3, multicenter INO-VATE study included 326 adults with CD22-positive, Philadelphia chromosome–negative or Philadelphia chromosome–positive relapsed or refractory B-cell precursor ALL. The safety analysis included 305 patients. Rates of treatment-emergent hepatotoxicities, of all grades, were 51% with inotuzumab ozogamicin and 34% with standard care. Most adverse hepatic events were grade 1-2 liver-related laboratory abnormalities, but 8% of inotuzumab ozogamicin recipients developed grade 3 or higher sinusoidal obstruction syndrome, versus less than 1% of the control group.
“After follow-up HSCT, the frequency of sinusoidal obstruction syndrome was 50% or higher in the following subgroups: patients aged 65 years or older, patients with last available pre-HSCT serum bilirubin concentration more than or equal to the upper limit of normal, and patients who received conditioning regimens with two alkylating agents,” Dr. Kantarjian and his fellow investigators wrote. Conditioning regimens that included thiotepa markedly increased the risk of sinusoidal obstruction syndrome. Additional risk factors included HSCT before study enrollment, history of liver disease, and a final pre-HSCT platelet count of less than 100 × 109 platelets per L.
Rates of sinusoidal obstruction syndrome were 42% with four to six cycles of inotuzumab ozogamicin, 23% with three cycles, 19% with two cycles, and 8% with one cycle, said the investigators. In multivariate analysis, conditioning with two alkylating agents (P = .02 compared with one alkylating agent) and pre-HSCT bilirubin of at least the upper limit of normal (P = .01) significantly increased the risk of sinusoidal obstruction syndrome during or after treatment with inotuzumab ozogamicin.
Notably, inotuzumab ozogamicin did not significantly increase the chances of survival compared with standard care among patients who also received follow-up HSCT (hazard ratio, 1.3; 97.5% confidence interval, 0.66 to 2.3; P = 0.77). Among HSCT recipients, the chances of surviving to 24 months were 39% (95% CI, 28%-50%) with inotuzumab ozogamicin and 29% (11%-49%) with standard care. Nonetheless, HSCT “offers possibility of cure in the relapsed or recurrent [ALL] setting,” the researchers wrote. Clinicians should be especially wary of sinusoidal obstruction syndrome if patients are 65 years or older, received HSCT before inotuzumab ozogamicin treatment, or have a baseline history of liver disease, they said. Strategies to minimize risk include shortening the duration of inotuzumab ozogamicin treatment and avoiding conditioning regimens that contain two alkylating agents.
Pfizer funded and collaborated in the trial. Dr. Kantarjian disclosed ties to Pfizer and numerous other pharmaceutical companies.
FROM LANCET HAEMATOLOGY
Key clinical point: Treatment with inotuzumab ozogamicin (Besponsa, Wyeth/Pfizer) led to sinusoidal obstructive syndrome (veno-occlusive disease), especially after follow-up hematopoietic stem cell transplantation, compared with standard care for relapsed or refractory acute lymphoblastic leukemia.
Major finding: After a median of 9 weeks of treatment, rates of sinusoidal obstructive syndrome were 13% among inotuzumab ozogamicin recipients overall, 22% among those who also received HSCT, and less than 1% in the standard-care group.
Data source: A prespecified safety analysis of INO-VATE, an open-label, phase 3, multicenter trial of 326 adults with Philadelphia chromosome–negative or Philadelphia chromosome–positive relapsed or refractory B-cell precursor ALL.
Disclosures: Pfizer funded and collaborated in the trial. Dr. Kantarjian disclosed ties to Pfizer and numerous other pharmaceutical companies.
NASH did not increase risk of poor liver transplantation outcomes
Adults with nonalcoholic steatohepatitis (NASH) fared as well on key outcome measures as other liver transplant recipients, despite having significantly more comorbidities, according to the results of a single-center retrospective cohort study.
Major morbidity, mortality, and rates of graft survival after 90 days were similar between patients who underwent transplantation for NASH and those who underwent it for another cirrhotic liver condition, wrote Eline H. van den Berg, MD, of University Medical Center Groningen (the Netherlands) with her associates. “These results are comforting, considering the expected increase of patients with NASH cirrhosis in the near future,” the researchers concluded. “Future analysis regarding the recurrence of nonalcoholic fatty liver disease, development of long-term complications, long-term graft patency, and occurrence of comorbid diseases after LT [liver transplantation] is mandatory to better understand the natural history and risk profile of NASH patients and to prevent and treat its complications.” The findings were published online in Digestive and Liver Disease (Dig Liver Dis. 2017 Aug 11. doi: 10.1016/j.dld.2017.08.022).
Nonalcoholic fatty liver disease begins as steatosis and can progress to NASH, fibrosis, and cirrhosis. The global obesity epidemic is amplifying its incidence, and about 26% of patients who develop NASH ultimately develop cirrhosis. Cirrhosis itself increases the risk of in-hospital death or prolonged length of postoperative stay, but patients with NASH also have obesity and cardiovascular disease, which might “tremendously increase” the risk of poor postoperative outcomes, the researchers said. Because prior research had focused mainly on mortality and had reported conflicting results, they used the Clavien-Dindo classification system to retrospectively study rates of complications among 169 adults who underwent liver transplantation at their center from 2009 through 2015, including 34 (20%) patients with NASH cirrhosis.
Patients with NASH were significantly older than other transplant recipients (59 versus 55 years, P = .01) and had markedly higher rates of obesity (62% versus 8%; P less than .01), diabetes mellitus (74% versus 20%; P less than .01), metabolic syndrome (83% versus 38%; P less than .01), hypertension (61% versus 30%; P less than .01), and cardiovascular disease (29% versus 11%; P less than .01). Despite these differences, the groups had statistically similar rates of postoperative mortality (3% in both groups), 90-day graft survival posttransplantation (94% and 90%, respectively), and major postoperative complications, including biopsy-proven acute cellular rejection (3% and 7%), hepatic artery thrombosis (0% and 7%), relaparotomy (15% and 24%), primary nonfunction (0% and 1.6%), retransplantation (6% and 7%), sepsis (12% and 13%), gastrointestinal infection (24% and 36%), fever of unknown origin (18% and 14%), and renal replacement therapy (15% and 24%).
After accounting for age, sex, transplant year, and donor characteristics, NASH patients were at significantly increased risk of grade 2 urogenital infections, compared with other patients (odds ratio, 3.4; 95% confidence interval, 1.1 to 10.6; P = .03). Grade 1 complications also were more common with NASH than otherwise (77% versus 59%), and the difference remained statistically significant in the multivariable analysis (OR, 1.6; 95% CI, 1.03 to 2.63; P = .04).
The study used a strict, internationally accepted definition of NASH – all patients either had cases confirmed by biopsy, had metabolic syndrome, or had obesity and type 2 diabetes mellitus, and, further, none had hepatitis or alcoholic liver disease. None of the patients in the study received transplants for acute liver failure or noncirrhotic liver disease, and none were 70 years or older, which is the cutoff age for liver transplantation in the Netherlands.
The investigators received no funding for the study and reported having no conflicts of interest.
Adults with nonalcoholic steatohepatitis (NASH) fared as well on key outcome measures as other liver transplant recipients, despite having significantly more comorbidities, according to the results of a single-center retrospective cohort study.
Major morbidity, mortality, and rates of graft survival after 90 days were similar between patients who underwent transplantation for NASH and those who underwent it for another cirrhotic liver condition, wrote Eline H. van den Berg, MD, of University Medical Center Groningen (the Netherlands) with her associates. “These results are comforting, considering the expected increase of patients with NASH cirrhosis in the near future,” the researchers concluded. “Future analysis regarding the recurrence of nonalcoholic fatty liver disease, development of long-term complications, long-term graft patency, and occurrence of comorbid diseases after LT [liver transplantation] is mandatory to better understand the natural history and risk profile of NASH patients and to prevent and treat its complications.” The findings were published online in Digestive and Liver Disease (Dig Liver Dis. 2017 Aug 11. doi: 10.1016/j.dld.2017.08.022).
Nonalcoholic fatty liver disease begins as steatosis and can progress to NASH, fibrosis, and cirrhosis. The global obesity epidemic is amplifying its incidence, and about 26% of patients who develop NASH ultimately develop cirrhosis. Cirrhosis itself increases the risk of in-hospital death or prolonged length of postoperative stay, but patients with NASH also have obesity and cardiovascular disease, which might “tremendously increase” the risk of poor postoperative outcomes, the researchers said. Because prior research had focused mainly on mortality and had reported conflicting results, they used the Clavien-Dindo classification system to retrospectively study rates of complications among 169 adults who underwent liver transplantation at their center from 2009 through 2015, including 34 (20%) patients with NASH cirrhosis.
Patients with NASH were significantly older than other transplant recipients (59 versus 55 years, P = .01) and had markedly higher rates of obesity (62% versus 8%; P less than .01), diabetes mellitus (74% versus 20%; P less than .01), metabolic syndrome (83% versus 38%; P less than .01), hypertension (61% versus 30%; P less than .01), and cardiovascular disease (29% versus 11%; P less than .01). Despite these differences, the groups had statistically similar rates of postoperative mortality (3% in both groups), 90-day graft survival posttransplantation (94% and 90%, respectively), and major postoperative complications, including biopsy-proven acute cellular rejection (3% and 7%), hepatic artery thrombosis (0% and 7%), relaparotomy (15% and 24%), primary nonfunction (0% and 1.6%), retransplantation (6% and 7%), sepsis (12% and 13%), gastrointestinal infection (24% and 36%), fever of unknown origin (18% and 14%), and renal replacement therapy (15% and 24%).
After accounting for age, sex, transplant year, and donor characteristics, NASH patients were at significantly increased risk of grade 2 urogenital infections, compared with other patients (odds ratio, 3.4; 95% confidence interval, 1.1 to 10.6; P = .03). Grade 1 complications also were more common with NASH than otherwise (77% versus 59%), and the difference remained statistically significant in the multivariable analysis (OR, 1.6; 95% CI, 1.03 to 2.63; P = .04).
The study used a strict, internationally accepted definition of NASH – all patients either had cases confirmed by biopsy, had metabolic syndrome, or had obesity and type 2 diabetes mellitus, and, further, none had hepatitis or alcoholic liver disease. None of the patients in the study received transplants for acute liver failure or noncirrhotic liver disease, and none were 70 years or older, which is the cutoff age for liver transplantation in the Netherlands.
The investigators received no funding for the study and reported having no conflicts of interest.
Adults with nonalcoholic steatohepatitis (NASH) fared as well on key outcome measures as other liver transplant recipients, despite having significantly more comorbidities, according to the results of a single-center retrospective cohort study.
Major morbidity, mortality, and rates of graft survival after 90 days were similar between patients who underwent transplantation for NASH and those who underwent it for another cirrhotic liver condition, wrote Eline H. van den Berg, MD, of University Medical Center Groningen (the Netherlands) with her associates. “These results are comforting, considering the expected increase of patients with NASH cirrhosis in the near future,” the researchers concluded. “Future analysis regarding the recurrence of nonalcoholic fatty liver disease, development of long-term complications, long-term graft patency, and occurrence of comorbid diseases after LT [liver transplantation] is mandatory to better understand the natural history and risk profile of NASH patients and to prevent and treat its complications.” The findings were published online in Digestive and Liver Disease (Dig Liver Dis. 2017 Aug 11. doi: 10.1016/j.dld.2017.08.022).
Nonalcoholic fatty liver disease begins as steatosis and can progress to NASH, fibrosis, and cirrhosis. The global obesity epidemic is amplifying its incidence, and about 26% of patients who develop NASH ultimately develop cirrhosis. Cirrhosis itself increases the risk of in-hospital death or prolonged length of postoperative stay, but patients with NASH also have obesity and cardiovascular disease, which might “tremendously increase” the risk of poor postoperative outcomes, the researchers said. Because prior research had focused mainly on mortality and had reported conflicting results, they used the Clavien-Dindo classification system to retrospectively study rates of complications among 169 adults who underwent liver transplantation at their center from 2009 through 2015, including 34 (20%) patients with NASH cirrhosis.
Patients with NASH were significantly older than other transplant recipients (59 versus 55 years, P = .01) and had markedly higher rates of obesity (62% versus 8%; P less than .01), diabetes mellitus (74% versus 20%; P less than .01), metabolic syndrome (83% versus 38%; P less than .01), hypertension (61% versus 30%; P less than .01), and cardiovascular disease (29% versus 11%; P less than .01). Despite these differences, the groups had statistically similar rates of postoperative mortality (3% in both groups), 90-day graft survival posttransplantation (94% and 90%, respectively), and major postoperative complications, including biopsy-proven acute cellular rejection (3% and 7%), hepatic artery thrombosis (0% and 7%), relaparotomy (15% and 24%), primary nonfunction (0% and 1.6%), retransplantation (6% and 7%), sepsis (12% and 13%), gastrointestinal infection (24% and 36%), fever of unknown origin (18% and 14%), and renal replacement therapy (15% and 24%).
After accounting for age, sex, transplant year, and donor characteristics, NASH patients were at significantly increased risk of grade 2 urogenital infections, compared with other patients (odds ratio, 3.4; 95% confidence interval, 1.1 to 10.6; P = .03). Grade 1 complications also were more common with NASH than otherwise (77% versus 59%), and the difference remained statistically significant in the multivariable analysis (OR, 1.6; 95% CI, 1.03 to 2.63; P = .04).
The study used a strict, internationally accepted definition of NASH – all patients either had cases confirmed by biopsy, had metabolic syndrome, or had obesity and type 2 diabetes mellitus, and, further, none had hepatitis or alcoholic liver disease. None of the patients in the study received transplants for acute liver failure or noncirrhotic liver disease, and none were 70 years or older, which is the cutoff age for liver transplantation in the Netherlands.
The investigators received no funding for the study and reported having no conflicts of interest.
FROM DIGESTIVE AND LIVER DISEASES
Key clinical point: Adults with nonalcoholic steatohepatitis (NASH) fared as well as on key outcome measures as other liver transplant recipients, despite having significantly more comorbidities.
Major finding: Patients with and without NASH had statistically similar rates of postoperative mortality (3% in both groups), 90-day graft survival (94% and 90%, respectively), and major postoperative complications.
Data source: A single-center retrospective cohort study of 169 adult liver transplant recipients, of whom 20% were transplanted for NASH cirrhosis.
Disclosures: The investigators received no funding for the study and reported having no conflicts of interest.
Study advances noninvasive prenatal testing for hemophilia
Digital droplet PCR (ddPCR) was an accurate and noninvasive method to detect mutations leading to hemophilia A and B in maternal plasma DNA in 15 at-risk pregnancies of 8 to 42 weeks’ gestation, researchers reported.
Additionally, the researchers showed for the first time that targeted massively parallel sequencing (MPS) accurately detected the clinically important int22h-related inversion mutations in maternal plasma DNA from pregnant hemophilia carriers from three families with the disorder.
As costs of sequencing continue to fall, larger studies of pregnant carriers of F8 int22h-related inversions can help make MPS “an essential part in noninvasive prenatal testing of hemophilia carriers,” Irena Hudecova, PhD, of Li Ka Shing Institute of Health Sciences, Hong Kong, and her associates wrote (Blood. 2017 Jul 20;130[3]:340-7).
Diagnosing hemophilia during pregnancy helps optimize care and allows mothers to make informed decisions about whether to terminate pregnancies. But for male fetuses, invasive testing has been the only option. In a prior small study, researchers used noninvasive microfluidics PCR to detect sequence variants of F8, which encodes factor VIII, and F9, which encodes Factor IX. The assay uses a chip that can accommodate about 9,000 reaction wells, making noninvasive screening much more feasible and affordable. But technical difficulties had precluded detection of int22h-related inversions, the inversion mutations of intron 22 in F8 on chromosome X that affect about half of individuals with severe hemophilia (Blood. 2011 Mar 31;117[13]:3684-91).
For the current study, the researchers first designed family-specific ddPCR assays to test for relevant maternal sequence variants scattered across the F8 and F9 genes. Tests of 15 male singleton fetuses produced three unclassified samples, but no misclassifications.
“Because of the scalability of ddPCR, the protocol performed reliably even in cases with fetal DNA fraction lower than 10%,” the researchers wrote. “When an unclassified result is encountered, one either performs more digital analyses on the sample to accumulate more data points, or when the sample is consumed, one may resort to an additional blood draw, possibly at a later gestational age with higher fetal DNA fraction.”
Next, the investigators used MPS to create detailed fetal haplotype maps of the 7.6-Md region of F8 where int22h-related inversions occur. This approach yielded an “accurate and robust measurement of maternally inherited fetal haplotypes,” they wrote. “Our data suggest it is feasible to apply targeted MPS to interrogate maternally inherited F8 int22h-related inversions, whereas ddPCR [is] an affordable approach for the identification of F8 and F9 sequence variants in maternal plasma.”
The study was funded by the Research Grants Council of the Hong Kong SAR Government and the Vice Chancellor’s One-Off Discretionary Fund of The Chinese University of Hong Kong. Dr. Hudecova reported having no financial disclosures. Several coauthors disclosed patents for plasma nucleic acid analysis and ties to Sequenom, Illumina, Xcelom, and Cirina.
The use of digital droplet PCR for prenatal diagnosis of hemophilia is a major improvement over current invasive methods, such as chorionic villus sampling, amniocentesis, and cordocentesis.
Knowledge of a hemophilia diagnosis before birth provides an opportunity for early hemostatic intervention before procedures such as circumcision are performed, or to prevent morbidity and mortality by cesarean delivery to reduce the risk of intracranial hemorrhage when the birth of a child with severe hemophilia is anticipated. Prenatal testing is also important to ensure hemostatic support for the mother, for whom it may be necessary to prevent bleeding with perinatal anesthesia and/or postpartum bleeding. These prenatal assays depend on knowledge of the mother’s carrier genotype, which is potentially more accurate than factor levels, which may increase with hormone use or the increasing hormone levels of pregnancy and mask carrier diagnosis.
The development of these assays is timely in view of the ongoing My Life, Our Future (MLOF) genome project in hemophilia and underscores the need for carrier testing and genetic counseling of female members from hemophilia kindreds.
Margaret V. Ragni, MD, is with the University of Pittsburgh Medical Center. She reported having no relevant disclosures. These comments are adapted from an accompanying editorial (Blood. 2017 Jul 20;130[3]:240-1).
The use of digital droplet PCR for prenatal diagnosis of hemophilia is a major improvement over current invasive methods, such as chorionic villus sampling, amniocentesis, and cordocentesis.
Knowledge of a hemophilia diagnosis before birth provides an opportunity for early hemostatic intervention before procedures such as circumcision are performed, or to prevent morbidity and mortality by cesarean delivery to reduce the risk of intracranial hemorrhage when the birth of a child with severe hemophilia is anticipated. Prenatal testing is also important to ensure hemostatic support for the mother, for whom it may be necessary to prevent bleeding with perinatal anesthesia and/or postpartum bleeding. These prenatal assays depend on knowledge of the mother’s carrier genotype, which is potentially more accurate than factor levels, which may increase with hormone use or the increasing hormone levels of pregnancy and mask carrier diagnosis.
The development of these assays is timely in view of the ongoing My Life, Our Future (MLOF) genome project in hemophilia and underscores the need for carrier testing and genetic counseling of female members from hemophilia kindreds.
Margaret V. Ragni, MD, is with the University of Pittsburgh Medical Center. She reported having no relevant disclosures. These comments are adapted from an accompanying editorial (Blood. 2017 Jul 20;130[3]:240-1).
The use of digital droplet PCR for prenatal diagnosis of hemophilia is a major improvement over current invasive methods, such as chorionic villus sampling, amniocentesis, and cordocentesis.
Knowledge of a hemophilia diagnosis before birth provides an opportunity for early hemostatic intervention before procedures such as circumcision are performed, or to prevent morbidity and mortality by cesarean delivery to reduce the risk of intracranial hemorrhage when the birth of a child with severe hemophilia is anticipated. Prenatal testing is also important to ensure hemostatic support for the mother, for whom it may be necessary to prevent bleeding with perinatal anesthesia and/or postpartum bleeding. These prenatal assays depend on knowledge of the mother’s carrier genotype, which is potentially more accurate than factor levels, which may increase with hormone use or the increasing hormone levels of pregnancy and mask carrier diagnosis.
The development of these assays is timely in view of the ongoing My Life, Our Future (MLOF) genome project in hemophilia and underscores the need for carrier testing and genetic counseling of female members from hemophilia kindreds.
Margaret V. Ragni, MD, is with the University of Pittsburgh Medical Center. She reported having no relevant disclosures. These comments are adapted from an accompanying editorial (Blood. 2017 Jul 20;130[3]:240-1).
Digital droplet PCR (ddPCR) was an accurate and noninvasive method to detect mutations leading to hemophilia A and B in maternal plasma DNA in 15 at-risk pregnancies of 8 to 42 weeks’ gestation, researchers reported.
Additionally, the researchers showed for the first time that targeted massively parallel sequencing (MPS) accurately detected the clinically important int22h-related inversion mutations in maternal plasma DNA from pregnant hemophilia carriers from three families with the disorder.
As costs of sequencing continue to fall, larger studies of pregnant carriers of F8 int22h-related inversions can help make MPS “an essential part in noninvasive prenatal testing of hemophilia carriers,” Irena Hudecova, PhD, of Li Ka Shing Institute of Health Sciences, Hong Kong, and her associates wrote (Blood. 2017 Jul 20;130[3]:340-7).
Diagnosing hemophilia during pregnancy helps optimize care and allows mothers to make informed decisions about whether to terminate pregnancies. But for male fetuses, invasive testing has been the only option. In a prior small study, researchers used noninvasive microfluidics PCR to detect sequence variants of F8, which encodes factor VIII, and F9, which encodes Factor IX. The assay uses a chip that can accommodate about 9,000 reaction wells, making noninvasive screening much more feasible and affordable. But technical difficulties had precluded detection of int22h-related inversions, the inversion mutations of intron 22 in F8 on chromosome X that affect about half of individuals with severe hemophilia (Blood. 2011 Mar 31;117[13]:3684-91).
For the current study, the researchers first designed family-specific ddPCR assays to test for relevant maternal sequence variants scattered across the F8 and F9 genes. Tests of 15 male singleton fetuses produced three unclassified samples, but no misclassifications.
“Because of the scalability of ddPCR, the protocol performed reliably even in cases with fetal DNA fraction lower than 10%,” the researchers wrote. “When an unclassified result is encountered, one either performs more digital analyses on the sample to accumulate more data points, or when the sample is consumed, one may resort to an additional blood draw, possibly at a later gestational age with higher fetal DNA fraction.”
Next, the investigators used MPS to create detailed fetal haplotype maps of the 7.6-Md region of F8 where int22h-related inversions occur. This approach yielded an “accurate and robust measurement of maternally inherited fetal haplotypes,” they wrote. “Our data suggest it is feasible to apply targeted MPS to interrogate maternally inherited F8 int22h-related inversions, whereas ddPCR [is] an affordable approach for the identification of F8 and F9 sequence variants in maternal plasma.”
The study was funded by the Research Grants Council of the Hong Kong SAR Government and the Vice Chancellor’s One-Off Discretionary Fund of The Chinese University of Hong Kong. Dr. Hudecova reported having no financial disclosures. Several coauthors disclosed patents for plasma nucleic acid analysis and ties to Sequenom, Illumina, Xcelom, and Cirina.
Digital droplet PCR (ddPCR) was an accurate and noninvasive method to detect mutations leading to hemophilia A and B in maternal plasma DNA in 15 at-risk pregnancies of 8 to 42 weeks’ gestation, researchers reported.
Additionally, the researchers showed for the first time that targeted massively parallel sequencing (MPS) accurately detected the clinically important int22h-related inversion mutations in maternal plasma DNA from pregnant hemophilia carriers from three families with the disorder.
As costs of sequencing continue to fall, larger studies of pregnant carriers of F8 int22h-related inversions can help make MPS “an essential part in noninvasive prenatal testing of hemophilia carriers,” Irena Hudecova, PhD, of Li Ka Shing Institute of Health Sciences, Hong Kong, and her associates wrote (Blood. 2017 Jul 20;130[3]:340-7).
Diagnosing hemophilia during pregnancy helps optimize care and allows mothers to make informed decisions about whether to terminate pregnancies. But for male fetuses, invasive testing has been the only option. In a prior small study, researchers used noninvasive microfluidics PCR to detect sequence variants of F8, which encodes factor VIII, and F9, which encodes Factor IX. The assay uses a chip that can accommodate about 9,000 reaction wells, making noninvasive screening much more feasible and affordable. But technical difficulties had precluded detection of int22h-related inversions, the inversion mutations of intron 22 in F8 on chromosome X that affect about half of individuals with severe hemophilia (Blood. 2011 Mar 31;117[13]:3684-91).
For the current study, the researchers first designed family-specific ddPCR assays to test for relevant maternal sequence variants scattered across the F8 and F9 genes. Tests of 15 male singleton fetuses produced three unclassified samples, but no misclassifications.
“Because of the scalability of ddPCR, the protocol performed reliably even in cases with fetal DNA fraction lower than 10%,” the researchers wrote. “When an unclassified result is encountered, one either performs more digital analyses on the sample to accumulate more data points, or when the sample is consumed, one may resort to an additional blood draw, possibly at a later gestational age with higher fetal DNA fraction.”
Next, the investigators used MPS to create detailed fetal haplotype maps of the 7.6-Md region of F8 where int22h-related inversions occur. This approach yielded an “accurate and robust measurement of maternally inherited fetal haplotypes,” they wrote. “Our data suggest it is feasible to apply targeted MPS to interrogate maternally inherited F8 int22h-related inversions, whereas ddPCR [is] an affordable approach for the identification of F8 and F9 sequence variants in maternal plasma.”
The study was funded by the Research Grants Council of the Hong Kong SAR Government and the Vice Chancellor’s One-Off Discretionary Fund of The Chinese University of Hong Kong. Dr. Hudecova reported having no financial disclosures. Several coauthors disclosed patents for plasma nucleic acid analysis and ties to Sequenom, Illumina, Xcelom, and Cirina.
FROM BLOOD
Key clinical point:
Major finding: Digital droplet PCR (ddPCR) detected relevant F8 and F9 gene mutations. Targeted massively parallel sequencing (MPS) determined fetal inheritance of F8 int22h-related inversions, which up to half of individuals with severe hemophilia carry.
Data source: ddPCR of 15 singleton male fetuses from at-risk mothers and MPS of the maternal plasma of pregnant carriers from three hemophilia families.
Disclosures: The study was funded by the Research Grants Council of the Hong Kong SAR Government and the Vice Chancellor’s One-Off Discretionary Fund of The Chinese University of Hong Kong. Dr. Hudecova reported having no financial disclosures. Some of the coauthors disclosed patents for plasma nucleic acid analysis and ties to Sequenom, Illumina, Xcelom, and Cirina.
Researchers identify ‘congenital NAD deficiency disorders’
Mutations that disrupt de novo synthesis of nicotinamide adenine dinucleotide (NAD) were associated with multiple congenital malformations in humans and mice, and supplementing niacin during gestation prevented these malformations in mice, new research suggests.
The malformations include vertebral defects, anal atresia, cardiac defects, tracheoesophageal fistula, renal anomalies, and limb abnormalities (VACTERL), “a nonrandom combination of congenital defects without a known cause,” wrote Hongjun Shi, PhD, of Victor Chang Cardiac Research Institute, New South Wales, Australia, and colleagues (N Engl J Med. 2017;377:544-52).
Numerous genetic and environmental factors can potentially cause NAD deficiency during gestation and the investigators suggested collectively referring to the resulting malformations as “congenital NAD deficiency disorders.”
Congenital defects can occur together in newborns more often than would be expected by chance, but “in many such cases, it has proved difficult to identify a genetic cause,” the investigators noted. Using genomic sequencing, they looked for possible pathogenic gene variants within four unrelated families in which a person was born with multiple congenital malformations. Next, they evaluated the function of the variants by testing in vitro enzyme activity and measuring relevant plasma metabolites. Finally, they used the CRISPR (clustered regularly interspaced short palindromic repeats)–Cas9 system to create mouse models with similar variants.
This approach identified variants in two genes encoding enzymes of the kynurenine pathway: 3-hydroxyanthranilic acid 3,4-dioxygenase (HAAO) and kynureninase (KYNU). Three patients had homozygous variants associated with loss-of-function changes in these proteins. A fourth patient had heterozygous variants in the gene encoding KYNU.
“The mutant enzymes had greatly reduced activity in vitro,” the researchers wrote. Patients had decreased circulating levels of NAD, which tryptophan synthesizes through the kynurenine pathway. Notably, mouse embryos lacking the mouse equivalents of HAAO or KYNU also had congenital defects associated with NAD deficiency. Preventing NAD deficiency during gestation averted these defects in mice.
“The NAD de novo synthesis pathway catabolizes tryptophan,” the researchers added. “Although metabolite levels upstream of the block are elevated, and the metabolites have postnatal functions, we found that it is the deficiency in embryonic NAD, downstream of the block, that is disrupting embryogenesis.”
The study was supported by the Australian and New South Wales governments and foundations. The investigators reported having no other financial disclosures.
Shi et al. report that a deficiency of nicotinamide adenine dinucleotide (NAD) causes congenital malformations, suggesting that interventions to raise NAD levels during fetal and early postnatal development might further reduce the incidence of congenital anomalies.
Regardless of how NAD depletion leads to congenital malformations (whether by compromising the detection of DNA damage by PARP proteins, reducing the supply of nucleotides, or both), dietary supplementation with NAD precursors merits further study. At high doses, niacin can cause flushing and gastrointestinal symptoms, but it has few side effects at lower doses.
Nicotinamide mononucleotide, nicotinamide riboside, and nicotinamide itself are better tolerated than niacin and are generally considered to be safe as dietary supplements, but the doses of NAD precursors required to reduce the risk of congenital malformations in humans are not known. Also unknown is the extent to which raising dietary levels of NAD would limit cognitive impairment in infants with congenital malformations.
Matthew G. Vander Heiden, MD, PhD, is with the Massachusetts Institute of Technology, Cambridge, Mass., and the Dana Farber Cancer Center, Boston. He reported receiving personal fees from Agios Pharmaceuticals and Aeglea Biotherapeutics outside the submitted work. These comments are adapted from an editorial (N Engl J Med. 2007;377:509-11).
Shi et al. report that a deficiency of nicotinamide adenine dinucleotide (NAD) causes congenital malformations, suggesting that interventions to raise NAD levels during fetal and early postnatal development might further reduce the incidence of congenital anomalies.
Regardless of how NAD depletion leads to congenital malformations (whether by compromising the detection of DNA damage by PARP proteins, reducing the supply of nucleotides, or both), dietary supplementation with NAD precursors merits further study. At high doses, niacin can cause flushing and gastrointestinal symptoms, but it has few side effects at lower doses.
Nicotinamide mononucleotide, nicotinamide riboside, and nicotinamide itself are better tolerated than niacin and are generally considered to be safe as dietary supplements, but the doses of NAD precursors required to reduce the risk of congenital malformations in humans are not known. Also unknown is the extent to which raising dietary levels of NAD would limit cognitive impairment in infants with congenital malformations.
Matthew G. Vander Heiden, MD, PhD, is with the Massachusetts Institute of Technology, Cambridge, Mass., and the Dana Farber Cancer Center, Boston. He reported receiving personal fees from Agios Pharmaceuticals and Aeglea Biotherapeutics outside the submitted work. These comments are adapted from an editorial (N Engl J Med. 2007;377:509-11).
Shi et al. report that a deficiency of nicotinamide adenine dinucleotide (NAD) causes congenital malformations, suggesting that interventions to raise NAD levels during fetal and early postnatal development might further reduce the incidence of congenital anomalies.
Regardless of how NAD depletion leads to congenital malformations (whether by compromising the detection of DNA damage by PARP proteins, reducing the supply of nucleotides, or both), dietary supplementation with NAD precursors merits further study. At high doses, niacin can cause flushing and gastrointestinal symptoms, but it has few side effects at lower doses.
Nicotinamide mononucleotide, nicotinamide riboside, and nicotinamide itself are better tolerated than niacin and are generally considered to be safe as dietary supplements, but the doses of NAD precursors required to reduce the risk of congenital malformations in humans are not known. Also unknown is the extent to which raising dietary levels of NAD would limit cognitive impairment in infants with congenital malformations.
Matthew G. Vander Heiden, MD, PhD, is with the Massachusetts Institute of Technology, Cambridge, Mass., and the Dana Farber Cancer Center, Boston. He reported receiving personal fees from Agios Pharmaceuticals and Aeglea Biotherapeutics outside the submitted work. These comments are adapted from an editorial (N Engl J Med. 2007;377:509-11).
Mutations that disrupt de novo synthesis of nicotinamide adenine dinucleotide (NAD) were associated with multiple congenital malformations in humans and mice, and supplementing niacin during gestation prevented these malformations in mice, new research suggests.
The malformations include vertebral defects, anal atresia, cardiac defects, tracheoesophageal fistula, renal anomalies, and limb abnormalities (VACTERL), “a nonrandom combination of congenital defects without a known cause,” wrote Hongjun Shi, PhD, of Victor Chang Cardiac Research Institute, New South Wales, Australia, and colleagues (N Engl J Med. 2017;377:544-52).
Numerous genetic and environmental factors can potentially cause NAD deficiency during gestation and the investigators suggested collectively referring to the resulting malformations as “congenital NAD deficiency disorders.”
Congenital defects can occur together in newborns more often than would be expected by chance, but “in many such cases, it has proved difficult to identify a genetic cause,” the investigators noted. Using genomic sequencing, they looked for possible pathogenic gene variants within four unrelated families in which a person was born with multiple congenital malformations. Next, they evaluated the function of the variants by testing in vitro enzyme activity and measuring relevant plasma metabolites. Finally, they used the CRISPR (clustered regularly interspaced short palindromic repeats)–Cas9 system to create mouse models with similar variants.
This approach identified variants in two genes encoding enzymes of the kynurenine pathway: 3-hydroxyanthranilic acid 3,4-dioxygenase (HAAO) and kynureninase (KYNU). Three patients had homozygous variants associated with loss-of-function changes in these proteins. A fourth patient had heterozygous variants in the gene encoding KYNU.
“The mutant enzymes had greatly reduced activity in vitro,” the researchers wrote. Patients had decreased circulating levels of NAD, which tryptophan synthesizes through the kynurenine pathway. Notably, mouse embryos lacking the mouse equivalents of HAAO or KYNU also had congenital defects associated with NAD deficiency. Preventing NAD deficiency during gestation averted these defects in mice.
“The NAD de novo synthesis pathway catabolizes tryptophan,” the researchers added. “Although metabolite levels upstream of the block are elevated, and the metabolites have postnatal functions, we found that it is the deficiency in embryonic NAD, downstream of the block, that is disrupting embryogenesis.”
The study was supported by the Australian and New South Wales governments and foundations. The investigators reported having no other financial disclosures.
Mutations that disrupt de novo synthesis of nicotinamide adenine dinucleotide (NAD) were associated with multiple congenital malformations in humans and mice, and supplementing niacin during gestation prevented these malformations in mice, new research suggests.
The malformations include vertebral defects, anal atresia, cardiac defects, tracheoesophageal fistula, renal anomalies, and limb abnormalities (VACTERL), “a nonrandom combination of congenital defects without a known cause,” wrote Hongjun Shi, PhD, of Victor Chang Cardiac Research Institute, New South Wales, Australia, and colleagues (N Engl J Med. 2017;377:544-52).
Numerous genetic and environmental factors can potentially cause NAD deficiency during gestation and the investigators suggested collectively referring to the resulting malformations as “congenital NAD deficiency disorders.”
Congenital defects can occur together in newborns more often than would be expected by chance, but “in many such cases, it has proved difficult to identify a genetic cause,” the investigators noted. Using genomic sequencing, they looked for possible pathogenic gene variants within four unrelated families in which a person was born with multiple congenital malformations. Next, they evaluated the function of the variants by testing in vitro enzyme activity and measuring relevant plasma metabolites. Finally, they used the CRISPR (clustered regularly interspaced short palindromic repeats)–Cas9 system to create mouse models with similar variants.
This approach identified variants in two genes encoding enzymes of the kynurenine pathway: 3-hydroxyanthranilic acid 3,4-dioxygenase (HAAO) and kynureninase (KYNU). Three patients had homozygous variants associated with loss-of-function changes in these proteins. A fourth patient had heterozygous variants in the gene encoding KYNU.
“The mutant enzymes had greatly reduced activity in vitro,” the researchers wrote. Patients had decreased circulating levels of NAD, which tryptophan synthesizes through the kynurenine pathway. Notably, mouse embryos lacking the mouse equivalents of HAAO or KYNU also had congenital defects associated with NAD deficiency. Preventing NAD deficiency during gestation averted these defects in mice.
“The NAD de novo synthesis pathway catabolizes tryptophan,” the researchers added. “Although metabolite levels upstream of the block are elevated, and the metabolites have postnatal functions, we found that it is the deficiency in embryonic NAD, downstream of the block, that is disrupting embryogenesis.”
The study was supported by the Australian and New South Wales governments and foundations. The investigators reported having no other financial disclosures.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point:
Major finding: Major congenital defects affecting unrelated families were associated with variants in genes encoding 3-hydroxyanthranilic acid 3,4-dioxygenase (HAAO) and kynureninase (KYNU).
Data source: Genomic sequencing of four unrelated families in which a person was born with multiple congenital malformations, plus in vitro measurements of enzyme activity and plasma metabolites and studies of mouse models created with the CRISPR–Cas9 system.
Disclosures: The study was supported by the Australian and New South Wales governments and foundations. The investigators reported having no other financial disclosures.
AGA Guideline: Therapeutic drug monitoring in IBD
Physicians should perform reactive therapeutic drug monitoring to guide changes in anti–tumor necrosis factor (TNF) therapy in patients with active inflammatory bowel disease and should consider target trough concentrations of at least 5 mcg/mL for infliximab, at least 7.5 mcg/mL for adalimumab, and at least 20 mcg/mL for certolizumab pegol, according to a guideline from the AGA Institute, published in the September 2017 issue of Gastroenterology (Gastroenterology. doi: 10.1053/j.gastro.2017.07.032).
Therapeutic drug monitoring can help guide whether to ramp up a dose (if the trough level is below the threshold) or switch therapy (if the trough level is above the threshold) when patients are not responding adequately to maintenance treatment. A nonresponder with optimal trough concentrations might need to switch drug classes, the guideline noted. A patient with low trough levels and no antidrug antibodies is probably experiencing rapid drug clearance in the setting of high inflammation. A patient with low or undetectable trough levels and high antidrug antibody titers has developed neutralizing antidrug antibodies. However, trough concentrations can vary for many other reasons, ranging from disease severity and inflammation to body mass index and sex. Therefore, target levels also vary and can be challenging to set.
The AGA makes no recommendation about routine, proactive TDM in patients with quiescent IBD who are on anti-TNF agents. While proactive TDM can shed light on endoscopic response and drug clearance, it might also trigger a premature switch of therapies; this is particularly likely because physicians have sparse data on either target trough levels for asymptomatic patients or the clinical significance of “low-titer” antidrug antibodies. The optimal frequency of proactive TDM also remains unclear.
Pending better data, the AGA recommended checking infliximab or adalimumab trough levels as close to the next dose as possible – that is, within 24 hours. Drug trough levels are consistent across commercial assays, but antidrug antibody titers are not, and there are no uniform thresholds for clinically relevant antidrug antibody titers. “Therefore, it may be beneficial to utilize the same assay when checking for trough concentration and antidrug antibodies,” the guideline stated.
For patients on a thiopurine, routine testing of thiopurine methyltransferase (TPMT) enzyme or genotype is recommended to guide dosing. In three pooled studies comprising 1,145 patients, only two patients were homozygous; further, rates of hematologic adverse events, clinical remission, and treatment discontinuation did not differ based on TPMT testing itself. However, using TPMT testing to guide dosing was associated with an 89% decrease in the risk of hematologic adverse events among patients who had a homozygous genotype or had low or absent TPMT enzymatic activity. “While this risk may be mitigated by routine laboratory CBC checking, adherence to regular monitoring in clinical practice is suboptimal,” the guideline stated. “It is important to continue to perform routine lab monitoring [of] CBC and liver enzymes after starting a thiopurine, regardless of the TPMT testing results.”
The AGA also conditionally supported reactive monitoring of thiopurine metabolites to guide treatment changes if patients develop breakthrough symptoms or treatment-related adverse effects. For active IBD symptoms in spite of thiopurine monotherapy, a target 6-thioguanine (6-TGN) cutoff between 230 and 450 pmol per 8 x 108 RBC is recommended. Again, supporting evidence is of “very low quality” – in a retrospective, observational study, patients who received treatment according to a TDM algorithm were five times more likely to respond to a change in therapy (relative risk, 5.2). The guideline recommended against monitoring thiopurine metabolites in quiescent IBD. Studies did not support this practice, compared with standard dosing, although no study of thiopurine metabolites included patients on thiopurine/anti-TNF combination therapy, the guideline’s authors noted.
The guideline includes clinical-decision support tools on when to perform TDM and how to interpret results when patients are taking an anti-TNF agent or a thiopurine. The guideline does not cover vedolizumab or ustekinumab because data are sparse. Other knowledge gaps include when best to measure trough concentrations; whether empiric dose escalation or TDM is preferred if response to induction is suboptimal; how target trough concentrations vary based on disease phenotype, disease state, or treatment goals; which levels and durations of antidrug antibody titers are clinically significant; and whether to suppress antidrug antibodies before changing therapy. Future studies should compare routine proactive and reactive TDM, investigate how often to perform proactive TDM, and characterize TDM of newly approved biologic agents, the guideline concluded.
The authors of the guideline document disclosed no conflicts related to the guideline topic.
Physicians should perform reactive therapeutic drug monitoring to guide changes in anti–tumor necrosis factor (TNF) therapy in patients with active inflammatory bowel disease and should consider target trough concentrations of at least 5 mcg/mL for infliximab, at least 7.5 mcg/mL for adalimumab, and at least 20 mcg/mL for certolizumab pegol, according to a guideline from the AGA Institute, published in the September 2017 issue of Gastroenterology (Gastroenterology. doi: 10.1053/j.gastro.2017.07.032).
Therapeutic drug monitoring can help guide whether to ramp up a dose (if the trough level is below the threshold) or switch therapy (if the trough level is above the threshold) when patients are not responding adequately to maintenance treatment. A nonresponder with optimal trough concentrations might need to switch drug classes, the guideline noted. A patient with low trough levels and no antidrug antibodies is probably experiencing rapid drug clearance in the setting of high inflammation. A patient with low or undetectable trough levels and high antidrug antibody titers has developed neutralizing antidrug antibodies. However, trough concentrations can vary for many other reasons, ranging from disease severity and inflammation to body mass index and sex. Therefore, target levels also vary and can be challenging to set.
The AGA makes no recommendation about routine, proactive TDM in patients with quiescent IBD who are on anti-TNF agents. While proactive TDM can shed light on endoscopic response and drug clearance, it might also trigger a premature switch of therapies; this is particularly likely because physicians have sparse data on either target trough levels for asymptomatic patients or the clinical significance of “low-titer” antidrug antibodies. The optimal frequency of proactive TDM also remains unclear.
Pending better data, the AGA recommended checking infliximab or adalimumab trough levels as close to the next dose as possible – that is, within 24 hours. Drug trough levels are consistent across commercial assays, but antidrug antibody titers are not, and there are no uniform thresholds for clinically relevant antidrug antibody titers. “Therefore, it may be beneficial to utilize the same assay when checking for trough concentration and antidrug antibodies,” the guideline stated.
For patients on a thiopurine, routine testing of thiopurine methyltransferase (TPMT) enzyme or genotype is recommended to guide dosing. In three pooled studies comprising 1,145 patients, only two patients were homozygous; further, rates of hematologic adverse events, clinical remission, and treatment discontinuation did not differ based on TPMT testing itself. However, using TPMT testing to guide dosing was associated with an 89% decrease in the risk of hematologic adverse events among patients who had a homozygous genotype or had low or absent TPMT enzymatic activity. “While this risk may be mitigated by routine laboratory CBC checking, adherence to regular monitoring in clinical practice is suboptimal,” the guideline stated. “It is important to continue to perform routine lab monitoring [of] CBC and liver enzymes after starting a thiopurine, regardless of the TPMT testing results.”
The AGA also conditionally supported reactive monitoring of thiopurine metabolites to guide treatment changes if patients develop breakthrough symptoms or treatment-related adverse effects. For active IBD symptoms in spite of thiopurine monotherapy, a target 6-thioguanine (6-TGN) cutoff between 230 and 450 pmol per 8 x 108 RBC is recommended. Again, supporting evidence is of “very low quality” – in a retrospective, observational study, patients who received treatment according to a TDM algorithm were five times more likely to respond to a change in therapy (relative risk, 5.2). The guideline recommended against monitoring thiopurine metabolites in quiescent IBD. Studies did not support this practice, compared with standard dosing, although no study of thiopurine metabolites included patients on thiopurine/anti-TNF combination therapy, the guideline’s authors noted.
The guideline includes clinical-decision support tools on when to perform TDM and how to interpret results when patients are taking an anti-TNF agent or a thiopurine. The guideline does not cover vedolizumab or ustekinumab because data are sparse. Other knowledge gaps include when best to measure trough concentrations; whether empiric dose escalation or TDM is preferred if response to induction is suboptimal; how target trough concentrations vary based on disease phenotype, disease state, or treatment goals; which levels and durations of antidrug antibody titers are clinically significant; and whether to suppress antidrug antibodies before changing therapy. Future studies should compare routine proactive and reactive TDM, investigate how often to perform proactive TDM, and characterize TDM of newly approved biologic agents, the guideline concluded.
The authors of the guideline document disclosed no conflicts related to the guideline topic.
Physicians should perform reactive therapeutic drug monitoring to guide changes in anti–tumor necrosis factor (TNF) therapy in patients with active inflammatory bowel disease and should consider target trough concentrations of at least 5 mcg/mL for infliximab, at least 7.5 mcg/mL for adalimumab, and at least 20 mcg/mL for certolizumab pegol, according to a guideline from the AGA Institute, published in the September 2017 issue of Gastroenterology (Gastroenterology. doi: 10.1053/j.gastro.2017.07.032).
Therapeutic drug monitoring can help guide whether to ramp up a dose (if the trough level is below the threshold) or switch therapy (if the trough level is above the threshold) when patients are not responding adequately to maintenance treatment. A nonresponder with optimal trough concentrations might need to switch drug classes, the guideline noted. A patient with low trough levels and no antidrug antibodies is probably experiencing rapid drug clearance in the setting of high inflammation. A patient with low or undetectable trough levels and high antidrug antibody titers has developed neutralizing antidrug antibodies. However, trough concentrations can vary for many other reasons, ranging from disease severity and inflammation to body mass index and sex. Therefore, target levels also vary and can be challenging to set.
The AGA makes no recommendation about routine, proactive TDM in patients with quiescent IBD who are on anti-TNF agents. While proactive TDM can shed light on endoscopic response and drug clearance, it might also trigger a premature switch of therapies; this is particularly likely because physicians have sparse data on either target trough levels for asymptomatic patients or the clinical significance of “low-titer” antidrug antibodies. The optimal frequency of proactive TDM also remains unclear.
Pending better data, the AGA recommended checking infliximab or adalimumab trough levels as close to the next dose as possible – that is, within 24 hours. Drug trough levels are consistent across commercial assays, but antidrug antibody titers are not, and there are no uniform thresholds for clinically relevant antidrug antibody titers. “Therefore, it may be beneficial to utilize the same assay when checking for trough concentration and antidrug antibodies,” the guideline stated.
For patients on a thiopurine, routine testing of thiopurine methyltransferase (TPMT) enzyme or genotype is recommended to guide dosing. In three pooled studies comprising 1,145 patients, only two patients were homozygous; further, rates of hematologic adverse events, clinical remission, and treatment discontinuation did not differ based on TPMT testing itself. However, using TPMT testing to guide dosing was associated with an 89% decrease in the risk of hematologic adverse events among patients who had a homozygous genotype or had low or absent TPMT enzymatic activity. “While this risk may be mitigated by routine laboratory CBC checking, adherence to regular monitoring in clinical practice is suboptimal,” the guideline stated. “It is important to continue to perform routine lab monitoring [of] CBC and liver enzymes after starting a thiopurine, regardless of the TPMT testing results.”
The AGA also conditionally supported reactive monitoring of thiopurine metabolites to guide treatment changes if patients develop breakthrough symptoms or treatment-related adverse effects. For active IBD symptoms in spite of thiopurine monotherapy, a target 6-thioguanine (6-TGN) cutoff between 230 and 450 pmol per 8 x 108 RBC is recommended. Again, supporting evidence is of “very low quality” – in a retrospective, observational study, patients who received treatment according to a TDM algorithm were five times more likely to respond to a change in therapy (relative risk, 5.2). The guideline recommended against monitoring thiopurine metabolites in quiescent IBD. Studies did not support this practice, compared with standard dosing, although no study of thiopurine metabolites included patients on thiopurine/anti-TNF combination therapy, the guideline’s authors noted.
The guideline includes clinical-decision support tools on when to perform TDM and how to interpret results when patients are taking an anti-TNF agent or a thiopurine. The guideline does not cover vedolizumab or ustekinumab because data are sparse. Other knowledge gaps include when best to measure trough concentrations; whether empiric dose escalation or TDM is preferred if response to induction is suboptimal; how target trough concentrations vary based on disease phenotype, disease state, or treatment goals; which levels and durations of antidrug antibody titers are clinically significant; and whether to suppress antidrug antibodies before changing therapy. Future studies should compare routine proactive and reactive TDM, investigate how often to perform proactive TDM, and characterize TDM of newly approved biologic agents, the guideline concluded.
The authors of the guideline document disclosed no conflicts related to the guideline topic.
FROM GASTROENTEROLOGY