User login
Norepinephrine shortage linked to mortality in patients with septic shock
A national shortage of norepinephrine in the United States was associated with higher rates of mortality among patients hospitalized with septic shock, investigators reported.
Rates of in-hospital mortality in 2011 were 40% during quarters when hospitals were facing shortages and 36% when they were not, Emily Vail, MD, and her associates said at the International Symposium on Intensive Care and Emergency Medicine. The report was published simultaneously in JAMA.
The link between norepinephrine shortage and death from septic shock persisted even after the researchers accounted for numerous clinical and demographic factors (adjusted odds ratio, 1.2; 95% confidence interval, 1.01 to 1.30; P = .03), wrote Dr. Vail of Columbia University, New York (JAMA. 2017 Mar 21. doi: 10.1001/jama.2017.2841).
Drug shortages are common in the United States, but few studies have explored their effects on patient outcomes. Investigators compared mortality rates among affected patients during 3-month intervals when hospitals were and were not using at least 20% less norepinephrine than baseline. The researchers used Premier Healthcare Database, which includes both standard claims and detailed, dated logs of all services billed to patients or insurance, with minimal missing data.
A total of 77% patients admitted with septic shock received norepinephrine before the shortage. During the lowest point of the shortage, 56% of patients received it, the researchers reported. Clinicians most often used phenylephrine instead, prescribing it to up to 54% of patients during the worst time of the shortage. The absolute increase in mortality during the quarters of shortage was 3.7% (95% CI, 1.5%-6.0%).
Several factors might explain the link between norepinephrine shortage and mortality, said the investigators. The vasopressors chosen to replace norepinephrine might result directly in worse outcomes, but a decrease in norepinephrine use also might be a proxy for relevant variables such as delayed use of vasopressors, lack of knowledge of how to optimally dose vasopressors besides norepinephrine, or the absence of a pharmacist dedicated to helping optimize the use of limited supplies.
The study did not uncover a dose-response association between greater decreases in norepinephrine use and increased mortality, the researchers noted. “This may be due to a threshold effect of vasopressor shortage on mortality, or lack of power due to relatively few hospital quarters at the extreme levels of vasopressor shortage,” they wrote.
Because the deaths captured included only those that occurred in-hospital, “the results may have underestimated mortality, particularly for hospitals that tend transfer patients early to other skilled care facilities,” the researchers noted.
The cohort of patients was limited to those who received vasopressors for 2 or more days and excluded patients who died on the first day of vasopressor treatment, the researchers said.
The Herbert and Florence Irving Scholars Program at Columbia University provided funding. One coinvestigator disclosed grant funding from the National Institutes of Health and personal fees from UpToDate. The other investigators reported having no conflicts of interest.
A national shortage of norepinephrine in the United States was associated with higher rates of mortality among patients hospitalized with septic shock, investigators reported.
Rates of in-hospital mortality in 2011 were 40% during quarters when hospitals were facing shortages and 36% when they were not, Emily Vail, MD, and her associates said at the International Symposium on Intensive Care and Emergency Medicine. The report was published simultaneously in JAMA.
The link between norepinephrine shortage and death from septic shock persisted even after the researchers accounted for numerous clinical and demographic factors (adjusted odds ratio, 1.2; 95% confidence interval, 1.01 to 1.30; P = .03), wrote Dr. Vail of Columbia University, New York (JAMA. 2017 Mar 21. doi: 10.1001/jama.2017.2841).
Drug shortages are common in the United States, but few studies have explored their effects on patient outcomes. Investigators compared mortality rates among affected patients during 3-month intervals when hospitals were and were not using at least 20% less norepinephrine than baseline. The researchers used Premier Healthcare Database, which includes both standard claims and detailed, dated logs of all services billed to patients or insurance, with minimal missing data.
A total of 77% patients admitted with septic shock received norepinephrine before the shortage. During the lowest point of the shortage, 56% of patients received it, the researchers reported. Clinicians most often used phenylephrine instead, prescribing it to up to 54% of patients during the worst time of the shortage. The absolute increase in mortality during the quarters of shortage was 3.7% (95% CI, 1.5%-6.0%).
Several factors might explain the link between norepinephrine shortage and mortality, said the investigators. The vasopressors chosen to replace norepinephrine might result directly in worse outcomes, but a decrease in norepinephrine use also might be a proxy for relevant variables such as delayed use of vasopressors, lack of knowledge of how to optimally dose vasopressors besides norepinephrine, or the absence of a pharmacist dedicated to helping optimize the use of limited supplies.
The study did not uncover a dose-response association between greater decreases in norepinephrine use and increased mortality, the researchers noted. “This may be due to a threshold effect of vasopressor shortage on mortality, or lack of power due to relatively few hospital quarters at the extreme levels of vasopressor shortage,” they wrote.
Because the deaths captured included only those that occurred in-hospital, “the results may have underestimated mortality, particularly for hospitals that tend transfer patients early to other skilled care facilities,” the researchers noted.
The cohort of patients was limited to those who received vasopressors for 2 or more days and excluded patients who died on the first day of vasopressor treatment, the researchers said.
The Herbert and Florence Irving Scholars Program at Columbia University provided funding. One coinvestigator disclosed grant funding from the National Institutes of Health and personal fees from UpToDate. The other investigators reported having no conflicts of interest.
A national shortage of norepinephrine in the United States was associated with higher rates of mortality among patients hospitalized with septic shock, investigators reported.
Rates of in-hospital mortality in 2011 were 40% during quarters when hospitals were facing shortages and 36% when they were not, Emily Vail, MD, and her associates said at the International Symposium on Intensive Care and Emergency Medicine. The report was published simultaneously in JAMA.
The link between norepinephrine shortage and death from septic shock persisted even after the researchers accounted for numerous clinical and demographic factors (adjusted odds ratio, 1.2; 95% confidence interval, 1.01 to 1.30; P = .03), wrote Dr. Vail of Columbia University, New York (JAMA. 2017 Mar 21. doi: 10.1001/jama.2017.2841).
Drug shortages are common in the United States, but few studies have explored their effects on patient outcomes. Investigators compared mortality rates among affected patients during 3-month intervals when hospitals were and were not using at least 20% less norepinephrine than baseline. The researchers used Premier Healthcare Database, which includes both standard claims and detailed, dated logs of all services billed to patients or insurance, with minimal missing data.
A total of 77% patients admitted with septic shock received norepinephrine before the shortage. During the lowest point of the shortage, 56% of patients received it, the researchers reported. Clinicians most often used phenylephrine instead, prescribing it to up to 54% of patients during the worst time of the shortage. The absolute increase in mortality during the quarters of shortage was 3.7% (95% CI, 1.5%-6.0%).
Several factors might explain the link between norepinephrine shortage and mortality, said the investigators. The vasopressors chosen to replace norepinephrine might result directly in worse outcomes, but a decrease in norepinephrine use also might be a proxy for relevant variables such as delayed use of vasopressors, lack of knowledge of how to optimally dose vasopressors besides norepinephrine, or the absence of a pharmacist dedicated to helping optimize the use of limited supplies.
The study did not uncover a dose-response association between greater decreases in norepinephrine use and increased mortality, the researchers noted. “This may be due to a threshold effect of vasopressor shortage on mortality, or lack of power due to relatively few hospital quarters at the extreme levels of vasopressor shortage,” they wrote.
Because the deaths captured included only those that occurred in-hospital, “the results may have underestimated mortality, particularly for hospitals that tend transfer patients early to other skilled care facilities,” the researchers noted.
The cohort of patients was limited to those who received vasopressors for 2 or more days and excluded patients who died on the first day of vasopressor treatment, the researchers said.
The Herbert and Florence Irving Scholars Program at Columbia University provided funding. One coinvestigator disclosed grant funding from the National Institutes of Health and personal fees from UpToDate. The other investigators reported having no conflicts of interest.
FROM ISICEM
Key clinical point. The 2011 norepinephrine shortage was associated with mortality among patients hospitalized with septic shock.
Major finding: Rates of in-hospital mortality were 36% during quarters of normal norepinephrine use and 40% during quarters of decreased use (adjusted odds ratio, 1.2; P = .03).
Data source: A retrospective cohort study of 27,835 patients at 26 hospitals in the United States that were affected by the shortage.
Disclosures: The Herbert and Florence Irving Scholars Program at Columbia University provided funding. One coinvestigator disclosed grant funding from the National Institutes of Health and personal fees from UpToDate. The other investigators reported having no conflicts of interest.
Pretreatment Imaging May Help Prevent Hodgkin Lymphoma Recurrence
Advances in radiation treatment have led to better targeting, minimizing the dose to healthy tissue. For patients with Hodgkin lymphoma (HL), pretreatment scanning with positron emission tomography and computed tomography (PET/CT) has become the gold standard, say researchers from University of Florida, in determining the extent of HL. Because HL may recur at the site of the original cancer, the scans are important to accurately capture the scope of the disease. Moreover, the researchers say pretreatment PET/CT may reduce disease progression.
Related: Study Points to Risk Factors for Lymphoma
In their study of 37 patients with stage I or II HL, 31 had PET/CT before chemotherapy. Two of the remaining 6 had PET/CT done within 5 days after chemotherapy was started. Median follow-up was 46 months.
The 4-year rate of relapse-free survival was 92%. Patients who did not receive pretreatment PET/CT were more likely to have a relapse (67%). Of 4 recurrences, 3 were within 12 months of follow-up; 1 developed 5 years after treatment.
Among the 6 patients who did not have a baseline PET/CT scan, all 3 recurrences were in lymph node regions outside of, but adjacent to, the radiation field. None of the 6 experienced an in-field treatment failure.
Related: Development and Implementation of a Veterans’ Cancer Survivorship Program
Long-term survivors of HL are vulnerable to late adverse effects, the researchers note, and that fact is “the impetus behind efforts to reduce radiation exposure to organs at risk.” They cite studies that have found that PET/CT scans, compared with using only pretreatment contrast-enhanced CT scans, can alter the staging in 10% to 30% of patients with HL. Their study, the researchers add, helps support the National Comprehensive Cancer Network guidelines that advise prechemotherapy PET/CT imaging in staging all HL patients. Not doing complete staging, the researchers say, puts patients at “unnecessary, and in some instances preventable, risk for recurrence.”
Source:
Figura N, Flampouri S, Mendenhall NP, et al. Adv Radiat Oncol. 2017;1-16.
Advances in radiation treatment have led to better targeting, minimizing the dose to healthy tissue. For patients with Hodgkin lymphoma (HL), pretreatment scanning with positron emission tomography and computed tomography (PET/CT) has become the gold standard, say researchers from University of Florida, in determining the extent of HL. Because HL may recur at the site of the original cancer, the scans are important to accurately capture the scope of the disease. Moreover, the researchers say pretreatment PET/CT may reduce disease progression.
Related: Study Points to Risk Factors for Lymphoma
In their study of 37 patients with stage I or II HL, 31 had PET/CT before chemotherapy. Two of the remaining 6 had PET/CT done within 5 days after chemotherapy was started. Median follow-up was 46 months.
The 4-year rate of relapse-free survival was 92%. Patients who did not receive pretreatment PET/CT were more likely to have a relapse (67%). Of 4 recurrences, 3 were within 12 months of follow-up; 1 developed 5 years after treatment.
Among the 6 patients who did not have a baseline PET/CT scan, all 3 recurrences were in lymph node regions outside of, but adjacent to, the radiation field. None of the 6 experienced an in-field treatment failure.
Related: Development and Implementation of a Veterans’ Cancer Survivorship Program
Long-term survivors of HL are vulnerable to late adverse effects, the researchers note, and that fact is “the impetus behind efforts to reduce radiation exposure to organs at risk.” They cite studies that have found that PET/CT scans, compared with using only pretreatment contrast-enhanced CT scans, can alter the staging in 10% to 30% of patients with HL. Their study, the researchers add, helps support the National Comprehensive Cancer Network guidelines that advise prechemotherapy PET/CT imaging in staging all HL patients. Not doing complete staging, the researchers say, puts patients at “unnecessary, and in some instances preventable, risk for recurrence.”
Source:
Figura N, Flampouri S, Mendenhall NP, et al. Adv Radiat Oncol. 2017;1-16.
Advances in radiation treatment have led to better targeting, minimizing the dose to healthy tissue. For patients with Hodgkin lymphoma (HL), pretreatment scanning with positron emission tomography and computed tomography (PET/CT) has become the gold standard, say researchers from University of Florida, in determining the extent of HL. Because HL may recur at the site of the original cancer, the scans are important to accurately capture the scope of the disease. Moreover, the researchers say pretreatment PET/CT may reduce disease progression.
Related: Study Points to Risk Factors for Lymphoma
In their study of 37 patients with stage I or II HL, 31 had PET/CT before chemotherapy. Two of the remaining 6 had PET/CT done within 5 days after chemotherapy was started. Median follow-up was 46 months.
The 4-year rate of relapse-free survival was 92%. Patients who did not receive pretreatment PET/CT were more likely to have a relapse (67%). Of 4 recurrences, 3 were within 12 months of follow-up; 1 developed 5 years after treatment.
Among the 6 patients who did not have a baseline PET/CT scan, all 3 recurrences were in lymph node regions outside of, but adjacent to, the radiation field. None of the 6 experienced an in-field treatment failure.
Related: Development and Implementation of a Veterans’ Cancer Survivorship Program
Long-term survivors of HL are vulnerable to late adverse effects, the researchers note, and that fact is “the impetus behind efforts to reduce radiation exposure to organs at risk.” They cite studies that have found that PET/CT scans, compared with using only pretreatment contrast-enhanced CT scans, can alter the staging in 10% to 30% of patients with HL. Their study, the researchers add, helps support the National Comprehensive Cancer Network guidelines that advise prechemotherapy PET/CT imaging in staging all HL patients. Not doing complete staging, the researchers say, puts patients at “unnecessary, and in some instances preventable, risk for recurrence.”
Source:
Figura N, Flampouri S, Mendenhall NP, et al. Adv Radiat Oncol. 2017;1-16.
Study supports use of rivaroxaban to prevent VTE recurrence
WASHINGTON, DC—Results of the EINSTEIN CHOICE study suggest rivaroxaban is more effective than, and just as safe as, aspirin for long-term anticoagulation in patients with venous thromboembolism (VTE).
In this phase 3 study, patients who had completed 6 to 12 months of anticoagulant therapy were randomized to receive rivaroxaban or aspirin.
Those who received rivaroxaban had a significantly lower risk of recurrent VTE, and the rates of major bleeding were similar between the treatment arms.
“How best to extend anticoagulant use beyond the initial treatment window has been a constant source of debate, with physicians carefully balancing patients’ risk of another VTE with the risk of anticoagulant-related bleeding,” said study investigator Philip S. Wells, MD, of The Ottawa Hospital in Ontario, Canada.
“With EINSTEIN CHOICE, for the first time, we have clinical evidence confirming rivaroxaban is superior to aspirin in reducing recurrent VTE, with no significant impact on safety. These important results have the potential to trigger a paradigm shift in how physicians manage their patients and protect them from VTE recurrence over the long term.”
These results were presented at the American College of Cardiology’s 66th Annual Scientific Session and published in NEJM. The research was funded by Bayer Pharmaceuticals.
The study enrolled 3365 patients with confirmed deep vein thrombosis or pulmonary embolism who were initially treated with anticoagulant therapy for 6 to 12 months.
Patients who required extended anticoagulation at therapeutic doses were not included in this trial, as the objective was to investigate those patients for whom the treating physician was uncertain about the need for continuing anticoagulant therapy.
Patients were randomized in a 1:1:1 ratio to receive a prophylactic dose of rivaroxaban (10 mg once daily), a treatment dose of rivaroxaban (20 mg once daily), or aspirin (100 mg once daily) for up to 12 months. Sixty percent of patients completed the full 12 months of treatment.
Efficacy
Both rivaroxaban doses were superior to aspirin in preventing fatal or non-fatal recurrent VTE, the study’s primary efficacy endpoint.
The rate of recurrent VTE was 1.2% in the 10 mg rivaroxaban arm (hazard ratio [HR]=0.26; 95% CI, 0.14 to 0.47; P<0.001), 1.5% in the 20 mg rivaroxaban arm (HR=0.34; 95% CI, 0.20 to 0.59; P<0.001), and 4.4% in the aspirin arm. Fatal VTE occurred in 0%, 0.2%, and 0.2%, respectively.
For the primary endpoint, the HR for the comparison between the 20 mg and 10 mg rivaroxaban arms was 1.34 (95% CI, 0.65 to 2.75, P=0.42). However, the researchers noted that the comparison of these 2 arms was not powered for significance.
The researchers also found that rivaroxaban reduced patients’ risk of experiencing one of the following events: recurrent VTE, heart attack, ischemic stroke, systemic embolism, or venous thrombosis in another location.
This endpoint occurred in 1.9% of patients in the 10 mg rivaroxaban group (HR=0.33; 95% CI, 0.20 to 0.54; P<0.001), 2.0% of patients in the 20 mg rivaroxaban group (HR=0.35; 95% CI, 0.22 to 0.57; P<0.001), and 5.6% of patients in the aspirin group.
Recurrent VTE or all-cause mortality occurred in 1.3% of patients in the 10 mg rivaroxaban group (HR=0.27; 95% CI, 0.15 to 0.47; P<0.001), 2.1% of patients in the 20 mg rivaroxaban group (HR=0.42; 95% CI, 0.26 to 0.68; P<0.001), and 4.9% of patients in the aspirin group.
Safety
The primary safety endpoint was major bleeding as defined by the International Society on Thrombosis and Haemostasis.
The rate of major bleeding was 0.4% for the 10 mg rivaroxaban group (HR=1.64; 95% CI, 0.39 to 6.84; P=0.50), 0.5% for the 20 mg rivaroxaban group (HR=2.01; 95% CI, 0.50 to 8.04; P=0.32), and 0.3% for the aspirin group.
Rates of clinically relevant non-major bleeding were 2.0%, 2.7%, and 1.8%, respectively (no significant difference).
“We know from previous studies that only about 40% of venous thromboembolism patients are actually on long-term blood thinners,” Dr Wells said.
“We hope that this study, which shows the blood thinner rivaroxaban is as safe as aspirin but much more effective at preventing future clots, will convince patients and their physicians to continue life-long medication that can prevent potentially dangerous blood clots.”
WASHINGTON, DC—Results of the EINSTEIN CHOICE study suggest rivaroxaban is more effective than, and just as safe as, aspirin for long-term anticoagulation in patients with venous thromboembolism (VTE).
In this phase 3 study, patients who had completed 6 to 12 months of anticoagulant therapy were randomized to receive rivaroxaban or aspirin.
Those who received rivaroxaban had a significantly lower risk of recurrent VTE, and the rates of major bleeding were similar between the treatment arms.
“How best to extend anticoagulant use beyond the initial treatment window has been a constant source of debate, with physicians carefully balancing patients’ risk of another VTE with the risk of anticoagulant-related bleeding,” said study investigator Philip S. Wells, MD, of The Ottawa Hospital in Ontario, Canada.
“With EINSTEIN CHOICE, for the first time, we have clinical evidence confirming rivaroxaban is superior to aspirin in reducing recurrent VTE, with no significant impact on safety. These important results have the potential to trigger a paradigm shift in how physicians manage their patients and protect them from VTE recurrence over the long term.”
These results were presented at the American College of Cardiology’s 66th Annual Scientific Session and published in NEJM. The research was funded by Bayer Pharmaceuticals.
The study enrolled 3365 patients with confirmed deep vein thrombosis or pulmonary embolism who were initially treated with anticoagulant therapy for 6 to 12 months.
Patients who required extended anticoagulation at therapeutic doses were not included in this trial, as the objective was to investigate those patients for whom the treating physician was uncertain about the need for continuing anticoagulant therapy.
Patients were randomized in a 1:1:1 ratio to receive a prophylactic dose of rivaroxaban (10 mg once daily), a treatment dose of rivaroxaban (20 mg once daily), or aspirin (100 mg once daily) for up to 12 months. Sixty percent of patients completed the full 12 months of treatment.
Efficacy
Both rivaroxaban doses were superior to aspirin in preventing fatal or non-fatal recurrent VTE, the study’s primary efficacy endpoint.
The rate of recurrent VTE was 1.2% in the 10 mg rivaroxaban arm (hazard ratio [HR]=0.26; 95% CI, 0.14 to 0.47; P<0.001), 1.5% in the 20 mg rivaroxaban arm (HR=0.34; 95% CI, 0.20 to 0.59; P<0.001), and 4.4% in the aspirin arm. Fatal VTE occurred in 0%, 0.2%, and 0.2%, respectively.
For the primary endpoint, the HR for the comparison between the 20 mg and 10 mg rivaroxaban arms was 1.34 (95% CI, 0.65 to 2.75, P=0.42). However, the researchers noted that the comparison of these 2 arms was not powered for significance.
The researchers also found that rivaroxaban reduced patients’ risk of experiencing one of the following events: recurrent VTE, heart attack, ischemic stroke, systemic embolism, or venous thrombosis in another location.
This endpoint occurred in 1.9% of patients in the 10 mg rivaroxaban group (HR=0.33; 95% CI, 0.20 to 0.54; P<0.001), 2.0% of patients in the 20 mg rivaroxaban group (HR=0.35; 95% CI, 0.22 to 0.57; P<0.001), and 5.6% of patients in the aspirin group.
Recurrent VTE or all-cause mortality occurred in 1.3% of patients in the 10 mg rivaroxaban group (HR=0.27; 95% CI, 0.15 to 0.47; P<0.001), 2.1% of patients in the 20 mg rivaroxaban group (HR=0.42; 95% CI, 0.26 to 0.68; P<0.001), and 4.9% of patients in the aspirin group.
Safety
The primary safety endpoint was major bleeding as defined by the International Society on Thrombosis and Haemostasis.
The rate of major bleeding was 0.4% for the 10 mg rivaroxaban group (HR=1.64; 95% CI, 0.39 to 6.84; P=0.50), 0.5% for the 20 mg rivaroxaban group (HR=2.01; 95% CI, 0.50 to 8.04; P=0.32), and 0.3% for the aspirin group.
Rates of clinically relevant non-major bleeding were 2.0%, 2.7%, and 1.8%, respectively (no significant difference).
“We know from previous studies that only about 40% of venous thromboembolism patients are actually on long-term blood thinners,” Dr Wells said.
“We hope that this study, which shows the blood thinner rivaroxaban is as safe as aspirin but much more effective at preventing future clots, will convince patients and their physicians to continue life-long medication that can prevent potentially dangerous blood clots.”
WASHINGTON, DC—Results of the EINSTEIN CHOICE study suggest rivaroxaban is more effective than, and just as safe as, aspirin for long-term anticoagulation in patients with venous thromboembolism (VTE).
In this phase 3 study, patients who had completed 6 to 12 months of anticoagulant therapy were randomized to receive rivaroxaban or aspirin.
Those who received rivaroxaban had a significantly lower risk of recurrent VTE, and the rates of major bleeding were similar between the treatment arms.
“How best to extend anticoagulant use beyond the initial treatment window has been a constant source of debate, with physicians carefully balancing patients’ risk of another VTE with the risk of anticoagulant-related bleeding,” said study investigator Philip S. Wells, MD, of The Ottawa Hospital in Ontario, Canada.
“With EINSTEIN CHOICE, for the first time, we have clinical evidence confirming rivaroxaban is superior to aspirin in reducing recurrent VTE, with no significant impact on safety. These important results have the potential to trigger a paradigm shift in how physicians manage their patients and protect them from VTE recurrence over the long term.”
These results were presented at the American College of Cardiology’s 66th Annual Scientific Session and published in NEJM. The research was funded by Bayer Pharmaceuticals.
The study enrolled 3365 patients with confirmed deep vein thrombosis or pulmonary embolism who were initially treated with anticoagulant therapy for 6 to 12 months.
Patients who required extended anticoagulation at therapeutic doses were not included in this trial, as the objective was to investigate those patients for whom the treating physician was uncertain about the need for continuing anticoagulant therapy.
Patients were randomized in a 1:1:1 ratio to receive a prophylactic dose of rivaroxaban (10 mg once daily), a treatment dose of rivaroxaban (20 mg once daily), or aspirin (100 mg once daily) for up to 12 months. Sixty percent of patients completed the full 12 months of treatment.
Efficacy
Both rivaroxaban doses were superior to aspirin in preventing fatal or non-fatal recurrent VTE, the study’s primary efficacy endpoint.
The rate of recurrent VTE was 1.2% in the 10 mg rivaroxaban arm (hazard ratio [HR]=0.26; 95% CI, 0.14 to 0.47; P<0.001), 1.5% in the 20 mg rivaroxaban arm (HR=0.34; 95% CI, 0.20 to 0.59; P<0.001), and 4.4% in the aspirin arm. Fatal VTE occurred in 0%, 0.2%, and 0.2%, respectively.
For the primary endpoint, the HR for the comparison between the 20 mg and 10 mg rivaroxaban arms was 1.34 (95% CI, 0.65 to 2.75, P=0.42). However, the researchers noted that the comparison of these 2 arms was not powered for significance.
The researchers also found that rivaroxaban reduced patients’ risk of experiencing one of the following events: recurrent VTE, heart attack, ischemic stroke, systemic embolism, or venous thrombosis in another location.
This endpoint occurred in 1.9% of patients in the 10 mg rivaroxaban group (HR=0.33; 95% CI, 0.20 to 0.54; P<0.001), 2.0% of patients in the 20 mg rivaroxaban group (HR=0.35; 95% CI, 0.22 to 0.57; P<0.001), and 5.6% of patients in the aspirin group.
Recurrent VTE or all-cause mortality occurred in 1.3% of patients in the 10 mg rivaroxaban group (HR=0.27; 95% CI, 0.15 to 0.47; P<0.001), 2.1% of patients in the 20 mg rivaroxaban group (HR=0.42; 95% CI, 0.26 to 0.68; P<0.001), and 4.9% of patients in the aspirin group.
Safety
The primary safety endpoint was major bleeding as defined by the International Society on Thrombosis and Haemostasis.
The rate of major bleeding was 0.4% for the 10 mg rivaroxaban group (HR=1.64; 95% CI, 0.39 to 6.84; P=0.50), 0.5% for the 20 mg rivaroxaban group (HR=2.01; 95% CI, 0.50 to 8.04; P=0.32), and 0.3% for the aspirin group.
Rates of clinically relevant non-major bleeding were 2.0%, 2.7%, and 1.8%, respectively (no significant difference).
“We know from previous studies that only about 40% of venous thromboembolism patients are actually on long-term blood thinners,” Dr Wells said.
“We hope that this study, which shows the blood thinner rivaroxaban is as safe as aspirin but much more effective at preventing future clots, will convince patients and their physicians to continue life-long medication that can prevent potentially dangerous blood clots.”
Proteins may be therapeutic targets for TKI-resistant CML, ALL
Researchers say they have identified 2 signaling proteins that enable resistance to tyrosine kinase inhibitors (TKIs) and could be therapeutic targets for acute lymphoblastic leukemia (ALL) and chronic myeloid leukemia (CML).
The team found that by deleting these proteins—c-Fos and Dusp1—they could eradicate BCR-ABL-induced B-cell ALL in mice.
And treatment combining c-Fos and Dusp1 inhibitors with the TKI imatinib was able to cure mice with BCR-ABL-driven CML.
The researchers reported these findings in Nature Medicine.
“We think that, within the next 5 years, our data will change the way people think about cancer development and targeted therapy,” said study author Mohammad Azam, PhD, of Cincinnati Children’s Hospital Medical Center in Ohio.
“This study identifies a potential Achilles’ heel of kinase-driven cancers, and what we propose is intended to be curative, not just treatment.”
The potential Achilles’ heel is a common point of passage in cells—a signaling node—that appears to be required to generate cancer cells. The node is formed by the signaling proteins c-Fos and Dusp1, according to the researchers.
The team identified c-Fos and Dusp1 by conducting global gene-expression analysis of mouse leukemia cells and human CML cells. Analysis of the human cells revealed extremely high levels of c-FOS and DUSP1 in BCR-ABL-positive, TKI-resistant cells.
Dr Azam and his colleagues found that signaling from tyrosine kinase and growth factor proteins that support cell expansion (such as IL-3 and IL-6) converge to elevate c-Fos and Dusp1 levels in leukemia cells.
Working together, these molecules maintain the survival of leukemia stem cells (LSCs), which translates to minimal residual disease (MRD) after treatment.
Dr Azam said Dusp1 and c-Fos support the survival of LSCs by increasing the toxic threshold needed to kill them. This means imatinib and other TKIs cannot eliminate the residual LSCs.
After describing the roles of c-Fos and Dusp1, Dr Azam and his colleagues put their ideas to the test in mouse models of CML.
The team tested several treatments in these mice, including:
- monotherapy with imatinib
- inhibitors of c-Fos and Dusp1
- treatment with imatinib and inhibitors of c-Fos and Dusp1.
As suspected, treatment with imatinib alone initially stopped CML progression, but mice ultimately relapsed.
Treatment with c-Fos and Dusp1 inhibitors significantly slowed CML progression and prolonged survival in a majority of mice, but this treatment wasn’t curative.
However, a month of treatment with c-Fos and Dusp1 inhibitors as well as imatinib cured about 90% of mice with CML, and there were no signs of MRD.
The researchers also found that simply deleting c-Fos and Dusp1 was sufficient to block the development of B-cell ALL in mice.
The team said they are following up this study by testing c-Fos and Dusp1 as treatment targets for different kinase-fueled cancers.
Researchers say they have identified 2 signaling proteins that enable resistance to tyrosine kinase inhibitors (TKIs) and could be therapeutic targets for acute lymphoblastic leukemia (ALL) and chronic myeloid leukemia (CML).
The team found that by deleting these proteins—c-Fos and Dusp1—they could eradicate BCR-ABL-induced B-cell ALL in mice.
And treatment combining c-Fos and Dusp1 inhibitors with the TKI imatinib was able to cure mice with BCR-ABL-driven CML.
The researchers reported these findings in Nature Medicine.
“We think that, within the next 5 years, our data will change the way people think about cancer development and targeted therapy,” said study author Mohammad Azam, PhD, of Cincinnati Children’s Hospital Medical Center in Ohio.
“This study identifies a potential Achilles’ heel of kinase-driven cancers, and what we propose is intended to be curative, not just treatment.”
The potential Achilles’ heel is a common point of passage in cells—a signaling node—that appears to be required to generate cancer cells. The node is formed by the signaling proteins c-Fos and Dusp1, according to the researchers.
The team identified c-Fos and Dusp1 by conducting global gene-expression analysis of mouse leukemia cells and human CML cells. Analysis of the human cells revealed extremely high levels of c-FOS and DUSP1 in BCR-ABL-positive, TKI-resistant cells.
Dr Azam and his colleagues found that signaling from tyrosine kinase and growth factor proteins that support cell expansion (such as IL-3 and IL-6) converge to elevate c-Fos and Dusp1 levels in leukemia cells.
Working together, these molecules maintain the survival of leukemia stem cells (LSCs), which translates to minimal residual disease (MRD) after treatment.
Dr Azam said Dusp1 and c-Fos support the survival of LSCs by increasing the toxic threshold needed to kill them. This means imatinib and other TKIs cannot eliminate the residual LSCs.
After describing the roles of c-Fos and Dusp1, Dr Azam and his colleagues put their ideas to the test in mouse models of CML.
The team tested several treatments in these mice, including:
- monotherapy with imatinib
- inhibitors of c-Fos and Dusp1
- treatment with imatinib and inhibitors of c-Fos and Dusp1.
As suspected, treatment with imatinib alone initially stopped CML progression, but mice ultimately relapsed.
Treatment with c-Fos and Dusp1 inhibitors significantly slowed CML progression and prolonged survival in a majority of mice, but this treatment wasn’t curative.
However, a month of treatment with c-Fos and Dusp1 inhibitors as well as imatinib cured about 90% of mice with CML, and there were no signs of MRD.
The researchers also found that simply deleting c-Fos and Dusp1 was sufficient to block the development of B-cell ALL in mice.
The team said they are following up this study by testing c-Fos and Dusp1 as treatment targets for different kinase-fueled cancers.
Researchers say they have identified 2 signaling proteins that enable resistance to tyrosine kinase inhibitors (TKIs) and could be therapeutic targets for acute lymphoblastic leukemia (ALL) and chronic myeloid leukemia (CML).
The team found that by deleting these proteins—c-Fos and Dusp1—they could eradicate BCR-ABL-induced B-cell ALL in mice.
And treatment combining c-Fos and Dusp1 inhibitors with the TKI imatinib was able to cure mice with BCR-ABL-driven CML.
The researchers reported these findings in Nature Medicine.
“We think that, within the next 5 years, our data will change the way people think about cancer development and targeted therapy,” said study author Mohammad Azam, PhD, of Cincinnati Children’s Hospital Medical Center in Ohio.
“This study identifies a potential Achilles’ heel of kinase-driven cancers, and what we propose is intended to be curative, not just treatment.”
The potential Achilles’ heel is a common point of passage in cells—a signaling node—that appears to be required to generate cancer cells. The node is formed by the signaling proteins c-Fos and Dusp1, according to the researchers.
The team identified c-Fos and Dusp1 by conducting global gene-expression analysis of mouse leukemia cells and human CML cells. Analysis of the human cells revealed extremely high levels of c-FOS and DUSP1 in BCR-ABL-positive, TKI-resistant cells.
Dr Azam and his colleagues found that signaling from tyrosine kinase and growth factor proteins that support cell expansion (such as IL-3 and IL-6) converge to elevate c-Fos and Dusp1 levels in leukemia cells.
Working together, these molecules maintain the survival of leukemia stem cells (LSCs), which translates to minimal residual disease (MRD) after treatment.
Dr Azam said Dusp1 and c-Fos support the survival of LSCs by increasing the toxic threshold needed to kill them. This means imatinib and other TKIs cannot eliminate the residual LSCs.
After describing the roles of c-Fos and Dusp1, Dr Azam and his colleagues put their ideas to the test in mouse models of CML.
The team tested several treatments in these mice, including:
- monotherapy with imatinib
- inhibitors of c-Fos and Dusp1
- treatment with imatinib and inhibitors of c-Fos and Dusp1.
As suspected, treatment with imatinib alone initially stopped CML progression, but mice ultimately relapsed.
Treatment with c-Fos and Dusp1 inhibitors significantly slowed CML progression and prolonged survival in a majority of mice, but this treatment wasn’t curative.
However, a month of treatment with c-Fos and Dusp1 inhibitors as well as imatinib cured about 90% of mice with CML, and there were no signs of MRD.
The researchers also found that simply deleting c-Fos and Dusp1 was sufficient to block the development of B-cell ALL in mice.
The team said they are following up this study by testing c-Fos and Dusp1 as treatment targets for different kinase-fueled cancers.
Allo-HSCT cures adult with congenital dyserythropoietic anemia
Physicians have reported what they believe is the first case of an allogeneic hematopoietic stem cell transplant (allo-HSCT) curing an adult with congenital dyserythropoietic anemia (CDA).
The patient, David Levy, was previously transfusion-dependent and suffered from iron overload, severe pain, and other adverse effects of his illness.
Levy was denied a transplant for years, but, in 2014, he received a non-myeloablative allo-HSCT from a matched, unrelated donor.
Now, Levy no longer requires transfusions, iron chelation, or immunosuppression, and says he is able to live a normal life.
Damiano Rondelli, MD, of the University of Illinois at Chicago, and his colleagues described Levy’s case in a letter to Bone Marrow Transplantation.
Levy was diagnosed with CDA at 4 months of age and was treated with regular blood transfusions for most of his life. He was 24 when the pain from his illness became so severe that he had to withdraw from graduate school.
“I spent the following years doing nothing—no work, no school, no social contact—because all I could focus on was managing my pain and getting my health back on track,” Levy said.
By age 32, Levy required transfusions every 2 to 3 weeks, had undergone a splenectomy, had an enlarged liver, and was suffering from fatigue, heart palpitations, and iron overload.
“It was bad,” Levy said. “I had been through enough pain. I was angry and depressed, and I wanted a cure. That’s why I started emailing Dr Rondelli.”
Dr Rondelli said that because of Levy’s range of illnesses and inability to tolerate chemotherapy and radiation, several institutions had denied him the possibility of a transplant.
However, Dr Rondelli and his colleagues had reported success with chemotherapy-free allo-HSCT in patients with sickle cell disease. So Dr Rondelli performed Levy’s transplant in 2014.
Levy received a peripheral blood stem cell transplant from an unrelated donor who was a 10/10 HLA match but ABO incompatible. He received conditioning with rabbit anti-thymocyte globulin, fludarabine, cyclophosphamide, and total body irradiation.
Levy also received graft-vs-host disease (GVHD) prophylaxis consisting of high-dose cyclophosphamide, mycophenolate mofetil, and sirolimus. And he received standard antibacterial, antifungal, antiviral, and anti-Pneumocystis jiroveci prophylaxis.
Levy experienced platelet engraftment on day 20 and neutrophil engraftment on day 21. Whole-blood donor-cell chimerism was 98.7% on day 30 and 100% on day 60 and beyond.
Levy did develop transient hemolytic anemia due to the ABO incompatibility. He was given a total of 10 units of packed red blood cells until day 78.
Levy was tapered off all immunosuppression at 12 months and has shown no signs of acute or chronic GVHD.
At 24 months after HSCT, Levy’s hemoglobin was 13.7 g/dL, and his ferritin was 376 ng/mL. He has had no iron chelation since the transplant.
“The transplant was hard, and I had some complications, but I am back to normal now,” said Levy, who is now 35.
“I still have some pain and some lingering issues from the years my condition was not properly managed, but I can be independent now. That is the most important thing to me.”
Levy is finishing his doctorate in psychology and running group therapy sessions at a behavioral health hospital.
Dr Rondelli said the potential of this treatment approach is promising.
“The use of this transplant protocol may represent a safe therapeutic strategy to treat adult patients with many types of congenital anemias—perhaps the only possible cure,” he said.
“For many adult patients with a blood disorder, treatment options have been limited because they are often not sick enough to qualify for a risky procedure, or they are too sick to tolerate the toxic drugs used alongside a standard transplant. This procedure gives some adults the option of a stem cell transplant, which was not previously available.”
Physicians have reported what they believe is the first case of an allogeneic hematopoietic stem cell transplant (allo-HSCT) curing an adult with congenital dyserythropoietic anemia (CDA).
The patient, David Levy, was previously transfusion-dependent and suffered from iron overload, severe pain, and other adverse effects of his illness.
Levy was denied a transplant for years, but, in 2014, he received a non-myeloablative allo-HSCT from a matched, unrelated donor.
Now, Levy no longer requires transfusions, iron chelation, or immunosuppression, and says he is able to live a normal life.
Damiano Rondelli, MD, of the University of Illinois at Chicago, and his colleagues described Levy’s case in a letter to Bone Marrow Transplantation.
Levy was diagnosed with CDA at 4 months of age and was treated with regular blood transfusions for most of his life. He was 24 when the pain from his illness became so severe that he had to withdraw from graduate school.
“I spent the following years doing nothing—no work, no school, no social contact—because all I could focus on was managing my pain and getting my health back on track,” Levy said.
By age 32, Levy required transfusions every 2 to 3 weeks, had undergone a splenectomy, had an enlarged liver, and was suffering from fatigue, heart palpitations, and iron overload.
“It was bad,” Levy said. “I had been through enough pain. I was angry and depressed, and I wanted a cure. That’s why I started emailing Dr Rondelli.”
Dr Rondelli said that because of Levy’s range of illnesses and inability to tolerate chemotherapy and radiation, several institutions had denied him the possibility of a transplant.
However, Dr Rondelli and his colleagues had reported success with chemotherapy-free allo-HSCT in patients with sickle cell disease. So Dr Rondelli performed Levy’s transplant in 2014.
Levy received a peripheral blood stem cell transplant from an unrelated donor who was a 10/10 HLA match but ABO incompatible. He received conditioning with rabbit anti-thymocyte globulin, fludarabine, cyclophosphamide, and total body irradiation.
Levy also received graft-vs-host disease (GVHD) prophylaxis consisting of high-dose cyclophosphamide, mycophenolate mofetil, and sirolimus. And he received standard antibacterial, antifungal, antiviral, and anti-Pneumocystis jiroveci prophylaxis.
Levy experienced platelet engraftment on day 20 and neutrophil engraftment on day 21. Whole-blood donor-cell chimerism was 98.7% on day 30 and 100% on day 60 and beyond.
Levy did develop transient hemolytic anemia due to the ABO incompatibility. He was given a total of 10 units of packed red blood cells until day 78.
Levy was tapered off all immunosuppression at 12 months and has shown no signs of acute or chronic GVHD.
At 24 months after HSCT, Levy’s hemoglobin was 13.7 g/dL, and his ferritin was 376 ng/mL. He has had no iron chelation since the transplant.
“The transplant was hard, and I had some complications, but I am back to normal now,” said Levy, who is now 35.
“I still have some pain and some lingering issues from the years my condition was not properly managed, but I can be independent now. That is the most important thing to me.”
Levy is finishing his doctorate in psychology and running group therapy sessions at a behavioral health hospital.
Dr Rondelli said the potential of this treatment approach is promising.
“The use of this transplant protocol may represent a safe therapeutic strategy to treat adult patients with many types of congenital anemias—perhaps the only possible cure,” he said.
“For many adult patients with a blood disorder, treatment options have been limited because they are often not sick enough to qualify for a risky procedure, or they are too sick to tolerate the toxic drugs used alongside a standard transplant. This procedure gives some adults the option of a stem cell transplant, which was not previously available.”
Physicians have reported what they believe is the first case of an allogeneic hematopoietic stem cell transplant (allo-HSCT) curing an adult with congenital dyserythropoietic anemia (CDA).
The patient, David Levy, was previously transfusion-dependent and suffered from iron overload, severe pain, and other adverse effects of his illness.
Levy was denied a transplant for years, but, in 2014, he received a non-myeloablative allo-HSCT from a matched, unrelated donor.
Now, Levy no longer requires transfusions, iron chelation, or immunosuppression, and says he is able to live a normal life.
Damiano Rondelli, MD, of the University of Illinois at Chicago, and his colleagues described Levy’s case in a letter to Bone Marrow Transplantation.
Levy was diagnosed with CDA at 4 months of age and was treated with regular blood transfusions for most of his life. He was 24 when the pain from his illness became so severe that he had to withdraw from graduate school.
“I spent the following years doing nothing—no work, no school, no social contact—because all I could focus on was managing my pain and getting my health back on track,” Levy said.
By age 32, Levy required transfusions every 2 to 3 weeks, had undergone a splenectomy, had an enlarged liver, and was suffering from fatigue, heart palpitations, and iron overload.
“It was bad,” Levy said. “I had been through enough pain. I was angry and depressed, and I wanted a cure. That’s why I started emailing Dr Rondelli.”
Dr Rondelli said that because of Levy’s range of illnesses and inability to tolerate chemotherapy and radiation, several institutions had denied him the possibility of a transplant.
However, Dr Rondelli and his colleagues had reported success with chemotherapy-free allo-HSCT in patients with sickle cell disease. So Dr Rondelli performed Levy’s transplant in 2014.
Levy received a peripheral blood stem cell transplant from an unrelated donor who was a 10/10 HLA match but ABO incompatible. He received conditioning with rabbit anti-thymocyte globulin, fludarabine, cyclophosphamide, and total body irradiation.
Levy also received graft-vs-host disease (GVHD) prophylaxis consisting of high-dose cyclophosphamide, mycophenolate mofetil, and sirolimus. And he received standard antibacterial, antifungal, antiviral, and anti-Pneumocystis jiroveci prophylaxis.
Levy experienced platelet engraftment on day 20 and neutrophil engraftment on day 21. Whole-blood donor-cell chimerism was 98.7% on day 30 and 100% on day 60 and beyond.
Levy did develop transient hemolytic anemia due to the ABO incompatibility. He was given a total of 10 units of packed red blood cells until day 78.
Levy was tapered off all immunosuppression at 12 months and has shown no signs of acute or chronic GVHD.
At 24 months after HSCT, Levy’s hemoglobin was 13.7 g/dL, and his ferritin was 376 ng/mL. He has had no iron chelation since the transplant.
“The transplant was hard, and I had some complications, but I am back to normal now,” said Levy, who is now 35.
“I still have some pain and some lingering issues from the years my condition was not properly managed, but I can be independent now. That is the most important thing to me.”
Levy is finishing his doctorate in psychology and running group therapy sessions at a behavioral health hospital.
Dr Rondelli said the potential of this treatment approach is promising.
“The use of this transplant protocol may represent a safe therapeutic strategy to treat adult patients with many types of congenital anemias—perhaps the only possible cure,” he said.
“For many adult patients with a blood disorder, treatment options have been limited because they are often not sick enough to qualify for a risky procedure, or they are too sick to tolerate the toxic drugs used alongside a standard transplant. This procedure gives some adults the option of a stem cell transplant, which was not previously available.”
New insight into high-hyperdiploid ALL
New research appears to explain how 10q21.2 influences the risk of high-hyperdiploid acute lymphoblastic leukemia (HD-ALL).
Previous research indicated that variation in the gene ARID5B at 10q21.2 is associated with HD-ALL.
Now, researchers have reported that the 10q21.2 risk locus for HD-ALL is mediated through the single nucleotide polymorphism (SNP) rs7090445, which disrupts RUNX3 transcription factor binding.
Specifically, the rs7090445-C allele confers an increased risk of HD-ALL through reduced RUNX3-mediated expression of ARID5B.
The researchers described these findings in Nature Communications.
“This study expands our understanding of how genetic risk factors can influence the development of acute lymphoblastic leukemia . . .,” said study author Richard Houlston, MD, PhD, of The Institute of Cancer Research in London, UK.
Dr Houlston and his colleagues focused this research on 10q21.2 because it had previously been implicated in HD-ALL, but it wasn’t clear how the region affects the risk of HD-ALL.
The team said they found that a SNP in the region, rs7090445, is “highly associated” with HD-ALL.
Further investigation revealed that variation at rs7090445 disrupts RUNX3 binding and reduces the expression of ARID5B, as RUNX3 regulates ARID5B expression.
The researchers also discovered that the rs7090445-C risk allele, which is associated with reduced ARID5B expression, is amplified in HD-ALL. The risk allele is “preferentially retained” on additional copies of chromosome 10 in HD-ALL blasts.
“We implicate reduced expression of a gene called ARID5B in the production and release of the immature ‘blast’ cells that characterize [HD-ALL],” Dr Houlston said. “Our study gives a new insight into the causes of the disease and may open up new strategies for prevention.”
New research appears to explain how 10q21.2 influences the risk of high-hyperdiploid acute lymphoblastic leukemia (HD-ALL).
Previous research indicated that variation in the gene ARID5B at 10q21.2 is associated with HD-ALL.
Now, researchers have reported that the 10q21.2 risk locus for HD-ALL is mediated through the single nucleotide polymorphism (SNP) rs7090445, which disrupts RUNX3 transcription factor binding.
Specifically, the rs7090445-C allele confers an increased risk of HD-ALL through reduced RUNX3-mediated expression of ARID5B.
The researchers described these findings in Nature Communications.
“This study expands our understanding of how genetic risk factors can influence the development of acute lymphoblastic leukemia . . .,” said study author Richard Houlston, MD, PhD, of The Institute of Cancer Research in London, UK.
Dr Houlston and his colleagues focused this research on 10q21.2 because it had previously been implicated in HD-ALL, but it wasn’t clear how the region affects the risk of HD-ALL.
The team said they found that a SNP in the region, rs7090445, is “highly associated” with HD-ALL.
Further investigation revealed that variation at rs7090445 disrupts RUNX3 binding and reduces the expression of ARID5B, as RUNX3 regulates ARID5B expression.
The researchers also discovered that the rs7090445-C risk allele, which is associated with reduced ARID5B expression, is amplified in HD-ALL. The risk allele is “preferentially retained” on additional copies of chromosome 10 in HD-ALL blasts.
“We implicate reduced expression of a gene called ARID5B in the production and release of the immature ‘blast’ cells that characterize [HD-ALL],” Dr Houlston said. “Our study gives a new insight into the causes of the disease and may open up new strategies for prevention.”
New research appears to explain how 10q21.2 influences the risk of high-hyperdiploid acute lymphoblastic leukemia (HD-ALL).
Previous research indicated that variation in the gene ARID5B at 10q21.2 is associated with HD-ALL.
Now, researchers have reported that the 10q21.2 risk locus for HD-ALL is mediated through the single nucleotide polymorphism (SNP) rs7090445, which disrupts RUNX3 transcription factor binding.
Specifically, the rs7090445-C allele confers an increased risk of HD-ALL through reduced RUNX3-mediated expression of ARID5B.
The researchers described these findings in Nature Communications.
“This study expands our understanding of how genetic risk factors can influence the development of acute lymphoblastic leukemia . . .,” said study author Richard Houlston, MD, PhD, of The Institute of Cancer Research in London, UK.
Dr Houlston and his colleagues focused this research on 10q21.2 because it had previously been implicated in HD-ALL, but it wasn’t clear how the region affects the risk of HD-ALL.
The team said they found that a SNP in the region, rs7090445, is “highly associated” with HD-ALL.
Further investigation revealed that variation at rs7090445 disrupts RUNX3 binding and reduces the expression of ARID5B, as RUNX3 regulates ARID5B expression.
The researchers also discovered that the rs7090445-C risk allele, which is associated with reduced ARID5B expression, is amplified in HD-ALL. The risk allele is “preferentially retained” on additional copies of chromosome 10 in HD-ALL blasts.
“We implicate reduced expression of a gene called ARID5B in the production and release of the immature ‘blast’ cells that characterize [HD-ALL],” Dr Houlston said. “Our study gives a new insight into the causes of the disease and may open up new strategies for prevention.”
Suspecting Pituitary Disorders: “What's Next?”
Reinforcing mesh at ostomy site prevents parastomal hernia
For patients undergoing elective permanent colostomy, prophylactic augmentation of the abdominal wall using mesh at the ostomy site prevents the development of parastomal hernia, according to a report published in the April issue of Annals of Surgery.
The incidence of parastomal hernia is expected to rise because of the increasing number of cancer patients surviving with a colostomy, and the rising number of obese patients who have increased tension on the abdominal wall because of their elevated intra-abdominal pressure and larger abdominal radius. Researchers in the Netherlands performed a prospective randomized study, the PREVENT trial, to assess whether augmenting the abdominal wall at the ostomy site, using a lightweight mesh, would be safe, feasible, and effective at preventing parastomal hernia. They reported their findings after 1 year of follow-up; the study will continue until longer-term results are available at 5 years.
In the intervention group, a retromuscular space was created to accommodate the mesh by dissecting the muscle from the posterior fascia or peritoneum to the lateral border via a median laparotomy. An incision was made in the center of the mesh to allow passage of the colon, and the mesh was placed on the posterior rectus sheath and anchored laterally with two absorbable sutures. “On the medial side, the mesh was incorporated in the running suture closing the fascia, thus preventing contact between the mesh and the viscera,” the investigators said (Ann Surg. 2017;265:663-9).
The primary end point – the incidence of parastomal hernia at 1 year – occurred in 3 patients (4.5%) in the intervention group and 16 (24.2%) in the control group, a significant difference. There were no mesh-related complications such as infection, strictures, or adhesions. “The majority of the parastomal hernias that required surgical repair were in the control group, which supports the concept that if a hernia develops in a patient with mesh, it is smaller and less likely to cause complaints,” Dr. Brandsma and his associates said.
Significantly fewer patients in the mesh group (9%) than in the control group (21%) reported stoma-related complaints such as pain, leakage, and skin problems. Scores on measures of quality of life and pain severity were no different between the two study groups.
“Prophylactic augmentation of the abdominal wall with a retromuscular polypropylene mesh at the ostomy site is a safe and feasible procedure with no adverse events. It significantly reduces the incidence of parastomal hernia,” the investigators concluded.
This study was supported by Canisius Wilhelmina Hospital’s surgery research fund, the Netherlands Organization for Health Research and Development, and Covidien/Medtronic. Dr. Brandsma and his associates reported having no relevant financial disclosures.
For patients undergoing elective permanent colostomy, prophylactic augmentation of the abdominal wall using mesh at the ostomy site prevents the development of parastomal hernia, according to a report published in the April issue of Annals of Surgery.
The incidence of parastomal hernia is expected to rise because of the increasing number of cancer patients surviving with a colostomy, and the rising number of obese patients who have increased tension on the abdominal wall because of their elevated intra-abdominal pressure and larger abdominal radius. Researchers in the Netherlands performed a prospective randomized study, the PREVENT trial, to assess whether augmenting the abdominal wall at the ostomy site, using a lightweight mesh, would be safe, feasible, and effective at preventing parastomal hernia. They reported their findings after 1 year of follow-up; the study will continue until longer-term results are available at 5 years.
In the intervention group, a retromuscular space was created to accommodate the mesh by dissecting the muscle from the posterior fascia or peritoneum to the lateral border via a median laparotomy. An incision was made in the center of the mesh to allow passage of the colon, and the mesh was placed on the posterior rectus sheath and anchored laterally with two absorbable sutures. “On the medial side, the mesh was incorporated in the running suture closing the fascia, thus preventing contact between the mesh and the viscera,” the investigators said (Ann Surg. 2017;265:663-9).
The primary end point – the incidence of parastomal hernia at 1 year – occurred in 3 patients (4.5%) in the intervention group and 16 (24.2%) in the control group, a significant difference. There were no mesh-related complications such as infection, strictures, or adhesions. “The majority of the parastomal hernias that required surgical repair were in the control group, which supports the concept that if a hernia develops in a patient with mesh, it is smaller and less likely to cause complaints,” Dr. Brandsma and his associates said.
Significantly fewer patients in the mesh group (9%) than in the control group (21%) reported stoma-related complaints such as pain, leakage, and skin problems. Scores on measures of quality of life and pain severity were no different between the two study groups.
“Prophylactic augmentation of the abdominal wall with a retromuscular polypropylene mesh at the ostomy site is a safe and feasible procedure with no adverse events. It significantly reduces the incidence of parastomal hernia,” the investigators concluded.
This study was supported by Canisius Wilhelmina Hospital’s surgery research fund, the Netherlands Organization for Health Research and Development, and Covidien/Medtronic. Dr. Brandsma and his associates reported having no relevant financial disclosures.
For patients undergoing elective permanent colostomy, prophylactic augmentation of the abdominal wall using mesh at the ostomy site prevents the development of parastomal hernia, according to a report published in the April issue of Annals of Surgery.
The incidence of parastomal hernia is expected to rise because of the increasing number of cancer patients surviving with a colostomy, and the rising number of obese patients who have increased tension on the abdominal wall because of their elevated intra-abdominal pressure and larger abdominal radius. Researchers in the Netherlands performed a prospective randomized study, the PREVENT trial, to assess whether augmenting the abdominal wall at the ostomy site, using a lightweight mesh, would be safe, feasible, and effective at preventing parastomal hernia. They reported their findings after 1 year of follow-up; the study will continue until longer-term results are available at 5 years.
In the intervention group, a retromuscular space was created to accommodate the mesh by dissecting the muscle from the posterior fascia or peritoneum to the lateral border via a median laparotomy. An incision was made in the center of the mesh to allow passage of the colon, and the mesh was placed on the posterior rectus sheath and anchored laterally with two absorbable sutures. “On the medial side, the mesh was incorporated in the running suture closing the fascia, thus preventing contact between the mesh and the viscera,” the investigators said (Ann Surg. 2017;265:663-9).
The primary end point – the incidence of parastomal hernia at 1 year – occurred in 3 patients (4.5%) in the intervention group and 16 (24.2%) in the control group, a significant difference. There were no mesh-related complications such as infection, strictures, or adhesions. “The majority of the parastomal hernias that required surgical repair were in the control group, which supports the concept that if a hernia develops in a patient with mesh, it is smaller and less likely to cause complaints,” Dr. Brandsma and his associates said.
Significantly fewer patients in the mesh group (9%) than in the control group (21%) reported stoma-related complaints such as pain, leakage, and skin problems. Scores on measures of quality of life and pain severity were no different between the two study groups.
“Prophylactic augmentation of the abdominal wall with a retromuscular polypropylene mesh at the ostomy site is a safe and feasible procedure with no adverse events. It significantly reduces the incidence of parastomal hernia,” the investigators concluded.
This study was supported by Canisius Wilhelmina Hospital’s surgery research fund, the Netherlands Organization for Health Research and Development, and Covidien/Medtronic. Dr. Brandsma and his associates reported having no relevant financial disclosures.
FROM THE ANNALS OF SURGERY
Key clinical point: For patients undergoing permanent colostomy, prophylactic augmentation of the abdominal wall using mesh at the ostomy site prevents the development of parastomal hernia.
Major finding: The primary end point – the incidence of parastomal hernia at 1 year – occurred in 3 patients (4.5%) in the intervention group and 16 (24.2%) in the control group.
Data source: A prospective, multicenter, randomized cohort study comparing prophylactic mesh against standard care in 133 adults undergoing elective end-colostomy during a 3-year period.
Disclosures: This study was supported by Canisius Wilhelmina Hospital’s surgery research fund, the Netherlands Organization for Health Research and Development, and Covidien/Medtronic. Dr. Brandsma and his associates reported having no relevant financial disclosures.
Anti-TNF agents show clinical benefit in refractory sarcoidosis
FROM SEMINARS IN ARTHRITIS & RHEUMATISM
Around two-thirds of patients with severe or refractory sarcoidosis show a significant clinical response to tumor necrosis factor (TNF) antagonists, according to findings from a retrospective, multicenter cohort study.
Biologic agents targeting TNF, such as etanercept, infliximab, and adalimumab, have been introduced as a third-line option for patients with disease that is refractory to other treatments. However, Yvan Jamilloux, MD, of the Hospices Civils de Lyon (France) and his coauthors reported that there are still insufficient data available on efficacy and safety of these drugs in the context of sarcoidosis.
Dr. Jamilloux and his colleagues analyzed data from 132 sarcoidosis patients who received TNF antagonists, 122 (92%) of whom had severe sarcoidosis (Semin Arthritis Rheum. 2017 Mar 8. doi: 10.1016/j.semarthrit.2017.03.005).
Overall, 64% of patients showed clinical improvements in response to TNF antagonists; 18% had a complete response, and 46% had a partial response. However, 33 (25%) patients showed no change, and 14 (11%) had continued disease progression despite treatment with TNF antagonists. In another 16 patients who received a second TNF antagonist, 10 (63%) had a complete or partial clinical response. The investigators could find no differences in response between anti-TNF agents or between monotherapy and a combination with an immunosuppressant.
Pulmonary involvement was associated with a significantly lower clinical response, but none of the other factors examined in a multivariate analysis (sex, age, ethnicity, organ involvement, disease duration, steroid dosage, or prior immunosuppressant use) distinguished responders and nonresponders.
The authors noted that these response rates were lower than those seen in the literature and suggested this may be attributable to the multicenter design, more patients with longer-lasting and more refractory disease, and longer times under biologic therapy (median 12 months).
The researchers reported significant improvements in central nervous system, peripheral nervous system, heart, skin, and upper respiratory tract involvements based on declines in Extrapulmonary Physician Organ Severity Tool (ePOST) scores. There were also improvements in the eye, muscle, and lung, but these were not statistically significant.
TNF-antagonist therapy was associated with a high rate of adverse events. Around half of all patients (52%) experienced adverse events, such as pneumonia, urinary tract infections, bacterial sepsis, and herpes zoster. In 31 patients (23%), these led to treatment cessation.
Nine patients also had severe allergic reactions, four had paradoxical granulomatous reactions, three developed neutralizing antibodies against anti-TNF agents, two patients had demyelinating lesions, and one had a serum sickness-like reaction. All of these events led to discontinuation.
Overall, 128 (97%) of the patients in the study had received corticosteroids as first-line therapy, and 125 (95%) had received at least one second-line immunosuppressive drug over a median duration of 16 months. Most were treated with infliximab (91%) as the first-line TNF antagonist, followed by adalimumab (6%), etanercept (2%), and certolizumab pegol (1%).
Treatment with TNF antagonists was associated with significant reductions in corticosteroid use; the mean daily prednisone dose decreased from 23 mg/day to 11 mg/day over the median 20.5-month follow-up. This was seen even in the 33 patients who showed no change in their disease course after TNF-antagonist therapy.
No conflicts of interest were declared.
This uncontrolled, unblinded retrospective observational study reports the outcomes of anti-TNF therapy in a heterogenous group of refractory sarcoid patients, with only 12% of the severe sarcoidosis population studied having the indication for treatment based on lung involvement. Further, it is notable that the patients with primarily pulmonary involvement had a poorer response to anti-TNF therapy. Over half of the patients had an adverse event related to the treatment, with nearly a quarter having to discontinue therapy. Given the limitations of this type of study, the low numbers of pulmonary sarcoid patients included, the lack of an efficacy signal in pulmonary sarcoid, and the high rate of serious adverse events – the role of anti-TNF agents for pulmonary sarcoid remains unclear and limited. However
This uncontrolled, unblinded retrospective observational study reports the outcomes of anti-TNF therapy in a heterogenous group of refractory sarcoid patients, with only 12% of the severe sarcoidosis population studied having the indication for treatment based on lung involvement. Further, it is notable that the patients with primarily pulmonary involvement had a poorer response to anti-TNF therapy. Over half of the patients had an adverse event related to the treatment, with nearly a quarter having to discontinue therapy. Given the limitations of this type of study, the low numbers of pulmonary sarcoid patients included, the lack of an efficacy signal in pulmonary sarcoid, and the high rate of serious adverse events – the role of anti-TNF agents for pulmonary sarcoid remains unclear and limited. However
This uncontrolled, unblinded retrospective observational study reports the outcomes of anti-TNF therapy in a heterogenous group of refractory sarcoid patients, with only 12% of the severe sarcoidosis population studied having the indication for treatment based on lung involvement. Further, it is notable that the patients with primarily pulmonary involvement had a poorer response to anti-TNF therapy. Over half of the patients had an adverse event related to the treatment, with nearly a quarter having to discontinue therapy. Given the limitations of this type of study, the low numbers of pulmonary sarcoid patients included, the lack of an efficacy signal in pulmonary sarcoid, and the high rate of serious adverse events – the role of anti-TNF agents for pulmonary sarcoid remains unclear and limited. However
FROM SEMINARS IN ARTHRITIS & RHEUMATISM
Around two-thirds of patients with severe or refractory sarcoidosis show a significant clinical response to tumor necrosis factor (TNF) antagonists, according to findings from a retrospective, multicenter cohort study.
Biologic agents targeting TNF, such as etanercept, infliximab, and adalimumab, have been introduced as a third-line option for patients with disease that is refractory to other treatments. However, Yvan Jamilloux, MD, of the Hospices Civils de Lyon (France) and his coauthors reported that there are still insufficient data available on efficacy and safety of these drugs in the context of sarcoidosis.
Dr. Jamilloux and his colleagues analyzed data from 132 sarcoidosis patients who received TNF antagonists, 122 (92%) of whom had severe sarcoidosis (Semin Arthritis Rheum. 2017 Mar 8. doi: 10.1016/j.semarthrit.2017.03.005).
Overall, 64% of patients showed clinical improvements in response to TNF antagonists; 18% had a complete response, and 46% had a partial response. However, 33 (25%) patients showed no change, and 14 (11%) had continued disease progression despite treatment with TNF antagonists. In another 16 patients who received a second TNF antagonist, 10 (63%) had a complete or partial clinical response. The investigators could find no differences in response between anti-TNF agents or between monotherapy and a combination with an immunosuppressant.
Pulmonary involvement was associated with a significantly lower clinical response, but none of the other factors examined in a multivariate analysis (sex, age, ethnicity, organ involvement, disease duration, steroid dosage, or prior immunosuppressant use) distinguished responders and nonresponders.
The authors noted that these response rates were lower than those seen in the literature and suggested this may be attributable to the multicenter design, more patients with longer-lasting and more refractory disease, and longer times under biologic therapy (median 12 months).
The researchers reported significant improvements in central nervous system, peripheral nervous system, heart, skin, and upper respiratory tract involvements based on declines in Extrapulmonary Physician Organ Severity Tool (ePOST) scores. There were also improvements in the eye, muscle, and lung, but these were not statistically significant.
TNF-antagonist therapy was associated with a high rate of adverse events. Around half of all patients (52%) experienced adverse events, such as pneumonia, urinary tract infections, bacterial sepsis, and herpes zoster. In 31 patients (23%), these led to treatment cessation.
Nine patients also had severe allergic reactions, four had paradoxical granulomatous reactions, three developed neutralizing antibodies against anti-TNF agents, two patients had demyelinating lesions, and one had a serum sickness-like reaction. All of these events led to discontinuation.
Overall, 128 (97%) of the patients in the study had received corticosteroids as first-line therapy, and 125 (95%) had received at least one second-line immunosuppressive drug over a median duration of 16 months. Most were treated with infliximab (91%) as the first-line TNF antagonist, followed by adalimumab (6%), etanercept (2%), and certolizumab pegol (1%).
Treatment with TNF antagonists was associated with significant reductions in corticosteroid use; the mean daily prednisone dose decreased from 23 mg/day to 11 mg/day over the median 20.5-month follow-up. This was seen even in the 33 patients who showed no change in their disease course after TNF-antagonist therapy.
No conflicts of interest were declared.
FROM SEMINARS IN ARTHRITIS & RHEUMATISM
Around two-thirds of patients with severe or refractory sarcoidosis show a significant clinical response to tumor necrosis factor (TNF) antagonists, according to findings from a retrospective, multicenter cohort study.
Biologic agents targeting TNF, such as etanercept, infliximab, and adalimumab, have been introduced as a third-line option for patients with disease that is refractory to other treatments. However, Yvan Jamilloux, MD, of the Hospices Civils de Lyon (France) and his coauthors reported that there are still insufficient data available on efficacy and safety of these drugs in the context of sarcoidosis.
Dr. Jamilloux and his colleagues analyzed data from 132 sarcoidosis patients who received TNF antagonists, 122 (92%) of whom had severe sarcoidosis (Semin Arthritis Rheum. 2017 Mar 8. doi: 10.1016/j.semarthrit.2017.03.005).
Overall, 64% of patients showed clinical improvements in response to TNF antagonists; 18% had a complete response, and 46% had a partial response. However, 33 (25%) patients showed no change, and 14 (11%) had continued disease progression despite treatment with TNF antagonists. In another 16 patients who received a second TNF antagonist, 10 (63%) had a complete or partial clinical response. The investigators could find no differences in response between anti-TNF agents or between monotherapy and a combination with an immunosuppressant.
Pulmonary involvement was associated with a significantly lower clinical response, but none of the other factors examined in a multivariate analysis (sex, age, ethnicity, organ involvement, disease duration, steroid dosage, or prior immunosuppressant use) distinguished responders and nonresponders.
The authors noted that these response rates were lower than those seen in the literature and suggested this may be attributable to the multicenter design, more patients with longer-lasting and more refractory disease, and longer times under biologic therapy (median 12 months).
The researchers reported significant improvements in central nervous system, peripheral nervous system, heart, skin, and upper respiratory tract involvements based on declines in Extrapulmonary Physician Organ Severity Tool (ePOST) scores. There were also improvements in the eye, muscle, and lung, but these were not statistically significant.
TNF-antagonist therapy was associated with a high rate of adverse events. Around half of all patients (52%) experienced adverse events, such as pneumonia, urinary tract infections, bacterial sepsis, and herpes zoster. In 31 patients (23%), these led to treatment cessation.
Nine patients also had severe allergic reactions, four had paradoxical granulomatous reactions, three developed neutralizing antibodies against anti-TNF agents, two patients had demyelinating lesions, and one had a serum sickness-like reaction. All of these events led to discontinuation.
Overall, 128 (97%) of the patients in the study had received corticosteroids as first-line therapy, and 125 (95%) had received at least one second-line immunosuppressive drug over a median duration of 16 months. Most were treated with infliximab (91%) as the first-line TNF antagonist, followed by adalimumab (6%), etanercept (2%), and certolizumab pegol (1%).
Treatment with TNF antagonists was associated with significant reductions in corticosteroid use; the mean daily prednisone dose decreased from 23 mg/day to 11 mg/day over the median 20.5-month follow-up. This was seen even in the 33 patients who showed no change in their disease course after TNF-antagonist therapy.
No conflicts of interest were declared.
Key clinical point:
Major finding: A total of 18% had a complete response, and 46% had a partial response, to TNF antagonists.
Data source: A retrospective, multicenter study in 132 sarcoidosis patients who received TNF antagonists.
Disclosures: No conflicts of interest were declared.
HIV vaccine could prevent 30 million cases by 2035
Global cases of HIV from 2015 to 2035 would be reduced by over 50% if the Joint United Nations Program on HIV/AIDS 95/95/95 target is met and a moderately effective HIV vaccine is introduced by 2020, according to new research published in Proceedings of the National Academy of Sciences.
A custom model based on current rates of diagnosis and treatment in 127 countries predicts that a total of 49 million new cases of HIV would occur globally from 2015 to 2035, investigators said. Achieving the UNAIDS goal of 95% disease diagnosis, 95% antiretroviral coverage, and 95% viral suppression by 2030 would avert 25 million cases by 2035. Achieving the more modest 90/90/90 target would avert 22 million cases within the same time period.
“Recent results from the HVTN 100 vaccine trial have bolstered optimism for the development and deployment of an HIV vaccine in the near term,” the investigators said. “HIV vaccination would enable a strategic shift from reactive to proactive control, as suggested by our finding that an HIV vaccine with even moderate efficacy rolled out in 2020 could avert 17 million new infections by 2035 relative to expectations under status quo interventions.”
Find the full study in PNAS (doi: 10.1073/pnas.1620788114)
[email protected]
On Twitter @IDPractitioner
Global cases of HIV from 2015 to 2035 would be reduced by over 50% if the Joint United Nations Program on HIV/AIDS 95/95/95 target is met and a moderately effective HIV vaccine is introduced by 2020, according to new research published in Proceedings of the National Academy of Sciences.
A custom model based on current rates of diagnosis and treatment in 127 countries predicts that a total of 49 million new cases of HIV would occur globally from 2015 to 2035, investigators said. Achieving the UNAIDS goal of 95% disease diagnosis, 95% antiretroviral coverage, and 95% viral suppression by 2030 would avert 25 million cases by 2035. Achieving the more modest 90/90/90 target would avert 22 million cases within the same time period.
“Recent results from the HVTN 100 vaccine trial have bolstered optimism for the development and deployment of an HIV vaccine in the near term,” the investigators said. “HIV vaccination would enable a strategic shift from reactive to proactive control, as suggested by our finding that an HIV vaccine with even moderate efficacy rolled out in 2020 could avert 17 million new infections by 2035 relative to expectations under status quo interventions.”
Find the full study in PNAS (doi: 10.1073/pnas.1620788114)
[email protected]
On Twitter @IDPractitioner
Global cases of HIV from 2015 to 2035 would be reduced by over 50% if the Joint United Nations Program on HIV/AIDS 95/95/95 target is met and a moderately effective HIV vaccine is introduced by 2020, according to new research published in Proceedings of the National Academy of Sciences.
A custom model based on current rates of diagnosis and treatment in 127 countries predicts that a total of 49 million new cases of HIV would occur globally from 2015 to 2035, investigators said. Achieving the UNAIDS goal of 95% disease diagnosis, 95% antiretroviral coverage, and 95% viral suppression by 2030 would avert 25 million cases by 2035. Achieving the more modest 90/90/90 target would avert 22 million cases within the same time period.
“Recent results from the HVTN 100 vaccine trial have bolstered optimism for the development and deployment of an HIV vaccine in the near term,” the investigators said. “HIV vaccination would enable a strategic shift from reactive to proactive control, as suggested by our finding that an HIV vaccine with even moderate efficacy rolled out in 2020 could avert 17 million new infections by 2035 relative to expectations under status quo interventions.”
Find the full study in PNAS (doi: 10.1073/pnas.1620788114)
[email protected]
On Twitter @IDPractitioner