User login
Local treatment boosts survival for some with oligometastatic lung cancer
Adding local treatment to systemic therapy may extend survival among certain patients with oligometastatic non–small cell lung cancer (NSCLC), according to a retrospective look at more than 34,000 patients.
Surgical resection provided the greatest survival benefit, followed by external beam radiotherapy or thermal ablation (EBRT/TA), reported lead author Johannes Uhlig, MD, of University Medical Center Göttingen (Germany) and colleagues.
NSCLC patients with five or fewer metastatic sites (oligometastatic disease) are thought to achieve better outcomes than patients with more widely disseminated disease, the investigators noted in JAMA Network Open, but the benefit of local therapy for this population is unclear.
“A recent randomized, prospective study of 74 patients with oligometastatic NSCLC identified superior progression-free survival with local control after hypofractionated radiotherapy or surgical resection and radiotherapy compared with systemic therapy alone, suggesting an important application of local treatment options for patients with metastatic disease,” the investigators wrote.
To build on these findings, the investigators retrospectively evaluated 34,887 patients with stage IV NSCLC who had up to one distant metastatic lesion in the liver, lung, brain, or bone, as documented in the National Cancer Database. Treatment groups were divided into patients who received systemic therapy alone, surgical resection plus systemic therapy, or EBRT/TA plus systemic therapy. Multivariable Cox proportional hazards models were used to compare overall survival among the three groups.
Including a median follow-up of 39.4 months, data analysis showed that patients who underwent surgery and systemic therapy fared the best. Adding surgery reduced mortality risk by 38% and 41%, compared with EBRT/TA plus systemic therapy and systemic therapy alone, respectively (P less than .001 for both). Compared with systemic therapy alone, adding EBRT/TA reduced mortality risk by 5% (P = .002).
The impact of EBRT/TA varied among subgroups. For those with squamous cell carcinoma who had limited nodal disease, adding EBRT/TA resulted in a clear benefit, reducing mortality risk by 32% (P less than .001). Compared with systemic therapy alone, this benefit translated to higher survival rates for up to 3 years. Conversely, adding EBRT/TA increased risk of death by 39% among patients with extended local and distant adenocarcinoma (P less than .001). In this subgroup, survival rates over the next 3 years were higher among patients treated with systemic therapy alone.
“The present study supports a combined approach of local therapy in addition to systemic treatment for select patients with oligometastatic NSCLC,” the investigators concluded.
The study was funded by the U.S. Department of Defense. The investigators disclosed additional relationships with Bayer, AstraZeneca, Bristol-Myers Squibb, and others.
SOURCE: Uhlig et al. JAMA Netw Open. 2019 Aug 21. doi: 10.1001/jamanetworkopen.2019.9702.
Adding local treatment to systemic therapy may extend survival among certain patients with oligometastatic non–small cell lung cancer (NSCLC), according to a retrospective look at more than 34,000 patients.
Surgical resection provided the greatest survival benefit, followed by external beam radiotherapy or thermal ablation (EBRT/TA), reported lead author Johannes Uhlig, MD, of University Medical Center Göttingen (Germany) and colleagues.
NSCLC patients with five or fewer metastatic sites (oligometastatic disease) are thought to achieve better outcomes than patients with more widely disseminated disease, the investigators noted in JAMA Network Open, but the benefit of local therapy for this population is unclear.
“A recent randomized, prospective study of 74 patients with oligometastatic NSCLC identified superior progression-free survival with local control after hypofractionated radiotherapy or surgical resection and radiotherapy compared with systemic therapy alone, suggesting an important application of local treatment options for patients with metastatic disease,” the investigators wrote.
To build on these findings, the investigators retrospectively evaluated 34,887 patients with stage IV NSCLC who had up to one distant metastatic lesion in the liver, lung, brain, or bone, as documented in the National Cancer Database. Treatment groups were divided into patients who received systemic therapy alone, surgical resection plus systemic therapy, or EBRT/TA plus systemic therapy. Multivariable Cox proportional hazards models were used to compare overall survival among the three groups.
Including a median follow-up of 39.4 months, data analysis showed that patients who underwent surgery and systemic therapy fared the best. Adding surgery reduced mortality risk by 38% and 41%, compared with EBRT/TA plus systemic therapy and systemic therapy alone, respectively (P less than .001 for both). Compared with systemic therapy alone, adding EBRT/TA reduced mortality risk by 5% (P = .002).
The impact of EBRT/TA varied among subgroups. For those with squamous cell carcinoma who had limited nodal disease, adding EBRT/TA resulted in a clear benefit, reducing mortality risk by 32% (P less than .001). Compared with systemic therapy alone, this benefit translated to higher survival rates for up to 3 years. Conversely, adding EBRT/TA increased risk of death by 39% among patients with extended local and distant adenocarcinoma (P less than .001). In this subgroup, survival rates over the next 3 years were higher among patients treated with systemic therapy alone.
“The present study supports a combined approach of local therapy in addition to systemic treatment for select patients with oligometastatic NSCLC,” the investigators concluded.
The study was funded by the U.S. Department of Defense. The investigators disclosed additional relationships with Bayer, AstraZeneca, Bristol-Myers Squibb, and others.
SOURCE: Uhlig et al. JAMA Netw Open. 2019 Aug 21. doi: 10.1001/jamanetworkopen.2019.9702.
Adding local treatment to systemic therapy may extend survival among certain patients with oligometastatic non–small cell lung cancer (NSCLC), according to a retrospective look at more than 34,000 patients.
Surgical resection provided the greatest survival benefit, followed by external beam radiotherapy or thermal ablation (EBRT/TA), reported lead author Johannes Uhlig, MD, of University Medical Center Göttingen (Germany) and colleagues.
NSCLC patients with five or fewer metastatic sites (oligometastatic disease) are thought to achieve better outcomes than patients with more widely disseminated disease, the investigators noted in JAMA Network Open, but the benefit of local therapy for this population is unclear.
“A recent randomized, prospective study of 74 patients with oligometastatic NSCLC identified superior progression-free survival with local control after hypofractionated radiotherapy or surgical resection and radiotherapy compared with systemic therapy alone, suggesting an important application of local treatment options for patients with metastatic disease,” the investigators wrote.
To build on these findings, the investigators retrospectively evaluated 34,887 patients with stage IV NSCLC who had up to one distant metastatic lesion in the liver, lung, brain, or bone, as documented in the National Cancer Database. Treatment groups were divided into patients who received systemic therapy alone, surgical resection plus systemic therapy, or EBRT/TA plus systemic therapy. Multivariable Cox proportional hazards models were used to compare overall survival among the three groups.
Including a median follow-up of 39.4 months, data analysis showed that patients who underwent surgery and systemic therapy fared the best. Adding surgery reduced mortality risk by 38% and 41%, compared with EBRT/TA plus systemic therapy and systemic therapy alone, respectively (P less than .001 for both). Compared with systemic therapy alone, adding EBRT/TA reduced mortality risk by 5% (P = .002).
The impact of EBRT/TA varied among subgroups. For those with squamous cell carcinoma who had limited nodal disease, adding EBRT/TA resulted in a clear benefit, reducing mortality risk by 32% (P less than .001). Compared with systemic therapy alone, this benefit translated to higher survival rates for up to 3 years. Conversely, adding EBRT/TA increased risk of death by 39% among patients with extended local and distant adenocarcinoma (P less than .001). In this subgroup, survival rates over the next 3 years were higher among patients treated with systemic therapy alone.
“The present study supports a combined approach of local therapy in addition to systemic treatment for select patients with oligometastatic NSCLC,” the investigators concluded.
The study was funded by the U.S. Department of Defense. The investigators disclosed additional relationships with Bayer, AstraZeneca, Bristol-Myers Squibb, and others.
SOURCE: Uhlig et al. JAMA Netw Open. 2019 Aug 21. doi: 10.1001/jamanetworkopen.2019.9702.
FROM JAMA NETWORK OPEN
Key clinical point: Adding local treatment to systemic therapy may extend survival among certain patients with oligometastatic non–small cell lung cancer (NSCLC).
Major finding: Patients treated with a combination of surgical resection and systemic therapy had better overall survival than patients treated with systemic therapy alone (hazard ratio, 0.59).
Study details: A retrospective analysis of 34,887 patients with stage IV NSCLC.
Disclosures: The study was funded by the U.S. Department of Defense. The investigators disclosed additional relationships with Bayer, AstraZeneca, Bristol-Myers Squibb, and others.
Source: Uhlig J et al. JAMA Netw Open. 2019 Aug 21. doi: 10.1001/jamanetworkopen.2019.9702.
Oncologists agree with AI treatment decisions about half the time
When it comes to treatment recommendations for high-risk breast cancer, oncologists agree with a leading artificial intelligence platform about half the time, according to investigators.
In the first study of its kind, involving 10 Chinese oncologists, recommendation concordance with the Watson for Oncology treatment advisory tool (WfO) was generally lower for hormone receptor–positive and metastatic cancers than hormone receptor–negative and nonmetastatic cases, reported Fengrui Xu, MD, of the Academy of Military Medical Sciences in Beijing, and colleagues. Refinement could enable broad use of Watson, not to dictate treatment decisions, but instead to propose alternate treatment approaches and offer point-of-care access to relevant evidence.
“[WfO] is an example of a quantitative oncology clinical decision support that leverages the clinical expertise of oncologists at Memorial Sloan Kettering Cancer Center [MSKCC],” the investigators wrote in JCO Clinical Cancer Informatics. The platform uses machine-learning software to interpret patient scenarios in light from MSKCC training cases, MSKCC treatment guidelines, and more than 300 medical textbooks and journals.
To compare WfO with real-world decision makers, the investigators recruited three chief physicians, four attending physicians, and three fellows to provide treatment recommendations for 1,977 patients with complex breast cancer who were treated at 10 hospitals in China. Participating physicians shared the workload; each evaluated an average of 198 different cases.
On average, oncologists and WfO made the same treatment recommendations 56% of the time. Out of the different types of physicians, fellows were most likely to agree with WfO, based on a 68% concordance rate, compared with 54% for chief physicians and 49% for attending physicians. Including all physicians, concordance was lowest for hormone receptor–positive/HER2-positive disease (48%) and highest for triple-negative cases (71%). Adjuvant and metastatic therapies were also evaluated, with high concordance for adjuvant endocrine (78%) and targeted therapy (100%), compared with moderate concordance for first- (52%) and second-line metastatic therapy (50%). The investigators described concordance results as generally “modest;” however, they noted that such levels are promising.
“This degree of concordance is encouraging because therapeutic decisions in these cases are often difficult as a result of the current limits of medical knowledge for treating complex breast cancers and the presence of local contextual factors that affect physician treatment choices,” the investigators wrote. “It is important to note that nonconcordance does not imply that one treatment is correct for a given patient and another is not, nor does it necessarily diminish the potential value of a decision support system that provides access to supporting evidence and insight into its reasoning process.”
The study was funded by Zefei Jiang. The investigators reported affiliations with IBM Watson Health, Pharmaceutical Manufacturer Institution, Merck, and others.
SOURCE: Xu F et al. JCO Clin Cancer Inform. 2019 Aug 16. doi: 10.1200/CCI.18.00159.
When it comes to treatment recommendations for high-risk breast cancer, oncologists agree with a leading artificial intelligence platform about half the time, according to investigators.
In the first study of its kind, involving 10 Chinese oncologists, recommendation concordance with the Watson for Oncology treatment advisory tool (WfO) was generally lower for hormone receptor–positive and metastatic cancers than hormone receptor–negative and nonmetastatic cases, reported Fengrui Xu, MD, of the Academy of Military Medical Sciences in Beijing, and colleagues. Refinement could enable broad use of Watson, not to dictate treatment decisions, but instead to propose alternate treatment approaches and offer point-of-care access to relevant evidence.
“[WfO] is an example of a quantitative oncology clinical decision support that leverages the clinical expertise of oncologists at Memorial Sloan Kettering Cancer Center [MSKCC],” the investigators wrote in JCO Clinical Cancer Informatics. The platform uses machine-learning software to interpret patient scenarios in light from MSKCC training cases, MSKCC treatment guidelines, and more than 300 medical textbooks and journals.
To compare WfO with real-world decision makers, the investigators recruited three chief physicians, four attending physicians, and three fellows to provide treatment recommendations for 1,977 patients with complex breast cancer who were treated at 10 hospitals in China. Participating physicians shared the workload; each evaluated an average of 198 different cases.
On average, oncologists and WfO made the same treatment recommendations 56% of the time. Out of the different types of physicians, fellows were most likely to agree with WfO, based on a 68% concordance rate, compared with 54% for chief physicians and 49% for attending physicians. Including all physicians, concordance was lowest for hormone receptor–positive/HER2-positive disease (48%) and highest for triple-negative cases (71%). Adjuvant and metastatic therapies were also evaluated, with high concordance for adjuvant endocrine (78%) and targeted therapy (100%), compared with moderate concordance for first- (52%) and second-line metastatic therapy (50%). The investigators described concordance results as generally “modest;” however, they noted that such levels are promising.
“This degree of concordance is encouraging because therapeutic decisions in these cases are often difficult as a result of the current limits of medical knowledge for treating complex breast cancers and the presence of local contextual factors that affect physician treatment choices,” the investigators wrote. “It is important to note that nonconcordance does not imply that one treatment is correct for a given patient and another is not, nor does it necessarily diminish the potential value of a decision support system that provides access to supporting evidence and insight into its reasoning process.”
The study was funded by Zefei Jiang. The investigators reported affiliations with IBM Watson Health, Pharmaceutical Manufacturer Institution, Merck, and others.
SOURCE: Xu F et al. JCO Clin Cancer Inform. 2019 Aug 16. doi: 10.1200/CCI.18.00159.
When it comes to treatment recommendations for high-risk breast cancer, oncologists agree with a leading artificial intelligence platform about half the time, according to investigators.
In the first study of its kind, involving 10 Chinese oncologists, recommendation concordance with the Watson for Oncology treatment advisory tool (WfO) was generally lower for hormone receptor–positive and metastatic cancers than hormone receptor–negative and nonmetastatic cases, reported Fengrui Xu, MD, of the Academy of Military Medical Sciences in Beijing, and colleagues. Refinement could enable broad use of Watson, not to dictate treatment decisions, but instead to propose alternate treatment approaches and offer point-of-care access to relevant evidence.
“[WfO] is an example of a quantitative oncology clinical decision support that leverages the clinical expertise of oncologists at Memorial Sloan Kettering Cancer Center [MSKCC],” the investigators wrote in JCO Clinical Cancer Informatics. The platform uses machine-learning software to interpret patient scenarios in light from MSKCC training cases, MSKCC treatment guidelines, and more than 300 medical textbooks and journals.
To compare WfO with real-world decision makers, the investigators recruited three chief physicians, four attending physicians, and three fellows to provide treatment recommendations for 1,977 patients with complex breast cancer who were treated at 10 hospitals in China. Participating physicians shared the workload; each evaluated an average of 198 different cases.
On average, oncologists and WfO made the same treatment recommendations 56% of the time. Out of the different types of physicians, fellows were most likely to agree with WfO, based on a 68% concordance rate, compared with 54% for chief physicians and 49% for attending physicians. Including all physicians, concordance was lowest for hormone receptor–positive/HER2-positive disease (48%) and highest for triple-negative cases (71%). Adjuvant and metastatic therapies were also evaluated, with high concordance for adjuvant endocrine (78%) and targeted therapy (100%), compared with moderate concordance for first- (52%) and second-line metastatic therapy (50%). The investigators described concordance results as generally “modest;” however, they noted that such levels are promising.
“This degree of concordance is encouraging because therapeutic decisions in these cases are often difficult as a result of the current limits of medical knowledge for treating complex breast cancers and the presence of local contextual factors that affect physician treatment choices,” the investigators wrote. “It is important to note that nonconcordance does not imply that one treatment is correct for a given patient and another is not, nor does it necessarily diminish the potential value of a decision support system that provides access to supporting evidence and insight into its reasoning process.”
The study was funded by Zefei Jiang. The investigators reported affiliations with IBM Watson Health, Pharmaceutical Manufacturer Institution, Merck, and others.
SOURCE: Xu F et al. JCO Clin Cancer Inform. 2019 Aug 16. doi: 10.1200/CCI.18.00159.
FROM JCO CLINICAL CANCER INFORMATICS
Adding chemo beats standard gefitinib for EGFR-mutated lung cancer
For patients with EGFR-mutated, advanced non–small cell lung cancer (NSCLC), adding pemetrexed and carboplatin to standard gefitinib therapy markedly extends progression free survival, but at the cost of twice as many serious toxicities, in a recent phase 3 trial.
Two previous phase 2 trials (J Clin Oncol. 2016 Sep 20;34[27]:3258-66 and Ann Oncol. 2015 Feb 10;26[5]:888-94) suggested that adding chemotherapy could improve outcomes over gefitinib alone, but this is the first study to clearly demonstrate better overall survival, reported lead author Vanita Noronha, MD, of Tata Memorial Hospital in Mumbai, India, and colleagues. They noted that this is the second regimen to demonstrate better overall survival than standard gefitinib for EGFR-mutated lung cancer, with dacomitinib being the first, as shown by the ARCHER 1050 trial.
The present study involved 350 patients with advanced, EGFR-mutated NSCLC who had an Eastern Cooperative Oncology (ECOG) performance status of 0-2 and were candidates for first-line palliative therapy. Approximately one-fifth of patients (21%) had a performance status of 2, and almost as many (18%) had brain metastases. After stratification for performance status and mutation type, patients were randomized in a 1:1 ratio to receive either gefitinib monotherapy (250 mg once daily) or gefitinib plus a chemotherapy combination of pemetrexed (500 mg/m2) and carboplatin (area under the curve of 5 with Calvert formula) on day 1 of four 21-day cycles. Subsequently, nonprogressing patients in the chemotherapy group received maintenance therapy with pemetrexed at the same dose and frequency. Treatment was continued until progression, toxicity, or withdrawal of consent. The primary endpoint was progression-free survival (PFS). Secondary outcomes included overall survival (OS), response rate, quality of life, and toxicity.
After a median follow-up of 17 months, the investigators found that adding chemotherapy to gefitinib resulted in a clear benefit, with estimated median PFS increasing from 8 months to 16 months (P less than .001). Estimated median overall survival also increased, with a figure not reached in the chemotherapy/gefitinib group, compared with 17 months among those who received gefitinib alone. Response rates echoed these findings, with more patients in the chemotherapy/gefitinib group achieving complete (2.9% vs. 0.6%) and partial remission (72.4% vs. 61.9%).
“[T]he PFS attained in our study is noteworthy, considering that 21% of our study patients had a [performance status] of 2, whereas the FLAURA study, [which demonstrated a PFS of 18.9 months with osimertinib], only included patients with a [performance status] of 1 or lower,” the investigators wrote. Their report is in Journal of Clinical Oncology.
Still, introducing chemotherapy was not without negative consequences. Compared with the gefitinib monotherapy group, patients who also received chemotherapy more often had grade 3 or higher adverse events (75% vs. 49.4%), and twice as many had clinically significant, serious toxicities (50.6% vs. 25.3%). The additional toxicities were predominantly due to myelosuppression and nephrotoxicity.
Despite these drawbacks, the investigators concluded that combination therapy was superior to gefitinib alone. “The combination of gefitinib, pemetrexed, and carboplatin represents a new standard first-line therapy for EGFR-mutant NSCLC,” the investigators concluded.
The study was funded by Tata Memorial Center Research Administration Council, Fresenius Kabi India, Lung Cancer Consortium India, and others. The investigators reported relationships with Roche, Biocon, Amgen, and others.
SOURCE: Noronha et al. Journal of Clinical Oncology. 2019 Aug 14. doi: 10.1200/JCO.19.01154.
For patients with EGFR-mutated, advanced non–small cell lung cancer (NSCLC), adding pemetrexed and carboplatin to standard gefitinib therapy markedly extends progression free survival, but at the cost of twice as many serious toxicities, in a recent phase 3 trial.
Two previous phase 2 trials (J Clin Oncol. 2016 Sep 20;34[27]:3258-66 and Ann Oncol. 2015 Feb 10;26[5]:888-94) suggested that adding chemotherapy could improve outcomes over gefitinib alone, but this is the first study to clearly demonstrate better overall survival, reported lead author Vanita Noronha, MD, of Tata Memorial Hospital in Mumbai, India, and colleagues. They noted that this is the second regimen to demonstrate better overall survival than standard gefitinib for EGFR-mutated lung cancer, with dacomitinib being the first, as shown by the ARCHER 1050 trial.
The present study involved 350 patients with advanced, EGFR-mutated NSCLC who had an Eastern Cooperative Oncology (ECOG) performance status of 0-2 and were candidates for first-line palliative therapy. Approximately one-fifth of patients (21%) had a performance status of 2, and almost as many (18%) had brain metastases. After stratification for performance status and mutation type, patients were randomized in a 1:1 ratio to receive either gefitinib monotherapy (250 mg once daily) or gefitinib plus a chemotherapy combination of pemetrexed (500 mg/m2) and carboplatin (area under the curve of 5 with Calvert formula) on day 1 of four 21-day cycles. Subsequently, nonprogressing patients in the chemotherapy group received maintenance therapy with pemetrexed at the same dose and frequency. Treatment was continued until progression, toxicity, or withdrawal of consent. The primary endpoint was progression-free survival (PFS). Secondary outcomes included overall survival (OS), response rate, quality of life, and toxicity.
After a median follow-up of 17 months, the investigators found that adding chemotherapy to gefitinib resulted in a clear benefit, with estimated median PFS increasing from 8 months to 16 months (P less than .001). Estimated median overall survival also increased, with a figure not reached in the chemotherapy/gefitinib group, compared with 17 months among those who received gefitinib alone. Response rates echoed these findings, with more patients in the chemotherapy/gefitinib group achieving complete (2.9% vs. 0.6%) and partial remission (72.4% vs. 61.9%).
“[T]he PFS attained in our study is noteworthy, considering that 21% of our study patients had a [performance status] of 2, whereas the FLAURA study, [which demonstrated a PFS of 18.9 months with osimertinib], only included patients with a [performance status] of 1 or lower,” the investigators wrote. Their report is in Journal of Clinical Oncology.
Still, introducing chemotherapy was not without negative consequences. Compared with the gefitinib monotherapy group, patients who also received chemotherapy more often had grade 3 or higher adverse events (75% vs. 49.4%), and twice as many had clinically significant, serious toxicities (50.6% vs. 25.3%). The additional toxicities were predominantly due to myelosuppression and nephrotoxicity.
Despite these drawbacks, the investigators concluded that combination therapy was superior to gefitinib alone. “The combination of gefitinib, pemetrexed, and carboplatin represents a new standard first-line therapy for EGFR-mutant NSCLC,” the investigators concluded.
The study was funded by Tata Memorial Center Research Administration Council, Fresenius Kabi India, Lung Cancer Consortium India, and others. The investigators reported relationships with Roche, Biocon, Amgen, and others.
SOURCE: Noronha et al. Journal of Clinical Oncology. 2019 Aug 14. doi: 10.1200/JCO.19.01154.
For patients with EGFR-mutated, advanced non–small cell lung cancer (NSCLC), adding pemetrexed and carboplatin to standard gefitinib therapy markedly extends progression free survival, but at the cost of twice as many serious toxicities, in a recent phase 3 trial.
Two previous phase 2 trials (J Clin Oncol. 2016 Sep 20;34[27]:3258-66 and Ann Oncol. 2015 Feb 10;26[5]:888-94) suggested that adding chemotherapy could improve outcomes over gefitinib alone, but this is the first study to clearly demonstrate better overall survival, reported lead author Vanita Noronha, MD, of Tata Memorial Hospital in Mumbai, India, and colleagues. They noted that this is the second regimen to demonstrate better overall survival than standard gefitinib for EGFR-mutated lung cancer, with dacomitinib being the first, as shown by the ARCHER 1050 trial.
The present study involved 350 patients with advanced, EGFR-mutated NSCLC who had an Eastern Cooperative Oncology (ECOG) performance status of 0-2 and were candidates for first-line palliative therapy. Approximately one-fifth of patients (21%) had a performance status of 2, and almost as many (18%) had brain metastases. After stratification for performance status and mutation type, patients were randomized in a 1:1 ratio to receive either gefitinib monotherapy (250 mg once daily) or gefitinib plus a chemotherapy combination of pemetrexed (500 mg/m2) and carboplatin (area under the curve of 5 with Calvert formula) on day 1 of four 21-day cycles. Subsequently, nonprogressing patients in the chemotherapy group received maintenance therapy with pemetrexed at the same dose and frequency. Treatment was continued until progression, toxicity, or withdrawal of consent. The primary endpoint was progression-free survival (PFS). Secondary outcomes included overall survival (OS), response rate, quality of life, and toxicity.
After a median follow-up of 17 months, the investigators found that adding chemotherapy to gefitinib resulted in a clear benefit, with estimated median PFS increasing from 8 months to 16 months (P less than .001). Estimated median overall survival also increased, with a figure not reached in the chemotherapy/gefitinib group, compared with 17 months among those who received gefitinib alone. Response rates echoed these findings, with more patients in the chemotherapy/gefitinib group achieving complete (2.9% vs. 0.6%) and partial remission (72.4% vs. 61.9%).
“[T]he PFS attained in our study is noteworthy, considering that 21% of our study patients had a [performance status] of 2, whereas the FLAURA study, [which demonstrated a PFS of 18.9 months with osimertinib], only included patients with a [performance status] of 1 or lower,” the investigators wrote. Their report is in Journal of Clinical Oncology.
Still, introducing chemotherapy was not without negative consequences. Compared with the gefitinib monotherapy group, patients who also received chemotherapy more often had grade 3 or higher adverse events (75% vs. 49.4%), and twice as many had clinically significant, serious toxicities (50.6% vs. 25.3%). The additional toxicities were predominantly due to myelosuppression and nephrotoxicity.
Despite these drawbacks, the investigators concluded that combination therapy was superior to gefitinib alone. “The combination of gefitinib, pemetrexed, and carboplatin represents a new standard first-line therapy for EGFR-mutant NSCLC,” the investigators concluded.
The study was funded by Tata Memorial Center Research Administration Council, Fresenius Kabi India, Lung Cancer Consortium India, and others. The investigators reported relationships with Roche, Biocon, Amgen, and others.
SOURCE: Noronha et al. Journal of Clinical Oncology. 2019 Aug 14. doi: 10.1200/JCO.19.01154.
FROM THE JOURNAL OF CLINICAL ONCOLOGY
New biomarker model outmatches conventional risk factors for predicting mortality
A new model using 14 biomarkers may be more accurate at predicting longer-term mortality than a model comprising conventional risk factors, based on the largest metabolomics study to date.
The prognostic model was more accurate at predicting 5- and 10-year mortality across all ages, reported Joris Deelen, PhD, of Leiden (the Netherlands) University Medical Center and colleagues.
“These results suggest that metabolic biomarker profiling could potentially be used to guide patient care, if further validated in relevant clinical settings,” the investigators wrote in Nature Communications.
“There is no consensus on the ultimate set of predictors of longer-term [5-10 years] mortality risk, since the predictive power of the currently used risk factors is limited, especially at higher ages,” the investigators wrote. “However, it is especially this age group and follow-up time window for which a robust tool would aid clinicians in assessing whether treatment is still sensible.”
The current study was a survival meta-analysis of 44,168 individuals from 12 cohorts aged between 18 and 109 years at baseline. First, the investigators looked for associations between 226 metabolic biomarkers and all-cause mortality in the 5,512 people who died during follow-up. This revealed associations between mortality and 136 biomarkers, which increased to 159 biomarkers after adjusting for recently reported all-cause mortality associations with albumin, very low-density lipoprotein (VLDL) particle size, citrate, and glycoprotein acetyls. Because of strong correlations between many of the biomarkers evaluated, the investigators pared the field down to 63 biomarkers, then used a forward-backward procedure to ultimately identify 14 biomarkers independently associated with mortality. Of the four recently described biomarkers, citrate was excluded from the final model because of its minimal contribution to mortality estimates.
The 14 biomarkers were total lipids in chylomicrons and extremely large VLDL cholesterol, total lipids in small HDL cholesterol, mean diameter for VLDL cholesterol particles, ratio of polyunsaturated fatty acids to total fatty acids, glucose, lactate, histidine, isoleucine, leucine, valine, phenylalanine, acetoacetate, albumin, and glycoprotein acetyls.
“The 14 identified biomarkers are involved in various processes, such as lipoprotein and fatty acid metabolism, glycolysis, fluid balance, and inflammation. Although the majority of these biomarkers have been associated with mortality before, this is the first study that shows their independent effect when combined into one model,” the researchers wrote.
Implementation of the new biomarker model led to a score that typically ranged from –2 to 3. A 1-point increase was associated with a 173% increased risk of death (hazard ratio, 2.73; P less than 1 x 10–132). Analysis of cause-specific mortality revealed that most biomarkers were predictive of multiple causes of death. Some biomarkers were more focused; glucose, for example, was more predictive of cardiovascular-related death than of death because of cancer or nonlocalized infections. Compared with a model incorporating conventional risk factors, the biomarker model more accurately predicted 5- and 10-year mortality, with respective C-statistics of 0.837 versus 0.772 and 0.830 versus 0.790. This superiority was even more pronounced when only individuals aged older than 60 years were included.
The study was funded by Biobanking and BioMolecular resources Research Initiative–Netherlands. The investigators reported additional relationships with Nightingale Health, Novo Nordisk, and Bayer.
SOURCE: Deelen J et al. Nature Comm. 2019 Aug 20. doi: 10.1038/s41467-019-11311-9.
A new model using 14 biomarkers may be more accurate at predicting longer-term mortality than a model comprising conventional risk factors, based on the largest metabolomics study to date.
The prognostic model was more accurate at predicting 5- and 10-year mortality across all ages, reported Joris Deelen, PhD, of Leiden (the Netherlands) University Medical Center and colleagues.
“These results suggest that metabolic biomarker profiling could potentially be used to guide patient care, if further validated in relevant clinical settings,” the investigators wrote in Nature Communications.
“There is no consensus on the ultimate set of predictors of longer-term [5-10 years] mortality risk, since the predictive power of the currently used risk factors is limited, especially at higher ages,” the investigators wrote. “However, it is especially this age group and follow-up time window for which a robust tool would aid clinicians in assessing whether treatment is still sensible.”
The current study was a survival meta-analysis of 44,168 individuals from 12 cohorts aged between 18 and 109 years at baseline. First, the investigators looked for associations between 226 metabolic biomarkers and all-cause mortality in the 5,512 people who died during follow-up. This revealed associations between mortality and 136 biomarkers, which increased to 159 biomarkers after adjusting for recently reported all-cause mortality associations with albumin, very low-density lipoprotein (VLDL) particle size, citrate, and glycoprotein acetyls. Because of strong correlations between many of the biomarkers evaluated, the investigators pared the field down to 63 biomarkers, then used a forward-backward procedure to ultimately identify 14 biomarkers independently associated with mortality. Of the four recently described biomarkers, citrate was excluded from the final model because of its minimal contribution to mortality estimates.
The 14 biomarkers were total lipids in chylomicrons and extremely large VLDL cholesterol, total lipids in small HDL cholesterol, mean diameter for VLDL cholesterol particles, ratio of polyunsaturated fatty acids to total fatty acids, glucose, lactate, histidine, isoleucine, leucine, valine, phenylalanine, acetoacetate, albumin, and glycoprotein acetyls.
“The 14 identified biomarkers are involved in various processes, such as lipoprotein and fatty acid metabolism, glycolysis, fluid balance, and inflammation. Although the majority of these biomarkers have been associated with mortality before, this is the first study that shows their independent effect when combined into one model,” the researchers wrote.
Implementation of the new biomarker model led to a score that typically ranged from –2 to 3. A 1-point increase was associated with a 173% increased risk of death (hazard ratio, 2.73; P less than 1 x 10–132). Analysis of cause-specific mortality revealed that most biomarkers were predictive of multiple causes of death. Some biomarkers were more focused; glucose, for example, was more predictive of cardiovascular-related death than of death because of cancer or nonlocalized infections. Compared with a model incorporating conventional risk factors, the biomarker model more accurately predicted 5- and 10-year mortality, with respective C-statistics of 0.837 versus 0.772 and 0.830 versus 0.790. This superiority was even more pronounced when only individuals aged older than 60 years were included.
The study was funded by Biobanking and BioMolecular resources Research Initiative–Netherlands. The investigators reported additional relationships with Nightingale Health, Novo Nordisk, and Bayer.
SOURCE: Deelen J et al. Nature Comm. 2019 Aug 20. doi: 10.1038/s41467-019-11311-9.
A new model using 14 biomarkers may be more accurate at predicting longer-term mortality than a model comprising conventional risk factors, based on the largest metabolomics study to date.
The prognostic model was more accurate at predicting 5- and 10-year mortality across all ages, reported Joris Deelen, PhD, of Leiden (the Netherlands) University Medical Center and colleagues.
“These results suggest that metabolic biomarker profiling could potentially be used to guide patient care, if further validated in relevant clinical settings,” the investigators wrote in Nature Communications.
“There is no consensus on the ultimate set of predictors of longer-term [5-10 years] mortality risk, since the predictive power of the currently used risk factors is limited, especially at higher ages,” the investigators wrote. “However, it is especially this age group and follow-up time window for which a robust tool would aid clinicians in assessing whether treatment is still sensible.”
The current study was a survival meta-analysis of 44,168 individuals from 12 cohorts aged between 18 and 109 years at baseline. First, the investigators looked for associations between 226 metabolic biomarkers and all-cause mortality in the 5,512 people who died during follow-up. This revealed associations between mortality and 136 biomarkers, which increased to 159 biomarkers after adjusting for recently reported all-cause mortality associations with albumin, very low-density lipoprotein (VLDL) particle size, citrate, and glycoprotein acetyls. Because of strong correlations between many of the biomarkers evaluated, the investigators pared the field down to 63 biomarkers, then used a forward-backward procedure to ultimately identify 14 biomarkers independently associated with mortality. Of the four recently described biomarkers, citrate was excluded from the final model because of its minimal contribution to mortality estimates.
The 14 biomarkers were total lipids in chylomicrons and extremely large VLDL cholesterol, total lipids in small HDL cholesterol, mean diameter for VLDL cholesterol particles, ratio of polyunsaturated fatty acids to total fatty acids, glucose, lactate, histidine, isoleucine, leucine, valine, phenylalanine, acetoacetate, albumin, and glycoprotein acetyls.
“The 14 identified biomarkers are involved in various processes, such as lipoprotein and fatty acid metabolism, glycolysis, fluid balance, and inflammation. Although the majority of these biomarkers have been associated with mortality before, this is the first study that shows their independent effect when combined into one model,” the researchers wrote.
Implementation of the new biomarker model led to a score that typically ranged from –2 to 3. A 1-point increase was associated with a 173% increased risk of death (hazard ratio, 2.73; P less than 1 x 10–132). Analysis of cause-specific mortality revealed that most biomarkers were predictive of multiple causes of death. Some biomarkers were more focused; glucose, for example, was more predictive of cardiovascular-related death than of death because of cancer or nonlocalized infections. Compared with a model incorporating conventional risk factors, the biomarker model more accurately predicted 5- and 10-year mortality, with respective C-statistics of 0.837 versus 0.772 and 0.830 versus 0.790. This superiority was even more pronounced when only individuals aged older than 60 years were included.
The study was funded by Biobanking and BioMolecular resources Research Initiative–Netherlands. The investigators reported additional relationships with Nightingale Health, Novo Nordisk, and Bayer.
SOURCE: Deelen J et al. Nature Comm. 2019 Aug 20. doi: 10.1038/s41467-019-11311-9.
FROM NATURE COMMUNICATIONS
Key clinical point: A new model using 14 biomarkers may be more accurate at predicting 5- and 10-year mortality than a model comprising conventional risk factors.
Major finding: The biomarker model better predicted 5-year mortality than the conventional model (C-statistic, 0.837 vs. 0.772).
Study details: A retrospective metabolomics study involving 44,168 individuals.
Disclosures: The study was funded by Biobanking and BioMolecular Resources Research Initiative–Netherlands. The investigators reported additional relationships with Nightingale Health, Novo Nordisk, and Bayer.
Source: Deelen J et al. Nature Comm. 2019 Aug 20. doi: 10.1038/s41467-019-11311-9.
Zanubrutinib may be poised to challenge ibrutinib for CLL
The Bruton tyrosine kinase (BTK) inhibitor zanubrutinib appears safe and effective for patients with B-cell malignancies, according to results from a phase 1 trial.
Among patients with chronic lymphocytic leukemia (CLL) or small lymphocytic lymphoma (SLL), the overall response rate was 96.2%, reported Constantine Si Lun Tam, MD, of Peter MacCallum Cancer Centre in Melbourne and colleagues.
“Zanubrutinib (BGB-3111) is a highly specific next-generation BTK inhibitor with favorable oral bioavailability, as shown in preclinical studies,” the investigators wrote in Blood. “Compared with ibrutinib, zanubrutinib has shown greater selectivity for BTK and fewer off-target effects in multiple in vitro enzymatic and cell-based assays.”
The current, open-label trial involved 144 patients with B-cell malignancies. To determine optimal dosing, the investigators recruited 17 patients with relapsed/refractory B-cell malignancies who had received at least one prior therapy. The dose expansion part of the study assessed responses in multiple cohorts, including patients with CLL/SLL, mantle cell lymphoma, and Waldenström macroglobulinemia. The primary endpoints were safety and tolerability, including maximum tolerated dose. Efficacy findings were also reported.
During dose escalation, no dose-limiting toxicities were observed, so the highest dose – 320 mg once daily or 160 mg twice daily – was selected for further testing.
The investigators highlighted efficacy and safety findings from 94 patients with CLL/SLL who were involved in dose expansion. Although nearly one-quarter (23.4%) were treatment-naive, the median number of prior therapies was two, and some patients had high-risk features, such as adverse cytogenetics, including 19.1% with a TP53 mutation and 23.3% with a 17p deletion. After a median follow-up of 13.7 months, 94.7% of these patients were still undergoing treatment.
Out of the initial 94 patients with CLL/SLL, 78 were evaluable for efficacy. The overall response rate was 96.2%, including two (2.6%) complete responses, 63 (80.8%) partial responses, and 10 (12.8%) partial responses with lymphocytosis. The median progression-free survival had not been reached, and the 12-month estimated progression-free survival was 100%.
In regard to safety, the most common adverse events were contusion (35.1%), upper respiratory tract infection (33.0%), cough (25.5%), diarrhea (21.3%), fatigue (19.1%), back pain (14.9%), hematuria (14.9%), headache (13.8%), nausea (13.8%), rash (12.8%), arthralgia (11.7%), muscle spasms (11.7%), and urinary tract infection (10.6%).
A number of other adverse events were reported, although these occurred in less than 10% of patients.
More than one-third of patients (36.2%) experienced grade 3 or higher adverse events, with neutropenia being most common (6.4%), followed by pneumonia , hypertension, and anemia, which each occurred in 2.1% of patients, and less commonly, back pain, nausea, urinary tract infection, purpura, cellulitis, and squamous cell carcinoma of the skin, which each occurred in 1.1% of patients.
“In this first-in-human study, zanubrutinib demonstrated encouraging activity in patients with relapsed/refractory and treatment-naive CLL/SLL, with good tolerability,” the investigators concluded. “Two ongoing randomized studies of zanubrutinib versus ibrutinib (NCT03053440 and NCT03734016) aim to determine whether consistent, continuous BTK blockade with a selective inhibitor results in fewer off-target effects and translates into improvements in disease control.”
The study was funded by BeiGene USA, which is developing the drug. The investigators reported relationships with the study sponsor, as well as Janssen, Pharmacyclics, AbbVie, and others.
SOURCE: Tam CSL et al. Blood. 2019 Jul 24. doi: 10.1182/blood.2019001160.
The Bruton tyrosine kinase (BTK) inhibitor zanubrutinib appears safe and effective for patients with B-cell malignancies, according to results from a phase 1 trial.
Among patients with chronic lymphocytic leukemia (CLL) or small lymphocytic lymphoma (SLL), the overall response rate was 96.2%, reported Constantine Si Lun Tam, MD, of Peter MacCallum Cancer Centre in Melbourne and colleagues.
“Zanubrutinib (BGB-3111) is a highly specific next-generation BTK inhibitor with favorable oral bioavailability, as shown in preclinical studies,” the investigators wrote in Blood. “Compared with ibrutinib, zanubrutinib has shown greater selectivity for BTK and fewer off-target effects in multiple in vitro enzymatic and cell-based assays.”
The current, open-label trial involved 144 patients with B-cell malignancies. To determine optimal dosing, the investigators recruited 17 patients with relapsed/refractory B-cell malignancies who had received at least one prior therapy. The dose expansion part of the study assessed responses in multiple cohorts, including patients with CLL/SLL, mantle cell lymphoma, and Waldenström macroglobulinemia. The primary endpoints were safety and tolerability, including maximum tolerated dose. Efficacy findings were also reported.
During dose escalation, no dose-limiting toxicities were observed, so the highest dose – 320 mg once daily or 160 mg twice daily – was selected for further testing.
The investigators highlighted efficacy and safety findings from 94 patients with CLL/SLL who were involved in dose expansion. Although nearly one-quarter (23.4%) were treatment-naive, the median number of prior therapies was two, and some patients had high-risk features, such as adverse cytogenetics, including 19.1% with a TP53 mutation and 23.3% with a 17p deletion. After a median follow-up of 13.7 months, 94.7% of these patients were still undergoing treatment.
Out of the initial 94 patients with CLL/SLL, 78 were evaluable for efficacy. The overall response rate was 96.2%, including two (2.6%) complete responses, 63 (80.8%) partial responses, and 10 (12.8%) partial responses with lymphocytosis. The median progression-free survival had not been reached, and the 12-month estimated progression-free survival was 100%.
In regard to safety, the most common adverse events were contusion (35.1%), upper respiratory tract infection (33.0%), cough (25.5%), diarrhea (21.3%), fatigue (19.1%), back pain (14.9%), hematuria (14.9%), headache (13.8%), nausea (13.8%), rash (12.8%), arthralgia (11.7%), muscle spasms (11.7%), and urinary tract infection (10.6%).
A number of other adverse events were reported, although these occurred in less than 10% of patients.
More than one-third of patients (36.2%) experienced grade 3 or higher adverse events, with neutropenia being most common (6.4%), followed by pneumonia , hypertension, and anemia, which each occurred in 2.1% of patients, and less commonly, back pain, nausea, urinary tract infection, purpura, cellulitis, and squamous cell carcinoma of the skin, which each occurred in 1.1% of patients.
“In this first-in-human study, zanubrutinib demonstrated encouraging activity in patients with relapsed/refractory and treatment-naive CLL/SLL, with good tolerability,” the investigators concluded. “Two ongoing randomized studies of zanubrutinib versus ibrutinib (NCT03053440 and NCT03734016) aim to determine whether consistent, continuous BTK blockade with a selective inhibitor results in fewer off-target effects and translates into improvements in disease control.”
The study was funded by BeiGene USA, which is developing the drug. The investigators reported relationships with the study sponsor, as well as Janssen, Pharmacyclics, AbbVie, and others.
SOURCE: Tam CSL et al. Blood. 2019 Jul 24. doi: 10.1182/blood.2019001160.
The Bruton tyrosine kinase (BTK) inhibitor zanubrutinib appears safe and effective for patients with B-cell malignancies, according to results from a phase 1 trial.
Among patients with chronic lymphocytic leukemia (CLL) or small lymphocytic lymphoma (SLL), the overall response rate was 96.2%, reported Constantine Si Lun Tam, MD, of Peter MacCallum Cancer Centre in Melbourne and colleagues.
“Zanubrutinib (BGB-3111) is a highly specific next-generation BTK inhibitor with favorable oral bioavailability, as shown in preclinical studies,” the investigators wrote in Blood. “Compared with ibrutinib, zanubrutinib has shown greater selectivity for BTK and fewer off-target effects in multiple in vitro enzymatic and cell-based assays.”
The current, open-label trial involved 144 patients with B-cell malignancies. To determine optimal dosing, the investigators recruited 17 patients with relapsed/refractory B-cell malignancies who had received at least one prior therapy. The dose expansion part of the study assessed responses in multiple cohorts, including patients with CLL/SLL, mantle cell lymphoma, and Waldenström macroglobulinemia. The primary endpoints were safety and tolerability, including maximum tolerated dose. Efficacy findings were also reported.
During dose escalation, no dose-limiting toxicities were observed, so the highest dose – 320 mg once daily or 160 mg twice daily – was selected for further testing.
The investigators highlighted efficacy and safety findings from 94 patients with CLL/SLL who were involved in dose expansion. Although nearly one-quarter (23.4%) were treatment-naive, the median number of prior therapies was two, and some patients had high-risk features, such as adverse cytogenetics, including 19.1% with a TP53 mutation and 23.3% with a 17p deletion. After a median follow-up of 13.7 months, 94.7% of these patients were still undergoing treatment.
Out of the initial 94 patients with CLL/SLL, 78 were evaluable for efficacy. The overall response rate was 96.2%, including two (2.6%) complete responses, 63 (80.8%) partial responses, and 10 (12.8%) partial responses with lymphocytosis. The median progression-free survival had not been reached, and the 12-month estimated progression-free survival was 100%.
In regard to safety, the most common adverse events were contusion (35.1%), upper respiratory tract infection (33.0%), cough (25.5%), diarrhea (21.3%), fatigue (19.1%), back pain (14.9%), hematuria (14.9%), headache (13.8%), nausea (13.8%), rash (12.8%), arthralgia (11.7%), muscle spasms (11.7%), and urinary tract infection (10.6%).
A number of other adverse events were reported, although these occurred in less than 10% of patients.
More than one-third of patients (36.2%) experienced grade 3 or higher adverse events, with neutropenia being most common (6.4%), followed by pneumonia , hypertension, and anemia, which each occurred in 2.1% of patients, and less commonly, back pain, nausea, urinary tract infection, purpura, cellulitis, and squamous cell carcinoma of the skin, which each occurred in 1.1% of patients.
“In this first-in-human study, zanubrutinib demonstrated encouraging activity in patients with relapsed/refractory and treatment-naive CLL/SLL, with good tolerability,” the investigators concluded. “Two ongoing randomized studies of zanubrutinib versus ibrutinib (NCT03053440 and NCT03734016) aim to determine whether consistent, continuous BTK blockade with a selective inhibitor results in fewer off-target effects and translates into improvements in disease control.”
The study was funded by BeiGene USA, which is developing the drug. The investigators reported relationships with the study sponsor, as well as Janssen, Pharmacyclics, AbbVie, and others.
SOURCE: Tam CSL et al. Blood. 2019 Jul 24. doi: 10.1182/blood.2019001160.
FROM BLOOD
Drug-inducible gene therapy unlocks IL-12 for glioblastoma
For patients with recurrent, high-grade glioblastoma, localized, drug-inducible gene therapy could unlock the anticancer potential of interleukin-12, based on a phase 1 trial.
In 31 patients who had their tumors excised, intraoperative site injection with an IL-12 vector followed by postoperative administration of veledimex, an oral activator of the transgene, increased IL-12 levels in the brain and appeared to improve overall survival, reported E. Antonio Chiocca, MD, PhD, Harvey W. Cushing Professor of Neurosurgery at Harvard Medical School, Boston, and colleagues. Although some serious adverse events were encountered, the investigators noted that these were less common with lower doses of veledimex and were reversible upon discontinuation. These findings mark a turning point in IL-12 cancer research, which previously encountered prohibitive safety obstacles.
“There was interest in the use of recombinant IL-12 in humans with cancer, and clinical trials of systemic IL-12 were undertaken but had to be stopped because the cytokine, administered as a recombinant soluble protein, was poorly tolerated,” the investigators wrote in Science Translational Medicine.
To overcome this issue, a novel treatment approach was developed. “With the objective of minimizing systemic toxicity, a ligand-inducible expression switch [RheoSwitch Therapeutic System] was developed to locally control production of IL-12 in the tumor microenvironment. In this system, transcription of the IL-12 transgene occurs only in the presence of the activator ligand, veledimex,” they noted.
The primary aim of the study was to evaluate safety and determine the optimal dose of veledimex; four dose levels were tested: 10, 20, 30, and 40 mg. Survival outcomes also were reported.
The protocol-defined maximum tolerated dose was not reached; however, the 20-mg dose was chosen, based on observed tolerability. At this dose level, the most common grade 3 or higher adverse events were lymphopenia (20.3%), thrombocytopenia (13.3%), and hyponatremia (13.3%). Specifically for grade 3 or higher neurologic adverse events, headache was most common, occurring in 13.3% of patients. Grade 2 cytokine release syndrome occurred in about one-fourth of patients (26.7%), whereas grade 3 cytokine release syndrome occurred about half as frequently (13.3%). All adverse events, including cytokine release syndrome, were reversible upon discontinuation of veledimex.
After a mean follow-up of 13.1 months, the median overall survival among patients receiving the 20-mg dose was 12.7 months. The investigators pointed out that this compared favorably with historical controls, who had a weighted median overall survival of 8.1 months. Those who received 30- or 40-mg doses had the poorest survival, which the investigators attributed to intolerability and other subgroup factors.
Data analysis also revealed a negative correlation between dexamethasone use and survival. Among patients in the 20-mg veledimex group who received 20 mg or less of dexamethasone during active veledimex dosing, median overall survival was extended to 17.8 months. The investigators speculated that this was because of reduced immune suppression, although dexamethasone could have induced cytochrome P450 3A4, which may have increased elimination of veledimex.
“In summary, this phase 1 trial reports the use of a transcriptional switch to safely control dosing of [IL-12], highlighting that this can be accomplished across the [blood-brain barrier] to remodel the tumor microenvironment with an influx of activated immune cells,” the investigators wrote.
They noted that this strategy could potentially be applied to other types of cancer, particularly those that are immunologically cold. “These data contribute to our understanding of IL-12 as a ‘master regulator’ of the immune system and highlight that even the transient production of this cytokine may function as a match to turn tumors from cold to hot.”
The study was funded by Ziopharm Oncology and the National Institutes of Health. The investigators reported additional relationships with Advantagene, Stemgen, Sigilon Therapeutics, and others.
SOURCE: Chiocca EA et al. Sci Transl Med. 2019 Aug 14. doi: 10.1126/scitranslmed.aaw5680.
For patients with recurrent, high-grade glioblastoma, localized, drug-inducible gene therapy could unlock the anticancer potential of interleukin-12, based on a phase 1 trial.
In 31 patients who had their tumors excised, intraoperative site injection with an IL-12 vector followed by postoperative administration of veledimex, an oral activator of the transgene, increased IL-12 levels in the brain and appeared to improve overall survival, reported E. Antonio Chiocca, MD, PhD, Harvey W. Cushing Professor of Neurosurgery at Harvard Medical School, Boston, and colleagues. Although some serious adverse events were encountered, the investigators noted that these were less common with lower doses of veledimex and were reversible upon discontinuation. These findings mark a turning point in IL-12 cancer research, which previously encountered prohibitive safety obstacles.
“There was interest in the use of recombinant IL-12 in humans with cancer, and clinical trials of systemic IL-12 were undertaken but had to be stopped because the cytokine, administered as a recombinant soluble protein, was poorly tolerated,” the investigators wrote in Science Translational Medicine.
To overcome this issue, a novel treatment approach was developed. “With the objective of minimizing systemic toxicity, a ligand-inducible expression switch [RheoSwitch Therapeutic System] was developed to locally control production of IL-12 in the tumor microenvironment. In this system, transcription of the IL-12 transgene occurs only in the presence of the activator ligand, veledimex,” they noted.
The primary aim of the study was to evaluate safety and determine the optimal dose of veledimex; four dose levels were tested: 10, 20, 30, and 40 mg. Survival outcomes also were reported.
The protocol-defined maximum tolerated dose was not reached; however, the 20-mg dose was chosen, based on observed tolerability. At this dose level, the most common grade 3 or higher adverse events were lymphopenia (20.3%), thrombocytopenia (13.3%), and hyponatremia (13.3%). Specifically for grade 3 or higher neurologic adverse events, headache was most common, occurring in 13.3% of patients. Grade 2 cytokine release syndrome occurred in about one-fourth of patients (26.7%), whereas grade 3 cytokine release syndrome occurred about half as frequently (13.3%). All adverse events, including cytokine release syndrome, were reversible upon discontinuation of veledimex.
After a mean follow-up of 13.1 months, the median overall survival among patients receiving the 20-mg dose was 12.7 months. The investigators pointed out that this compared favorably with historical controls, who had a weighted median overall survival of 8.1 months. Those who received 30- or 40-mg doses had the poorest survival, which the investigators attributed to intolerability and other subgroup factors.
Data analysis also revealed a negative correlation between dexamethasone use and survival. Among patients in the 20-mg veledimex group who received 20 mg or less of dexamethasone during active veledimex dosing, median overall survival was extended to 17.8 months. The investigators speculated that this was because of reduced immune suppression, although dexamethasone could have induced cytochrome P450 3A4, which may have increased elimination of veledimex.
“In summary, this phase 1 trial reports the use of a transcriptional switch to safely control dosing of [IL-12], highlighting that this can be accomplished across the [blood-brain barrier] to remodel the tumor microenvironment with an influx of activated immune cells,” the investigators wrote.
They noted that this strategy could potentially be applied to other types of cancer, particularly those that are immunologically cold. “These data contribute to our understanding of IL-12 as a ‘master regulator’ of the immune system and highlight that even the transient production of this cytokine may function as a match to turn tumors from cold to hot.”
The study was funded by Ziopharm Oncology and the National Institutes of Health. The investigators reported additional relationships with Advantagene, Stemgen, Sigilon Therapeutics, and others.
SOURCE: Chiocca EA et al. Sci Transl Med. 2019 Aug 14. doi: 10.1126/scitranslmed.aaw5680.
For patients with recurrent, high-grade glioblastoma, localized, drug-inducible gene therapy could unlock the anticancer potential of interleukin-12, based on a phase 1 trial.
In 31 patients who had their tumors excised, intraoperative site injection with an IL-12 vector followed by postoperative administration of veledimex, an oral activator of the transgene, increased IL-12 levels in the brain and appeared to improve overall survival, reported E. Antonio Chiocca, MD, PhD, Harvey W. Cushing Professor of Neurosurgery at Harvard Medical School, Boston, and colleagues. Although some serious adverse events were encountered, the investigators noted that these were less common with lower doses of veledimex and were reversible upon discontinuation. These findings mark a turning point in IL-12 cancer research, which previously encountered prohibitive safety obstacles.
“There was interest in the use of recombinant IL-12 in humans with cancer, and clinical trials of systemic IL-12 were undertaken but had to be stopped because the cytokine, administered as a recombinant soluble protein, was poorly tolerated,” the investigators wrote in Science Translational Medicine.
To overcome this issue, a novel treatment approach was developed. “With the objective of minimizing systemic toxicity, a ligand-inducible expression switch [RheoSwitch Therapeutic System] was developed to locally control production of IL-12 in the tumor microenvironment. In this system, transcription of the IL-12 transgene occurs only in the presence of the activator ligand, veledimex,” they noted.
The primary aim of the study was to evaluate safety and determine the optimal dose of veledimex; four dose levels were tested: 10, 20, 30, and 40 mg. Survival outcomes also were reported.
The protocol-defined maximum tolerated dose was not reached; however, the 20-mg dose was chosen, based on observed tolerability. At this dose level, the most common grade 3 or higher adverse events were lymphopenia (20.3%), thrombocytopenia (13.3%), and hyponatremia (13.3%). Specifically for grade 3 or higher neurologic adverse events, headache was most common, occurring in 13.3% of patients. Grade 2 cytokine release syndrome occurred in about one-fourth of patients (26.7%), whereas grade 3 cytokine release syndrome occurred about half as frequently (13.3%). All adverse events, including cytokine release syndrome, were reversible upon discontinuation of veledimex.
After a mean follow-up of 13.1 months, the median overall survival among patients receiving the 20-mg dose was 12.7 months. The investigators pointed out that this compared favorably with historical controls, who had a weighted median overall survival of 8.1 months. Those who received 30- or 40-mg doses had the poorest survival, which the investigators attributed to intolerability and other subgroup factors.
Data analysis also revealed a negative correlation between dexamethasone use and survival. Among patients in the 20-mg veledimex group who received 20 mg or less of dexamethasone during active veledimex dosing, median overall survival was extended to 17.8 months. The investigators speculated that this was because of reduced immune suppression, although dexamethasone could have induced cytochrome P450 3A4, which may have increased elimination of veledimex.
“In summary, this phase 1 trial reports the use of a transcriptional switch to safely control dosing of [IL-12], highlighting that this can be accomplished across the [blood-brain barrier] to remodel the tumor microenvironment with an influx of activated immune cells,” the investigators wrote.
They noted that this strategy could potentially be applied to other types of cancer, particularly those that are immunologically cold. “These data contribute to our understanding of IL-12 as a ‘master regulator’ of the immune system and highlight that even the transient production of this cytokine may function as a match to turn tumors from cold to hot.”
The study was funded by Ziopharm Oncology and the National Institutes of Health. The investigators reported additional relationships with Advantagene, Stemgen, Sigilon Therapeutics, and others.
SOURCE: Chiocca EA et al. Sci Transl Med. 2019 Aug 14. doi: 10.1126/scitranslmed.aaw5680.
FROM SCIENCE TRANSLATIONAL MEDICINE
Key clinical point: For patients with recurrent, high-grade glioblastoma, localized, drug-inducible gene therapy could unlock the anticancer potential of interleukin-12.
Major finding: After 13.1 months of follow-up, median overall survival was 12.7 months, compared with a weighted median overall survival among historical controls of 8.1 months.
Study details: A phase 1 trial involving 31 patients with recurrent glioblastoma.
Disclosures: The study was funded by Ziopharm Oncology and the National Institutes of Health. The investigators reported additional relationships with Advantagene, Stemgen, Sigilon Therapeutics, and others.
Source: Chiocca EA et al. Sci Transl Med. 2019 Aug 14. doi: 10.1126/scitranslmed.aaw5680.
T cells and IL-2 drive acute celiac symptoms
CD4+ T-cell reactivation and interleukin (IL)–2 release are responsible for acute gastrointestinal symptoms when patients with celiac disease are exposed to gluten, according to investigators.
Although T cells have been well studied in previous celiac disease research, clinical symptoms after acute gluten exposure have never been linked with specific cytokine changes, reported lead author Gautam Goel, PhD, of Massachusetts General Hospital in Boston, and colleagues.
“If treated [celiac disease] patients, i.e., those following a strict [gluten-free diet], are exposed to gluten-containing food, they typically suffer from gastrointestinal reactions occurring 1 to 2 hours after the gluten exposure,” the investigators wrote in Science Advances. “There is currently no explanation for the acute gluten-induced symptoms seen in treated [celiac disease] patients.”
The current study was prompted by two phase 1 trials involving the therapeutic vaccine Nexvax2, which uses peptide fragments of gluten proteins to desensitize celiac patients to gluten, the investigators explained. During those trials, intradermal injections of Nexvax2 above a certain dose threshold led to gastrointestinal symptoms within 2-5 hours, but not injection site reactions, which would have been indicative of a cutaneous response to recall antigen.
“Our observations from these phase 1 studies led us to hypothesize that cytokine release occurs following natural gluten exposure and could be used to implicate which arms of the immune system drive early symptoms.”
Of the 28 patients in the two trials, all underwent intradermal testing, while 19 also participated in an oral gluten challenge. Following intradermal injection of gluten peptides, patients exhibited gastrointestinal symptoms, along with coordinated elevations of least 15 plasma cytokines; most significantly IL-2, MCP-1, IL-8, IL-10, MIP-1beta, IP-10, and eotaxin. The first cytokines to respond to injection were IL-2 and IL-8, rising within 2 hours, prior to symptoms. At 4 hours, when symptoms were present, peak IL-2 elevations were most dramatic, with a 272-fold elevation, followed by IL-8 (11-fold) and IL-10 (1.2-fold).
“IL-2 is both the earliest and most sensitive marker for the coordinated cytokine release that was almost universal in HLA-DQ2.5 + [celiac disease] patients administered gluten peptides,” the investigators wrote.
Similar to intradermal testing, oral challenge with gluten caused IL-2, IL-8, and IL-10 to elevate within 2 hours, and peak within 4-6 hours. Again, IL-2 was most sensitive, with a 15-fold increase at 4 hours. This increase in IL-2 correlated with IL-8 and IL-10 elevations, although IL-2 increases were at least six times greater than the other two cytokines.
“Together, the serum cytokine profile following gluten ingestion is less prominent but qualitatively similar and over a corresponding time course to that after injecting gluten peptides, which is consistent with activated CD4+ T cells being the driver of cytokine release in both scenarios,” the investigators wrote.
Further testing showed that, after gluten challenge, plasma levels of IL-2, IL-8, and IL-10 negatively correlated with duodenal villous height-to-crypt depth ratios. In addition, high levels of IL-2 correlated with severe nausea and vomiting, adding to the evidence that celiac symptoms were linked with specific cytokine elevations.
“The link between immune activation and symptoms was further strengthened by showing that postdose symptoms and cytokine release were both lessened after three weekly doses and absent after 16 twice-weekly injections of gluten peptides,” the investigators wrote. “These findings are consistent with the difference in severity of symptoms after gluten ingestion compared to gluten peptide injection being related to potency of the antigen challenge and T-cell activation measured by circulating IL-2 concentration at 4 hours.”
Even though IL-2 elevations appeared to drive celiac symptoms, the source of IL-2 was initially unknown. “Activated T cells are the primary source of IL-2, but [dendritic cells] can also secrete IL-2 following ligation of specific pathogen recognition receptors; mast cells also secrete IL-2 following exposure to IL-33 or IL-9,” the investigators explained. Still, CD4+ T cells are known to be key players in celiac disease, and the timing and magnitude of IL-2 release made T cells the most likely candidates. To test this hypothesis, the investigators collected blood from patients 6 days after gluten food challenge and incubated these samples for 24 hours with gluten peptides. Results of this test suggested that gluten-specific CD4+ T cells were the most likely source of IL-2.
The connection between particular cytokines and gastrointestinal symptoms is now supported with evidence; however, the investigators pointed out that a relationship between cytokines and other symptoms of celiac disease remains to be seen. “Whether cytokines elevated in blood after injecting gluten peptides or ingesting gluten have any direct extraintestinal effects is unclear,” the investigators wrote. “Fatigue, headache, and ‘brain fog’ are the commonly reported extraintestinal symptoms in [celiac disease] patients. However, symptoms being focused on the upper gastrointestinal tract suggest that cytokines increased in blood have clinical and immunological effects that selectively affect the tissue from which they originate.”
“Future studies should test whether cytokine concentrations are substantially higher in gut mucosal tissue than in blood after gluten challenge or injection of gluten peptides and determine whether alterations in local cytokine levels are matched by immune and inflammatory cell infiltration,” the investigators wrote.
The study was supported by the University of Chicago Celiac Disease Center and the University of Oslo KG Jebsen Coeliac Disease Research Centre. The investigators reported additional relationships The investigators reported additional relationships with several government and nonprofit organizations. Multiple investigators are employees of ImmusanT, which is developing Nexvax2.
SOURCE: Goel G et al. Science Advances. 2019 Aug 7. doi: 10.1126/sciadv.aaw7756.
CD4+ T-cell reactivation and interleukin (IL)–2 release are responsible for acute gastrointestinal symptoms when patients with celiac disease are exposed to gluten, according to investigators.
Although T cells have been well studied in previous celiac disease research, clinical symptoms after acute gluten exposure have never been linked with specific cytokine changes, reported lead author Gautam Goel, PhD, of Massachusetts General Hospital in Boston, and colleagues.
“If treated [celiac disease] patients, i.e., those following a strict [gluten-free diet], are exposed to gluten-containing food, they typically suffer from gastrointestinal reactions occurring 1 to 2 hours after the gluten exposure,” the investigators wrote in Science Advances. “There is currently no explanation for the acute gluten-induced symptoms seen in treated [celiac disease] patients.”
The current study was prompted by two phase 1 trials involving the therapeutic vaccine Nexvax2, which uses peptide fragments of gluten proteins to desensitize celiac patients to gluten, the investigators explained. During those trials, intradermal injections of Nexvax2 above a certain dose threshold led to gastrointestinal symptoms within 2-5 hours, but not injection site reactions, which would have been indicative of a cutaneous response to recall antigen.
“Our observations from these phase 1 studies led us to hypothesize that cytokine release occurs following natural gluten exposure and could be used to implicate which arms of the immune system drive early symptoms.”
Of the 28 patients in the two trials, all underwent intradermal testing, while 19 also participated in an oral gluten challenge. Following intradermal injection of gluten peptides, patients exhibited gastrointestinal symptoms, along with coordinated elevations of least 15 plasma cytokines; most significantly IL-2, MCP-1, IL-8, IL-10, MIP-1beta, IP-10, and eotaxin. The first cytokines to respond to injection were IL-2 and IL-8, rising within 2 hours, prior to symptoms. At 4 hours, when symptoms were present, peak IL-2 elevations were most dramatic, with a 272-fold elevation, followed by IL-8 (11-fold) and IL-10 (1.2-fold).
“IL-2 is both the earliest and most sensitive marker for the coordinated cytokine release that was almost universal in HLA-DQ2.5 + [celiac disease] patients administered gluten peptides,” the investigators wrote.
Similar to intradermal testing, oral challenge with gluten caused IL-2, IL-8, and IL-10 to elevate within 2 hours, and peak within 4-6 hours. Again, IL-2 was most sensitive, with a 15-fold increase at 4 hours. This increase in IL-2 correlated with IL-8 and IL-10 elevations, although IL-2 increases were at least six times greater than the other two cytokines.
“Together, the serum cytokine profile following gluten ingestion is less prominent but qualitatively similar and over a corresponding time course to that after injecting gluten peptides, which is consistent with activated CD4+ T cells being the driver of cytokine release in both scenarios,” the investigators wrote.
Further testing showed that, after gluten challenge, plasma levels of IL-2, IL-8, and IL-10 negatively correlated with duodenal villous height-to-crypt depth ratios. In addition, high levels of IL-2 correlated with severe nausea and vomiting, adding to the evidence that celiac symptoms were linked with specific cytokine elevations.
“The link between immune activation and symptoms was further strengthened by showing that postdose symptoms and cytokine release were both lessened after three weekly doses and absent after 16 twice-weekly injections of gluten peptides,” the investigators wrote. “These findings are consistent with the difference in severity of symptoms after gluten ingestion compared to gluten peptide injection being related to potency of the antigen challenge and T-cell activation measured by circulating IL-2 concentration at 4 hours.”
Even though IL-2 elevations appeared to drive celiac symptoms, the source of IL-2 was initially unknown. “Activated T cells are the primary source of IL-2, but [dendritic cells] can also secrete IL-2 following ligation of specific pathogen recognition receptors; mast cells also secrete IL-2 following exposure to IL-33 or IL-9,” the investigators explained. Still, CD4+ T cells are known to be key players in celiac disease, and the timing and magnitude of IL-2 release made T cells the most likely candidates. To test this hypothesis, the investigators collected blood from patients 6 days after gluten food challenge and incubated these samples for 24 hours with gluten peptides. Results of this test suggested that gluten-specific CD4+ T cells were the most likely source of IL-2.
The connection between particular cytokines and gastrointestinal symptoms is now supported with evidence; however, the investigators pointed out that a relationship between cytokines and other symptoms of celiac disease remains to be seen. “Whether cytokines elevated in blood after injecting gluten peptides or ingesting gluten have any direct extraintestinal effects is unclear,” the investigators wrote. “Fatigue, headache, and ‘brain fog’ are the commonly reported extraintestinal symptoms in [celiac disease] patients. However, symptoms being focused on the upper gastrointestinal tract suggest that cytokines increased in blood have clinical and immunological effects that selectively affect the tissue from which they originate.”
“Future studies should test whether cytokine concentrations are substantially higher in gut mucosal tissue than in blood after gluten challenge or injection of gluten peptides and determine whether alterations in local cytokine levels are matched by immune and inflammatory cell infiltration,” the investigators wrote.
The study was supported by the University of Chicago Celiac Disease Center and the University of Oslo KG Jebsen Coeliac Disease Research Centre. The investigators reported additional relationships The investigators reported additional relationships with several government and nonprofit organizations. Multiple investigators are employees of ImmusanT, which is developing Nexvax2.
SOURCE: Goel G et al. Science Advances. 2019 Aug 7. doi: 10.1126/sciadv.aaw7756.
CD4+ T-cell reactivation and interleukin (IL)–2 release are responsible for acute gastrointestinal symptoms when patients with celiac disease are exposed to gluten, according to investigators.
Although T cells have been well studied in previous celiac disease research, clinical symptoms after acute gluten exposure have never been linked with specific cytokine changes, reported lead author Gautam Goel, PhD, of Massachusetts General Hospital in Boston, and colleagues.
“If treated [celiac disease] patients, i.e., those following a strict [gluten-free diet], are exposed to gluten-containing food, they typically suffer from gastrointestinal reactions occurring 1 to 2 hours after the gluten exposure,” the investigators wrote in Science Advances. “There is currently no explanation for the acute gluten-induced symptoms seen in treated [celiac disease] patients.”
The current study was prompted by two phase 1 trials involving the therapeutic vaccine Nexvax2, which uses peptide fragments of gluten proteins to desensitize celiac patients to gluten, the investigators explained. During those trials, intradermal injections of Nexvax2 above a certain dose threshold led to gastrointestinal symptoms within 2-5 hours, but not injection site reactions, which would have been indicative of a cutaneous response to recall antigen.
“Our observations from these phase 1 studies led us to hypothesize that cytokine release occurs following natural gluten exposure and could be used to implicate which arms of the immune system drive early symptoms.”
Of the 28 patients in the two trials, all underwent intradermal testing, while 19 also participated in an oral gluten challenge. Following intradermal injection of gluten peptides, patients exhibited gastrointestinal symptoms, along with coordinated elevations of least 15 plasma cytokines; most significantly IL-2, MCP-1, IL-8, IL-10, MIP-1beta, IP-10, and eotaxin. The first cytokines to respond to injection were IL-2 and IL-8, rising within 2 hours, prior to symptoms. At 4 hours, when symptoms were present, peak IL-2 elevations were most dramatic, with a 272-fold elevation, followed by IL-8 (11-fold) and IL-10 (1.2-fold).
“IL-2 is both the earliest and most sensitive marker for the coordinated cytokine release that was almost universal in HLA-DQ2.5 + [celiac disease] patients administered gluten peptides,” the investigators wrote.
Similar to intradermal testing, oral challenge with gluten caused IL-2, IL-8, and IL-10 to elevate within 2 hours, and peak within 4-6 hours. Again, IL-2 was most sensitive, with a 15-fold increase at 4 hours. This increase in IL-2 correlated with IL-8 and IL-10 elevations, although IL-2 increases were at least six times greater than the other two cytokines.
“Together, the serum cytokine profile following gluten ingestion is less prominent but qualitatively similar and over a corresponding time course to that after injecting gluten peptides, which is consistent with activated CD4+ T cells being the driver of cytokine release in both scenarios,” the investigators wrote.
Further testing showed that, after gluten challenge, plasma levels of IL-2, IL-8, and IL-10 negatively correlated with duodenal villous height-to-crypt depth ratios. In addition, high levels of IL-2 correlated with severe nausea and vomiting, adding to the evidence that celiac symptoms were linked with specific cytokine elevations.
“The link between immune activation and symptoms was further strengthened by showing that postdose symptoms and cytokine release were both lessened after three weekly doses and absent after 16 twice-weekly injections of gluten peptides,” the investigators wrote. “These findings are consistent with the difference in severity of symptoms after gluten ingestion compared to gluten peptide injection being related to potency of the antigen challenge and T-cell activation measured by circulating IL-2 concentration at 4 hours.”
Even though IL-2 elevations appeared to drive celiac symptoms, the source of IL-2 was initially unknown. “Activated T cells are the primary source of IL-2, but [dendritic cells] can also secrete IL-2 following ligation of specific pathogen recognition receptors; mast cells also secrete IL-2 following exposure to IL-33 or IL-9,” the investigators explained. Still, CD4+ T cells are known to be key players in celiac disease, and the timing and magnitude of IL-2 release made T cells the most likely candidates. To test this hypothesis, the investigators collected blood from patients 6 days after gluten food challenge and incubated these samples for 24 hours with gluten peptides. Results of this test suggested that gluten-specific CD4+ T cells were the most likely source of IL-2.
The connection between particular cytokines and gastrointestinal symptoms is now supported with evidence; however, the investigators pointed out that a relationship between cytokines and other symptoms of celiac disease remains to be seen. “Whether cytokines elevated in blood after injecting gluten peptides or ingesting gluten have any direct extraintestinal effects is unclear,” the investigators wrote. “Fatigue, headache, and ‘brain fog’ are the commonly reported extraintestinal symptoms in [celiac disease] patients. However, symptoms being focused on the upper gastrointestinal tract suggest that cytokines increased in blood have clinical and immunological effects that selectively affect the tissue from which they originate.”
“Future studies should test whether cytokine concentrations are substantially higher in gut mucosal tissue than in blood after gluten challenge or injection of gluten peptides and determine whether alterations in local cytokine levels are matched by immune and inflammatory cell infiltration,” the investigators wrote.
The study was supported by the University of Chicago Celiac Disease Center and the University of Oslo KG Jebsen Coeliac Disease Research Centre. The investigators reported additional relationships The investigators reported additional relationships with several government and nonprofit organizations. Multiple investigators are employees of ImmusanT, which is developing Nexvax2.
SOURCE: Goel G et al. Science Advances. 2019 Aug 7. doi: 10.1126/sciadv.aaw7756.
FROM SCIENCE ADVANCES
Vitamin D supplementation may improve ulcerative colitis
Vitamin D supplementation may lead to significant improvements in ulcerative colitis (UC), based on a placebo-controlled trial involving 60 patients with active disease.
Those who achieved vitamin D levels greater than 40 ng/mL were most likely to benefit, reported lead author Rizwan Ahamed Z, MD, of the Postgraduate Institute of Medical Education and Research in Chandigarh, India, and colleagues. They noted that the findings contribute much-needed clinical data to a largely theoretical subject area.
“[T]he discovery of vitamin D receptors on lymphocytes, monocytes, and dendritic cells initiated various studies which have highlighted the role of vitamin D in regulating gut mucosal immunity and gut barrier,” the investigators wrote in Journal of Clinical Gastroenterology. “In experimental interleukin (IL)-10 knockout mice models, vitamin D deficiency was found to result in severe colitis, progressive wasting, and high mortality. However, vitamin D supplementation not only prevented but also ameliorated symptoms of colitis in the mice model.”
Human studies have revealed similar associations between vitamin D supplementation and inflammatory bowel disease, such as a study by Jørgensen and colleagues that found a lower risk of relapse in Crohn’s disease, and another by Sharifi and colleagues that showed injectable vitamin D could reduce erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) in patients with UC. Still, the investigators suggested that more clinical data are needed, particularly for outcomes after vitamin D therapy. In addition to providing such data, the present trial was also the first of its kind to test oral nano vitamin D3, which may have better bioavailability than conventional supplements.
The investigators initially recruited 110 patients with active UC who had an ulcerative colitis disease activity index (UCDAI) of at least 3. After screening, 50 patients were excluded because they had vitamin D levels greater than 40 ng/mL, were already taking a vitamin D supplement, had severe UC requiring hospitalization, or exhibited severe systemic illness. The remaining 60 patients were randomized in a 1:1 ratio to receive either 60,000 IU nano vitamin D3 once daily for 8 days, or placebo. Disease parameters, which were measured at baseline and then again at 4 weeks, included UCDAI, ESR, CRP, and fecal calprotectin. The primary outcome was response, defined as a UCDAI reduction of at least 3 points. Secondary outcome measures included stool frequency, stool consistency, and remission (UCDAI less than 3); in addition, the investigators evaluated histologic, endoscopic, fecal, and serum inflammatory markers.
The majority of patients in the study were men (60%), with a mean age of 36 years. Most patients had moderate UC (73.3%), while smaller proportions had severe (18%) or mild (8%) disease. All patients were taking a 5-aminosalicylic acid oral compound and some (16.6%) were also taking azathioprine. At baseline, the mean vitamin D level was 14 ng/mL. Most patients (70%) were diagnosed with vitamin D deficiency, based on measurements below 20 ng/mL. The remaining patients were diagnosed with insufficiency (13%; 20-30 ng/mL) or suboptimal levels (17%; 30-40 ng/mL).
From baseline to 4-week follow-up, median vitamin D level in the supplement group increased from 15.4 to 40.83 ng/mL, compared with a much smaller increase in the placebo group, from 13.45 to 18.85 ng/mL. Compared with the placebo group, significantly more patients given nano vitamin D3 achieved a UCDAI 3-point reduction (53% vs 13%; P = .001); this translated to a Pearson correlation coefficient (rho) of –0.713, between vitamin D level and UCDAI. Similar, albeit less strong, inverse relationships were detected between vitamin D level and CRP (rho = −0.603) and calprotectin (rho = −0.368).
Benefits observed in the supplement group also extended to stool frequency, stool consistency, and histologic measures. Those who achieved a vitamin D level greater than 40 ng/mL were 4 times more likely to have a UCDAI 3-point reduction than those who did not meet the same criteria (80% vs 20%; P = .038). Independent predictors of response included baseline histologic activity (odds ratio, 1.92), and to a greater extent, vitamin D supplementation (OR, 9.17). No patients achieved remission, which the investigators attributed to the relatively short study duration.
Minor, self-limiting side effects occurred in 13.3% and 10% of patients given the vitamin D supplement and placebo, respectively.
“[T]he present study showed significant improvement in all inflammatory parameters of the disease including clinical, endoscopic, histopathologic, and serum and fecal markers of inflammation, all of which paralleled each other in showing [the benefit of] oral nano vitamin D supplementation,” the investigators concluded. They advised that larger, longer-term studies are needed before the findings can be generalized to all patients with active UC.
The investigators disclosed no external funding or conflicts of interest.
SOURCE: Ahamed R et al. J Clin Gastroenterol. 2019 Jul 24. doi: 10.1097/MCG.0000000000001233.
Vitamin D supplementation may lead to significant improvements in ulcerative colitis (UC), based on a placebo-controlled trial involving 60 patients with active disease.
Those who achieved vitamin D levels greater than 40 ng/mL were most likely to benefit, reported lead author Rizwan Ahamed Z, MD, of the Postgraduate Institute of Medical Education and Research in Chandigarh, India, and colleagues. They noted that the findings contribute much-needed clinical data to a largely theoretical subject area.
“[T]he discovery of vitamin D receptors on lymphocytes, monocytes, and dendritic cells initiated various studies which have highlighted the role of vitamin D in regulating gut mucosal immunity and gut barrier,” the investigators wrote in Journal of Clinical Gastroenterology. “In experimental interleukin (IL)-10 knockout mice models, vitamin D deficiency was found to result in severe colitis, progressive wasting, and high mortality. However, vitamin D supplementation not only prevented but also ameliorated symptoms of colitis in the mice model.”
Human studies have revealed similar associations between vitamin D supplementation and inflammatory bowel disease, such as a study by Jørgensen and colleagues that found a lower risk of relapse in Crohn’s disease, and another by Sharifi and colleagues that showed injectable vitamin D could reduce erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) in patients with UC. Still, the investigators suggested that more clinical data are needed, particularly for outcomes after vitamin D therapy. In addition to providing such data, the present trial was also the first of its kind to test oral nano vitamin D3, which may have better bioavailability than conventional supplements.
The investigators initially recruited 110 patients with active UC who had an ulcerative colitis disease activity index (UCDAI) of at least 3. After screening, 50 patients were excluded because they had vitamin D levels greater than 40 ng/mL, were already taking a vitamin D supplement, had severe UC requiring hospitalization, or exhibited severe systemic illness. The remaining 60 patients were randomized in a 1:1 ratio to receive either 60,000 IU nano vitamin D3 once daily for 8 days, or placebo. Disease parameters, which were measured at baseline and then again at 4 weeks, included UCDAI, ESR, CRP, and fecal calprotectin. The primary outcome was response, defined as a UCDAI reduction of at least 3 points. Secondary outcome measures included stool frequency, stool consistency, and remission (UCDAI less than 3); in addition, the investigators evaluated histologic, endoscopic, fecal, and serum inflammatory markers.
The majority of patients in the study were men (60%), with a mean age of 36 years. Most patients had moderate UC (73.3%), while smaller proportions had severe (18%) or mild (8%) disease. All patients were taking a 5-aminosalicylic acid oral compound and some (16.6%) were also taking azathioprine. At baseline, the mean vitamin D level was 14 ng/mL. Most patients (70%) were diagnosed with vitamin D deficiency, based on measurements below 20 ng/mL. The remaining patients were diagnosed with insufficiency (13%; 20-30 ng/mL) or suboptimal levels (17%; 30-40 ng/mL).
From baseline to 4-week follow-up, median vitamin D level in the supplement group increased from 15.4 to 40.83 ng/mL, compared with a much smaller increase in the placebo group, from 13.45 to 18.85 ng/mL. Compared with the placebo group, significantly more patients given nano vitamin D3 achieved a UCDAI 3-point reduction (53% vs 13%; P = .001); this translated to a Pearson correlation coefficient (rho) of –0.713, between vitamin D level and UCDAI. Similar, albeit less strong, inverse relationships were detected between vitamin D level and CRP (rho = −0.603) and calprotectin (rho = −0.368).
Benefits observed in the supplement group also extended to stool frequency, stool consistency, and histologic measures. Those who achieved a vitamin D level greater than 40 ng/mL were 4 times more likely to have a UCDAI 3-point reduction than those who did not meet the same criteria (80% vs 20%; P = .038). Independent predictors of response included baseline histologic activity (odds ratio, 1.92), and to a greater extent, vitamin D supplementation (OR, 9.17). No patients achieved remission, which the investigators attributed to the relatively short study duration.
Minor, self-limiting side effects occurred in 13.3% and 10% of patients given the vitamin D supplement and placebo, respectively.
“[T]he present study showed significant improvement in all inflammatory parameters of the disease including clinical, endoscopic, histopathologic, and serum and fecal markers of inflammation, all of which paralleled each other in showing [the benefit of] oral nano vitamin D supplementation,” the investigators concluded. They advised that larger, longer-term studies are needed before the findings can be generalized to all patients with active UC.
The investigators disclosed no external funding or conflicts of interest.
SOURCE: Ahamed R et al. J Clin Gastroenterol. 2019 Jul 24. doi: 10.1097/MCG.0000000000001233.
Vitamin D supplementation may lead to significant improvements in ulcerative colitis (UC), based on a placebo-controlled trial involving 60 patients with active disease.
Those who achieved vitamin D levels greater than 40 ng/mL were most likely to benefit, reported lead author Rizwan Ahamed Z, MD, of the Postgraduate Institute of Medical Education and Research in Chandigarh, India, and colleagues. They noted that the findings contribute much-needed clinical data to a largely theoretical subject area.
“[T]he discovery of vitamin D receptors on lymphocytes, monocytes, and dendritic cells initiated various studies which have highlighted the role of vitamin D in regulating gut mucosal immunity and gut barrier,” the investigators wrote in Journal of Clinical Gastroenterology. “In experimental interleukin (IL)-10 knockout mice models, vitamin D deficiency was found to result in severe colitis, progressive wasting, and high mortality. However, vitamin D supplementation not only prevented but also ameliorated symptoms of colitis in the mice model.”
Human studies have revealed similar associations between vitamin D supplementation and inflammatory bowel disease, such as a study by Jørgensen and colleagues that found a lower risk of relapse in Crohn’s disease, and another by Sharifi and colleagues that showed injectable vitamin D could reduce erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) in patients with UC. Still, the investigators suggested that more clinical data are needed, particularly for outcomes after vitamin D therapy. In addition to providing such data, the present trial was also the first of its kind to test oral nano vitamin D3, which may have better bioavailability than conventional supplements.
The investigators initially recruited 110 patients with active UC who had an ulcerative colitis disease activity index (UCDAI) of at least 3. After screening, 50 patients were excluded because they had vitamin D levels greater than 40 ng/mL, were already taking a vitamin D supplement, had severe UC requiring hospitalization, or exhibited severe systemic illness. The remaining 60 patients were randomized in a 1:1 ratio to receive either 60,000 IU nano vitamin D3 once daily for 8 days, or placebo. Disease parameters, which were measured at baseline and then again at 4 weeks, included UCDAI, ESR, CRP, and fecal calprotectin. The primary outcome was response, defined as a UCDAI reduction of at least 3 points. Secondary outcome measures included stool frequency, stool consistency, and remission (UCDAI less than 3); in addition, the investigators evaluated histologic, endoscopic, fecal, and serum inflammatory markers.
The majority of patients in the study were men (60%), with a mean age of 36 years. Most patients had moderate UC (73.3%), while smaller proportions had severe (18%) or mild (8%) disease. All patients were taking a 5-aminosalicylic acid oral compound and some (16.6%) were also taking azathioprine. At baseline, the mean vitamin D level was 14 ng/mL. Most patients (70%) were diagnosed with vitamin D deficiency, based on measurements below 20 ng/mL. The remaining patients were diagnosed with insufficiency (13%; 20-30 ng/mL) or suboptimal levels (17%; 30-40 ng/mL).
From baseline to 4-week follow-up, median vitamin D level in the supplement group increased from 15.4 to 40.83 ng/mL, compared with a much smaller increase in the placebo group, from 13.45 to 18.85 ng/mL. Compared with the placebo group, significantly more patients given nano vitamin D3 achieved a UCDAI 3-point reduction (53% vs 13%; P = .001); this translated to a Pearson correlation coefficient (rho) of –0.713, between vitamin D level and UCDAI. Similar, albeit less strong, inverse relationships were detected between vitamin D level and CRP (rho = −0.603) and calprotectin (rho = −0.368).
Benefits observed in the supplement group also extended to stool frequency, stool consistency, and histologic measures. Those who achieved a vitamin D level greater than 40 ng/mL were 4 times more likely to have a UCDAI 3-point reduction than those who did not meet the same criteria (80% vs 20%; P = .038). Independent predictors of response included baseline histologic activity (odds ratio, 1.92), and to a greater extent, vitamin D supplementation (OR, 9.17). No patients achieved remission, which the investigators attributed to the relatively short study duration.
Minor, self-limiting side effects occurred in 13.3% and 10% of patients given the vitamin D supplement and placebo, respectively.
“[T]he present study showed significant improvement in all inflammatory parameters of the disease including clinical, endoscopic, histopathologic, and serum and fecal markers of inflammation, all of which paralleled each other in showing [the benefit of] oral nano vitamin D supplementation,” the investigators concluded. They advised that larger, longer-term studies are needed before the findings can be generalized to all patients with active UC.
The investigators disclosed no external funding or conflicts of interest.
SOURCE: Ahamed R et al. J Clin Gastroenterol. 2019 Jul 24. doi: 10.1097/MCG.0000000000001233.
FROM THE JOURNAL OF CLINICAL GASTROENTEROLOGY
Large prospective trial offers reassurance for long-term PPI use
Aside from a possible increased risk of enteric infections, long-term use of the proton pump inhibitor (PPI) pantoprazole appears safe in patients with stable atherosclerotic vascular disease, according to a prospective trial involving more than 17,000 participants.
In contrast with published observational studies, the present trial found no associations between long-term PPI use and previously reported risks such as pneumonia, fracture, or cerebrovascular events, according to lead author Paul Moayyedi, MB ChB, PhD, of McMaster University in Hamilton, Ont., and colleagues.
“To our knowledge, this is the largest PPI trial for any indication and the first prospective randomized trial to evaluate the many long-term safety concerns related to PPI therapy,” the investigators wrote in Gastroenterology. “It is reassuring that there was no evidence for harm for most of these events other than an excess of enteric infections.”
“Given how commonly acid suppressive medications are used, it is important to ensure that this class of drugs is safe,” the investigators wrote. They noted that patients are often alarmed by “sensational headlines” about PPI safety. “There are balancing articles that more carefully discuss the risks and benefits of taking PPI therapy but these receive less media attention,” the investigators added.
The present, prospective trial, COMPASS, involved 17,598 participants from 33 countries with stable peripheral artery disease and cardiovascular disease. “We use the term participants, rather than patients, as not all of those taking part in this research would have been patients throughout the trial but all participated in the randomized controlled trial,” the investigators wrote.
In addition to evaluating the safety of pantoprazole, the study was initially designed to measure the efficacy of pantoprazole for preventing upper gastrointestinal events in participants taking rivaroxaban and/or aspirin, which, in combination, were recently shown to reduce cardiovascular outcomes among patients with stable cardiovascular conditions. As such, participants in the trial were randomized to one of three groups: 100-mg aspirin once daily, 5-mg rivaroxaban twice daily, or 2.5-mg rivaroxaban twice daily combined with 100-mg aspirin once daily. The primary efficacy outcomes for these three groups were stroke, myocardial infarction, and cardiovascular death. This portion of the trial was discontinued early because of evidence that showed the superiority of combination therapy over aspirin alone; however, the pantoprazole component of the trial continued, as planned, for 3 years.
At baseline, about two-thirds of participants (64%) were not taking a PPI, requiring randomization to either 40-mg pantoprazole once daily or matching placebo. Pantoprazole safety outcomes centered on those previously reported by observational studies, including dementia, chronic kidney disease, gastric atrophy, fracture, cancer, pneumonia, diabetes mellitus, chronic obstructive lung disease, Clostrididoides difficile infection, and other enteric infections. Hospitalization rates for noncardiovascular and cardiovascular events were also reported. Data were gathered via questionnaires, which were conducted every 6 months.
Most patients in the trial (78%) were male, and 23% were current smokers. Smaller proportions of the population were taking an NSAID (5%) and/or had a history of peptic ulcer disease (2.6%). The median follow-up was 3.01 years, ranging from 2.49 to 3.59 years. Permanent discontinuations occurred at approximately equal rates in the pantoprazole (21%) and placebo (22%) group after a median of 11 months (338 days). In both groups, more than 96% of participants who continued treatment took their medications as prescribed at least 80% of the time.
Analysis of cardiovascular outcomes revealed no significant differences between placebo and pantoprazole groups. Of all the evaluated safety measures, only enteric infections differed significantly between groups, occurring at a higher rate in the pantoprazole group than in the placebo group (1.4% vs. 1.0%; odds ratio, 1.33; 95% confidence interval, 1.01-1.75). Although C. difficile infection was more common among pantoprazole users, only 13 such events occurred, precluding statistical significance.
According to the investigators, these findings should offer reassurance to PPI prescribers and users; they noted that previous findings from observational studies warrant skepticism. “A significant proportion of patients are prescribed PPI therapy inappropriately, and in these cases, it is reasonable to advocate strategies to discontinue acid suppression. However, when there is a clinical need for PPI therapy, these data suggest that the benefits are likely to outweigh any putative risks.”
In regard to the possible increased risk of enteric infection, the investigators again urged a conservative interpretation, as the increased rate of enteric infection among PPI users was still lower than rates reported by systematic reviews. “The data in the current randomized trial were not adjusted for multiple testing so this result should be interpreted with caution,” the investigators wrote. Although acid suppression may allow for increased ingestion of pathogenic organisms, which could theoretically increase the risk of enteric infection, the investigators stated that the benefits of PPIs likely outweigh their risks.
The COMPASS trial was funded by Bayer AG. The investigators disclosed additional relationships with Bayer, Allergan, Takeda, Janssen, and others.
SOURCE: Moayyedi P et al. Gastro. 2019 May 29. doi: 10.1053/j.gastro.2019.05.056.
AGA patient education on GERD can help your patients better understand and manage the disorder. Post this education or your practice website or share you’re your patients at https://www.gastro.org/practice-guidance/gi-patient-center/topic/gastroesophageal-reflux-disease-gerd.
Aside from a possible increased risk of enteric infections, long-term use of the proton pump inhibitor (PPI) pantoprazole appears safe in patients with stable atherosclerotic vascular disease, according to a prospective trial involving more than 17,000 participants.
In contrast with published observational studies, the present trial found no associations between long-term PPI use and previously reported risks such as pneumonia, fracture, or cerebrovascular events, according to lead author Paul Moayyedi, MB ChB, PhD, of McMaster University in Hamilton, Ont., and colleagues.
“To our knowledge, this is the largest PPI trial for any indication and the first prospective randomized trial to evaluate the many long-term safety concerns related to PPI therapy,” the investigators wrote in Gastroenterology. “It is reassuring that there was no evidence for harm for most of these events other than an excess of enteric infections.”
“Given how commonly acid suppressive medications are used, it is important to ensure that this class of drugs is safe,” the investigators wrote. They noted that patients are often alarmed by “sensational headlines” about PPI safety. “There are balancing articles that more carefully discuss the risks and benefits of taking PPI therapy but these receive less media attention,” the investigators added.
The present, prospective trial, COMPASS, involved 17,598 participants from 33 countries with stable peripheral artery disease and cardiovascular disease. “We use the term participants, rather than patients, as not all of those taking part in this research would have been patients throughout the trial but all participated in the randomized controlled trial,” the investigators wrote.
In addition to evaluating the safety of pantoprazole, the study was initially designed to measure the efficacy of pantoprazole for preventing upper gastrointestinal events in participants taking rivaroxaban and/or aspirin, which, in combination, were recently shown to reduce cardiovascular outcomes among patients with stable cardiovascular conditions. As such, participants in the trial were randomized to one of three groups: 100-mg aspirin once daily, 5-mg rivaroxaban twice daily, or 2.5-mg rivaroxaban twice daily combined with 100-mg aspirin once daily. The primary efficacy outcomes for these three groups were stroke, myocardial infarction, and cardiovascular death. This portion of the trial was discontinued early because of evidence that showed the superiority of combination therapy over aspirin alone; however, the pantoprazole component of the trial continued, as planned, for 3 years.
At baseline, about two-thirds of participants (64%) were not taking a PPI, requiring randomization to either 40-mg pantoprazole once daily or matching placebo. Pantoprazole safety outcomes centered on those previously reported by observational studies, including dementia, chronic kidney disease, gastric atrophy, fracture, cancer, pneumonia, diabetes mellitus, chronic obstructive lung disease, Clostrididoides difficile infection, and other enteric infections. Hospitalization rates for noncardiovascular and cardiovascular events were also reported. Data were gathered via questionnaires, which were conducted every 6 months.
Most patients in the trial (78%) were male, and 23% were current smokers. Smaller proportions of the population were taking an NSAID (5%) and/or had a history of peptic ulcer disease (2.6%). The median follow-up was 3.01 years, ranging from 2.49 to 3.59 years. Permanent discontinuations occurred at approximately equal rates in the pantoprazole (21%) and placebo (22%) group after a median of 11 months (338 days). In both groups, more than 96% of participants who continued treatment took their medications as prescribed at least 80% of the time.
Analysis of cardiovascular outcomes revealed no significant differences between placebo and pantoprazole groups. Of all the evaluated safety measures, only enteric infections differed significantly between groups, occurring at a higher rate in the pantoprazole group than in the placebo group (1.4% vs. 1.0%; odds ratio, 1.33; 95% confidence interval, 1.01-1.75). Although C. difficile infection was more common among pantoprazole users, only 13 such events occurred, precluding statistical significance.
According to the investigators, these findings should offer reassurance to PPI prescribers and users; they noted that previous findings from observational studies warrant skepticism. “A significant proportion of patients are prescribed PPI therapy inappropriately, and in these cases, it is reasonable to advocate strategies to discontinue acid suppression. However, when there is a clinical need for PPI therapy, these data suggest that the benefits are likely to outweigh any putative risks.”
In regard to the possible increased risk of enteric infection, the investigators again urged a conservative interpretation, as the increased rate of enteric infection among PPI users was still lower than rates reported by systematic reviews. “The data in the current randomized trial were not adjusted for multiple testing so this result should be interpreted with caution,” the investigators wrote. Although acid suppression may allow for increased ingestion of pathogenic organisms, which could theoretically increase the risk of enteric infection, the investigators stated that the benefits of PPIs likely outweigh their risks.
The COMPASS trial was funded by Bayer AG. The investigators disclosed additional relationships with Bayer, Allergan, Takeda, Janssen, and others.
SOURCE: Moayyedi P et al. Gastro. 2019 May 29. doi: 10.1053/j.gastro.2019.05.056.
AGA patient education on GERD can help your patients better understand and manage the disorder. Post this education or your practice website or share you’re your patients at https://www.gastro.org/practice-guidance/gi-patient-center/topic/gastroesophageal-reflux-disease-gerd.
Aside from a possible increased risk of enteric infections, long-term use of the proton pump inhibitor (PPI) pantoprazole appears safe in patients with stable atherosclerotic vascular disease, according to a prospective trial involving more than 17,000 participants.
In contrast with published observational studies, the present trial found no associations between long-term PPI use and previously reported risks such as pneumonia, fracture, or cerebrovascular events, according to lead author Paul Moayyedi, MB ChB, PhD, of McMaster University in Hamilton, Ont., and colleagues.
“To our knowledge, this is the largest PPI trial for any indication and the first prospective randomized trial to evaluate the many long-term safety concerns related to PPI therapy,” the investigators wrote in Gastroenterology. “It is reassuring that there was no evidence for harm for most of these events other than an excess of enteric infections.”
“Given how commonly acid suppressive medications are used, it is important to ensure that this class of drugs is safe,” the investigators wrote. They noted that patients are often alarmed by “sensational headlines” about PPI safety. “There are balancing articles that more carefully discuss the risks and benefits of taking PPI therapy but these receive less media attention,” the investigators added.
The present, prospective trial, COMPASS, involved 17,598 participants from 33 countries with stable peripheral artery disease and cardiovascular disease. “We use the term participants, rather than patients, as not all of those taking part in this research would have been patients throughout the trial but all participated in the randomized controlled trial,” the investigators wrote.
In addition to evaluating the safety of pantoprazole, the study was initially designed to measure the efficacy of pantoprazole for preventing upper gastrointestinal events in participants taking rivaroxaban and/or aspirin, which, in combination, were recently shown to reduce cardiovascular outcomes among patients with stable cardiovascular conditions. As such, participants in the trial were randomized to one of three groups: 100-mg aspirin once daily, 5-mg rivaroxaban twice daily, or 2.5-mg rivaroxaban twice daily combined with 100-mg aspirin once daily. The primary efficacy outcomes for these three groups were stroke, myocardial infarction, and cardiovascular death. This portion of the trial was discontinued early because of evidence that showed the superiority of combination therapy over aspirin alone; however, the pantoprazole component of the trial continued, as planned, for 3 years.
At baseline, about two-thirds of participants (64%) were not taking a PPI, requiring randomization to either 40-mg pantoprazole once daily or matching placebo. Pantoprazole safety outcomes centered on those previously reported by observational studies, including dementia, chronic kidney disease, gastric atrophy, fracture, cancer, pneumonia, diabetes mellitus, chronic obstructive lung disease, Clostrididoides difficile infection, and other enteric infections. Hospitalization rates for noncardiovascular and cardiovascular events were also reported. Data were gathered via questionnaires, which were conducted every 6 months.
Most patients in the trial (78%) were male, and 23% were current smokers. Smaller proportions of the population were taking an NSAID (5%) and/or had a history of peptic ulcer disease (2.6%). The median follow-up was 3.01 years, ranging from 2.49 to 3.59 years. Permanent discontinuations occurred at approximately equal rates in the pantoprazole (21%) and placebo (22%) group after a median of 11 months (338 days). In both groups, more than 96% of participants who continued treatment took their medications as prescribed at least 80% of the time.
Analysis of cardiovascular outcomes revealed no significant differences between placebo and pantoprazole groups. Of all the evaluated safety measures, only enteric infections differed significantly between groups, occurring at a higher rate in the pantoprazole group than in the placebo group (1.4% vs. 1.0%; odds ratio, 1.33; 95% confidence interval, 1.01-1.75). Although C. difficile infection was more common among pantoprazole users, only 13 such events occurred, precluding statistical significance.
According to the investigators, these findings should offer reassurance to PPI prescribers and users; they noted that previous findings from observational studies warrant skepticism. “A significant proportion of patients are prescribed PPI therapy inappropriately, and in these cases, it is reasonable to advocate strategies to discontinue acid suppression. However, when there is a clinical need for PPI therapy, these data suggest that the benefits are likely to outweigh any putative risks.”
In regard to the possible increased risk of enteric infection, the investigators again urged a conservative interpretation, as the increased rate of enteric infection among PPI users was still lower than rates reported by systematic reviews. “The data in the current randomized trial were not adjusted for multiple testing so this result should be interpreted with caution,” the investigators wrote. Although acid suppression may allow for increased ingestion of pathogenic organisms, which could theoretically increase the risk of enteric infection, the investigators stated that the benefits of PPIs likely outweigh their risks.
The COMPASS trial was funded by Bayer AG. The investigators disclosed additional relationships with Bayer, Allergan, Takeda, Janssen, and others.
SOURCE: Moayyedi P et al. Gastro. 2019 May 29. doi: 10.1053/j.gastro.2019.05.056.
AGA patient education on GERD can help your patients better understand and manage the disorder. Post this education or your practice website or share you’re your patients at https://www.gastro.org/practice-guidance/gi-patient-center/topic/gastroesophageal-reflux-disease-gerd.
FROM GASTROENTEROLOGY
Key clinical point: Aside from a possible increased risk of enteric infections, long-term use of pantoprazole is safe in patients with stable peripheral artery and cardiovascular disease.
Major finding: Enteric infections were 33% more common in the pantoprazole group than in the placebo group.
Study details: A placebo-controlled, double-blind, randomized trial involving 17,598 patients with stable peripheral artery disease and cardiovascular disease.
Disclosures: The COMPASS trial was funded by Bayer AG. The investigators disclosed relationships with Bayer, Allergan, Takeda, Janssen, and others.
Source: Moayyedi P et al. Gastroenterology. 2019 May 29. doi: 10.1053/j.gastro.2019.05.056.
TP53 double hit predicts aggressive myeloma
Relapsed multiple myeloma becomes increasingly aggressive and difficult to treat with each additional TP53 alteration, according to investigators.
Findings from the study help illuminate the mechanics of myeloma disease progression and demonstrate the value of clonal competition assays, reported lead author Umair Munawar of the University Hospital Würzburg (Germany) and colleagues.
“The implications of mono-allelic TP53 lesions for the clinical outcome remain controversial, but clonal selection and evolution is a common feature of myeloma progression, and patients with TP53 wild-type or mono-allelic inactivation may present a double hit on relapse,” the investigators wrote in Blood. “Here, we addressed the hypothesis that sequential acquisition of TP53 hits lead to a gain of proliferative fitness of [multiple myeloma] cancer cells, inducing the expansion and domination of the affected clones within the patient’s bone marrow.”
The investigators used sleeping beauty and CRISPR/Cas9 techniques to create double- and single-hit multiple myeloma cell lines that were stably transfected with fluorescent proteins. By observing coculture pairings of wild-type, single-hit, and double-hit cells, the investigators found a hierarchy of proliferation that depended on the number of TP53 alterations. For instance, when double-hit cells were cocultured with wild-type cells in a 1:3 ratio, it took 21 days for the double-hit cells to reach 50% of the total culture population. Similarly, single-hit cells outcompeted wild-type cells after 38 days, while double-hit cells took 35 days to overcome the single-hit population.
Further testing showed that comparatively smaller initial populations of TP53-aberrant cells required longer to outcompete larger wild-type populations, which could explain why deeper responses in the clinic are often followed by longer periods without disease progression, the investigators suggested.
A comparison of transcriptomes between wild-type cells and TP53 mutants revealed differences in about 900 genes, including 14 signaling pathways. Specifically, downregulation impacted antigen processing and presentation, chemokine signaling, and oxidative phosphorylation.
“These differences on the transcriptomic level well reflect the biology of ultra–high risk disease,” the investigators wrote, referring to increased glucose uptake on PET, resistance to immunotherapies, and extramedullary disease.
“[This study] underscores the power of clonal competition assays to decipher the effect of genomic lesions in tumors to better understand their impact on progression and disease relapse in [multiple myeloma],” the investigators concluded.
The study was funded by Deutsche Forschungsgemeinschaft, the CDW Stiftung, and the IZKF Würzburg. The investigators reported additional support from the CRIS foundation, the German Cancer Aid, and the University of Würzburg.
SOURCE: Munawar U et al. Blood. 2019 Jul 24. doi: 10.1182/blood.2019000080.
Relapsed multiple myeloma becomes increasingly aggressive and difficult to treat with each additional TP53 alteration, according to investigators.
Findings from the study help illuminate the mechanics of myeloma disease progression and demonstrate the value of clonal competition assays, reported lead author Umair Munawar of the University Hospital Würzburg (Germany) and colleagues.
“The implications of mono-allelic TP53 lesions for the clinical outcome remain controversial, but clonal selection and evolution is a common feature of myeloma progression, and patients with TP53 wild-type or mono-allelic inactivation may present a double hit on relapse,” the investigators wrote in Blood. “Here, we addressed the hypothesis that sequential acquisition of TP53 hits lead to a gain of proliferative fitness of [multiple myeloma] cancer cells, inducing the expansion and domination of the affected clones within the patient’s bone marrow.”
The investigators used sleeping beauty and CRISPR/Cas9 techniques to create double- and single-hit multiple myeloma cell lines that were stably transfected with fluorescent proteins. By observing coculture pairings of wild-type, single-hit, and double-hit cells, the investigators found a hierarchy of proliferation that depended on the number of TP53 alterations. For instance, when double-hit cells were cocultured with wild-type cells in a 1:3 ratio, it took 21 days for the double-hit cells to reach 50% of the total culture population. Similarly, single-hit cells outcompeted wild-type cells after 38 days, while double-hit cells took 35 days to overcome the single-hit population.
Further testing showed that comparatively smaller initial populations of TP53-aberrant cells required longer to outcompete larger wild-type populations, which could explain why deeper responses in the clinic are often followed by longer periods without disease progression, the investigators suggested.
A comparison of transcriptomes between wild-type cells and TP53 mutants revealed differences in about 900 genes, including 14 signaling pathways. Specifically, downregulation impacted antigen processing and presentation, chemokine signaling, and oxidative phosphorylation.
“These differences on the transcriptomic level well reflect the biology of ultra–high risk disease,” the investigators wrote, referring to increased glucose uptake on PET, resistance to immunotherapies, and extramedullary disease.
“[This study] underscores the power of clonal competition assays to decipher the effect of genomic lesions in tumors to better understand their impact on progression and disease relapse in [multiple myeloma],” the investigators concluded.
The study was funded by Deutsche Forschungsgemeinschaft, the CDW Stiftung, and the IZKF Würzburg. The investigators reported additional support from the CRIS foundation, the German Cancer Aid, and the University of Würzburg.
SOURCE: Munawar U et al. Blood. 2019 Jul 24. doi: 10.1182/blood.2019000080.
Relapsed multiple myeloma becomes increasingly aggressive and difficult to treat with each additional TP53 alteration, according to investigators.
Findings from the study help illuminate the mechanics of myeloma disease progression and demonstrate the value of clonal competition assays, reported lead author Umair Munawar of the University Hospital Würzburg (Germany) and colleagues.
“The implications of mono-allelic TP53 lesions for the clinical outcome remain controversial, but clonal selection and evolution is a common feature of myeloma progression, and patients with TP53 wild-type or mono-allelic inactivation may present a double hit on relapse,” the investigators wrote in Blood. “Here, we addressed the hypothesis that sequential acquisition of TP53 hits lead to a gain of proliferative fitness of [multiple myeloma] cancer cells, inducing the expansion and domination of the affected clones within the patient’s bone marrow.”
The investigators used sleeping beauty and CRISPR/Cas9 techniques to create double- and single-hit multiple myeloma cell lines that were stably transfected with fluorescent proteins. By observing coculture pairings of wild-type, single-hit, and double-hit cells, the investigators found a hierarchy of proliferation that depended on the number of TP53 alterations. For instance, when double-hit cells were cocultured with wild-type cells in a 1:3 ratio, it took 21 days for the double-hit cells to reach 50% of the total culture population. Similarly, single-hit cells outcompeted wild-type cells after 38 days, while double-hit cells took 35 days to overcome the single-hit population.
Further testing showed that comparatively smaller initial populations of TP53-aberrant cells required longer to outcompete larger wild-type populations, which could explain why deeper responses in the clinic are often followed by longer periods without disease progression, the investigators suggested.
A comparison of transcriptomes between wild-type cells and TP53 mutants revealed differences in about 900 genes, including 14 signaling pathways. Specifically, downregulation impacted antigen processing and presentation, chemokine signaling, and oxidative phosphorylation.
“These differences on the transcriptomic level well reflect the biology of ultra–high risk disease,” the investigators wrote, referring to increased glucose uptake on PET, resistance to immunotherapies, and extramedullary disease.
“[This study] underscores the power of clonal competition assays to decipher the effect of genomic lesions in tumors to better understand their impact on progression and disease relapse in [multiple myeloma],” the investigators concluded.
The study was funded by Deutsche Forschungsgemeinschaft, the CDW Stiftung, and the IZKF Würzburg. The investigators reported additional support from the CRIS foundation, the German Cancer Aid, and the University of Würzburg.
SOURCE: Munawar U et al. Blood. 2019 Jul 24. doi: 10.1182/blood.2019000080.
FROM BLOOD