User login
AVAHO
div[contains(@class, 'header__large-screen')]
div[contains(@class, 'read-next-article')]
div[contains(@class, 'nav-primary')]
nav[contains(@class, 'nav-primary')]
section[contains(@class, 'footer-nav-section-wrapper')]
footer[@id='footer']
div[contains(@class, 'main-prefix')]
section[contains(@class, 'nav-hidden')]
div[contains(@class, 'ce-card-content')]
nav[contains(@class, 'nav-ce-stack')]


Agent Orange and Uranium Exposures Associated With Bladder Cancer Risk in Veterans
Exposure to Agent Orange and depleted urology are associated with increased risk of bladder cancer, according to a recent Urology meta-analysis. About 3200 US veterans are diagnosed with bladder cancer each year, which is the fourth most diagnosed cancer among veterans. “Identifying veterans exposed to these risk factors is crucial for implementing screening protocols and connecting them with preventive healthcare measures when possible,” the authors said.
A meta-analysis using narrative synthesis to incorporate diverse studies examined the impact of exposure to Agent Orange, depleted uranium exposure, contaminated drinking water, and other environmental contaminants. The researchers found 7 studies of Agent Orange exposure that in total showed a statistically significant increase in bladder cancer risk (hazard ratio [HR], 1.17; 95% confidence interval [CI], 1.01-1.36; P < .001) among 2,705,283 veterans. Six studies revealed that depleted uranium exposure caused a statistically significant association with bladder cancer as well (HR, 2.13; 95% CI, 1.31-3.48; P = .002) among 28,899 patients. Exposure to contaminated drinking water exposure in 4 studies also suggested an increased bladder cancer risk (HR, 1.25; 95% CI, 0.97-1.61; P = .08) among 370,408 veterans.
The authors identified other factors that also contributed to increased bladder cancer risk, including smoking, occupational exposures to substances like asbestos and diesel fumes, and exposure to ionizing radiation from nuclear tests. “These findings emphasize the urgent need for enhanced clinical management strategies and preventive measures for veterans exposed to these carcinogenic agents,” the authors asserted.
The authors report no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
Exposure to Agent Orange and depleted urology are associated with increased risk of bladder cancer, according to a recent Urology meta-analysis. About 3200 US veterans are diagnosed with bladder cancer each year, which is the fourth most diagnosed cancer among veterans. “Identifying veterans exposed to these risk factors is crucial for implementing screening protocols and connecting them with preventive healthcare measures when possible,” the authors said.
A meta-analysis using narrative synthesis to incorporate diverse studies examined the impact of exposure to Agent Orange, depleted uranium exposure, contaminated drinking water, and other environmental contaminants. The researchers found 7 studies of Agent Orange exposure that in total showed a statistically significant increase in bladder cancer risk (hazard ratio [HR], 1.17; 95% confidence interval [CI], 1.01-1.36; P < .001) among 2,705,283 veterans. Six studies revealed that depleted uranium exposure caused a statistically significant association with bladder cancer as well (HR, 2.13; 95% CI, 1.31-3.48; P = .002) among 28,899 patients. Exposure to contaminated drinking water exposure in 4 studies also suggested an increased bladder cancer risk (HR, 1.25; 95% CI, 0.97-1.61; P = .08) among 370,408 veterans.
The authors identified other factors that also contributed to increased bladder cancer risk, including smoking, occupational exposures to substances like asbestos and diesel fumes, and exposure to ionizing radiation from nuclear tests. “These findings emphasize the urgent need for enhanced clinical management strategies and preventive measures for veterans exposed to these carcinogenic agents,” the authors asserted.
The authors report no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
Exposure to Agent Orange and depleted urology are associated with increased risk of bladder cancer, according to a recent Urology meta-analysis. About 3200 US veterans are diagnosed with bladder cancer each year, which is the fourth most diagnosed cancer among veterans. “Identifying veterans exposed to these risk factors is crucial for implementing screening protocols and connecting them with preventive healthcare measures when possible,” the authors said.
A meta-analysis using narrative synthesis to incorporate diverse studies examined the impact of exposure to Agent Orange, depleted uranium exposure, contaminated drinking water, and other environmental contaminants. The researchers found 7 studies of Agent Orange exposure that in total showed a statistically significant increase in bladder cancer risk (hazard ratio [HR], 1.17; 95% confidence interval [CI], 1.01-1.36; P < .001) among 2,705,283 veterans. Six studies revealed that depleted uranium exposure caused a statistically significant association with bladder cancer as well (HR, 2.13; 95% CI, 1.31-3.48; P = .002) among 28,899 patients. Exposure to contaminated drinking water exposure in 4 studies also suggested an increased bladder cancer risk (HR, 1.25; 95% CI, 0.97-1.61; P = .08) among 370,408 veterans.
The authors identified other factors that also contributed to increased bladder cancer risk, including smoking, occupational exposures to substances like asbestos and diesel fumes, and exposure to ionizing radiation from nuclear tests. “These findings emphasize the urgent need for enhanced clinical management strategies and preventive measures for veterans exposed to these carcinogenic agents,” the authors asserted.
The authors report no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
Does Watch and Wait Increase Distant Metastasis Risk in Rectal Cancer?
TOPLINE:
The new study highlights the importance of timely surgical intervention to improve distant metastases–free survival rates.
METHODOLOGY:
- Organ preservation has become an attractive alternative to surgery for patients with rectal cancer who achieve a clinical complete response after neoadjuvant therapy, with the risk for local regrowth after initial clinical complete response being around 25%-30%.
- The new study aimed to compare the risk for distant metastases between patients with local regrowth after watch and wait and patients with near-complete pathologic response managed by total mesorectal excision.
- A total of 508 patients with local regrowth were included from the International Watch & Wait Database, and 893 patients with near-complete pathologic response were included from the Spanish Rectal Cancer Project.
- The primary endpoint was distant metastases–free survival at 3 years from the decision to watch and wait or total mesorectal excision, and the secondary endpoints included possible risk factors associated with distant metastases.
TAKEAWAY:
- Patients with local regrowth had a significantly higher rate of distant metastases (rate, 22.8% vs 10.2%; P ≤.001) than those with near-complete pathologic response managed by total mesorectal excision.
- Distant metastases–free survival at 3 years was significantly worse for patients with local regrowth (rate, 75% vs 87%; P < .001).
- Independent risk factors for distant metastases included local regrowth (vs total mesorectal excision at reassessment; P = .001), ypT3-4 status (P = .016), and ypN+ status (P = .001) at the time of surgery.
- Patients with local regrowth had worse distant metastases–free survival across all pathologic stages than those managed by total mesorectal excision.
IN PRACTICE:
“Patients with local regrowth appear to have a higher risk for subsequent distant metastases development than patients with near-complete pathologic response managed by total mesorectal excision at restaging irrespective of final pathology,” the authors wrote.
SOURCE:
This study was led by Laura M. Fernandez, MD, of the Champalimaud Foundation in Lisbon, Portugal. It was published online in Journal of Clinical Oncology.
LIMITATIONS:
This study’s limitations included the heterogeneity in defining clinical complete response and the decision to watch and wait across different institutions. The majority of patients did not receive total neoadjuvant therapy regimens, which may have affected the generalizability of the findings. The study had a considerable amount of follow-up losses, which could have introduced bias.
DISCLOSURES:
This study was supported by the European Society of Surgical Oncology, the Champalimaud Foundation, the Bas Mulder Award, the Alpe d’HuZes Foundation, the Dutch Cancer Society, the European Research Council Advanced Grant, and the National Institute of Health and Research Manchester Biomedical Research Centre. Fernandez disclosed receiving grants from Johnson & Johnson. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The new study highlights the importance of timely surgical intervention to improve distant metastases–free survival rates.
METHODOLOGY:
- Organ preservation has become an attractive alternative to surgery for patients with rectal cancer who achieve a clinical complete response after neoadjuvant therapy, with the risk for local regrowth after initial clinical complete response being around 25%-30%.
- The new study aimed to compare the risk for distant metastases between patients with local regrowth after watch and wait and patients with near-complete pathologic response managed by total mesorectal excision.
- A total of 508 patients with local regrowth were included from the International Watch & Wait Database, and 893 patients with near-complete pathologic response were included from the Spanish Rectal Cancer Project.
- The primary endpoint was distant metastases–free survival at 3 years from the decision to watch and wait or total mesorectal excision, and the secondary endpoints included possible risk factors associated with distant metastases.
TAKEAWAY:
- Patients with local regrowth had a significantly higher rate of distant metastases (rate, 22.8% vs 10.2%; P ≤.001) than those with near-complete pathologic response managed by total mesorectal excision.
- Distant metastases–free survival at 3 years was significantly worse for patients with local regrowth (rate, 75% vs 87%; P < .001).
- Independent risk factors for distant metastases included local regrowth (vs total mesorectal excision at reassessment; P = .001), ypT3-4 status (P = .016), and ypN+ status (P = .001) at the time of surgery.
- Patients with local regrowth had worse distant metastases–free survival across all pathologic stages than those managed by total mesorectal excision.
IN PRACTICE:
“Patients with local regrowth appear to have a higher risk for subsequent distant metastases development than patients with near-complete pathologic response managed by total mesorectal excision at restaging irrespective of final pathology,” the authors wrote.
SOURCE:
This study was led by Laura M. Fernandez, MD, of the Champalimaud Foundation in Lisbon, Portugal. It was published online in Journal of Clinical Oncology.
LIMITATIONS:
This study’s limitations included the heterogeneity in defining clinical complete response and the decision to watch and wait across different institutions. The majority of patients did not receive total neoadjuvant therapy regimens, which may have affected the generalizability of the findings. The study had a considerable amount of follow-up losses, which could have introduced bias.
DISCLOSURES:
This study was supported by the European Society of Surgical Oncology, the Champalimaud Foundation, the Bas Mulder Award, the Alpe d’HuZes Foundation, the Dutch Cancer Society, the European Research Council Advanced Grant, and the National Institute of Health and Research Manchester Biomedical Research Centre. Fernandez disclosed receiving grants from Johnson & Johnson. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The new study highlights the importance of timely surgical intervention to improve distant metastases–free survival rates.
METHODOLOGY:
- Organ preservation has become an attractive alternative to surgery for patients with rectal cancer who achieve a clinical complete response after neoadjuvant therapy, with the risk for local regrowth after initial clinical complete response being around 25%-30%.
- The new study aimed to compare the risk for distant metastases between patients with local regrowth after watch and wait and patients with near-complete pathologic response managed by total mesorectal excision.
- A total of 508 patients with local regrowth were included from the International Watch & Wait Database, and 893 patients with near-complete pathologic response were included from the Spanish Rectal Cancer Project.
- The primary endpoint was distant metastases–free survival at 3 years from the decision to watch and wait or total mesorectal excision, and the secondary endpoints included possible risk factors associated with distant metastases.
TAKEAWAY:
- Patients with local regrowth had a significantly higher rate of distant metastases (rate, 22.8% vs 10.2%; P ≤.001) than those with near-complete pathologic response managed by total mesorectal excision.
- Distant metastases–free survival at 3 years was significantly worse for patients with local regrowth (rate, 75% vs 87%; P < .001).
- Independent risk factors for distant metastases included local regrowth (vs total mesorectal excision at reassessment; P = .001), ypT3-4 status (P = .016), and ypN+ status (P = .001) at the time of surgery.
- Patients with local regrowth had worse distant metastases–free survival across all pathologic stages than those managed by total mesorectal excision.
IN PRACTICE:
“Patients with local regrowth appear to have a higher risk for subsequent distant metastases development than patients with near-complete pathologic response managed by total mesorectal excision at restaging irrespective of final pathology,” the authors wrote.
SOURCE:
This study was led by Laura M. Fernandez, MD, of the Champalimaud Foundation in Lisbon, Portugal. It was published online in Journal of Clinical Oncology.
LIMITATIONS:
This study’s limitations included the heterogeneity in defining clinical complete response and the decision to watch and wait across different institutions. The majority of patients did not receive total neoadjuvant therapy regimens, which may have affected the generalizability of the findings. The study had a considerable amount of follow-up losses, which could have introduced bias.
DISCLOSURES:
This study was supported by the European Society of Surgical Oncology, the Champalimaud Foundation, the Bas Mulder Award, the Alpe d’HuZes Foundation, the Dutch Cancer Society, the European Research Council Advanced Grant, and the National Institute of Health and Research Manchester Biomedical Research Centre. Fernandez disclosed receiving grants from Johnson & Johnson. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
AI-Aided Colonoscopy’s ‘Intelligent’ Module Ups Polyp Detection
Colin J. Rees, a professor of gastroenterology in the Faculty of Medical Sciences at Newcastle University in Newcastle upon Tyne, England, and colleagues compared the real-world clinical effectiveness of computer-aided detection (CADe)–assisted colonoscopy using an “intelligent” module with that of standard colonoscopy in a study in The Lancet Gastroenterology & Hepatology.
They found the GI Genius Intelligent Endoscopy Module (Medtronic) increased the mean number of adenomas detected per procedure and the adenoma detection rate, especially for small, flat (type 0-IIa) polyps, and sessile serrated lesions, which are more likely to be missed.
“Missed sessile serrated lesions disproportionately increase the risk of post-colonoscopy colorectal cancer, thus the adoption of GI Genius into routine colonoscopy practice could not only increase polyp detection but also reduce the incidence of post-colonoscopy colorectal cancer,” the investigators wrote.
“AI is going to have a major impact upon most aspects of healthcare. Some areas of medical practice are now well established, and some are still in evolution,” Rees, who is also president of the British Society of Gastroenterology, said in an interview. “Within gastroenterology, the role of AI in endoscopic diagnostics is also evolving. The COLO-DETECT trial demonstrates that AI increases detection of lesions, and work is ongoing to see how AI might help with characterization and other elements of endoscopic practice.”
Study Details
The multicenter, open-label, parallel-arm, pragmatic randomized controlled trial was conducted at 12 National Health Service hospitals in England. The study cohort consisted of adults ≥ 18 years undergoing colorectal cancer (CRC) screening or colonoscopy for gastrointestinal symptom surveillance owing to personal or family history.
Recruiting staff, participants, and colonoscopists were unmasked to allocation, whereas histopathologists, cochief investigators, and trial statisticians were masked.
CADe-assisted colonoscopy consisted of standard colonoscopy plus the GI Genius module active for at least the entire inspection phase of colonoscope withdrawal.
The primary outcome was mean adenomas per procedure (total number of adenomas detected divided by total number of procedures). The key secondary outcome was adenoma detection rate (proportion of colonoscopies with at least one adenoma).
From March 2021 to April 2023, the investigators recruited 2032 participants, 55.7% men, with a mean cohort age of 62.4 years and randomly assigned them to CADe-assisted colonoscopy (n = 1015) or to standard colonoscopy (n = 1017). Of these, 60.6% were undergoing screening and 39.4% had symptomatic indications.
Mean adenomas per procedure were 1.56 (SD, 2.82; n = 1001 participants with data) in the CADe-assisted group vs 1.21 (n = 1009) in the standard group, for an adjusted mean difference of 0.36 (95% CI, 0.14-0.57; adjusted incidence rate ratio, 1.30; 95% CI, 1.15-1.47; P < .0001).
Adenomas were detected in 555 (56.6%) of 980 participants in the CADe-assisted group vs 477 (48.4%) of 986 in the standard group, representing a proportion difference of 8.3% (95% CI, 3.9-12.7; adjusted odds ratio, 1.47; 95% CI, 1.21-1.78; P < .0001).
As to safety, adverse events were numerically comparable in both the intervention and control groups, with overall events 25 vs 19 and serious events 4 vs 6. On independent review, no adverse events in the CADe-assisted colonoscopy group were related to GI Genius.
Offering a US perspective on the study, Nabil M. Mansour, MD, an associate professor and director of the McNair General GI Clinic at Baylor College of Medicine in Houston, Texas, said GI Genius and other CADe systems represent a significant advance over standard colonoscopy for identifying premalignant polyps. “While the data have been mixed, most studies, particularly randomized controlled trials have shown significant improvements with CADe in detection both terms of in adenomas per colonoscopy and reductions in adenoma miss rate,” he said in an interview.
He added that the main utility of CADe is for asymptomatic patients undergoing average-risk screening and surveillance colonoscopy for CRC screening and prevention, as well as for those with positive stool-based screening tests, “though there is no downside to using it in symptomatic patients as well.” Though AI colonoscopy likely still stands at < 50% of endoscopy centers overall, and is used mainly at academic centers, his clinic has been using it for the past year.
The main question, Mansour cautioned, is whether increased detection of small polyps will actually reduce CRC incidence or mortality, and it will likely be several years before clear, concrete data can answer that.
“Most studies have shown the improvement in adenoma detection is mainly for diminutive polyps < 5 mm in diameter, but whether that will actually translate to substantive improvements in hard outcomes is as yet unknown,” he said. “But if gastroenterologists are interested in doing everything they can today to help improve detection rates and lower miss rates of premalignant polyps, serious consideration should be given to adopting the use of CADe in practice.”
This study was supported by Medtronic. Rees reported receiving grant funding from ARC Medical, Norgine, Medtronic, 3-D Matrix, and Olympus Medical, and has been an expert witness for ARC Medical. Other authors disclosed receiving research funding, honoraria, or travel expenses from Medtronic or other private companies. Mansour had no competing interests to declare.
A version of this article appeared on Medscape.com.
Colin J. Rees, a professor of gastroenterology in the Faculty of Medical Sciences at Newcastle University in Newcastle upon Tyne, England, and colleagues compared the real-world clinical effectiveness of computer-aided detection (CADe)–assisted colonoscopy using an “intelligent” module with that of standard colonoscopy in a study in The Lancet Gastroenterology & Hepatology.
They found the GI Genius Intelligent Endoscopy Module (Medtronic) increased the mean number of adenomas detected per procedure and the adenoma detection rate, especially for small, flat (type 0-IIa) polyps, and sessile serrated lesions, which are more likely to be missed.
“Missed sessile serrated lesions disproportionately increase the risk of post-colonoscopy colorectal cancer, thus the adoption of GI Genius into routine colonoscopy practice could not only increase polyp detection but also reduce the incidence of post-colonoscopy colorectal cancer,” the investigators wrote.
“AI is going to have a major impact upon most aspects of healthcare. Some areas of medical practice are now well established, and some are still in evolution,” Rees, who is also president of the British Society of Gastroenterology, said in an interview. “Within gastroenterology, the role of AI in endoscopic diagnostics is also evolving. The COLO-DETECT trial demonstrates that AI increases detection of lesions, and work is ongoing to see how AI might help with characterization and other elements of endoscopic practice.”
Study Details
The multicenter, open-label, parallel-arm, pragmatic randomized controlled trial was conducted at 12 National Health Service hospitals in England. The study cohort consisted of adults ≥ 18 years undergoing colorectal cancer (CRC) screening or colonoscopy for gastrointestinal symptom surveillance owing to personal or family history.
Recruiting staff, participants, and colonoscopists were unmasked to allocation, whereas histopathologists, cochief investigators, and trial statisticians were masked.
CADe-assisted colonoscopy consisted of standard colonoscopy plus the GI Genius module active for at least the entire inspection phase of colonoscope withdrawal.
The primary outcome was mean adenomas per procedure (total number of adenomas detected divided by total number of procedures). The key secondary outcome was adenoma detection rate (proportion of colonoscopies with at least one adenoma).
From March 2021 to April 2023, the investigators recruited 2032 participants, 55.7% men, with a mean cohort age of 62.4 years and randomly assigned them to CADe-assisted colonoscopy (n = 1015) or to standard colonoscopy (n = 1017). Of these, 60.6% were undergoing screening and 39.4% had symptomatic indications.
Mean adenomas per procedure were 1.56 (SD, 2.82; n = 1001 participants with data) in the CADe-assisted group vs 1.21 (n = 1009) in the standard group, for an adjusted mean difference of 0.36 (95% CI, 0.14-0.57; adjusted incidence rate ratio, 1.30; 95% CI, 1.15-1.47; P < .0001).
Adenomas were detected in 555 (56.6%) of 980 participants in the CADe-assisted group vs 477 (48.4%) of 986 in the standard group, representing a proportion difference of 8.3% (95% CI, 3.9-12.7; adjusted odds ratio, 1.47; 95% CI, 1.21-1.78; P < .0001).
As to safety, adverse events were numerically comparable in both the intervention and control groups, with overall events 25 vs 19 and serious events 4 vs 6. On independent review, no adverse events in the CADe-assisted colonoscopy group were related to GI Genius.
Offering a US perspective on the study, Nabil M. Mansour, MD, an associate professor and director of the McNair General GI Clinic at Baylor College of Medicine in Houston, Texas, said GI Genius and other CADe systems represent a significant advance over standard colonoscopy for identifying premalignant polyps. “While the data have been mixed, most studies, particularly randomized controlled trials have shown significant improvements with CADe in detection both terms of in adenomas per colonoscopy and reductions in adenoma miss rate,” he said in an interview.
He added that the main utility of CADe is for asymptomatic patients undergoing average-risk screening and surveillance colonoscopy for CRC screening and prevention, as well as for those with positive stool-based screening tests, “though there is no downside to using it in symptomatic patients as well.” Though AI colonoscopy likely still stands at < 50% of endoscopy centers overall, and is used mainly at academic centers, his clinic has been using it for the past year.
The main question, Mansour cautioned, is whether increased detection of small polyps will actually reduce CRC incidence or mortality, and it will likely be several years before clear, concrete data can answer that.
“Most studies have shown the improvement in adenoma detection is mainly for diminutive polyps < 5 mm in diameter, but whether that will actually translate to substantive improvements in hard outcomes is as yet unknown,” he said. “But if gastroenterologists are interested in doing everything they can today to help improve detection rates and lower miss rates of premalignant polyps, serious consideration should be given to adopting the use of CADe in practice.”
This study was supported by Medtronic. Rees reported receiving grant funding from ARC Medical, Norgine, Medtronic, 3-D Matrix, and Olympus Medical, and has been an expert witness for ARC Medical. Other authors disclosed receiving research funding, honoraria, or travel expenses from Medtronic or other private companies. Mansour had no competing interests to declare.
A version of this article appeared on Medscape.com.
Colin J. Rees, a professor of gastroenterology in the Faculty of Medical Sciences at Newcastle University in Newcastle upon Tyne, England, and colleagues compared the real-world clinical effectiveness of computer-aided detection (CADe)–assisted colonoscopy using an “intelligent” module with that of standard colonoscopy in a study in The Lancet Gastroenterology & Hepatology.
They found the GI Genius Intelligent Endoscopy Module (Medtronic) increased the mean number of adenomas detected per procedure and the adenoma detection rate, especially for small, flat (type 0-IIa) polyps, and sessile serrated lesions, which are more likely to be missed.
“Missed sessile serrated lesions disproportionately increase the risk of post-colonoscopy colorectal cancer, thus the adoption of GI Genius into routine colonoscopy practice could not only increase polyp detection but also reduce the incidence of post-colonoscopy colorectal cancer,” the investigators wrote.
“AI is going to have a major impact upon most aspects of healthcare. Some areas of medical practice are now well established, and some are still in evolution,” Rees, who is also president of the British Society of Gastroenterology, said in an interview. “Within gastroenterology, the role of AI in endoscopic diagnostics is also evolving. The COLO-DETECT trial demonstrates that AI increases detection of lesions, and work is ongoing to see how AI might help with characterization and other elements of endoscopic practice.”
Study Details
The multicenter, open-label, parallel-arm, pragmatic randomized controlled trial was conducted at 12 National Health Service hospitals in England. The study cohort consisted of adults ≥ 18 years undergoing colorectal cancer (CRC) screening or colonoscopy for gastrointestinal symptom surveillance owing to personal or family history.
Recruiting staff, participants, and colonoscopists were unmasked to allocation, whereas histopathologists, cochief investigators, and trial statisticians were masked.
CADe-assisted colonoscopy consisted of standard colonoscopy plus the GI Genius module active for at least the entire inspection phase of colonoscope withdrawal.
The primary outcome was mean adenomas per procedure (total number of adenomas detected divided by total number of procedures). The key secondary outcome was adenoma detection rate (proportion of colonoscopies with at least one adenoma).
From March 2021 to April 2023, the investigators recruited 2032 participants, 55.7% men, with a mean cohort age of 62.4 years and randomly assigned them to CADe-assisted colonoscopy (n = 1015) or to standard colonoscopy (n = 1017). Of these, 60.6% were undergoing screening and 39.4% had symptomatic indications.
Mean adenomas per procedure were 1.56 (SD, 2.82; n = 1001 participants with data) in the CADe-assisted group vs 1.21 (n = 1009) in the standard group, for an adjusted mean difference of 0.36 (95% CI, 0.14-0.57; adjusted incidence rate ratio, 1.30; 95% CI, 1.15-1.47; P < .0001).
Adenomas were detected in 555 (56.6%) of 980 participants in the CADe-assisted group vs 477 (48.4%) of 986 in the standard group, representing a proportion difference of 8.3% (95% CI, 3.9-12.7; adjusted odds ratio, 1.47; 95% CI, 1.21-1.78; P < .0001).
As to safety, adverse events were numerically comparable in both the intervention and control groups, with overall events 25 vs 19 and serious events 4 vs 6. On independent review, no adverse events in the CADe-assisted colonoscopy group were related to GI Genius.
Offering a US perspective on the study, Nabil M. Mansour, MD, an associate professor and director of the McNair General GI Clinic at Baylor College of Medicine in Houston, Texas, said GI Genius and other CADe systems represent a significant advance over standard colonoscopy for identifying premalignant polyps. “While the data have been mixed, most studies, particularly randomized controlled trials have shown significant improvements with CADe in detection both terms of in adenomas per colonoscopy and reductions in adenoma miss rate,” he said in an interview.
He added that the main utility of CADe is for asymptomatic patients undergoing average-risk screening and surveillance colonoscopy for CRC screening and prevention, as well as for those with positive stool-based screening tests, “though there is no downside to using it in symptomatic patients as well.” Though AI colonoscopy likely still stands at < 50% of endoscopy centers overall, and is used mainly at academic centers, his clinic has been using it for the past year.
The main question, Mansour cautioned, is whether increased detection of small polyps will actually reduce CRC incidence or mortality, and it will likely be several years before clear, concrete data can answer that.
“Most studies have shown the improvement in adenoma detection is mainly for diminutive polyps < 5 mm in diameter, but whether that will actually translate to substantive improvements in hard outcomes is as yet unknown,” he said. “But if gastroenterologists are interested in doing everything they can today to help improve detection rates and lower miss rates of premalignant polyps, serious consideration should be given to adopting the use of CADe in practice.”
This study was supported by Medtronic. Rees reported receiving grant funding from ARC Medical, Norgine, Medtronic, 3-D Matrix, and Olympus Medical, and has been an expert witness for ARC Medical. Other authors disclosed receiving research funding, honoraria, or travel expenses from Medtronic or other private companies. Mansour had no competing interests to declare.
A version of this article appeared on Medscape.com.
FROM THE LANCET GASTROENTEROLOGY & HEPATOLOGY
Impact of NSAID Use on Bleeding Rates for Patients Taking Rivaroxaban or Apixaban
Impact of NSAID Use on Bleeding Rates for Patients Taking Rivaroxaban or Apixaban
Clinical practice has shifted from vitamin K antagonists to direct oral anticoagulants (DOACs) for atrial fibrillation treatment due to their more favorable risk-benefit profile and less lifestyle modification required.1,2 However, the advantage of a lower bleeding risk with DOACs could be compromised by potentially problematic pharmacokinetic interactions like those conferred by antiplatelets or nonsteroidal anti-inflammatory drugs (NSAIDs).3,4 Treating a patient needing anticoagulation with a DOAC who has comorbidities may introduce unavoidable drug-drug interactions. This particularly happens with over-the-counter and prescription NSAIDs used for the management of pain and inflammatory conditions.5
NSAIDs primarily affect 2 cyclooxygenase (COX) enzyme isomers, COX-1 and COX-2.6 COX-1 helps maintain gastrointestinal (GI) mucosa integrity and platelet aggregation processes, whereas COX-2 is engaged in pain signaling and inflammation mediation. COX-1 inhibition is associated with more bleeding-related adverse events (AEs), especially in the GI tract. COX-2 inhibition is thought to provide analgesia and anti-inflammatory properties without elevating bleeding risk. This premise is responsible for the preferential use of celecoxib, a COX-2 selective NSAID, which should confer a lower bleeding risk compared to nonselective NSAIDs such as ibuprofen and naproxen.7 NSAIDs have been documented as independent risk factors for bleeding. NSAID users are about 3 times as likely to develop GI AEs compared to nonNSAID users.8
Many clinicians aim to further mitigate NSAID-associated bleeding risk by coprescribing a proton pump inhibitor (PPI). PPIs provide gastroprotection against NSAID-induced mucosal injury and sequential complication of GI bleeding. In a multicenter randomized control trial, patients who received concomitant PPI therapy while undergoing chronic NSAID therapy—including nonselective and COX-2 selective NSAIDs—had a significantly lower risk of GI ulcer development (placebo, 17.0%; 20 mg esomeprazole, 5.2%; 40 mg esomeprazole, 4.6%).9 Current clinical guidelines for preventing NSAIDassociated bleeding complications recommend using a COX-2 selective NSAID in combination with PPI therapy for patients at high risk for GI-related bleeding, including the concomitant use of anticoagulants.10
There is evidence suggesting an increased bleeding risk with NSAIDs when used in combination with vitamin K antagonists such as warfarin.11,12 A systematic review of warfarin and concomitant NSAID use found an increased risk of overall bleeding with NSAID use in combination with warfarin (odds ratio 1.58; 95% CI, 1.18-2.12), compared to warfarin alone.12
Posthoc analyses of randomized clinical trials have also demonstrated an increased bleeding risk with oral anticoagulation and concomitant NSAID use.13,14 In the RE-LY trial, NSAID users on warfarin or dabigatran had a statistically significant increased risk of major bleeding compared to non-NSAID users (hazard ratio [HR] 1.68; 95% CI, 1.40- 2.02; P < .001).13 In the ARISTOTLE trial, patients on warfarin or apixaban who were incident NSAID users were found to have an increased risk of major bleeding (HR 1.61; 95% CI, 1.11-2.33) and clinically relevant nonmajor bleeding (HR 1.70; 95% CI, 1.16- 2.48).14 These trials found a statistically significant increased bleeding risk associated with NSAID use, though the populations evaluated included patients taking warfarin and patients taking DOACs. These trials did not evaluate the bleeding risk of concomitant NSAID use among DOACs alone.
Evidence on NSAID-associated bleeding risk with DOACs is lacking in settings where the patient population, prescribing practices, and monitoring levels are variable. Within the Veterans Health Administration, clinical pharmacist practitioners (CPPs) in anticoagulation clinics oversee DOAC therapy management. CPPs monitor safety and efficacy of DOAC therapies through a population health management tool, the DOAC Dashboard.15 The DOAC Dashboard creates alerts for patients who may require an intervention based on certain clinical parameters, such as drug-drug interactions.16 Whenever a patient on a DOAC is prescribed an NSAID, an alert is generated on the DOAC Dashboard to flag the CPPs for the potential need for an intervention. If NSAID therapy remains clinically indicated, CPPs may recommend risk reduction strategies such as a COX-2 selective NSAID or coprescribing a PPI.10
The DOAC Dashboard provides an ideal setting for investigating the effects of NSAID use, NSAID selectivity, and PPI coprescribing on DOAC bleeding rates. With an increasing population of patients receiving anticoagulation therapy with a DOAC, more guidance regarding the bleeding risk of concomitant NSAID use with DOACs is needed. Studies evaluating the bleeding risk with concomitant NSAID use in patients on a DOAC alone are limited. This is the first study to date to compare bleeding risk with concomitant NSAID use between DOACs. This study provides information on bleeding risk with NSAID use among commonly prescribed DOACs, rivaroxaban and apixaban, and the potential impacts of current risk reduction strategies.
METHODS
This single-center retrospective cohort review was performed using the electronic health records (EHRs) of patients enrolled in the US Department of Veterans Affairs (VA) Mountain Home Healthcare System who received rivaroxaban or apixaban from December 2020 to December 2022. This study received approval from the East Tennessee State University/VA Institutional Review Board committee.
Patients were identified through the DOAC Dashboard, aged 21 to 100 years, and received rivaroxaban or apixaban at a therapeutic dose: rivaroxaban 10 to 20 mg daily or apixaban 2.5 to 5 mg twice daily. Patients were excluded if they were prescribed dual antiplatelet therapy, received rivaroxaban at dosing indicated for peripheral vascular disease, were undergoing dialysis, had evidence of moderate to severe hepatic impairment or any hepatic disease with coagulopathy, were undergoing chemotherapy or radiation, or had hematological conditions with predisposed bleeding risk. These patients were excluded to mitigate the potential confounding impact from nontherapeutic DOAC dosing strategies and conditions associated with an increased bleeding risk.
Eligible patients were stratified based on NSAID use. NSAID users were defined as patients prescribed an oral NSAID, including both acute and chronic courses, at any point during the study time frame while actively on a DOAC. Bleeding events were reviewed to evaluate rates between rivaroxaban and apixaban among NSAID and nonNSAID users. Identified NSAID users were further assessed for NSAID selectivity and PPI coprescribing as a subgroup analysis for the secondary assessment.
Data Collection
Baseline data were collected, including age, body mass index, anticoagulation indication, DOAC agent, DOAC dose, and DOAC total daily dose. Baseline serum creatinine levels, liver function tests, hemoglobin levels, and platelet counts were collected from the most recent data available immediately prior to the bleeding event, if applicable.
The DOAC Dashboard was reviewed for active and dismissed drug interaction alerts to identify patients taking rivaroxaban or apixaban who were prescribed an NSAID. Patients were categorized in the NSAID group if an interacting drug alert with an NSAID was reported during the study time frame. Data available through the interacting drug alerts on NSAID use were limited to the interacting drug name and date of the reported flag. Manual EHR review was required to confirm dates of NSAID therapy initiation and NSAID discontinuation, if applicable.
Data regarding concomitant antiplatelet use were obtained through review of the active and dismissed drug interaction alerts on the DOAC Dashboard. Concomitant antiplatelet use was defined as the prescribing of a single antiplatelet agent at any point while receiving DOAC therapy. Data on concomitant antiplatelets were collected regardless of NSAID status.
Data on coprescribed PPI therapy were obtained through manual EHR review of identified NSAID users. Coprescribed PPI therapy was defined as the prescribing of a PPI at any point during NSAID therapy. Data regarding PPI use among non-NSAID users were not collected because the secondary endpoint was designed to assess PPI use only among patients coprescribed a DOAC and NSAID.
Outcomes
Bleeding events were identified through an outcomes report generated by the DOAC Dashboard based on International Classification of Diseases, Tenth Revision diagnosis codes associated with a bleeding event. The outcomes report captures diagnoses from the outpatient and inpatient care settings. Reported bleeding events were limited to patients who received a DOAC at any point in the 6 months prior to the event and excluded patients with recent DOAC initiation within 7 days of the event, as these patients are not captured on the DOAC Dashboard.
All reported bleeding events were manually reviewed in the EHR and categorized as a major or clinically relevant nonmajor bleed, according to International Society of Thrombosis and Haemostasis criteria. Validated bleeding events were then crossreferenced with the interacting drug alerts report to identify events with potentially overlapping NSAID therapy at the time of the event. Overlapping NSAID therapy was defined as the prescribing of an NSAID at any point in the 6 months prior to the event. All events with potential overlapping NSAID therapies were manually reviewed for confirmation of NSAID status at the time of the event.
The primary endpoint was a composite of any bleeding event per International Society of Thrombosis and Haemostasis criteria. The secondary endpoint evaluated the potential impact of NSAID selectivity or PPI coprescribing on the bleeding rate among the NSAID user groups.
Statistical Analysis
Analyses were performed consistent with the methods used in the ARISTOTLE and RE-LY trials. It was determined that a sample size of 504 patients, with ≥ 168 patients in each group, would provide 80% power using a 2-sided a of 0.05. HRs with 95% CIs and respective P values were calculated using a SPSS-adapted online calculator.
RESULTS
The DOAC Dashboard identified 681 patients on rivaroxaban and 3225 patients on apixaban; 72 patients on rivaroxaban (10.6%) and 300 patients on apixaban (9.3%) were NSAID users. The mean age of NSAID users was 66.9 years in the rivaroxaban group and 72.4 years in the apixaban group. The mean age of non-NSAID users was 71.5 years in the rivaroxaban group and 75.6 years in the apixaban group. No appreciable differences were observed among subgroups in body mass index, renal function, hepatic function, hemoglobin, or platelet counts, and no statistically significant differences were identified (Table 1). Antiplatelet agents identified included aspirin, clopidogrel, prasugrel, and ticagrelor. Fifteen patients (20.3%) in the rivaroxaban group and 87 patients (28.7%) in the apixaban group had concomitant antiplatelet and NSAID use. Forty-five patients on rivaroxaban (60.8%) and 170 (55.9%) on apixaban were prescribed concomitant PPI and NSAID at baseline. Among non-NSAID users, there was concomitant antiplatelet use for 265 patients (43.6%) in the rivaroxaban group and 1401 patients (47.9%) in the apixaban group. Concomitant PPI use was identified among 63 patients (60.0%) taking selective NSAIDs and 182 (57.2%) taking nonselective NSAIDs.

A total of 423 courses of NSAIDs were identified: 85 NSAID courses in the rivaroxaban group and 338 NSAID courses in the apixaban group. Most NSAID courses involved a nonselective NSAID in the rivaroxaban and apixaban NSAID user groups: 75.2% (n = 318) aggregately compared to 71.8% (n = 61) and 76.0% (n = 257) in the rivaroxaban and apixaban groups, respectively. The most frequent NSAID courses identified were meloxicam (26.7%; n = 113), celecoxib (24.8%; n = 105), ibuprofen (19.1%; n = 81), and naproxen (13.5%; n = 57). Data regarding NSAID therapy initiation and discontinuation dates were not readily available. As a result, the duration of NSAID courses was not captured.
There was no statistically significant difference in bleeding rates between rivaroxaban and apixaban among NSAID users (HR 1.04; 95% CI, 0.98-1.12) or non-NSAID users (HR 1.15; 95% CI, 0.80-1.66) (Table 2). Apixaban non-NSAID users had a higher rate of major bleeds (HR 0.32; 95% CI, 0.17-0.61) while rivaroxaban non-NSAID users had a higher rate of clinically relevant nonmajor bleeds (HR 1.63; 95% CI, 1.10-2.54).

The sample size for the secondary endpoint consisted of bleeding events that were confirmed to have had an overlapping NSAID prescribed at the time of the event. For this secondary assessment, there was 1 rivaroxaban NSAID user bleeding event and 4 apixaban NSAID user bleeding events. For the rivaroxaban NSAID user bleeding event, the NSAID was nonselective and a PPI was not coprescribed. For the apixaban NSAID user bleeding events, 2 NSAIDs were nonselective and 2 were selective. All patients with apixaban and NSAID bleeding events had a coprescribed PPI. There was no clinically significant difference in the bleeding rates observed for NSAID selectivity or PPI coprescribing among the NSAID user subgroups.
DISCUSSION
This study found that there was no statistically significant difference for bleeding rates of major and nonmajor bleeding events between rivaroxaban and apixaban among NSAID users and non-NSAID users. This study did not identify a clinically significant impact on bleeding rates from NSAID selectivity or PPI coprescribing among the NSAID users.
There were notable but not statistically significant differences in baseline characteristics observed between the NSAID and non-NSAID user groups. On average, the rivaroxaban and apixaban NSAID users were younger compared with those not taking NSAIDs. NSAIDs, specifically nonselective NSAIDs, are recognized as potentially inappropriate medications for older adults given that this population is at an increased risk for GI ulcer development and/or GI bleeding.17 The non-NSAID user group likely consisted of older patients compared to the NSAID user group as clinicians may avoid prescribing NSAIDs to older adults regardless of concomitant DOAC therapy.
In addition to having an older patient population, non-NSAID users were more frequently prescribed a concomitant antiplatelet when compared with NSAID users. This prescribing pattern may be due to clinicians avoiding the use of NSAIDs in patients receiving DOAC therapy in combination with antiplatelet therapy, as these patients have been found to have an increased bleeding rate compared to DOAC therapy alone.18
Non-NSAID users had an overall higher bleeding rate for both major and nonmajor bleeding events. Based on this observation, it could be hypothesized that antiplatelet agents have a higher risk of bleeding in comparison to NSAIDs. In a subanalysis of the EXPAND study evaluating risk factors of major bleeding in patients receiving rivaroxaban, concomitant use of antiplatelet agents demonstrated a statistically significant increased risk of bleeding (HR 1.6; 95% CI, 1.2-2.3; P = .003) while concomitant use of NSAIDs did not (HR 0.8; 95% CI, 0.3-2.2; P = .67).19
In assessing PPI status at baseline, a majority of both rivaroxaban and apixaban NSAID users were coprescribed a PPI. This trend aligns with current clinical guideline recommendations for the prescribing of PPI therapy for GI protection in high-risk patients, such as those on DOAC therapy and concomitant NSAID therapy.10 Given the high proportion of NSAID users coprescribed a PPI at baseline, it may be possible that the true incidence of NSAID-associated bleeding events was higher than what this study found. This observation may reflect the impact from timely implementation of risk mitigation strategies by CPPs in the anticoagulation clinic. However, this study was not constructed to assess the efficacy of PPI use in this manner.
It is important to note the patients included in this study were followed by a pharmacist in an anticoagulation clinic using the DOAC Dashboard.15 This population management tool allows CPPs to make proactive interventions when a patient taking a DOAC receives an NSAID prescription, such as recommending the coprescribing of a PPI or use of a selective NSAID.10,16 These standards of care may have contributed to an overall reduced bleeding rate among the NSAID user group and may not be reflective of private practice.
The planned analysis of this study was modeled after the posthoc analysis of the RE-LY and ARISTOTLE trials. Both trials demonstrated an increased risk of bleeding with oral anticoagulation, including DOAC and warfarin, in combination with NSAID use. However, both trials found that NSAID use in patients treated with a DOAC was not independently associated with increased bleeding events compared with warfarin.13,14 The results of this study are comparable to the RE-LY and ARISTOTLE findings that NSAID use among patients treated with rivaroxaban or apixaban did not demonstrate a statistically significant increased bleeding risk.
Studies of NSAID use in combination with DOAC therapy have been limited to patient populations consisting of both DOAC and warfarin. Evidence from these trials outlines the increased bleeding risk associated with NSAID use in combination with oral anticoagulation; however, these patient populations include those on a DOAC and warfarin.13,14,19,20 Given the limited evidence on NSAID use among DOACs alone, it is assumed NSAID use in combination with DOACs has a similar risk of bleeding as warfarin use. This may cause clinicians to automatically exclude NSAID therapy as a treatment option for patients on a DOAC who are otherwise clinically appropriate candidates, such as those with underlying inflammatory conditions. Avoiding NSAID therapy in this patient population may lead to suboptimal pain management and increase the risk of patient harm from methods such as inappropriate opioid therapy prescribing.
DOAC therapy should not be a universal limitation to the use of NSAIDs. Although the risk of bleeding with NSAID therapy is always present, deliberate NSAID prescribing in addition to the timely implementation of risk mitigation strategies may provide an avenue for safe NSAID prescribing in patients receiving a DOAC. A population health-based approach to DOAC management, such as the DOAC Dashboard, appears to be effective at preventing patient harm when NSAIDs are prescribed in conjunction with DOACs.
Limitations
The DOAC Dashboard has been shown to be effective and efficient at monitoring DOAC therapy from a population-based approach.16 Reports generated through the DOAC Dashboard provide convenient access to patient data which allows for timely interventions; however, there are limits to its use for data collection. All the data elements necessary to properly assess bleeding risk with validated tools, such as HAS-BLED (hypertension, abnormal renal/liver function, stroke, bleeding history or predisposition, labile international normalized ratio, elderly, drugs/ alcohol concomitantly), are not available on DOAC Dashboard reports. Due to this constraint, bleeding risk assessments were not conducted at baseline and this study was unable to include risk modeling. Additionally, data elements like initiation and discontinuation dates and duration of therapies were not readily available. As a result, this study was unable to incorporate time as a data point.
This was a retrospective study that relied on manual review of chart documentation to verify bleeding events, but data obtained through the DOAC Dashboard were transferred directly from the EHR.15 Bleeding events available for evaluation were restricted to those that occurred at a VA facility. Additionally, the sample size within the rivaroxaban NSAID user group did not reach the predefined sample size required to reach power and may have been too small to detect a difference if one did exist. The secondary assessment had a low sample size of NSAID user bleeding events, making it difficult to fully assess its impact on NSAID selectivity and PPI coprescribing on bleeding rates. All courses of NSAIDs were equally valued regardless of the dose or therapy duration; however, this is consistent with how NSAID use was defined in the RE-LY and ARISTOTLE trials.
CONCLUSIONS
This retrospective cohort review found no statistically significant difference in the composite bleeding rates between rivaroxaban and apixaban among NSAID users and non-NSAID users. Moreover, there was no clinically significant impact observed for bleeding rates in regard to NSAID selectivity and PPI coprescribing among NSAID users. However, coprescribing of PPI therapy to patients on a DOAC who are clinically indicated for an NSAID may reduce the risk of bleeding. Population health management tools, such as the DOAC Dashboard, may also allow clinicians to safely prescribe NSAIDs to patients on a DOAC. Further large-scale observational studies are needed to quantify the real-world risk of bleeding with concomitant NSAID use among DOACs alone and to evaluate the impact from NSAID selectivity or PPI coprescribing.
- Ruff CT, Giugliano RP, Braunwald E, et al. Comparison of the efficacy and safety of new oral anticoagulants with warfarin in patients with atrial fibrillation: a meta-analysis of randomised trials. Lancet. 2014;383(9921):955-962. doi:10.1016/S0140-6736(13)62343-0
- Ageno W, Gallus AS, Wittkowsky A, Crowther M, Hylek EM, Palareti G. Oral anticoagulant therapy: antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2 Suppl):e44S-e88S. doi:10.1378/chest.11-2292
- Eikelboom J, Merli G. Bleeding with direct oral anticoagulants vs warfarin: clinical experience. Am J Med. 2016;129(11S):S33-S40. doi:10.1016/j.amjmed.2016.06.003
- Vranckx P, Valgimigli M, Heidbuchel H. The significance of drug-drug and drug-food interactions of oral anticoagulation. Arrhythm Electrophysiol Rev. 2018;7(1):55-61. doi:10.15420/aer.2017.50.1
- Davis JS, Lee HY, Kim J, et al. Use of non-steroidal antiinflammatory drugs in US adults: changes over time and by demographic. Open Heart. 2017;4(1):e000550. doi:10.1136/openhrt-2016-000550
- Schafer AI. Effects of nonsteroidal antiinflammatory drugs on platelet function and systemic hemostasis. J Clin Pharmacol. 1995;35(3):209-219. doi:10.1002/j.1552-4604.1995.tb04050.x
- Al-Saeed A. Gastrointestinal and cardiovascular risk of nonsteroidal anti-inflammatory drugs. Oman Med J. 2011;26(6):385-391. doi:10.5001/omj.2011.101
- Gabriel SE, Jaakkimainen L, Bombardier C. Risk for serious gastrointestinal complications related to use of nonsteroidal anti-inflammatory drugs. Ann Intern Med. 1991;115(10):787-796. doi:10.7326/0003-4819-115-10-787
- Scheiman JM, Yeomans ND, Talley NJ, et al. Prevention of ulcers by esomeprazole in at-risk patients using non-selective NSAIDs and COX-2 inhibitors. Am J Gastroenterol. 2006;101(4):701-710. doi:10.1111/j.1572-0241.2006.00499.x
- Freedberg DE, Kim LS, Yang YX. The risks and benefits of long-term use of proton pump inhibitors: expert review and best practice advice from the American Gastroenterological Association. Gastroenterology. 2017;152(4):706-715. doi:10.1053/j.gastro.2017.01.031
- Lamberts M, Lip GYH, Hansen ML, et al. Relation of nonsteroidal anti-inflammatory drugs to serious bleeding and thromboembolism risk in patients with atrial fibrillation receiving antithrombotic therapy: a nationwide cohort study. Ann Intern Med. 2014;161(10):690-698. doi:10.7326/M13-1581
- Villa Zapata L, Hansten PD, Panic J, et al. Risk of bleeding with exposure to warfarin and nonsteroidal anti-inflammatory drugs: a systematic review and metaanalysis. Thromb Haemost. 2020;120(7):1066-1074. doi:10.1055/s-0040-1710592
- Kent AP, Brueckmann M, Fraessdorf M, et al. Concomitant oral anticoagulant and nonsteroidal anti-inflammatory drug therapy in patients with atrial fibrillation. J Am Coll Cardiol. 2018;72(3):255-267. doi:10.1016/j.jacc.2018.04.063
- Dalgaard F, Mulder H, Wojdyla DM, et al. Patients with atrial fibrillation taking nonsteroidal antiinflammatory drugs and oral anticoagulants in the ARISTOTLE Trial. Circulation. 2020;141(1):10-20. doi:10.1161/CIRCULATIONAHA.119.041296
- Allen AL, Lucas J, Parra D, et al. Shifting the paradigm: a population health approach to the management of direct oral anticoagulants. J Am Heart Asssoc. 2021;10(24):e022758. doi:10.1161/JAHA.121.022758
- . Valencia D, Spoutz P, Stoppi J, et al. Impact of a direct oral anticoagulant population management tool on anticoagulation therapy monitoring in clinical practice. Ann Pharmacother. 2019;53(8):806-811. doi:10.1177/1060028019835843
- By the 2023 American Geriatrics Society Beers Criteria® Update Expert Panel. American Geriatrics Society 2023 Updated AGS Beers Criteria® for potentially inappropriate medication use in older adults. J Am Geriatr Soc. 2023;71(7):2052-2081. doi:10.1111/jgs.18372
- Kumar S, Danik SB, Altman RK, et al. Non-vitamin K antagonist oral anticoagulants and antiplatelet therapy for stroke prevention in patients with atrial fibrillation. Cardiol Rev. 2016;24(5):218-223. doi:10.1097/CRD.0000000000000088
- Sakuma I, Uchiyama S, Atarashi H, et al. Clinical risk factors of stroke and major bleeding in patients with nonvalvular atrial fibrillation under rivaroxaban: the EXPAND study sub-analysis. Heart Vessels. 2019;34(11):1839-1851. doi:10.1007/s00380-019-01425-x
- Davidson BL, Verheijen S, Lensing AWA, et al. Bleeding risk of patients with acute venous thromboembolism taking nonsteroidal anti-inflammatory drugs or aspirin. JAMA Intern Med. 2014;174(6):947-953. doi:10.1001/jamainternmed.2014.946
Clinical practice has shifted from vitamin K antagonists to direct oral anticoagulants (DOACs) for atrial fibrillation treatment due to their more favorable risk-benefit profile and less lifestyle modification required.1,2 However, the advantage of a lower bleeding risk with DOACs could be compromised by potentially problematic pharmacokinetic interactions like those conferred by antiplatelets or nonsteroidal anti-inflammatory drugs (NSAIDs).3,4 Treating a patient needing anticoagulation with a DOAC who has comorbidities may introduce unavoidable drug-drug interactions. This particularly happens with over-the-counter and prescription NSAIDs used for the management of pain and inflammatory conditions.5
NSAIDs primarily affect 2 cyclooxygenase (COX) enzyme isomers, COX-1 and COX-2.6 COX-1 helps maintain gastrointestinal (GI) mucosa integrity and platelet aggregation processes, whereas COX-2 is engaged in pain signaling and inflammation mediation. COX-1 inhibition is associated with more bleeding-related adverse events (AEs), especially in the GI tract. COX-2 inhibition is thought to provide analgesia and anti-inflammatory properties without elevating bleeding risk. This premise is responsible for the preferential use of celecoxib, a COX-2 selective NSAID, which should confer a lower bleeding risk compared to nonselective NSAIDs such as ibuprofen and naproxen.7 NSAIDs have been documented as independent risk factors for bleeding. NSAID users are about 3 times as likely to develop GI AEs compared to nonNSAID users.8
Many clinicians aim to further mitigate NSAID-associated bleeding risk by coprescribing a proton pump inhibitor (PPI). PPIs provide gastroprotection against NSAID-induced mucosal injury and sequential complication of GI bleeding. In a multicenter randomized control trial, patients who received concomitant PPI therapy while undergoing chronic NSAID therapy—including nonselective and COX-2 selective NSAIDs—had a significantly lower risk of GI ulcer development (placebo, 17.0%; 20 mg esomeprazole, 5.2%; 40 mg esomeprazole, 4.6%).9 Current clinical guidelines for preventing NSAIDassociated bleeding complications recommend using a COX-2 selective NSAID in combination with PPI therapy for patients at high risk for GI-related bleeding, including the concomitant use of anticoagulants.10
There is evidence suggesting an increased bleeding risk with NSAIDs when used in combination with vitamin K antagonists such as warfarin.11,12 A systematic review of warfarin and concomitant NSAID use found an increased risk of overall bleeding with NSAID use in combination with warfarin (odds ratio 1.58; 95% CI, 1.18-2.12), compared to warfarin alone.12
Posthoc analyses of randomized clinical trials have also demonstrated an increased bleeding risk with oral anticoagulation and concomitant NSAID use.13,14 In the RE-LY trial, NSAID users on warfarin or dabigatran had a statistically significant increased risk of major bleeding compared to non-NSAID users (hazard ratio [HR] 1.68; 95% CI, 1.40- 2.02; P < .001).13 In the ARISTOTLE trial, patients on warfarin or apixaban who were incident NSAID users were found to have an increased risk of major bleeding (HR 1.61; 95% CI, 1.11-2.33) and clinically relevant nonmajor bleeding (HR 1.70; 95% CI, 1.16- 2.48).14 These trials found a statistically significant increased bleeding risk associated with NSAID use, though the populations evaluated included patients taking warfarin and patients taking DOACs. These trials did not evaluate the bleeding risk of concomitant NSAID use among DOACs alone.
Evidence on NSAID-associated bleeding risk with DOACs is lacking in settings where the patient population, prescribing practices, and monitoring levels are variable. Within the Veterans Health Administration, clinical pharmacist practitioners (CPPs) in anticoagulation clinics oversee DOAC therapy management. CPPs monitor safety and efficacy of DOAC therapies through a population health management tool, the DOAC Dashboard.15 The DOAC Dashboard creates alerts for patients who may require an intervention based on certain clinical parameters, such as drug-drug interactions.16 Whenever a patient on a DOAC is prescribed an NSAID, an alert is generated on the DOAC Dashboard to flag the CPPs for the potential need for an intervention. If NSAID therapy remains clinically indicated, CPPs may recommend risk reduction strategies such as a COX-2 selective NSAID or coprescribing a PPI.10
The DOAC Dashboard provides an ideal setting for investigating the effects of NSAID use, NSAID selectivity, and PPI coprescribing on DOAC bleeding rates. With an increasing population of patients receiving anticoagulation therapy with a DOAC, more guidance regarding the bleeding risk of concomitant NSAID use with DOACs is needed. Studies evaluating the bleeding risk with concomitant NSAID use in patients on a DOAC alone are limited. This is the first study to date to compare bleeding risk with concomitant NSAID use between DOACs. This study provides information on bleeding risk with NSAID use among commonly prescribed DOACs, rivaroxaban and apixaban, and the potential impacts of current risk reduction strategies.
METHODS
This single-center retrospective cohort review was performed using the electronic health records (EHRs) of patients enrolled in the US Department of Veterans Affairs (VA) Mountain Home Healthcare System who received rivaroxaban or apixaban from December 2020 to December 2022. This study received approval from the East Tennessee State University/VA Institutional Review Board committee.
Patients were identified through the DOAC Dashboard, aged 21 to 100 years, and received rivaroxaban or apixaban at a therapeutic dose: rivaroxaban 10 to 20 mg daily or apixaban 2.5 to 5 mg twice daily. Patients were excluded if they were prescribed dual antiplatelet therapy, received rivaroxaban at dosing indicated for peripheral vascular disease, were undergoing dialysis, had evidence of moderate to severe hepatic impairment or any hepatic disease with coagulopathy, were undergoing chemotherapy or radiation, or had hematological conditions with predisposed bleeding risk. These patients were excluded to mitigate the potential confounding impact from nontherapeutic DOAC dosing strategies and conditions associated with an increased bleeding risk.
Eligible patients were stratified based on NSAID use. NSAID users were defined as patients prescribed an oral NSAID, including both acute and chronic courses, at any point during the study time frame while actively on a DOAC. Bleeding events were reviewed to evaluate rates between rivaroxaban and apixaban among NSAID and nonNSAID users. Identified NSAID users were further assessed for NSAID selectivity and PPI coprescribing as a subgroup analysis for the secondary assessment.
Data Collection
Baseline data were collected, including age, body mass index, anticoagulation indication, DOAC agent, DOAC dose, and DOAC total daily dose. Baseline serum creatinine levels, liver function tests, hemoglobin levels, and platelet counts were collected from the most recent data available immediately prior to the bleeding event, if applicable.
The DOAC Dashboard was reviewed for active and dismissed drug interaction alerts to identify patients taking rivaroxaban or apixaban who were prescribed an NSAID. Patients were categorized in the NSAID group if an interacting drug alert with an NSAID was reported during the study time frame. Data available through the interacting drug alerts on NSAID use were limited to the interacting drug name and date of the reported flag. Manual EHR review was required to confirm dates of NSAID therapy initiation and NSAID discontinuation, if applicable.
Data regarding concomitant antiplatelet use were obtained through review of the active and dismissed drug interaction alerts on the DOAC Dashboard. Concomitant antiplatelet use was defined as the prescribing of a single antiplatelet agent at any point while receiving DOAC therapy. Data on concomitant antiplatelets were collected regardless of NSAID status.
Data on coprescribed PPI therapy were obtained through manual EHR review of identified NSAID users. Coprescribed PPI therapy was defined as the prescribing of a PPI at any point during NSAID therapy. Data regarding PPI use among non-NSAID users were not collected because the secondary endpoint was designed to assess PPI use only among patients coprescribed a DOAC and NSAID.
Outcomes
Bleeding events were identified through an outcomes report generated by the DOAC Dashboard based on International Classification of Diseases, Tenth Revision diagnosis codes associated with a bleeding event. The outcomes report captures diagnoses from the outpatient and inpatient care settings. Reported bleeding events were limited to patients who received a DOAC at any point in the 6 months prior to the event and excluded patients with recent DOAC initiation within 7 days of the event, as these patients are not captured on the DOAC Dashboard.
All reported bleeding events were manually reviewed in the EHR and categorized as a major or clinically relevant nonmajor bleed, according to International Society of Thrombosis and Haemostasis criteria. Validated bleeding events were then crossreferenced with the interacting drug alerts report to identify events with potentially overlapping NSAID therapy at the time of the event. Overlapping NSAID therapy was defined as the prescribing of an NSAID at any point in the 6 months prior to the event. All events with potential overlapping NSAID therapies were manually reviewed for confirmation of NSAID status at the time of the event.
The primary endpoint was a composite of any bleeding event per International Society of Thrombosis and Haemostasis criteria. The secondary endpoint evaluated the potential impact of NSAID selectivity or PPI coprescribing on the bleeding rate among the NSAID user groups.
Statistical Analysis
Analyses were performed consistent with the methods used in the ARISTOTLE and RE-LY trials. It was determined that a sample size of 504 patients, with ≥ 168 patients in each group, would provide 80% power using a 2-sided a of 0.05. HRs with 95% CIs and respective P values were calculated using a SPSS-adapted online calculator.
RESULTS
The DOAC Dashboard identified 681 patients on rivaroxaban and 3225 patients on apixaban; 72 patients on rivaroxaban (10.6%) and 300 patients on apixaban (9.3%) were NSAID users. The mean age of NSAID users was 66.9 years in the rivaroxaban group and 72.4 years in the apixaban group. The mean age of non-NSAID users was 71.5 years in the rivaroxaban group and 75.6 years in the apixaban group. No appreciable differences were observed among subgroups in body mass index, renal function, hepatic function, hemoglobin, or platelet counts, and no statistically significant differences were identified (Table 1). Antiplatelet agents identified included aspirin, clopidogrel, prasugrel, and ticagrelor. Fifteen patients (20.3%) in the rivaroxaban group and 87 patients (28.7%) in the apixaban group had concomitant antiplatelet and NSAID use. Forty-five patients on rivaroxaban (60.8%) and 170 (55.9%) on apixaban were prescribed concomitant PPI and NSAID at baseline. Among non-NSAID users, there was concomitant antiplatelet use for 265 patients (43.6%) in the rivaroxaban group and 1401 patients (47.9%) in the apixaban group. Concomitant PPI use was identified among 63 patients (60.0%) taking selective NSAIDs and 182 (57.2%) taking nonselective NSAIDs.

A total of 423 courses of NSAIDs were identified: 85 NSAID courses in the rivaroxaban group and 338 NSAID courses in the apixaban group. Most NSAID courses involved a nonselective NSAID in the rivaroxaban and apixaban NSAID user groups: 75.2% (n = 318) aggregately compared to 71.8% (n = 61) and 76.0% (n = 257) in the rivaroxaban and apixaban groups, respectively. The most frequent NSAID courses identified were meloxicam (26.7%; n = 113), celecoxib (24.8%; n = 105), ibuprofen (19.1%; n = 81), and naproxen (13.5%; n = 57). Data regarding NSAID therapy initiation and discontinuation dates were not readily available. As a result, the duration of NSAID courses was not captured.
There was no statistically significant difference in bleeding rates between rivaroxaban and apixaban among NSAID users (HR 1.04; 95% CI, 0.98-1.12) or non-NSAID users (HR 1.15; 95% CI, 0.80-1.66) (Table 2). Apixaban non-NSAID users had a higher rate of major bleeds (HR 0.32; 95% CI, 0.17-0.61) while rivaroxaban non-NSAID users had a higher rate of clinically relevant nonmajor bleeds (HR 1.63; 95% CI, 1.10-2.54).

The sample size for the secondary endpoint consisted of bleeding events that were confirmed to have had an overlapping NSAID prescribed at the time of the event. For this secondary assessment, there was 1 rivaroxaban NSAID user bleeding event and 4 apixaban NSAID user bleeding events. For the rivaroxaban NSAID user bleeding event, the NSAID was nonselective and a PPI was not coprescribed. For the apixaban NSAID user bleeding events, 2 NSAIDs were nonselective and 2 were selective. All patients with apixaban and NSAID bleeding events had a coprescribed PPI. There was no clinically significant difference in the bleeding rates observed for NSAID selectivity or PPI coprescribing among the NSAID user subgroups.
DISCUSSION
This study found that there was no statistically significant difference for bleeding rates of major and nonmajor bleeding events between rivaroxaban and apixaban among NSAID users and non-NSAID users. This study did not identify a clinically significant impact on bleeding rates from NSAID selectivity or PPI coprescribing among the NSAID users.
There were notable but not statistically significant differences in baseline characteristics observed between the NSAID and non-NSAID user groups. On average, the rivaroxaban and apixaban NSAID users were younger compared with those not taking NSAIDs. NSAIDs, specifically nonselective NSAIDs, are recognized as potentially inappropriate medications for older adults given that this population is at an increased risk for GI ulcer development and/or GI bleeding.17 The non-NSAID user group likely consisted of older patients compared to the NSAID user group as clinicians may avoid prescribing NSAIDs to older adults regardless of concomitant DOAC therapy.
In addition to having an older patient population, non-NSAID users were more frequently prescribed a concomitant antiplatelet when compared with NSAID users. This prescribing pattern may be due to clinicians avoiding the use of NSAIDs in patients receiving DOAC therapy in combination with antiplatelet therapy, as these patients have been found to have an increased bleeding rate compared to DOAC therapy alone.18
Non-NSAID users had an overall higher bleeding rate for both major and nonmajor bleeding events. Based on this observation, it could be hypothesized that antiplatelet agents have a higher risk of bleeding in comparison to NSAIDs. In a subanalysis of the EXPAND study evaluating risk factors of major bleeding in patients receiving rivaroxaban, concomitant use of antiplatelet agents demonstrated a statistically significant increased risk of bleeding (HR 1.6; 95% CI, 1.2-2.3; P = .003) while concomitant use of NSAIDs did not (HR 0.8; 95% CI, 0.3-2.2; P = .67).19
In assessing PPI status at baseline, a majority of both rivaroxaban and apixaban NSAID users were coprescribed a PPI. This trend aligns with current clinical guideline recommendations for the prescribing of PPI therapy for GI protection in high-risk patients, such as those on DOAC therapy and concomitant NSAID therapy.10 Given the high proportion of NSAID users coprescribed a PPI at baseline, it may be possible that the true incidence of NSAID-associated bleeding events was higher than what this study found. This observation may reflect the impact from timely implementation of risk mitigation strategies by CPPs in the anticoagulation clinic. However, this study was not constructed to assess the efficacy of PPI use in this manner.
It is important to note the patients included in this study were followed by a pharmacist in an anticoagulation clinic using the DOAC Dashboard.15 This population management tool allows CPPs to make proactive interventions when a patient taking a DOAC receives an NSAID prescription, such as recommending the coprescribing of a PPI or use of a selective NSAID.10,16 These standards of care may have contributed to an overall reduced bleeding rate among the NSAID user group and may not be reflective of private practice.
The planned analysis of this study was modeled after the posthoc analysis of the RE-LY and ARISTOTLE trials. Both trials demonstrated an increased risk of bleeding with oral anticoagulation, including DOAC and warfarin, in combination with NSAID use. However, both trials found that NSAID use in patients treated with a DOAC was not independently associated with increased bleeding events compared with warfarin.13,14 The results of this study are comparable to the RE-LY and ARISTOTLE findings that NSAID use among patients treated with rivaroxaban or apixaban did not demonstrate a statistically significant increased bleeding risk.
Studies of NSAID use in combination with DOAC therapy have been limited to patient populations consisting of both DOAC and warfarin. Evidence from these trials outlines the increased bleeding risk associated with NSAID use in combination with oral anticoagulation; however, these patient populations include those on a DOAC and warfarin.13,14,19,20 Given the limited evidence on NSAID use among DOACs alone, it is assumed NSAID use in combination with DOACs has a similar risk of bleeding as warfarin use. This may cause clinicians to automatically exclude NSAID therapy as a treatment option for patients on a DOAC who are otherwise clinically appropriate candidates, such as those with underlying inflammatory conditions. Avoiding NSAID therapy in this patient population may lead to suboptimal pain management and increase the risk of patient harm from methods such as inappropriate opioid therapy prescribing.
DOAC therapy should not be a universal limitation to the use of NSAIDs. Although the risk of bleeding with NSAID therapy is always present, deliberate NSAID prescribing in addition to the timely implementation of risk mitigation strategies may provide an avenue for safe NSAID prescribing in patients receiving a DOAC. A population health-based approach to DOAC management, such as the DOAC Dashboard, appears to be effective at preventing patient harm when NSAIDs are prescribed in conjunction with DOACs.
Limitations
The DOAC Dashboard has been shown to be effective and efficient at monitoring DOAC therapy from a population-based approach.16 Reports generated through the DOAC Dashboard provide convenient access to patient data which allows for timely interventions; however, there are limits to its use for data collection. All the data elements necessary to properly assess bleeding risk with validated tools, such as HAS-BLED (hypertension, abnormal renal/liver function, stroke, bleeding history or predisposition, labile international normalized ratio, elderly, drugs/ alcohol concomitantly), are not available on DOAC Dashboard reports. Due to this constraint, bleeding risk assessments were not conducted at baseline and this study was unable to include risk modeling. Additionally, data elements like initiation and discontinuation dates and duration of therapies were not readily available. As a result, this study was unable to incorporate time as a data point.
This was a retrospective study that relied on manual review of chart documentation to verify bleeding events, but data obtained through the DOAC Dashboard were transferred directly from the EHR.15 Bleeding events available for evaluation were restricted to those that occurred at a VA facility. Additionally, the sample size within the rivaroxaban NSAID user group did not reach the predefined sample size required to reach power and may have been too small to detect a difference if one did exist. The secondary assessment had a low sample size of NSAID user bleeding events, making it difficult to fully assess its impact on NSAID selectivity and PPI coprescribing on bleeding rates. All courses of NSAIDs were equally valued regardless of the dose or therapy duration; however, this is consistent with how NSAID use was defined in the RE-LY and ARISTOTLE trials.
CONCLUSIONS
This retrospective cohort review found no statistically significant difference in the composite bleeding rates between rivaroxaban and apixaban among NSAID users and non-NSAID users. Moreover, there was no clinically significant impact observed for bleeding rates in regard to NSAID selectivity and PPI coprescribing among NSAID users. However, coprescribing of PPI therapy to patients on a DOAC who are clinically indicated for an NSAID may reduce the risk of bleeding. Population health management tools, such as the DOAC Dashboard, may also allow clinicians to safely prescribe NSAIDs to patients on a DOAC. Further large-scale observational studies are needed to quantify the real-world risk of bleeding with concomitant NSAID use among DOACs alone and to evaluate the impact from NSAID selectivity or PPI coprescribing.
Clinical practice has shifted from vitamin K antagonists to direct oral anticoagulants (DOACs) for atrial fibrillation treatment due to their more favorable risk-benefit profile and less lifestyle modification required.1,2 However, the advantage of a lower bleeding risk with DOACs could be compromised by potentially problematic pharmacokinetic interactions like those conferred by antiplatelets or nonsteroidal anti-inflammatory drugs (NSAIDs).3,4 Treating a patient needing anticoagulation with a DOAC who has comorbidities may introduce unavoidable drug-drug interactions. This particularly happens with over-the-counter and prescription NSAIDs used for the management of pain and inflammatory conditions.5
NSAIDs primarily affect 2 cyclooxygenase (COX) enzyme isomers, COX-1 and COX-2.6 COX-1 helps maintain gastrointestinal (GI) mucosa integrity and platelet aggregation processes, whereas COX-2 is engaged in pain signaling and inflammation mediation. COX-1 inhibition is associated with more bleeding-related adverse events (AEs), especially in the GI tract. COX-2 inhibition is thought to provide analgesia and anti-inflammatory properties without elevating bleeding risk. This premise is responsible for the preferential use of celecoxib, a COX-2 selective NSAID, which should confer a lower bleeding risk compared to nonselective NSAIDs such as ibuprofen and naproxen.7 NSAIDs have been documented as independent risk factors for bleeding. NSAID users are about 3 times as likely to develop GI AEs compared to nonNSAID users.8
Many clinicians aim to further mitigate NSAID-associated bleeding risk by coprescribing a proton pump inhibitor (PPI). PPIs provide gastroprotection against NSAID-induced mucosal injury and sequential complication of GI bleeding. In a multicenter randomized control trial, patients who received concomitant PPI therapy while undergoing chronic NSAID therapy—including nonselective and COX-2 selective NSAIDs—had a significantly lower risk of GI ulcer development (placebo, 17.0%; 20 mg esomeprazole, 5.2%; 40 mg esomeprazole, 4.6%).9 Current clinical guidelines for preventing NSAIDassociated bleeding complications recommend using a COX-2 selective NSAID in combination with PPI therapy for patients at high risk for GI-related bleeding, including the concomitant use of anticoagulants.10
There is evidence suggesting an increased bleeding risk with NSAIDs when used in combination with vitamin K antagonists such as warfarin.11,12 A systematic review of warfarin and concomitant NSAID use found an increased risk of overall bleeding with NSAID use in combination with warfarin (odds ratio 1.58; 95% CI, 1.18-2.12), compared to warfarin alone.12
Posthoc analyses of randomized clinical trials have also demonstrated an increased bleeding risk with oral anticoagulation and concomitant NSAID use.13,14 In the RE-LY trial, NSAID users on warfarin or dabigatran had a statistically significant increased risk of major bleeding compared to non-NSAID users (hazard ratio [HR] 1.68; 95% CI, 1.40- 2.02; P < .001).13 In the ARISTOTLE trial, patients on warfarin or apixaban who were incident NSAID users were found to have an increased risk of major bleeding (HR 1.61; 95% CI, 1.11-2.33) and clinically relevant nonmajor bleeding (HR 1.70; 95% CI, 1.16- 2.48).14 These trials found a statistically significant increased bleeding risk associated with NSAID use, though the populations evaluated included patients taking warfarin and patients taking DOACs. These trials did not evaluate the bleeding risk of concomitant NSAID use among DOACs alone.
Evidence on NSAID-associated bleeding risk with DOACs is lacking in settings where the patient population, prescribing practices, and monitoring levels are variable. Within the Veterans Health Administration, clinical pharmacist practitioners (CPPs) in anticoagulation clinics oversee DOAC therapy management. CPPs monitor safety and efficacy of DOAC therapies through a population health management tool, the DOAC Dashboard.15 The DOAC Dashboard creates alerts for patients who may require an intervention based on certain clinical parameters, such as drug-drug interactions.16 Whenever a patient on a DOAC is prescribed an NSAID, an alert is generated on the DOAC Dashboard to flag the CPPs for the potential need for an intervention. If NSAID therapy remains clinically indicated, CPPs may recommend risk reduction strategies such as a COX-2 selective NSAID or coprescribing a PPI.10
The DOAC Dashboard provides an ideal setting for investigating the effects of NSAID use, NSAID selectivity, and PPI coprescribing on DOAC bleeding rates. With an increasing population of patients receiving anticoagulation therapy with a DOAC, more guidance regarding the bleeding risk of concomitant NSAID use with DOACs is needed. Studies evaluating the bleeding risk with concomitant NSAID use in patients on a DOAC alone are limited. This is the first study to date to compare bleeding risk with concomitant NSAID use between DOACs. This study provides information on bleeding risk with NSAID use among commonly prescribed DOACs, rivaroxaban and apixaban, and the potential impacts of current risk reduction strategies.
METHODS
This single-center retrospective cohort review was performed using the electronic health records (EHRs) of patients enrolled in the US Department of Veterans Affairs (VA) Mountain Home Healthcare System who received rivaroxaban or apixaban from December 2020 to December 2022. This study received approval from the East Tennessee State University/VA Institutional Review Board committee.
Patients were identified through the DOAC Dashboard, aged 21 to 100 years, and received rivaroxaban or apixaban at a therapeutic dose: rivaroxaban 10 to 20 mg daily or apixaban 2.5 to 5 mg twice daily. Patients were excluded if they were prescribed dual antiplatelet therapy, received rivaroxaban at dosing indicated for peripheral vascular disease, were undergoing dialysis, had evidence of moderate to severe hepatic impairment or any hepatic disease with coagulopathy, were undergoing chemotherapy or radiation, or had hematological conditions with predisposed bleeding risk. These patients were excluded to mitigate the potential confounding impact from nontherapeutic DOAC dosing strategies and conditions associated with an increased bleeding risk.
Eligible patients were stratified based on NSAID use. NSAID users were defined as patients prescribed an oral NSAID, including both acute and chronic courses, at any point during the study time frame while actively on a DOAC. Bleeding events were reviewed to evaluate rates between rivaroxaban and apixaban among NSAID and nonNSAID users. Identified NSAID users were further assessed for NSAID selectivity and PPI coprescribing as a subgroup analysis for the secondary assessment.
Data Collection
Baseline data were collected, including age, body mass index, anticoagulation indication, DOAC agent, DOAC dose, and DOAC total daily dose. Baseline serum creatinine levels, liver function tests, hemoglobin levels, and platelet counts were collected from the most recent data available immediately prior to the bleeding event, if applicable.
The DOAC Dashboard was reviewed for active and dismissed drug interaction alerts to identify patients taking rivaroxaban or apixaban who were prescribed an NSAID. Patients were categorized in the NSAID group if an interacting drug alert with an NSAID was reported during the study time frame. Data available through the interacting drug alerts on NSAID use were limited to the interacting drug name and date of the reported flag. Manual EHR review was required to confirm dates of NSAID therapy initiation and NSAID discontinuation, if applicable.
Data regarding concomitant antiplatelet use were obtained through review of the active and dismissed drug interaction alerts on the DOAC Dashboard. Concomitant antiplatelet use was defined as the prescribing of a single antiplatelet agent at any point while receiving DOAC therapy. Data on concomitant antiplatelets were collected regardless of NSAID status.
Data on coprescribed PPI therapy were obtained through manual EHR review of identified NSAID users. Coprescribed PPI therapy was defined as the prescribing of a PPI at any point during NSAID therapy. Data regarding PPI use among non-NSAID users were not collected because the secondary endpoint was designed to assess PPI use only among patients coprescribed a DOAC and NSAID.
Outcomes
Bleeding events were identified through an outcomes report generated by the DOAC Dashboard based on International Classification of Diseases, Tenth Revision diagnosis codes associated with a bleeding event. The outcomes report captures diagnoses from the outpatient and inpatient care settings. Reported bleeding events were limited to patients who received a DOAC at any point in the 6 months prior to the event and excluded patients with recent DOAC initiation within 7 days of the event, as these patients are not captured on the DOAC Dashboard.
All reported bleeding events were manually reviewed in the EHR and categorized as a major or clinically relevant nonmajor bleed, according to International Society of Thrombosis and Haemostasis criteria. Validated bleeding events were then crossreferenced with the interacting drug alerts report to identify events with potentially overlapping NSAID therapy at the time of the event. Overlapping NSAID therapy was defined as the prescribing of an NSAID at any point in the 6 months prior to the event. All events with potential overlapping NSAID therapies were manually reviewed for confirmation of NSAID status at the time of the event.
The primary endpoint was a composite of any bleeding event per International Society of Thrombosis and Haemostasis criteria. The secondary endpoint evaluated the potential impact of NSAID selectivity or PPI coprescribing on the bleeding rate among the NSAID user groups.
Statistical Analysis
Analyses were performed consistent with the methods used in the ARISTOTLE and RE-LY trials. It was determined that a sample size of 504 patients, with ≥ 168 patients in each group, would provide 80% power using a 2-sided a of 0.05. HRs with 95% CIs and respective P values were calculated using a SPSS-adapted online calculator.
RESULTS
The DOAC Dashboard identified 681 patients on rivaroxaban and 3225 patients on apixaban; 72 patients on rivaroxaban (10.6%) and 300 patients on apixaban (9.3%) were NSAID users. The mean age of NSAID users was 66.9 years in the rivaroxaban group and 72.4 years in the apixaban group. The mean age of non-NSAID users was 71.5 years in the rivaroxaban group and 75.6 years in the apixaban group. No appreciable differences were observed among subgroups in body mass index, renal function, hepatic function, hemoglobin, or platelet counts, and no statistically significant differences were identified (Table 1). Antiplatelet agents identified included aspirin, clopidogrel, prasugrel, and ticagrelor. Fifteen patients (20.3%) in the rivaroxaban group and 87 patients (28.7%) in the apixaban group had concomitant antiplatelet and NSAID use. Forty-five patients on rivaroxaban (60.8%) and 170 (55.9%) on apixaban were prescribed concomitant PPI and NSAID at baseline. Among non-NSAID users, there was concomitant antiplatelet use for 265 patients (43.6%) in the rivaroxaban group and 1401 patients (47.9%) in the apixaban group. Concomitant PPI use was identified among 63 patients (60.0%) taking selective NSAIDs and 182 (57.2%) taking nonselective NSAIDs.

A total of 423 courses of NSAIDs were identified: 85 NSAID courses in the rivaroxaban group and 338 NSAID courses in the apixaban group. Most NSAID courses involved a nonselective NSAID in the rivaroxaban and apixaban NSAID user groups: 75.2% (n = 318) aggregately compared to 71.8% (n = 61) and 76.0% (n = 257) in the rivaroxaban and apixaban groups, respectively. The most frequent NSAID courses identified were meloxicam (26.7%; n = 113), celecoxib (24.8%; n = 105), ibuprofen (19.1%; n = 81), and naproxen (13.5%; n = 57). Data regarding NSAID therapy initiation and discontinuation dates were not readily available. As a result, the duration of NSAID courses was not captured.
There was no statistically significant difference in bleeding rates between rivaroxaban and apixaban among NSAID users (HR 1.04; 95% CI, 0.98-1.12) or non-NSAID users (HR 1.15; 95% CI, 0.80-1.66) (Table 2). Apixaban non-NSAID users had a higher rate of major bleeds (HR 0.32; 95% CI, 0.17-0.61) while rivaroxaban non-NSAID users had a higher rate of clinically relevant nonmajor bleeds (HR 1.63; 95% CI, 1.10-2.54).

The sample size for the secondary endpoint consisted of bleeding events that were confirmed to have had an overlapping NSAID prescribed at the time of the event. For this secondary assessment, there was 1 rivaroxaban NSAID user bleeding event and 4 apixaban NSAID user bleeding events. For the rivaroxaban NSAID user bleeding event, the NSAID was nonselective and a PPI was not coprescribed. For the apixaban NSAID user bleeding events, 2 NSAIDs were nonselective and 2 were selective. All patients with apixaban and NSAID bleeding events had a coprescribed PPI. There was no clinically significant difference in the bleeding rates observed for NSAID selectivity or PPI coprescribing among the NSAID user subgroups.
DISCUSSION
This study found that there was no statistically significant difference for bleeding rates of major and nonmajor bleeding events between rivaroxaban and apixaban among NSAID users and non-NSAID users. This study did not identify a clinically significant impact on bleeding rates from NSAID selectivity or PPI coprescribing among the NSAID users.
There were notable but not statistically significant differences in baseline characteristics observed between the NSAID and non-NSAID user groups. On average, the rivaroxaban and apixaban NSAID users were younger compared with those not taking NSAIDs. NSAIDs, specifically nonselective NSAIDs, are recognized as potentially inappropriate medications for older adults given that this population is at an increased risk for GI ulcer development and/or GI bleeding.17 The non-NSAID user group likely consisted of older patients compared to the NSAID user group as clinicians may avoid prescribing NSAIDs to older adults regardless of concomitant DOAC therapy.
In addition to having an older patient population, non-NSAID users were more frequently prescribed a concomitant antiplatelet when compared with NSAID users. This prescribing pattern may be due to clinicians avoiding the use of NSAIDs in patients receiving DOAC therapy in combination with antiplatelet therapy, as these patients have been found to have an increased bleeding rate compared to DOAC therapy alone.18
Non-NSAID users had an overall higher bleeding rate for both major and nonmajor bleeding events. Based on this observation, it could be hypothesized that antiplatelet agents have a higher risk of bleeding in comparison to NSAIDs. In a subanalysis of the EXPAND study evaluating risk factors of major bleeding in patients receiving rivaroxaban, concomitant use of antiplatelet agents demonstrated a statistically significant increased risk of bleeding (HR 1.6; 95% CI, 1.2-2.3; P = .003) while concomitant use of NSAIDs did not (HR 0.8; 95% CI, 0.3-2.2; P = .67).19
In assessing PPI status at baseline, a majority of both rivaroxaban and apixaban NSAID users were coprescribed a PPI. This trend aligns with current clinical guideline recommendations for the prescribing of PPI therapy for GI protection in high-risk patients, such as those on DOAC therapy and concomitant NSAID therapy.10 Given the high proportion of NSAID users coprescribed a PPI at baseline, it may be possible that the true incidence of NSAID-associated bleeding events was higher than what this study found. This observation may reflect the impact from timely implementation of risk mitigation strategies by CPPs in the anticoagulation clinic. However, this study was not constructed to assess the efficacy of PPI use in this manner.
It is important to note the patients included in this study were followed by a pharmacist in an anticoagulation clinic using the DOAC Dashboard.15 This population management tool allows CPPs to make proactive interventions when a patient taking a DOAC receives an NSAID prescription, such as recommending the coprescribing of a PPI or use of a selective NSAID.10,16 These standards of care may have contributed to an overall reduced bleeding rate among the NSAID user group and may not be reflective of private practice.
The planned analysis of this study was modeled after the posthoc analysis of the RE-LY and ARISTOTLE trials. Both trials demonstrated an increased risk of bleeding with oral anticoagulation, including DOAC and warfarin, in combination with NSAID use. However, both trials found that NSAID use in patients treated with a DOAC was not independently associated with increased bleeding events compared with warfarin.13,14 The results of this study are comparable to the RE-LY and ARISTOTLE findings that NSAID use among patients treated with rivaroxaban or apixaban did not demonstrate a statistically significant increased bleeding risk.
Studies of NSAID use in combination with DOAC therapy have been limited to patient populations consisting of both DOAC and warfarin. Evidence from these trials outlines the increased bleeding risk associated with NSAID use in combination with oral anticoagulation; however, these patient populations include those on a DOAC and warfarin.13,14,19,20 Given the limited evidence on NSAID use among DOACs alone, it is assumed NSAID use in combination with DOACs has a similar risk of bleeding as warfarin use. This may cause clinicians to automatically exclude NSAID therapy as a treatment option for patients on a DOAC who are otherwise clinically appropriate candidates, such as those with underlying inflammatory conditions. Avoiding NSAID therapy in this patient population may lead to suboptimal pain management and increase the risk of patient harm from methods such as inappropriate opioid therapy prescribing.
DOAC therapy should not be a universal limitation to the use of NSAIDs. Although the risk of bleeding with NSAID therapy is always present, deliberate NSAID prescribing in addition to the timely implementation of risk mitigation strategies may provide an avenue for safe NSAID prescribing in patients receiving a DOAC. A population health-based approach to DOAC management, such as the DOAC Dashboard, appears to be effective at preventing patient harm when NSAIDs are prescribed in conjunction with DOACs.
Limitations
The DOAC Dashboard has been shown to be effective and efficient at monitoring DOAC therapy from a population-based approach.16 Reports generated through the DOAC Dashboard provide convenient access to patient data which allows for timely interventions; however, there are limits to its use for data collection. All the data elements necessary to properly assess bleeding risk with validated tools, such as HAS-BLED (hypertension, abnormal renal/liver function, stroke, bleeding history or predisposition, labile international normalized ratio, elderly, drugs/ alcohol concomitantly), are not available on DOAC Dashboard reports. Due to this constraint, bleeding risk assessments were not conducted at baseline and this study was unable to include risk modeling. Additionally, data elements like initiation and discontinuation dates and duration of therapies were not readily available. As a result, this study was unable to incorporate time as a data point.
This was a retrospective study that relied on manual review of chart documentation to verify bleeding events, but data obtained through the DOAC Dashboard were transferred directly from the EHR.15 Bleeding events available for evaluation were restricted to those that occurred at a VA facility. Additionally, the sample size within the rivaroxaban NSAID user group did not reach the predefined sample size required to reach power and may have been too small to detect a difference if one did exist. The secondary assessment had a low sample size of NSAID user bleeding events, making it difficult to fully assess its impact on NSAID selectivity and PPI coprescribing on bleeding rates. All courses of NSAIDs were equally valued regardless of the dose or therapy duration; however, this is consistent with how NSAID use was defined in the RE-LY and ARISTOTLE trials.
CONCLUSIONS
This retrospective cohort review found no statistically significant difference in the composite bleeding rates between rivaroxaban and apixaban among NSAID users and non-NSAID users. Moreover, there was no clinically significant impact observed for bleeding rates in regard to NSAID selectivity and PPI coprescribing among NSAID users. However, coprescribing of PPI therapy to patients on a DOAC who are clinically indicated for an NSAID may reduce the risk of bleeding. Population health management tools, such as the DOAC Dashboard, may also allow clinicians to safely prescribe NSAIDs to patients on a DOAC. Further large-scale observational studies are needed to quantify the real-world risk of bleeding with concomitant NSAID use among DOACs alone and to evaluate the impact from NSAID selectivity or PPI coprescribing.
- Ruff CT, Giugliano RP, Braunwald E, et al. Comparison of the efficacy and safety of new oral anticoagulants with warfarin in patients with atrial fibrillation: a meta-analysis of randomised trials. Lancet. 2014;383(9921):955-962. doi:10.1016/S0140-6736(13)62343-0
- Ageno W, Gallus AS, Wittkowsky A, Crowther M, Hylek EM, Palareti G. Oral anticoagulant therapy: antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2 Suppl):e44S-e88S. doi:10.1378/chest.11-2292
- Eikelboom J, Merli G. Bleeding with direct oral anticoagulants vs warfarin: clinical experience. Am J Med. 2016;129(11S):S33-S40. doi:10.1016/j.amjmed.2016.06.003
- Vranckx P, Valgimigli M, Heidbuchel H. The significance of drug-drug and drug-food interactions of oral anticoagulation. Arrhythm Electrophysiol Rev. 2018;7(1):55-61. doi:10.15420/aer.2017.50.1
- Davis JS, Lee HY, Kim J, et al. Use of non-steroidal antiinflammatory drugs in US adults: changes over time and by demographic. Open Heart. 2017;4(1):e000550. doi:10.1136/openhrt-2016-000550
- Schafer AI. Effects of nonsteroidal antiinflammatory drugs on platelet function and systemic hemostasis. J Clin Pharmacol. 1995;35(3):209-219. doi:10.1002/j.1552-4604.1995.tb04050.x
- Al-Saeed A. Gastrointestinal and cardiovascular risk of nonsteroidal anti-inflammatory drugs. Oman Med J. 2011;26(6):385-391. doi:10.5001/omj.2011.101
- Gabriel SE, Jaakkimainen L, Bombardier C. Risk for serious gastrointestinal complications related to use of nonsteroidal anti-inflammatory drugs. Ann Intern Med. 1991;115(10):787-796. doi:10.7326/0003-4819-115-10-787
- Scheiman JM, Yeomans ND, Talley NJ, et al. Prevention of ulcers by esomeprazole in at-risk patients using non-selective NSAIDs and COX-2 inhibitors. Am J Gastroenterol. 2006;101(4):701-710. doi:10.1111/j.1572-0241.2006.00499.x
- Freedberg DE, Kim LS, Yang YX. The risks and benefits of long-term use of proton pump inhibitors: expert review and best practice advice from the American Gastroenterological Association. Gastroenterology. 2017;152(4):706-715. doi:10.1053/j.gastro.2017.01.031
- Lamberts M, Lip GYH, Hansen ML, et al. Relation of nonsteroidal anti-inflammatory drugs to serious bleeding and thromboembolism risk in patients with atrial fibrillation receiving antithrombotic therapy: a nationwide cohort study. Ann Intern Med. 2014;161(10):690-698. doi:10.7326/M13-1581
- Villa Zapata L, Hansten PD, Panic J, et al. Risk of bleeding with exposure to warfarin and nonsteroidal anti-inflammatory drugs: a systematic review and metaanalysis. Thromb Haemost. 2020;120(7):1066-1074. doi:10.1055/s-0040-1710592
- Kent AP, Brueckmann M, Fraessdorf M, et al. Concomitant oral anticoagulant and nonsteroidal anti-inflammatory drug therapy in patients with atrial fibrillation. J Am Coll Cardiol. 2018;72(3):255-267. doi:10.1016/j.jacc.2018.04.063
- Dalgaard F, Mulder H, Wojdyla DM, et al. Patients with atrial fibrillation taking nonsteroidal antiinflammatory drugs and oral anticoagulants in the ARISTOTLE Trial. Circulation. 2020;141(1):10-20. doi:10.1161/CIRCULATIONAHA.119.041296
- Allen AL, Lucas J, Parra D, et al. Shifting the paradigm: a population health approach to the management of direct oral anticoagulants. J Am Heart Asssoc. 2021;10(24):e022758. doi:10.1161/JAHA.121.022758
- . Valencia D, Spoutz P, Stoppi J, et al. Impact of a direct oral anticoagulant population management tool on anticoagulation therapy monitoring in clinical practice. Ann Pharmacother. 2019;53(8):806-811. doi:10.1177/1060028019835843
- By the 2023 American Geriatrics Society Beers Criteria® Update Expert Panel. American Geriatrics Society 2023 Updated AGS Beers Criteria® for potentially inappropriate medication use in older adults. J Am Geriatr Soc. 2023;71(7):2052-2081. doi:10.1111/jgs.18372
- Kumar S, Danik SB, Altman RK, et al. Non-vitamin K antagonist oral anticoagulants and antiplatelet therapy for stroke prevention in patients with atrial fibrillation. Cardiol Rev. 2016;24(5):218-223. doi:10.1097/CRD.0000000000000088
- Sakuma I, Uchiyama S, Atarashi H, et al. Clinical risk factors of stroke and major bleeding in patients with nonvalvular atrial fibrillation under rivaroxaban: the EXPAND study sub-analysis. Heart Vessels. 2019;34(11):1839-1851. doi:10.1007/s00380-019-01425-x
- Davidson BL, Verheijen S, Lensing AWA, et al. Bleeding risk of patients with acute venous thromboembolism taking nonsteroidal anti-inflammatory drugs or aspirin. JAMA Intern Med. 2014;174(6):947-953. doi:10.1001/jamainternmed.2014.946
- Ruff CT, Giugliano RP, Braunwald E, et al. Comparison of the efficacy and safety of new oral anticoagulants with warfarin in patients with atrial fibrillation: a meta-analysis of randomised trials. Lancet. 2014;383(9921):955-962. doi:10.1016/S0140-6736(13)62343-0
- Ageno W, Gallus AS, Wittkowsky A, Crowther M, Hylek EM, Palareti G. Oral anticoagulant therapy: antithrombotic therapy and prevention of thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest. 2012;141(2 Suppl):e44S-e88S. doi:10.1378/chest.11-2292
- Eikelboom J, Merli G. Bleeding with direct oral anticoagulants vs warfarin: clinical experience. Am J Med. 2016;129(11S):S33-S40. doi:10.1016/j.amjmed.2016.06.003
- Vranckx P, Valgimigli M, Heidbuchel H. The significance of drug-drug and drug-food interactions of oral anticoagulation. Arrhythm Electrophysiol Rev. 2018;7(1):55-61. doi:10.15420/aer.2017.50.1
- Davis JS, Lee HY, Kim J, et al. Use of non-steroidal antiinflammatory drugs in US adults: changes over time and by demographic. Open Heart. 2017;4(1):e000550. doi:10.1136/openhrt-2016-000550
- Schafer AI. Effects of nonsteroidal antiinflammatory drugs on platelet function and systemic hemostasis. J Clin Pharmacol. 1995;35(3):209-219. doi:10.1002/j.1552-4604.1995.tb04050.x
- Al-Saeed A. Gastrointestinal and cardiovascular risk of nonsteroidal anti-inflammatory drugs. Oman Med J. 2011;26(6):385-391. doi:10.5001/omj.2011.101
- Gabriel SE, Jaakkimainen L, Bombardier C. Risk for serious gastrointestinal complications related to use of nonsteroidal anti-inflammatory drugs. Ann Intern Med. 1991;115(10):787-796. doi:10.7326/0003-4819-115-10-787
- Scheiman JM, Yeomans ND, Talley NJ, et al. Prevention of ulcers by esomeprazole in at-risk patients using non-selective NSAIDs and COX-2 inhibitors. Am J Gastroenterol. 2006;101(4):701-710. doi:10.1111/j.1572-0241.2006.00499.x
- Freedberg DE, Kim LS, Yang YX. The risks and benefits of long-term use of proton pump inhibitors: expert review and best practice advice from the American Gastroenterological Association. Gastroenterology. 2017;152(4):706-715. doi:10.1053/j.gastro.2017.01.031
- Lamberts M, Lip GYH, Hansen ML, et al. Relation of nonsteroidal anti-inflammatory drugs to serious bleeding and thromboembolism risk in patients with atrial fibrillation receiving antithrombotic therapy: a nationwide cohort study. Ann Intern Med. 2014;161(10):690-698. doi:10.7326/M13-1581
- Villa Zapata L, Hansten PD, Panic J, et al. Risk of bleeding with exposure to warfarin and nonsteroidal anti-inflammatory drugs: a systematic review and metaanalysis. Thromb Haemost. 2020;120(7):1066-1074. doi:10.1055/s-0040-1710592
- Kent AP, Brueckmann M, Fraessdorf M, et al. Concomitant oral anticoagulant and nonsteroidal anti-inflammatory drug therapy in patients with atrial fibrillation. J Am Coll Cardiol. 2018;72(3):255-267. doi:10.1016/j.jacc.2018.04.063
- Dalgaard F, Mulder H, Wojdyla DM, et al. Patients with atrial fibrillation taking nonsteroidal antiinflammatory drugs and oral anticoagulants in the ARISTOTLE Trial. Circulation. 2020;141(1):10-20. doi:10.1161/CIRCULATIONAHA.119.041296
- Allen AL, Lucas J, Parra D, et al. Shifting the paradigm: a population health approach to the management of direct oral anticoagulants. J Am Heart Asssoc. 2021;10(24):e022758. doi:10.1161/JAHA.121.022758
- . Valencia D, Spoutz P, Stoppi J, et al. Impact of a direct oral anticoagulant population management tool on anticoagulation therapy monitoring in clinical practice. Ann Pharmacother. 2019;53(8):806-811. doi:10.1177/1060028019835843
- By the 2023 American Geriatrics Society Beers Criteria® Update Expert Panel. American Geriatrics Society 2023 Updated AGS Beers Criteria® for potentially inappropriate medication use in older adults. J Am Geriatr Soc. 2023;71(7):2052-2081. doi:10.1111/jgs.18372
- Kumar S, Danik SB, Altman RK, et al. Non-vitamin K antagonist oral anticoagulants and antiplatelet therapy for stroke prevention in patients with atrial fibrillation. Cardiol Rev. 2016;24(5):218-223. doi:10.1097/CRD.0000000000000088
- Sakuma I, Uchiyama S, Atarashi H, et al. Clinical risk factors of stroke and major bleeding in patients with nonvalvular atrial fibrillation under rivaroxaban: the EXPAND study sub-analysis. Heart Vessels. 2019;34(11):1839-1851. doi:10.1007/s00380-019-01425-x
- Davidson BL, Verheijen S, Lensing AWA, et al. Bleeding risk of patients with acute venous thromboembolism taking nonsteroidal anti-inflammatory drugs or aspirin. JAMA Intern Med. 2014;174(6):947-953. doi:10.1001/jamainternmed.2014.946
Impact of NSAID Use on Bleeding Rates for Patients Taking Rivaroxaban or Apixaban
Impact of NSAID Use on Bleeding Rates for Patients Taking Rivaroxaban or Apixaban
Skin Cancer Risk Elevated Among Blood, Marrow Transplant Survivors
TOPLINE:
with a cumulative incidence of 27.4% over 30 years, according to the results of a cohort study.
METHODOLOGY:
- The retrospective cohort study included 3880 BMT survivors (median age, 44 years; 55.8% men; 4.9% Black, 12.1 Hispanic, and 74.7% non-Hispanic White individuals) who underwent transplant between 1974 to 2014.
- Participants completed the BMT Survivor Study survey and were followed up for a median of 9.5 years.
- The primary outcomes were the development of subsequent cutaneous malignant neoplasms (BCC, SCC, or melanoma).
TAKEAWAY:
- The 30-year cumulative incidence of any cutaneous malignant neoplasm was 27.4% — 18% for BCC, 9.8% for SCC, and 3.7% for melanoma.
- A higher risk for skin cancer was reported for patients aged 50 years or more (subdistribution hazard ratio [SHR], 2.23; 95% CI, 1.83-2.71), and men (SHR, 1.40; 95% CI, 1.18-1.65).
- Allogeneic BMT with chronic graft-vs-host disease (cGVHD) increased the risk for skin cancer (SHR, 1.84; 95% CI, 1.37-2.47), compared with autologous BMT, while post-BMT immunosuppression increased risk for all types (overall SHR, 1.53; 95% CI, 1.26-1.86).
- The risk for any skin cancer was significantly lower in Black individuals (SHR, 0.14; 95% CI, 0.05-0.37), Hispanic individuals (SHR, 0.29; 95%CI, 0.20-0.62), and patients of other races or who were multiracial (SHR, 0.22; 95% CI, 0.13-0.37) than in non-Hispanic White patients.
IN PRACTICE:
In the study, “risk factors for post-BMT cutaneous malignant neoplasms included pretransplant treatment with a monoclonal antibody, cGVHD, and posttransplant immunosuppression,” the authors wrote, adding that the findings “could inform targeted surveillance of BMT survivors.” Most BMT survivors, “do not undergo routine dermatologic surveillance, highlighting the need to understand risk factors and incorporate risk-informed dermatologic surveillance into survivorship care plans.”
SOURCE:
The study was led by Kristy K. Broman, MD, MPH, University of Alabama at Birmingham, and was published online on December 18 in JAMA Dermatology.
LIMITATIONS:
Limitations included self-reported data and possible underreporting of melanoma cases in the SEER database. Additionally, the study did not capture other risk factors for cutaneous malignant neoplasms such as skin phototype, ultraviolet light exposure, or family history. The duration of posttransplant immunosuppression was not collected, and surveys were administered at variable intervals, though all were completed more than 2 years post BMT.
DISCLOSURES:
The study was supported by the National Cancer Institute (NCI) and the Leukemia and Lymphoma Society. Broman received grants from NCI, the National Center for Advancing Translational Sciences, the American Society of Clinical Oncology, and the American College of Surgeons. Another author reported receiving grants outside this work.
This article was created using several editorial tools, including artificial intelligence, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
with a cumulative incidence of 27.4% over 30 years, according to the results of a cohort study.
METHODOLOGY:
- The retrospective cohort study included 3880 BMT survivors (median age, 44 years; 55.8% men; 4.9% Black, 12.1 Hispanic, and 74.7% non-Hispanic White individuals) who underwent transplant between 1974 to 2014.
- Participants completed the BMT Survivor Study survey and were followed up for a median of 9.5 years.
- The primary outcomes were the development of subsequent cutaneous malignant neoplasms (BCC, SCC, or melanoma).
TAKEAWAY:
- The 30-year cumulative incidence of any cutaneous malignant neoplasm was 27.4% — 18% for BCC, 9.8% for SCC, and 3.7% for melanoma.
- A higher risk for skin cancer was reported for patients aged 50 years or more (subdistribution hazard ratio [SHR], 2.23; 95% CI, 1.83-2.71), and men (SHR, 1.40; 95% CI, 1.18-1.65).
- Allogeneic BMT with chronic graft-vs-host disease (cGVHD) increased the risk for skin cancer (SHR, 1.84; 95% CI, 1.37-2.47), compared with autologous BMT, while post-BMT immunosuppression increased risk for all types (overall SHR, 1.53; 95% CI, 1.26-1.86).
- The risk for any skin cancer was significantly lower in Black individuals (SHR, 0.14; 95% CI, 0.05-0.37), Hispanic individuals (SHR, 0.29; 95%CI, 0.20-0.62), and patients of other races or who were multiracial (SHR, 0.22; 95% CI, 0.13-0.37) than in non-Hispanic White patients.
IN PRACTICE:
In the study, “risk factors for post-BMT cutaneous malignant neoplasms included pretransplant treatment with a monoclonal antibody, cGVHD, and posttransplant immunosuppression,” the authors wrote, adding that the findings “could inform targeted surveillance of BMT survivors.” Most BMT survivors, “do not undergo routine dermatologic surveillance, highlighting the need to understand risk factors and incorporate risk-informed dermatologic surveillance into survivorship care plans.”
SOURCE:
The study was led by Kristy K. Broman, MD, MPH, University of Alabama at Birmingham, and was published online on December 18 in JAMA Dermatology.
LIMITATIONS:
Limitations included self-reported data and possible underreporting of melanoma cases in the SEER database. Additionally, the study did not capture other risk factors for cutaneous malignant neoplasms such as skin phototype, ultraviolet light exposure, or family history. The duration of posttransplant immunosuppression was not collected, and surveys were administered at variable intervals, though all were completed more than 2 years post BMT.
DISCLOSURES:
The study was supported by the National Cancer Institute (NCI) and the Leukemia and Lymphoma Society. Broman received grants from NCI, the National Center for Advancing Translational Sciences, the American Society of Clinical Oncology, and the American College of Surgeons. Another author reported receiving grants outside this work.
This article was created using several editorial tools, including artificial intelligence, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
with a cumulative incidence of 27.4% over 30 years, according to the results of a cohort study.
METHODOLOGY:
- The retrospective cohort study included 3880 BMT survivors (median age, 44 years; 55.8% men; 4.9% Black, 12.1 Hispanic, and 74.7% non-Hispanic White individuals) who underwent transplant between 1974 to 2014.
- Participants completed the BMT Survivor Study survey and were followed up for a median of 9.5 years.
- The primary outcomes were the development of subsequent cutaneous malignant neoplasms (BCC, SCC, or melanoma).
TAKEAWAY:
- The 30-year cumulative incidence of any cutaneous malignant neoplasm was 27.4% — 18% for BCC, 9.8% for SCC, and 3.7% for melanoma.
- A higher risk for skin cancer was reported for patients aged 50 years or more (subdistribution hazard ratio [SHR], 2.23; 95% CI, 1.83-2.71), and men (SHR, 1.40; 95% CI, 1.18-1.65).
- Allogeneic BMT with chronic graft-vs-host disease (cGVHD) increased the risk for skin cancer (SHR, 1.84; 95% CI, 1.37-2.47), compared with autologous BMT, while post-BMT immunosuppression increased risk for all types (overall SHR, 1.53; 95% CI, 1.26-1.86).
- The risk for any skin cancer was significantly lower in Black individuals (SHR, 0.14; 95% CI, 0.05-0.37), Hispanic individuals (SHR, 0.29; 95%CI, 0.20-0.62), and patients of other races or who were multiracial (SHR, 0.22; 95% CI, 0.13-0.37) than in non-Hispanic White patients.
IN PRACTICE:
In the study, “risk factors for post-BMT cutaneous malignant neoplasms included pretransplant treatment with a monoclonal antibody, cGVHD, and posttransplant immunosuppression,” the authors wrote, adding that the findings “could inform targeted surveillance of BMT survivors.” Most BMT survivors, “do not undergo routine dermatologic surveillance, highlighting the need to understand risk factors and incorporate risk-informed dermatologic surveillance into survivorship care plans.”
SOURCE:
The study was led by Kristy K. Broman, MD, MPH, University of Alabama at Birmingham, and was published online on December 18 in JAMA Dermatology.
LIMITATIONS:
Limitations included self-reported data and possible underreporting of melanoma cases in the SEER database. Additionally, the study did not capture other risk factors for cutaneous malignant neoplasms such as skin phototype, ultraviolet light exposure, or family history. The duration of posttransplant immunosuppression was not collected, and surveys were administered at variable intervals, though all were completed more than 2 years post BMT.
DISCLOSURES:
The study was supported by the National Cancer Institute (NCI) and the Leukemia and Lymphoma Society. Broman received grants from NCI, the National Center for Advancing Translational Sciences, the American Society of Clinical Oncology, and the American College of Surgeons. Another author reported receiving grants outside this work.
This article was created using several editorial tools, including artificial intelligence, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
How Are Patients Managing Intermediate-Risk Prostate Cancer?
TOPLINE:
METHODOLOGY:
- Current guidelines support active surveillance or watchful waiting for select patients with intermediate-risk prostate cancer. These observation strategies may help reduce the adverse effects associated with immediate radical treatment.
- To understand the trends over time in the use of active surveillance and watchful waiting, researchers looked at data of 147,205 individuals with intermediate-risk prostate cancer from the Surveillance, Epidemiology, and End Results prostate cancer database between 2010 and 2020 in the United States.
- Criteria for intermediate-risk included Gleason grade group 2 or 3, prostate-specific antigen (PSA) levels of 10-20 ng/mL, or stage cT2b of the disease. Researchers also included trends for patients with Gleason grade group 1, as a reference group.
- Researchers assessed the temporal trends and factors associated with the selection of active surveillance and watchful waiting in this population.
TAKEAWAY:
- Overall, the rate of active surveillance and watchful waiting more than doubled among intermediate-risk patients from 5% to 12.3% between 2010 and 2020.
- Between 2010 and 2020, the use of active surveillance and watchful waiting increased significantly among patients in Gleason grade group 1 (13.2% to 53.8%) and Gleason grade group 2 (4.0% to 11.6%) but remained stable for those in Gleason grade group 3 (2.5% to 2.8%; P = .85). For those with PSA levels < 10 ng/mL, adoption increased from 3.4% in 2010 to 9.2% in 2020 and more than doubled (9.3% to 20.7%) for those with PSA levels of 10-20 ng/mL.
- Higher Gleason grade groups had a significantly lower likelihood of adopting active surveillance or watchful waiting (Gleason grade group 2 vs 1: odds ratio [OR], 0.83; Gleason grade group 3 vs 1: OR, 0.79).
- Hispanic or Latino individuals (OR, 0.98) and non-Hispanic Black individuals (OR, 0.99) were slightly less likely to adopt these strategies than non-Hispanic White individuals.
IN PRACTICE:
“This study found a significant increase in initial active surveillance and watchful waiting for intermediate-risk prostate cancer between 2010 and 2020,” the authors wrote. “Research priorities should include reducing upfront overdiagnosis and better defining criteria for starting and stopping active surveillance and watchful waiting beyond conventional clinical measures such as GGs [Gleason grade groups] or PSA levels alone.”
SOURCE:
This study, led by Ismail Ajjawi, Yale School of Medicine, New Haven, Connecticut, was published online in JAMA.
LIMITATIONS:
This study relied on observational data and therefore could not capture various factors influencing clinical decision-making processes. Additionally, the absence of information on patient outcomes restricted the ability to assess the long-term implications of different management strategies.
DISCLOSURES:
This study received financial support from the Urological Research Foundation. Several authors reported having various ties with various sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Current guidelines support active surveillance or watchful waiting for select patients with intermediate-risk prostate cancer. These observation strategies may help reduce the adverse effects associated with immediate radical treatment.
- To understand the trends over time in the use of active surveillance and watchful waiting, researchers looked at data of 147,205 individuals with intermediate-risk prostate cancer from the Surveillance, Epidemiology, and End Results prostate cancer database between 2010 and 2020 in the United States.
- Criteria for intermediate-risk included Gleason grade group 2 or 3, prostate-specific antigen (PSA) levels of 10-20 ng/mL, or stage cT2b of the disease. Researchers also included trends for patients with Gleason grade group 1, as a reference group.
- Researchers assessed the temporal trends and factors associated with the selection of active surveillance and watchful waiting in this population.
TAKEAWAY:
- Overall, the rate of active surveillance and watchful waiting more than doubled among intermediate-risk patients from 5% to 12.3% between 2010 and 2020.
- Between 2010 and 2020, the use of active surveillance and watchful waiting increased significantly among patients in Gleason grade group 1 (13.2% to 53.8%) and Gleason grade group 2 (4.0% to 11.6%) but remained stable for those in Gleason grade group 3 (2.5% to 2.8%; P = .85). For those with PSA levels < 10 ng/mL, adoption increased from 3.4% in 2010 to 9.2% in 2020 and more than doubled (9.3% to 20.7%) for those with PSA levels of 10-20 ng/mL.
- Higher Gleason grade groups had a significantly lower likelihood of adopting active surveillance or watchful waiting (Gleason grade group 2 vs 1: odds ratio [OR], 0.83; Gleason grade group 3 vs 1: OR, 0.79).
- Hispanic or Latino individuals (OR, 0.98) and non-Hispanic Black individuals (OR, 0.99) were slightly less likely to adopt these strategies than non-Hispanic White individuals.
IN PRACTICE:
“This study found a significant increase in initial active surveillance and watchful waiting for intermediate-risk prostate cancer between 2010 and 2020,” the authors wrote. “Research priorities should include reducing upfront overdiagnosis and better defining criteria for starting and stopping active surveillance and watchful waiting beyond conventional clinical measures such as GGs [Gleason grade groups] or PSA levels alone.”
SOURCE:
This study, led by Ismail Ajjawi, Yale School of Medicine, New Haven, Connecticut, was published online in JAMA.
LIMITATIONS:
This study relied on observational data and therefore could not capture various factors influencing clinical decision-making processes. Additionally, the absence of information on patient outcomes restricted the ability to assess the long-term implications of different management strategies.
DISCLOSURES:
This study received financial support from the Urological Research Foundation. Several authors reported having various ties with various sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Current guidelines support active surveillance or watchful waiting for select patients with intermediate-risk prostate cancer. These observation strategies may help reduce the adverse effects associated with immediate radical treatment.
- To understand the trends over time in the use of active surveillance and watchful waiting, researchers looked at data of 147,205 individuals with intermediate-risk prostate cancer from the Surveillance, Epidemiology, and End Results prostate cancer database between 2010 and 2020 in the United States.
- Criteria for intermediate-risk included Gleason grade group 2 or 3, prostate-specific antigen (PSA) levels of 10-20 ng/mL, or stage cT2b of the disease. Researchers also included trends for patients with Gleason grade group 1, as a reference group.
- Researchers assessed the temporal trends and factors associated with the selection of active surveillance and watchful waiting in this population.
TAKEAWAY:
- Overall, the rate of active surveillance and watchful waiting more than doubled among intermediate-risk patients from 5% to 12.3% between 2010 and 2020.
- Between 2010 and 2020, the use of active surveillance and watchful waiting increased significantly among patients in Gleason grade group 1 (13.2% to 53.8%) and Gleason grade group 2 (4.0% to 11.6%) but remained stable for those in Gleason grade group 3 (2.5% to 2.8%; P = .85). For those with PSA levels < 10 ng/mL, adoption increased from 3.4% in 2010 to 9.2% in 2020 and more than doubled (9.3% to 20.7%) for those with PSA levels of 10-20 ng/mL.
- Higher Gleason grade groups had a significantly lower likelihood of adopting active surveillance or watchful waiting (Gleason grade group 2 vs 1: odds ratio [OR], 0.83; Gleason grade group 3 vs 1: OR, 0.79).
- Hispanic or Latino individuals (OR, 0.98) and non-Hispanic Black individuals (OR, 0.99) were slightly less likely to adopt these strategies than non-Hispanic White individuals.
IN PRACTICE:
“This study found a significant increase in initial active surveillance and watchful waiting for intermediate-risk prostate cancer between 2010 and 2020,” the authors wrote. “Research priorities should include reducing upfront overdiagnosis and better defining criteria for starting and stopping active surveillance and watchful waiting beyond conventional clinical measures such as GGs [Gleason grade groups] or PSA levels alone.”
SOURCE:
This study, led by Ismail Ajjawi, Yale School of Medicine, New Haven, Connecticut, was published online in JAMA.
LIMITATIONS:
This study relied on observational data and therefore could not capture various factors influencing clinical decision-making processes. Additionally, the absence of information on patient outcomes restricted the ability to assess the long-term implications of different management strategies.
DISCLOSURES:
This study received financial support from the Urological Research Foundation. Several authors reported having various ties with various sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
New Cancer Drugs: Do Patients Prefer Faster Access or Clinical Benefit?
When the Food and Drug Administration (FDA) grants cancer drugs accelerated approval, a key aim is to provide patients faster access to therapies that can benefit them.
The downside of a speedier approval timeline, however, is that it’s often not yet clear whether the new drugs will actually allow a patient to live longer or better. Information on overall survival and quality of life typically comes years later, after drugs undergo confirmatory trials, or sometimes not at all, if companies fail to conduct these trials.
During this waiting period, patients may be receiving a cancer drug that provides no real clinical benefit but comes with a host of toxicities.
In fact, the odds are about as good as a coin flip. For cancer drugs that have confirmatory trial data, more than half don’t ultimately provide an overall survival or quality of life benefit.
Inherent to the accelerated approval process is the assumption that patients are willing to accept this uncertainty in exchange for faster access.
But is that really the case?
The researchers asked about 870 adults with experience of cancer challenges — either their own cancer diagnosis or that of family or a close friend — whether they valued faster access or certainty that a drug really works.
In the study, participants imagined they had been diagnosed with cancer and could choose between two cancer drugs under investigation in clinical trials but with uncertain effectiveness, and a current standard treatment. Participants had to make a series of choices based on five scenarios.
The first two scenarios were based on the impact of the current standard treatment: A patient’s life expectancy on the standard treatment (6 months up to 3 years), and a patient’s physical health on the standard treatment (functional status restricted only during strenuous activities up to completely disabled).
The remaining three scenarios dealt with the two new drugs: The effect of the new drugs on a surrogate endpoint, progression-free survival (whether the drugs slowed tumor growth for an extra month or 5 additional months compared with the standard treatment), certainty that slowing tumor growth will improve survival (very low to high), and the wait time to access the drugs (immediately to as long as 2 years).
The researchers assessed the relative importance of survival benefit certainty vs wait time and how that balance shifted depending on the different scenarios.
Overall, the researchers found that, if there was no evidence linking the surrogate endpoint (progression-free survival) to overall survival, patients were willing to wait about 8 months for weak evidence of an overall survival benefit (ie, low certainty the drug will extend survival by 1-5 months), about 16 months for moderate certainty, and almost 22 months for high certainty.
Despite a willingness to wait for greater certainty, participants did value speed as well. Overall, respondents showed a strong preference against a 1-year delay in FDA approval time. People who were aged 55 years or more and were non-White individuals made less than $40,000 year as well as those with the lowest life expectancy on a current standard treatment were most sensitive to wait times while those with better functional status and longer life expectancies on a current treatment were less sensitive to longer wait times.
“Our results indicate that some patients (except those with the poorest prognoses) would find the additional time required to generate evidence on the survival benefit of new cancer drugs an acceptable tradeoff,” the study authors concluded.
Although people do place high value on timely access to new cancer drugs, especially if there are limited treatment options, many are willing to wait for greater certainty that a new drug provides an overall survival benefit, lead author Robin Forrest, MSc, with the Department of Health Policy, London School of Economics in England, said in an interview.
In the study, respondents also did not place significant value on whether the drug substantially slowed cancer growth. “In other words, substantial progression-free survival benefit of a drug did not compensate for lack of certainty about a drug’s benefit on survival in respondents’ drug choices,” the authors explained.
“In an effort to move quickly, we have accepted progression-free survival [as a surrogate endpoint],” Jyoti D. Patel, MD, oncologist with Northwestern Memorial Hospital, Chicago, Illinois, who wasn’t involved in the study. But a growing body of evidence indicates that progression-free survival is often a poor surrogate for overall survival. And what this study suggests is that “patients uniformly care about improvements in overall survival and the quality of that survival,” Patel said.
Bishal Gyawali, MD, PhD, was not surprised by the findings.
“I always thought this was the real-world scenario, but the problem is the voices of ordinary patients are not heard,” Gyawali, with Queen’s University, Kingston, Ontario, Canada, who also wasn’t involved in the study, said in an interview.
“What is heard is the loud noise of ‘we need access now, today, yesterday’ — ‘we don’t care if the drug doesn’t improve overall survival, we just need a drug, any drug’ — ‘we don’t care how much it costs, we need access today,’ ” Gyawali said. “Not saying this is wrong, but this is not the representation of all patients.”
However, the voices of patients who are more cautious and want evidence of benefit before accepting toxicities don’t make headlines, he added.
What this survey means from a policy perspective, said Gyawali, is that accelerated approvals that do not mandate survival endpoint in confirmatory trials are ignoring the need of many patients who prioritize certainty of benefit over speed of access.
The study was funded by the London School of Economics and Political Science Phelan United States Centre. Forrest had no relevant disclosures. Gyawali has received consulting fees from Vivio Health. Patel has various relationships with AbbVie, Anheart, AstraZeneca, Bristol-Myers Squibb, Guardant, Tempus, Sanofi, BluePrint, Takeda, and Gilead.
A version of this article first appeared on Medscape.com.
When the Food and Drug Administration (FDA) grants cancer drugs accelerated approval, a key aim is to provide patients faster access to therapies that can benefit them.
The downside of a speedier approval timeline, however, is that it’s often not yet clear whether the new drugs will actually allow a patient to live longer or better. Information on overall survival and quality of life typically comes years later, after drugs undergo confirmatory trials, or sometimes not at all, if companies fail to conduct these trials.
During this waiting period, patients may be receiving a cancer drug that provides no real clinical benefit but comes with a host of toxicities.
In fact, the odds are about as good as a coin flip. For cancer drugs that have confirmatory trial data, more than half don’t ultimately provide an overall survival or quality of life benefit.
Inherent to the accelerated approval process is the assumption that patients are willing to accept this uncertainty in exchange for faster access.
But is that really the case?
The researchers asked about 870 adults with experience of cancer challenges — either their own cancer diagnosis or that of family or a close friend — whether they valued faster access or certainty that a drug really works.
In the study, participants imagined they had been diagnosed with cancer and could choose between two cancer drugs under investigation in clinical trials but with uncertain effectiveness, and a current standard treatment. Participants had to make a series of choices based on five scenarios.
The first two scenarios were based on the impact of the current standard treatment: A patient’s life expectancy on the standard treatment (6 months up to 3 years), and a patient’s physical health on the standard treatment (functional status restricted only during strenuous activities up to completely disabled).
The remaining three scenarios dealt with the two new drugs: The effect of the new drugs on a surrogate endpoint, progression-free survival (whether the drugs slowed tumor growth for an extra month or 5 additional months compared with the standard treatment), certainty that slowing tumor growth will improve survival (very low to high), and the wait time to access the drugs (immediately to as long as 2 years).
The researchers assessed the relative importance of survival benefit certainty vs wait time and how that balance shifted depending on the different scenarios.
Overall, the researchers found that, if there was no evidence linking the surrogate endpoint (progression-free survival) to overall survival, patients were willing to wait about 8 months for weak evidence of an overall survival benefit (ie, low certainty the drug will extend survival by 1-5 months), about 16 months for moderate certainty, and almost 22 months for high certainty.
Despite a willingness to wait for greater certainty, participants did value speed as well. Overall, respondents showed a strong preference against a 1-year delay in FDA approval time. People who were aged 55 years or more and were non-White individuals made less than $40,000 year as well as those with the lowest life expectancy on a current standard treatment were most sensitive to wait times while those with better functional status and longer life expectancies on a current treatment were less sensitive to longer wait times.
“Our results indicate that some patients (except those with the poorest prognoses) would find the additional time required to generate evidence on the survival benefit of new cancer drugs an acceptable tradeoff,” the study authors concluded.
Although people do place high value on timely access to new cancer drugs, especially if there are limited treatment options, many are willing to wait for greater certainty that a new drug provides an overall survival benefit, lead author Robin Forrest, MSc, with the Department of Health Policy, London School of Economics in England, said in an interview.
In the study, respondents also did not place significant value on whether the drug substantially slowed cancer growth. “In other words, substantial progression-free survival benefit of a drug did not compensate for lack of certainty about a drug’s benefit on survival in respondents’ drug choices,” the authors explained.
“In an effort to move quickly, we have accepted progression-free survival [as a surrogate endpoint],” Jyoti D. Patel, MD, oncologist with Northwestern Memorial Hospital, Chicago, Illinois, who wasn’t involved in the study. But a growing body of evidence indicates that progression-free survival is often a poor surrogate for overall survival. And what this study suggests is that “patients uniformly care about improvements in overall survival and the quality of that survival,” Patel said.
Bishal Gyawali, MD, PhD, was not surprised by the findings.
“I always thought this was the real-world scenario, but the problem is the voices of ordinary patients are not heard,” Gyawali, with Queen’s University, Kingston, Ontario, Canada, who also wasn’t involved in the study, said in an interview.
“What is heard is the loud noise of ‘we need access now, today, yesterday’ — ‘we don’t care if the drug doesn’t improve overall survival, we just need a drug, any drug’ — ‘we don’t care how much it costs, we need access today,’ ” Gyawali said. “Not saying this is wrong, but this is not the representation of all patients.”
However, the voices of patients who are more cautious and want evidence of benefit before accepting toxicities don’t make headlines, he added.
What this survey means from a policy perspective, said Gyawali, is that accelerated approvals that do not mandate survival endpoint in confirmatory trials are ignoring the need of many patients who prioritize certainty of benefit over speed of access.
The study was funded by the London School of Economics and Political Science Phelan United States Centre. Forrest had no relevant disclosures. Gyawali has received consulting fees from Vivio Health. Patel has various relationships with AbbVie, Anheart, AstraZeneca, Bristol-Myers Squibb, Guardant, Tempus, Sanofi, BluePrint, Takeda, and Gilead.
A version of this article first appeared on Medscape.com.
When the Food and Drug Administration (FDA) grants cancer drugs accelerated approval, a key aim is to provide patients faster access to therapies that can benefit them.
The downside of a speedier approval timeline, however, is that it’s often not yet clear whether the new drugs will actually allow a patient to live longer or better. Information on overall survival and quality of life typically comes years later, after drugs undergo confirmatory trials, or sometimes not at all, if companies fail to conduct these trials.
During this waiting period, patients may be receiving a cancer drug that provides no real clinical benefit but comes with a host of toxicities.
In fact, the odds are about as good as a coin flip. For cancer drugs that have confirmatory trial data, more than half don’t ultimately provide an overall survival or quality of life benefit.
Inherent to the accelerated approval process is the assumption that patients are willing to accept this uncertainty in exchange for faster access.
But is that really the case?
The researchers asked about 870 adults with experience of cancer challenges — either their own cancer diagnosis or that of family or a close friend — whether they valued faster access or certainty that a drug really works.
In the study, participants imagined they had been diagnosed with cancer and could choose between two cancer drugs under investigation in clinical trials but with uncertain effectiveness, and a current standard treatment. Participants had to make a series of choices based on five scenarios.
The first two scenarios were based on the impact of the current standard treatment: A patient’s life expectancy on the standard treatment (6 months up to 3 years), and a patient’s physical health on the standard treatment (functional status restricted only during strenuous activities up to completely disabled).
The remaining three scenarios dealt with the two new drugs: The effect of the new drugs on a surrogate endpoint, progression-free survival (whether the drugs slowed tumor growth for an extra month or 5 additional months compared with the standard treatment), certainty that slowing tumor growth will improve survival (very low to high), and the wait time to access the drugs (immediately to as long as 2 years).
The researchers assessed the relative importance of survival benefit certainty vs wait time and how that balance shifted depending on the different scenarios.
Overall, the researchers found that, if there was no evidence linking the surrogate endpoint (progression-free survival) to overall survival, patients were willing to wait about 8 months for weak evidence of an overall survival benefit (ie, low certainty the drug will extend survival by 1-5 months), about 16 months for moderate certainty, and almost 22 months for high certainty.
Despite a willingness to wait for greater certainty, participants did value speed as well. Overall, respondents showed a strong preference against a 1-year delay in FDA approval time. People who were aged 55 years or more and were non-White individuals made less than $40,000 year as well as those with the lowest life expectancy on a current standard treatment were most sensitive to wait times while those with better functional status and longer life expectancies on a current treatment were less sensitive to longer wait times.
“Our results indicate that some patients (except those with the poorest prognoses) would find the additional time required to generate evidence on the survival benefit of new cancer drugs an acceptable tradeoff,” the study authors concluded.
Although people do place high value on timely access to new cancer drugs, especially if there are limited treatment options, many are willing to wait for greater certainty that a new drug provides an overall survival benefit, lead author Robin Forrest, MSc, with the Department of Health Policy, London School of Economics in England, said in an interview.
In the study, respondents also did not place significant value on whether the drug substantially slowed cancer growth. “In other words, substantial progression-free survival benefit of a drug did not compensate for lack of certainty about a drug’s benefit on survival in respondents’ drug choices,” the authors explained.
“In an effort to move quickly, we have accepted progression-free survival [as a surrogate endpoint],” Jyoti D. Patel, MD, oncologist with Northwestern Memorial Hospital, Chicago, Illinois, who wasn’t involved in the study. But a growing body of evidence indicates that progression-free survival is often a poor surrogate for overall survival. And what this study suggests is that “patients uniformly care about improvements in overall survival and the quality of that survival,” Patel said.
Bishal Gyawali, MD, PhD, was not surprised by the findings.
“I always thought this was the real-world scenario, but the problem is the voices of ordinary patients are not heard,” Gyawali, with Queen’s University, Kingston, Ontario, Canada, who also wasn’t involved in the study, said in an interview.
“What is heard is the loud noise of ‘we need access now, today, yesterday’ — ‘we don’t care if the drug doesn’t improve overall survival, we just need a drug, any drug’ — ‘we don’t care how much it costs, we need access today,’ ” Gyawali said. “Not saying this is wrong, but this is not the representation of all patients.”
However, the voices of patients who are more cautious and want evidence of benefit before accepting toxicities don’t make headlines, he added.
What this survey means from a policy perspective, said Gyawali, is that accelerated approvals that do not mandate survival endpoint in confirmatory trials are ignoring the need of many patients who prioritize certainty of benefit over speed of access.
The study was funded by the London School of Economics and Political Science Phelan United States Centre. Forrest had no relevant disclosures. Gyawali has received consulting fees from Vivio Health. Patel has various relationships with AbbVie, Anheart, AstraZeneca, Bristol-Myers Squibb, Guardant, Tempus, Sanofi, BluePrint, Takeda, and Gilead.
A version of this article first appeared on Medscape.com.
FROM THE LANCET ONCOLOGY
High-Fiber Diet Linked to Improved Stem Cell Transplant, GvHD Outcomes
Importantly, the findings suggest standard recommendations for patients of a low-fiber diet following allo-HCT may run counter to the potential benefits.
“Significant decrease of fiber intake during transplantation is detrimental. It’s a lost opportunity to promote a healthy gut microbiome, recover from treatment-related microbiota injury, and protect against GVHD,” first author Jenny Paredes, PhD, a staff scientist at City of Hope National Medical Center in Duarte, California, said in a press statement for the study presented at the American Society of Hematology (ASH) 2024 Annual Meeting.
Although the health benefits of dietary fiber on the gut microbiome are well-documented, the effects have recently been shown to extend to outcomes after allo-HCT in general, with researchers finding increased overall survival when there is higher diversity in the gut microbiome, including a higher abundance of butyrate producers and lower abundance of enterococcus, explained Paredes when presenting the findings.
Acute GvHD, a common and potentially life-threatening complication of allo-HCT, can have symptoms that mimic irritable bowel disease (IBD), including abdominal pain or cramps, nausea, vomiting, and diarrhea. The low-fiber diet recommendations, including avoidance of raw vegetables and fruits before and after the allo-HCT procedure, are designed to counter those effects, as well as reduce exposure to bacteria.
However, with data suggesting the potential benefits of dietary fiber could extend to the prevention of GvHD, Paredes and colleagues further investigated.
For the observational study, they evaluated all dietary data on 173 allo-HCT recipients at Memorial Sloan Kettering Cancer Center (MSKCC) from 10 days prior to transplantation to 30 days post-transplantation, representing 3837 patient-days in total.
Data collected from the patients also included rRNA sequencing of fecal samples and fecal short-chain fatty acid concentration.
Participants had a median age of 60, and 45% were female. The most common diseases being treated were leukemia (50%), myelodysplastic syndrome (25%), and non-Hodgkin’s lymphoma (8.7%).
After stratifying patients based on high- or low-fiber intake, those with high-fiber intake were found to have significantly higher rates of microbial α-diversity (P = .009), a higher abundance of butyrate producers (P = .03), and a higher concentration of butyrate (P = .02), a short-chain fatty acid that plays a key role in gut health.
Furthermore, the high-fiber group had significantly higher overall survival in an analysis extending to 24 months relative to day 12 of the study (P = .04).
Focusing on GvHD outcomes, the authors further evaluated data on 101 non-T-cell–depleted patients, and identified 29 patients without GvHD and 24 who developed lower gastrointestinal (GI) GvHD.
Patients with lower GI GvHD had significantly lower fecal concentrations of butyrate (P = .03) and acetate (P = .02).
However, patients among those in the high-fiber intake group had a significantly lower cumulative incidence of developing GvHD at day 100 (P = .034) and a lower incidence of lower GI GvHD (P = .04).
A separate preclinical analysis of a mouse model with GvHD further showed that a fiber-rich diet (12% cellulose) significantly increased the expression of genes associated with reduced GvHD, including IDO1 and CEACAM1, and those associated with enrichment of the bile acid pathway.
The findings suggest an opportunity to improve outcomes with relatively small dietary changes, Paredes said.
“Strategies to increase the fiber concentration in these diets paired with the safety that these patients need is what makes this study exciting,” she said in an interview.
“Increasing the fiber intake by 10 to 20 grams/day could potentially increase the microbiome diversity and abundance of butyrate producers, which have been correlated with higher overall survival rates post allo-HCT,” she continued.
“[For instance], that could be an avocado per day, or it could be a small salad per day, or a small vegetable soup per day,” she added. “I would encourage institutions to re-evaluate their menu planning and see how to include more fiber into the meals in a safe way.”
Ultimately, “I think that a dietary intervention outweighs the risks of a pharmacological intervention,” Paredes added.
The necessary duration of a high-fiber diet to produce the beneficial effects on allo-HCT outcomes would likely be over the course of the pre- and post-transplant periods, Paredes added.
“With the survival analysis extending from 5 days before transplantation to 12 days post, we are looking at an intervention that potentially could be around 20 days,” she said.
“We would love to take advantage of the pretransplantation window, in particular, and we can see that just increasing the fiber intake by about 20 grams during this window was shown to improve overall survival after 24 months,” Paredes added.
Importantly, however, some patients may not be appropriate for high-fiber dietary changes, Paredes cautioned.
“Patients that have developed IBD-like symptoms and severe GvHD patients, for example, or with lower GI-GvHD grades 3 and 4 would be not appropriate candidates for a high-fiber diet,” she said.
High-Fiber Diet Slows MM Disease Progression?
The potential important benefits of a high-fiber diet in blood diseases were further demonstrated in a separate study also by MSKCC researchers presented at the meeting, which showed encouraging signs that a plant-based diet rich in fiber could potentially slow disease progression in multiple myeloma (MM).
NUTRIVENTION included 20 patients with the two precancerous MM conditions, monoclonal gammopathy of undetermined significance (MGUS) and smoldering multiple myeloma (SMM), which can last for years without progressing to MM and which researchers have speculated could be a potential opportunity to intervene to prevent progression to cancer.
Patients were provided with a 12-week controlled diet plus health coaching for another 3 months; no meals or coaching were provided for the rest of the 1-year study period. Participants had a median age of 62 and, with being overweight/obesity a risk factor for MM, had a body mass index (BMI) of 25 kg/m2 or higher.
The trial met its endpoint of feasibility, with 91% adherence in the first 3 months. The rate of consumption of unprocessed plant foods increased from 20% at baseline to 92% on the intervention. Overall adherence was 58%. Insulin and anti-inflammatory markers also improved and, despite no calorie restriction, there was a 7% sustained reduction in BMI.
Notably, two patients in the study had stabilization of disease progression.
“We saw improvements in all spheres, including metabolism, microbiome, and immune system markers, and we also saw that two patients with progressive disease had the progression stabilize and slow down on the intervention,” principal investigator Urvi A. Shah, MD, said in a press statement.
“Even though it’s just two cases, to our knowledge, it has not been shown before in an intervention setting that you can improve diet and lifestyle and actually slow or change the trajectory of the disease,” she noted.
The researchers caution that findings in mice do not necessarily translate to humans but note another experiment in mice with SMM that showed animals fed a normal diet had progression to MM after a median of 12 weeks, compared with a median of 30 weeks among those fed a high-fiber diet.
Notably, all mice in the normal-diet group progressed to MM, whereas 40% of mice in the high-fiber group did not.
“We found that a high-fiber plant-based diet can improve BMI, improve insulin resistance [and] the microbiome through diversity and butyrate producers, and with the production of short-chain fatty acids, can have effects on inflammation, immunity, innate and adaptive antitumor immunity, and tumor cells or plasma cells,” Shah said during her presentation.
The study was supported by funding from the National Cancer Institute and private foundations. Paredes has reported no relevant financial relationships. Shah has reported relationships with Sanofi, Bristol Myers Squibb, and Janssen.
A version of this article first appeared on Medscape.com.
Importantly, the findings suggest standard recommendations for patients of a low-fiber diet following allo-HCT may run counter to the potential benefits.
“Significant decrease of fiber intake during transplantation is detrimental. It’s a lost opportunity to promote a healthy gut microbiome, recover from treatment-related microbiota injury, and protect against GVHD,” first author Jenny Paredes, PhD, a staff scientist at City of Hope National Medical Center in Duarte, California, said in a press statement for the study presented at the American Society of Hematology (ASH) 2024 Annual Meeting.
Although the health benefits of dietary fiber on the gut microbiome are well-documented, the effects have recently been shown to extend to outcomes after allo-HCT in general, with researchers finding increased overall survival when there is higher diversity in the gut microbiome, including a higher abundance of butyrate producers and lower abundance of enterococcus, explained Paredes when presenting the findings.
Acute GvHD, a common and potentially life-threatening complication of allo-HCT, can have symptoms that mimic irritable bowel disease (IBD), including abdominal pain or cramps, nausea, vomiting, and diarrhea. The low-fiber diet recommendations, including avoidance of raw vegetables and fruits before and after the allo-HCT procedure, are designed to counter those effects, as well as reduce exposure to bacteria.
However, with data suggesting the potential benefits of dietary fiber could extend to the prevention of GvHD, Paredes and colleagues further investigated.
For the observational study, they evaluated all dietary data on 173 allo-HCT recipients at Memorial Sloan Kettering Cancer Center (MSKCC) from 10 days prior to transplantation to 30 days post-transplantation, representing 3837 patient-days in total.
Data collected from the patients also included rRNA sequencing of fecal samples and fecal short-chain fatty acid concentration.
Participants had a median age of 60, and 45% were female. The most common diseases being treated were leukemia (50%), myelodysplastic syndrome (25%), and non-Hodgkin’s lymphoma (8.7%).
After stratifying patients based on high- or low-fiber intake, those with high-fiber intake were found to have significantly higher rates of microbial α-diversity (P = .009), a higher abundance of butyrate producers (P = .03), and a higher concentration of butyrate (P = .02), a short-chain fatty acid that plays a key role in gut health.
Furthermore, the high-fiber group had significantly higher overall survival in an analysis extending to 24 months relative to day 12 of the study (P = .04).
Focusing on GvHD outcomes, the authors further evaluated data on 101 non-T-cell–depleted patients, and identified 29 patients without GvHD and 24 who developed lower gastrointestinal (GI) GvHD.
Patients with lower GI GvHD had significantly lower fecal concentrations of butyrate (P = .03) and acetate (P = .02).
However, patients among those in the high-fiber intake group had a significantly lower cumulative incidence of developing GvHD at day 100 (P = .034) and a lower incidence of lower GI GvHD (P = .04).
A separate preclinical analysis of a mouse model with GvHD further showed that a fiber-rich diet (12% cellulose) significantly increased the expression of genes associated with reduced GvHD, including IDO1 and CEACAM1, and those associated with enrichment of the bile acid pathway.
The findings suggest an opportunity to improve outcomes with relatively small dietary changes, Paredes said.
“Strategies to increase the fiber concentration in these diets paired with the safety that these patients need is what makes this study exciting,” she said in an interview.
“Increasing the fiber intake by 10 to 20 grams/day could potentially increase the microbiome diversity and abundance of butyrate producers, which have been correlated with higher overall survival rates post allo-HCT,” she continued.
“[For instance], that could be an avocado per day, or it could be a small salad per day, or a small vegetable soup per day,” she added. “I would encourage institutions to re-evaluate their menu planning and see how to include more fiber into the meals in a safe way.”
Ultimately, “I think that a dietary intervention outweighs the risks of a pharmacological intervention,” Paredes added.
The necessary duration of a high-fiber diet to produce the beneficial effects on allo-HCT outcomes would likely be over the course of the pre- and post-transplant periods, Paredes added.
“With the survival analysis extending from 5 days before transplantation to 12 days post, we are looking at an intervention that potentially could be around 20 days,” she said.
“We would love to take advantage of the pretransplantation window, in particular, and we can see that just increasing the fiber intake by about 20 grams during this window was shown to improve overall survival after 24 months,” Paredes added.
Importantly, however, some patients may not be appropriate for high-fiber dietary changes, Paredes cautioned.
“Patients that have developed IBD-like symptoms and severe GvHD patients, for example, or with lower GI-GvHD grades 3 and 4 would be not appropriate candidates for a high-fiber diet,” she said.
High-Fiber Diet Slows MM Disease Progression?
The potential important benefits of a high-fiber diet in blood diseases were further demonstrated in a separate study also by MSKCC researchers presented at the meeting, which showed encouraging signs that a plant-based diet rich in fiber could potentially slow disease progression in multiple myeloma (MM).
NUTRIVENTION included 20 patients with the two precancerous MM conditions, monoclonal gammopathy of undetermined significance (MGUS) and smoldering multiple myeloma (SMM), which can last for years without progressing to MM and which researchers have speculated could be a potential opportunity to intervene to prevent progression to cancer.
Patients were provided with a 12-week controlled diet plus health coaching for another 3 months; no meals or coaching were provided for the rest of the 1-year study period. Participants had a median age of 62 and, with being overweight/obesity a risk factor for MM, had a body mass index (BMI) of 25 kg/m2 or higher.
The trial met its endpoint of feasibility, with 91% adherence in the first 3 months. The rate of consumption of unprocessed plant foods increased from 20% at baseline to 92% on the intervention. Overall adherence was 58%. Insulin and anti-inflammatory markers also improved and, despite no calorie restriction, there was a 7% sustained reduction in BMI.
Notably, two patients in the study had stabilization of disease progression.
“We saw improvements in all spheres, including metabolism, microbiome, and immune system markers, and we also saw that two patients with progressive disease had the progression stabilize and slow down on the intervention,” principal investigator Urvi A. Shah, MD, said in a press statement.
“Even though it’s just two cases, to our knowledge, it has not been shown before in an intervention setting that you can improve diet and lifestyle and actually slow or change the trajectory of the disease,” she noted.
The researchers caution that findings in mice do not necessarily translate to humans but note another experiment in mice with SMM that showed animals fed a normal diet had progression to MM after a median of 12 weeks, compared with a median of 30 weeks among those fed a high-fiber diet.
Notably, all mice in the normal-diet group progressed to MM, whereas 40% of mice in the high-fiber group did not.
“We found that a high-fiber plant-based diet can improve BMI, improve insulin resistance [and] the microbiome through diversity and butyrate producers, and with the production of short-chain fatty acids, can have effects on inflammation, immunity, innate and adaptive antitumor immunity, and tumor cells or plasma cells,” Shah said during her presentation.
The study was supported by funding from the National Cancer Institute and private foundations. Paredes has reported no relevant financial relationships. Shah has reported relationships with Sanofi, Bristol Myers Squibb, and Janssen.
A version of this article first appeared on Medscape.com.
Importantly, the findings suggest standard recommendations for patients of a low-fiber diet following allo-HCT may run counter to the potential benefits.
“Significant decrease of fiber intake during transplantation is detrimental. It’s a lost opportunity to promote a healthy gut microbiome, recover from treatment-related microbiota injury, and protect against GVHD,” first author Jenny Paredes, PhD, a staff scientist at City of Hope National Medical Center in Duarte, California, said in a press statement for the study presented at the American Society of Hematology (ASH) 2024 Annual Meeting.
Although the health benefits of dietary fiber on the gut microbiome are well-documented, the effects have recently been shown to extend to outcomes after allo-HCT in general, with researchers finding increased overall survival when there is higher diversity in the gut microbiome, including a higher abundance of butyrate producers and lower abundance of enterococcus, explained Paredes when presenting the findings.
Acute GvHD, a common and potentially life-threatening complication of allo-HCT, can have symptoms that mimic irritable bowel disease (IBD), including abdominal pain or cramps, nausea, vomiting, and diarrhea. The low-fiber diet recommendations, including avoidance of raw vegetables and fruits before and after the allo-HCT procedure, are designed to counter those effects, as well as reduce exposure to bacteria.
However, with data suggesting the potential benefits of dietary fiber could extend to the prevention of GvHD, Paredes and colleagues further investigated.
For the observational study, they evaluated all dietary data on 173 allo-HCT recipients at Memorial Sloan Kettering Cancer Center (MSKCC) from 10 days prior to transplantation to 30 days post-transplantation, representing 3837 patient-days in total.
Data collected from the patients also included rRNA sequencing of fecal samples and fecal short-chain fatty acid concentration.
Participants had a median age of 60, and 45% were female. The most common diseases being treated were leukemia (50%), myelodysplastic syndrome (25%), and non-Hodgkin’s lymphoma (8.7%).
After stratifying patients based on high- or low-fiber intake, those with high-fiber intake were found to have significantly higher rates of microbial α-diversity (P = .009), a higher abundance of butyrate producers (P = .03), and a higher concentration of butyrate (P = .02), a short-chain fatty acid that plays a key role in gut health.
Furthermore, the high-fiber group had significantly higher overall survival in an analysis extending to 24 months relative to day 12 of the study (P = .04).
Focusing on GvHD outcomes, the authors further evaluated data on 101 non-T-cell–depleted patients, and identified 29 patients without GvHD and 24 who developed lower gastrointestinal (GI) GvHD.
Patients with lower GI GvHD had significantly lower fecal concentrations of butyrate (P = .03) and acetate (P = .02).
However, patients among those in the high-fiber intake group had a significantly lower cumulative incidence of developing GvHD at day 100 (P = .034) and a lower incidence of lower GI GvHD (P = .04).
A separate preclinical analysis of a mouse model with GvHD further showed that a fiber-rich diet (12% cellulose) significantly increased the expression of genes associated with reduced GvHD, including IDO1 and CEACAM1, and those associated with enrichment of the bile acid pathway.
The findings suggest an opportunity to improve outcomes with relatively small dietary changes, Paredes said.
“Strategies to increase the fiber concentration in these diets paired with the safety that these patients need is what makes this study exciting,” she said in an interview.
“Increasing the fiber intake by 10 to 20 grams/day could potentially increase the microbiome diversity and abundance of butyrate producers, which have been correlated with higher overall survival rates post allo-HCT,” she continued.
“[For instance], that could be an avocado per day, or it could be a small salad per day, or a small vegetable soup per day,” she added. “I would encourage institutions to re-evaluate their menu planning and see how to include more fiber into the meals in a safe way.”
Ultimately, “I think that a dietary intervention outweighs the risks of a pharmacological intervention,” Paredes added.
The necessary duration of a high-fiber diet to produce the beneficial effects on allo-HCT outcomes would likely be over the course of the pre- and post-transplant periods, Paredes added.
“With the survival analysis extending from 5 days before transplantation to 12 days post, we are looking at an intervention that potentially could be around 20 days,” she said.
“We would love to take advantage of the pretransplantation window, in particular, and we can see that just increasing the fiber intake by about 20 grams during this window was shown to improve overall survival after 24 months,” Paredes added.
Importantly, however, some patients may not be appropriate for high-fiber dietary changes, Paredes cautioned.
“Patients that have developed IBD-like symptoms and severe GvHD patients, for example, or with lower GI-GvHD grades 3 and 4 would be not appropriate candidates for a high-fiber diet,” she said.
High-Fiber Diet Slows MM Disease Progression?
The potential important benefits of a high-fiber diet in blood diseases were further demonstrated in a separate study also by MSKCC researchers presented at the meeting, which showed encouraging signs that a plant-based diet rich in fiber could potentially slow disease progression in multiple myeloma (MM).
NUTRIVENTION included 20 patients with the two precancerous MM conditions, monoclonal gammopathy of undetermined significance (MGUS) and smoldering multiple myeloma (SMM), which can last for years without progressing to MM and which researchers have speculated could be a potential opportunity to intervene to prevent progression to cancer.
Patients were provided with a 12-week controlled diet plus health coaching for another 3 months; no meals or coaching were provided for the rest of the 1-year study period. Participants had a median age of 62 and, with being overweight/obesity a risk factor for MM, had a body mass index (BMI) of 25 kg/m2 or higher.
The trial met its endpoint of feasibility, with 91% adherence in the first 3 months. The rate of consumption of unprocessed plant foods increased from 20% at baseline to 92% on the intervention. Overall adherence was 58%. Insulin and anti-inflammatory markers also improved and, despite no calorie restriction, there was a 7% sustained reduction in BMI.
Notably, two patients in the study had stabilization of disease progression.
“We saw improvements in all spheres, including metabolism, microbiome, and immune system markers, and we also saw that two patients with progressive disease had the progression stabilize and slow down on the intervention,” principal investigator Urvi A. Shah, MD, said in a press statement.
“Even though it’s just two cases, to our knowledge, it has not been shown before in an intervention setting that you can improve diet and lifestyle and actually slow or change the trajectory of the disease,” she noted.
The researchers caution that findings in mice do not necessarily translate to humans but note another experiment in mice with SMM that showed animals fed a normal diet had progression to MM after a median of 12 weeks, compared with a median of 30 weeks among those fed a high-fiber diet.
Notably, all mice in the normal-diet group progressed to MM, whereas 40% of mice in the high-fiber group did not.
“We found that a high-fiber plant-based diet can improve BMI, improve insulin resistance [and] the microbiome through diversity and butyrate producers, and with the production of short-chain fatty acids, can have effects on inflammation, immunity, innate and adaptive antitumor immunity, and tumor cells or plasma cells,” Shah said during her presentation.
The study was supported by funding from the National Cancer Institute and private foundations. Paredes has reported no relevant financial relationships. Shah has reported relationships with Sanofi, Bristol Myers Squibb, and Janssen.
A version of this article first appeared on Medscape.com.
FROM ASH 2024
Intratumoral Dendritic Cell Therapy Shows Promise in Early-Stage ERBB2-Positive Breast Cancer
TOPLINE:
The higher dose (100 million cells) shows enhanced immune effector recruitment and significant tumor regression before chemotherapy initiation.
METHODOLOGY:
- ERBB2-positive breast cancer survival has improved with anti-ERBB2 antibodies trastuzumab and pertuzumab, but for a pathologic complete response, chemotherapy remains necessary, which comes with significant toxic effects.
- A phase 1, nonrandomized clinical trial enrolled 12 patients with early-stage ERBB2-positive breast cancer in Tampa, Florida, from October 2021 to October 2022.
- Participants received intratumoral (IT) cDC1 injections weekly for 6 weeks at two dose levels (50 million cells for dose level 1 and 100 million cells for dose level 2), with six patients in each group.
- Starting from day 1 of the cDC1 injections, treatment included trastuzumab (8-mg/kg loading dose, then 6 mg/kg) and pertuzumab (840-mg loading dose, then 420 mg) administered intravenously every 3 weeks for six cycles, followed by paclitaxel (80 mg/m2) weekly for 12 weeks and surgery with lumpectomy or mastectomy.
- Primary outcomes measured safety and immune response of increasing doses of cDC1 combined with anti-ERBB2 antibodies before neoadjuvant chemotherapy; secondary outcomes assessed antitumor efficacy through breast MRI and residual cancer burden at surgery.
TAKEAWAY:
- IT delivery of ERBB2 cDC1 was safe and not associated with any dose-limiting toxic effects. The most frequent adverse events attributed to cDC1 were grade 1-2 chills (50%), fatigue (41.7%), headache (33%), and injection-site reactions (33%).
- Dose level 2 showed enhanced recruitment of adaptive CD3, CD4, and CD8 T cells and B cells within the tumor microenvironment (TME), along with increased innate gamma delta T cells and natural killer T cells.
- Breast MRI revealed nine objective responses, including six partial responses and three complete responses, with three cases of stable disease.
- Following surgery, 7 of 12 patients (58%) achieved a pathologic complete response, including all 3 hormone receptor–negative patients and 4 of the 9 hormone receptor–positive patients.
IN PRACTICE:
“Overall, the clinical data shown here demonstrate the effects of combining ERBB2 antibodies with IT [intratumoral] delivery of targeted cDC1 to enhance immune cell infiltration within the TME [tumor microenvironment] and subsequently induce tumor regression before chemotherapy,” wrote the authors, who noted they will be testing the higher dose for an ongoing phase 2 trial with an additional 41 patients.
SOURCE:
The study was led by Hyo S. Han, MD, of H. Lee Moffitt Cancer Center and Research Institute in Tampa, Florida. It was published online on December 5, 2024, in JAMA Oncology.
LIMITATIONS:
Because only two dose levels of cDC1 were tested, it remains unclear whether higher doses or different administration schedules could further enhance immune response. Additionally, the nonrandomized design prevents definitive conclusions about whether the clinical benefits were solely from the anti-ERBB2 antibodies. The small sample size also makes it difficult to determine if the pathologic complete responses were primarily due to the 12 weeks of trastuzumab/pertuzumab/paclitaxel treatment.
DISCLOSURES:
This study was funded by the Moffitt Breast Cancer Research Fund, Shula Fund, and Pennies in Action. Several authors reported research support and personal and consulting fees from US funding agencies and multiple pharmaceutical companies outside of the submitted work, as well as related intellectual property and patents.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The higher dose (100 million cells) shows enhanced immune effector recruitment and significant tumor regression before chemotherapy initiation.
METHODOLOGY:
- ERBB2-positive breast cancer survival has improved with anti-ERBB2 antibodies trastuzumab and pertuzumab, but for a pathologic complete response, chemotherapy remains necessary, which comes with significant toxic effects.
- A phase 1, nonrandomized clinical trial enrolled 12 patients with early-stage ERBB2-positive breast cancer in Tampa, Florida, from October 2021 to October 2022.
- Participants received intratumoral (IT) cDC1 injections weekly for 6 weeks at two dose levels (50 million cells for dose level 1 and 100 million cells for dose level 2), with six patients in each group.
- Starting from day 1 of the cDC1 injections, treatment included trastuzumab (8-mg/kg loading dose, then 6 mg/kg) and pertuzumab (840-mg loading dose, then 420 mg) administered intravenously every 3 weeks for six cycles, followed by paclitaxel (80 mg/m2) weekly for 12 weeks and surgery with lumpectomy or mastectomy.
- Primary outcomes measured safety and immune response of increasing doses of cDC1 combined with anti-ERBB2 antibodies before neoadjuvant chemotherapy; secondary outcomes assessed antitumor efficacy through breast MRI and residual cancer burden at surgery.
TAKEAWAY:
- IT delivery of ERBB2 cDC1 was safe and not associated with any dose-limiting toxic effects. The most frequent adverse events attributed to cDC1 were grade 1-2 chills (50%), fatigue (41.7%), headache (33%), and injection-site reactions (33%).
- Dose level 2 showed enhanced recruitment of adaptive CD3, CD4, and CD8 T cells and B cells within the tumor microenvironment (TME), along with increased innate gamma delta T cells and natural killer T cells.
- Breast MRI revealed nine objective responses, including six partial responses and three complete responses, with three cases of stable disease.
- Following surgery, 7 of 12 patients (58%) achieved a pathologic complete response, including all 3 hormone receptor–negative patients and 4 of the 9 hormone receptor–positive patients.
IN PRACTICE:
“Overall, the clinical data shown here demonstrate the effects of combining ERBB2 antibodies with IT [intratumoral] delivery of targeted cDC1 to enhance immune cell infiltration within the TME [tumor microenvironment] and subsequently induce tumor regression before chemotherapy,” wrote the authors, who noted they will be testing the higher dose for an ongoing phase 2 trial with an additional 41 patients.
SOURCE:
The study was led by Hyo S. Han, MD, of H. Lee Moffitt Cancer Center and Research Institute in Tampa, Florida. It was published online on December 5, 2024, in JAMA Oncology.
LIMITATIONS:
Because only two dose levels of cDC1 were tested, it remains unclear whether higher doses or different administration schedules could further enhance immune response. Additionally, the nonrandomized design prevents definitive conclusions about whether the clinical benefits were solely from the anti-ERBB2 antibodies. The small sample size also makes it difficult to determine if the pathologic complete responses were primarily due to the 12 weeks of trastuzumab/pertuzumab/paclitaxel treatment.
DISCLOSURES:
This study was funded by the Moffitt Breast Cancer Research Fund, Shula Fund, and Pennies in Action. Several authors reported research support and personal and consulting fees from US funding agencies and multiple pharmaceutical companies outside of the submitted work, as well as related intellectual property and patents.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The higher dose (100 million cells) shows enhanced immune effector recruitment and significant tumor regression before chemotherapy initiation.
METHODOLOGY:
- ERBB2-positive breast cancer survival has improved with anti-ERBB2 antibodies trastuzumab and pertuzumab, but for a pathologic complete response, chemotherapy remains necessary, which comes with significant toxic effects.
- A phase 1, nonrandomized clinical trial enrolled 12 patients with early-stage ERBB2-positive breast cancer in Tampa, Florida, from October 2021 to October 2022.
- Participants received intratumoral (IT) cDC1 injections weekly for 6 weeks at two dose levels (50 million cells for dose level 1 and 100 million cells for dose level 2), with six patients in each group.
- Starting from day 1 of the cDC1 injections, treatment included trastuzumab (8-mg/kg loading dose, then 6 mg/kg) and pertuzumab (840-mg loading dose, then 420 mg) administered intravenously every 3 weeks for six cycles, followed by paclitaxel (80 mg/m2) weekly for 12 weeks and surgery with lumpectomy or mastectomy.
- Primary outcomes measured safety and immune response of increasing doses of cDC1 combined with anti-ERBB2 antibodies before neoadjuvant chemotherapy; secondary outcomes assessed antitumor efficacy through breast MRI and residual cancer burden at surgery.
TAKEAWAY:
- IT delivery of ERBB2 cDC1 was safe and not associated with any dose-limiting toxic effects. The most frequent adverse events attributed to cDC1 were grade 1-2 chills (50%), fatigue (41.7%), headache (33%), and injection-site reactions (33%).
- Dose level 2 showed enhanced recruitment of adaptive CD3, CD4, and CD8 T cells and B cells within the tumor microenvironment (TME), along with increased innate gamma delta T cells and natural killer T cells.
- Breast MRI revealed nine objective responses, including six partial responses and three complete responses, with three cases of stable disease.
- Following surgery, 7 of 12 patients (58%) achieved a pathologic complete response, including all 3 hormone receptor–negative patients and 4 of the 9 hormone receptor–positive patients.
IN PRACTICE:
“Overall, the clinical data shown here demonstrate the effects of combining ERBB2 antibodies with IT [intratumoral] delivery of targeted cDC1 to enhance immune cell infiltration within the TME [tumor microenvironment] and subsequently induce tumor regression before chemotherapy,” wrote the authors, who noted they will be testing the higher dose for an ongoing phase 2 trial with an additional 41 patients.
SOURCE:
The study was led by Hyo S. Han, MD, of H. Lee Moffitt Cancer Center and Research Institute in Tampa, Florida. It was published online on December 5, 2024, in JAMA Oncology.
LIMITATIONS:
Because only two dose levels of cDC1 were tested, it remains unclear whether higher doses or different administration schedules could further enhance immune response. Additionally, the nonrandomized design prevents definitive conclusions about whether the clinical benefits were solely from the anti-ERBB2 antibodies. The small sample size also makes it difficult to determine if the pathologic complete responses were primarily due to the 12 weeks of trastuzumab/pertuzumab/paclitaxel treatment.
DISCLOSURES:
This study was funded by the Moffitt Breast Cancer Research Fund, Shula Fund, and Pennies in Action. Several authors reported research support and personal and consulting fees from US funding agencies and multiple pharmaceutical companies outside of the submitted work, as well as related intellectual property and patents.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
New Cancer Vaccines on the Horizon: Renewed Hope or Hype?
Vaccines for treating and preventing cancer have long been considered a holy grail in oncology.
But aside from a few notable exceptions — including the human papillomavirus (HPV) vaccine, which has dramatically reduced the incidence of HPV-related cancers, and a Bacillus Calmette-Guerin vaccine, which helps prevent early-stage bladder cancer recurrence — most have failed to deliver.
Following a string of disappointments over the past decade, recent advances in the immunotherapy space are bringing renewed hope for progress.
In an American Association for Cancer Research (AACR) series earlier in 2024, Catherine J. Wu, MD, predicted big strides for cancer vaccines, especially for personalized vaccines that target patient-specific neoantigens — the proteins that form on cancer cells — as well as vaccines that can treat diverse tumor types.
said Wu, the Lavine Family Chair of Preventative Cancer Therapies at Dana-Farber Cancer Institute and a professor of medicine at Harvard Medical School, both in Boston, Massachusetts.
A prime example is a personalized, messenger RNA (mRNA)–based vaccine designed to prevent melanoma recurrence. The mRNA-4157 vaccine encodes up to 34 different patient-specific neoantigens.
“This is one of the most exciting developments in modern cancer therapy,” said Lawrence Young, a virologist and professor of molecular oncology at the University of Warwick, Coventry, England, who commented on the investigational vaccine via the UK-based Science Media Centre.
Other promising options are on the horizon as well. In August, BioNTech announced a phase 1 global trial to study BNT116 — a vaccine to treat non–small cell lung cancer (NSCLC). BNT116, like mRNA-4157, targets specific antigens in the lung cancer cells.
“This technology is the next big phase of cancer treatment,” Siow Ming Lee, MD, a consultant medical oncologist at University College London Hospitals in England, which is leading the UK trial for the lung cancer and melanoma vaccines, told The Guardian. “We are now entering this very exciting new era of mRNA-based immunotherapy clinical trials to investigate the treatment of lung cancer.”
Still, these predictions have a familiar ring. While the prospects are exciting, delivering on them is another story. There are simply no guarantees these strategies will work as hoped.
Then: Where We Were
Cancer vaccine research began to ramp up in the 2000s, and in 2006, the first-generation HPV vaccine, Gardasil, was approved. Gardasil prevents infection from four strains of HPV that cause about 80% of cervical cancer cases.
In 2010, the Food and Drug Administration approved sipuleucel-T, the first therapeutic cancer vaccine, which improved overall survival in patients with hormone-refractory prostate cancer.
Researchers predicted this approval would “pave the way for developing innovative, next generation of vaccines with enhanced antitumor potency.”
In a 2015 AACR research forecast report, Drew Pardoll, MD, PhD, co-director of the Cancer Immunology and Hematopoiesis Program at Johns Hopkins University, Baltimore, Maryland, said that “we can expect to see encouraging results from studies using cancer vaccines.”
Despite the excitement surrounding cancer vaccines alongside a few successes, the next decade brought a longer string of late-phase disappointments.
In 2016, the phase 3 ACT IV trial of a therapeutic vaccine to treat glioblastoma multiforme (CDX-110) was terminated after it failed to demonstrate improved survival.
In 2017, a phase 3 trial of the therapeutic pancreatic cancer vaccine, GVAX, was stopped early for lack of efficacy.
That year, an attenuated Listeria monocytogenes vaccine to treat pancreatic cancer and mesothelioma also failed to come to fruition. In late 2017, concerns over listeria infections prompted Aduro Biotech to cancel its listeria-based cancer treatment program.
In 2018, a phase 3 trial of belagenpumatucel-L, a therapeutic NSCLC vaccine, failed to demonstrate a significant improvement in survival and further study was discontinued.
And in 2019, a vaccine targeting MAGE-A3, a cancer-testis antigen present in multiple tumor types, failed to meet endpoints for improved survival in a phase 3 trial, leading to discontinuation of the vaccine program.
But these disappointments and failures are normal parts of medical research and drug development and have allowed for incremental advances that helped fuel renewed interest and hope for cancer vaccines, when the timing was right, explained vaccine pioneer Larry W. Kwak, MD, PhD, deputy director of the Comprehensive Cancer Center at City of Hope, Duarte, California.
When it comes to vaccine progress, timing makes a difference. In 2011, Kwak and colleagues published promising phase 3 trial results on a personalized vaccine. The vaccine was a patient-specific tumor-derived antigen for patients with follicular lymphoma in their first remission following chemotherapy. Patients who received the vaccine demonstrated significantly longer disease-free survival.
But, at the time, personalized vaccines faced strong headwinds due, largely, to high costs, and commercial interest failed to materialize. “That’s been the major hurdle for a long time,” said Kwak.
Now, however, interest has returned alongside advances in technology and research. The big shift has been the emergence of lower-cost rapid-production mRNA and DNA platforms and a better understanding of how vaccines and potent immune stimulants, like checkpoint inhibitors, can work together to improve outcomes, he explained.
“The timing wasn’t right” back then, Kwak noted. “Now, it’s a different environment and a different time.”
A Turning Point?
Indeed, a decade later, cancer vaccine development appears to be headed in a more promising direction.
Among key cancer vaccines to watch is the mRNA-4157 vaccine, developed by Merck and Moderna, designed to prevent melanoma recurrence. In a recent phase 2 study, patients receiving the mRNA-4157 vaccine alongside pembrolizumab had nearly half the risk for melanoma recurrence or death at 3 years compared with those receiving pembrolizumab alone. Investigators are now evaluating the vaccine in a global phase 3 study in patients with high-risk, stage IIB to IV melanoma following surgery.
Another one to watch is the BNT116 NSCLC vaccine from BioNTech. This vaccine presents the immune system with NSCLC tumor markers to encourage the body to fight cancer cells expressing those markers while ignoring healthy cells. BioNTech also launched a global clinical trial for its vaccine this year.
Other notables include a pancreatic cancer mRNA vaccine, which has shown promising early results in a small trial of 16 patients. Of 16 patients who received the vaccine alongside chemotherapy and after surgery and immunotherapy, 8 responded. Of these eight, six remained recurrence free at 3 years. Investigators noted that the vaccine appeared to stimulate a durable T-cell response in patients who responded.
Kwak has also continued his work on lymphoma vaccines. In August, his team published promising first-in-human data on the use of personalized neoantigen vaccines as an early intervention in untreated patients with lymphoplasmacytic lymphoma. Among nine asymptomatic patients who received the vaccine, all achieved stable disease or better, with no dose-limiting toxicities. One patient had a minor response, and the median time to progression was greater than 72 months.
“The current setting is more for advanced disease,” Kwak explained. “It’s a tougher task, but combined with checkpoint blockade, it may be potent enough to work.”
Still, caution is important. Despite early promise, it’s too soon to tell which, if any, of these investigational vaccines will pan out in the long run. Like investigational drugs, cancer vaccines may show big promising initially but then fail in larger trials.
One key to success, according to Kwak, is to design trials so that even negative results will inform next steps.
But, he noted, failures in large clinical trials will “put a chilling effect on cancer vaccine research again.”
“That’s what keeps me up at night,” he said. “We know the science is fundamentally sound and we have seen glimpses over decades of research that cancer vaccines can work, so it’s really just a matter of tweaking things to optimize trial design.”
Companies tend to design trials to test if a vaccine works or not, without trying to understand why, he said.
“What we need to do is design those so that we can learn from negative results,” he said. That’s what he and his colleagues attempted to do in their recent trial. “We didn’t just look at clinical results; we’re interrogating the actual tumor environment to understand what worked and didn’t and how to tweak that for the next trial.”
Kwak and his colleagues found, for instance, that the vaccine had a greater effect on B cell–derived tumor cells than on cells of plasma origin, so “the most rational design for the next iteration is to combine the vaccine with agents that work directly against plasma cells,” he explained.
As for what’s next, Kwak said: “We’re just focused on trying to do good science and understand. We’ve seen glimpses of success. That’s where we are.”
A version of this article first appeared on Medscape.com.
Vaccines for treating and preventing cancer have long been considered a holy grail in oncology.
But aside from a few notable exceptions — including the human papillomavirus (HPV) vaccine, which has dramatically reduced the incidence of HPV-related cancers, and a Bacillus Calmette-Guerin vaccine, which helps prevent early-stage bladder cancer recurrence — most have failed to deliver.
Following a string of disappointments over the past decade, recent advances in the immunotherapy space are bringing renewed hope for progress.
In an American Association for Cancer Research (AACR) series earlier in 2024, Catherine J. Wu, MD, predicted big strides for cancer vaccines, especially for personalized vaccines that target patient-specific neoantigens — the proteins that form on cancer cells — as well as vaccines that can treat diverse tumor types.
said Wu, the Lavine Family Chair of Preventative Cancer Therapies at Dana-Farber Cancer Institute and a professor of medicine at Harvard Medical School, both in Boston, Massachusetts.
A prime example is a personalized, messenger RNA (mRNA)–based vaccine designed to prevent melanoma recurrence. The mRNA-4157 vaccine encodes up to 34 different patient-specific neoantigens.
“This is one of the most exciting developments in modern cancer therapy,” said Lawrence Young, a virologist and professor of molecular oncology at the University of Warwick, Coventry, England, who commented on the investigational vaccine via the UK-based Science Media Centre.
Other promising options are on the horizon as well. In August, BioNTech announced a phase 1 global trial to study BNT116 — a vaccine to treat non–small cell lung cancer (NSCLC). BNT116, like mRNA-4157, targets specific antigens in the lung cancer cells.
“This technology is the next big phase of cancer treatment,” Siow Ming Lee, MD, a consultant medical oncologist at University College London Hospitals in England, which is leading the UK trial for the lung cancer and melanoma vaccines, told The Guardian. “We are now entering this very exciting new era of mRNA-based immunotherapy clinical trials to investigate the treatment of lung cancer.”
Still, these predictions have a familiar ring. While the prospects are exciting, delivering on them is another story. There are simply no guarantees these strategies will work as hoped.
Then: Where We Were
Cancer vaccine research began to ramp up in the 2000s, and in 2006, the first-generation HPV vaccine, Gardasil, was approved. Gardasil prevents infection from four strains of HPV that cause about 80% of cervical cancer cases.
In 2010, the Food and Drug Administration approved sipuleucel-T, the first therapeutic cancer vaccine, which improved overall survival in patients with hormone-refractory prostate cancer.
Researchers predicted this approval would “pave the way for developing innovative, next generation of vaccines with enhanced antitumor potency.”
In a 2015 AACR research forecast report, Drew Pardoll, MD, PhD, co-director of the Cancer Immunology and Hematopoiesis Program at Johns Hopkins University, Baltimore, Maryland, said that “we can expect to see encouraging results from studies using cancer vaccines.”
Despite the excitement surrounding cancer vaccines alongside a few successes, the next decade brought a longer string of late-phase disappointments.
In 2016, the phase 3 ACT IV trial of a therapeutic vaccine to treat glioblastoma multiforme (CDX-110) was terminated after it failed to demonstrate improved survival.
In 2017, a phase 3 trial of the therapeutic pancreatic cancer vaccine, GVAX, was stopped early for lack of efficacy.
That year, an attenuated Listeria monocytogenes vaccine to treat pancreatic cancer and mesothelioma also failed to come to fruition. In late 2017, concerns over listeria infections prompted Aduro Biotech to cancel its listeria-based cancer treatment program.
In 2018, a phase 3 trial of belagenpumatucel-L, a therapeutic NSCLC vaccine, failed to demonstrate a significant improvement in survival and further study was discontinued.
And in 2019, a vaccine targeting MAGE-A3, a cancer-testis antigen present in multiple tumor types, failed to meet endpoints for improved survival in a phase 3 trial, leading to discontinuation of the vaccine program.
But these disappointments and failures are normal parts of medical research and drug development and have allowed for incremental advances that helped fuel renewed interest and hope for cancer vaccines, when the timing was right, explained vaccine pioneer Larry W. Kwak, MD, PhD, deputy director of the Comprehensive Cancer Center at City of Hope, Duarte, California.
When it comes to vaccine progress, timing makes a difference. In 2011, Kwak and colleagues published promising phase 3 trial results on a personalized vaccine. The vaccine was a patient-specific tumor-derived antigen for patients with follicular lymphoma in their first remission following chemotherapy. Patients who received the vaccine demonstrated significantly longer disease-free survival.
But, at the time, personalized vaccines faced strong headwinds due, largely, to high costs, and commercial interest failed to materialize. “That’s been the major hurdle for a long time,” said Kwak.
Now, however, interest has returned alongside advances in technology and research. The big shift has been the emergence of lower-cost rapid-production mRNA and DNA platforms and a better understanding of how vaccines and potent immune stimulants, like checkpoint inhibitors, can work together to improve outcomes, he explained.
“The timing wasn’t right” back then, Kwak noted. “Now, it’s a different environment and a different time.”
A Turning Point?
Indeed, a decade later, cancer vaccine development appears to be headed in a more promising direction.
Among key cancer vaccines to watch is the mRNA-4157 vaccine, developed by Merck and Moderna, designed to prevent melanoma recurrence. In a recent phase 2 study, patients receiving the mRNA-4157 vaccine alongside pembrolizumab had nearly half the risk for melanoma recurrence or death at 3 years compared with those receiving pembrolizumab alone. Investigators are now evaluating the vaccine in a global phase 3 study in patients with high-risk, stage IIB to IV melanoma following surgery.
Another one to watch is the BNT116 NSCLC vaccine from BioNTech. This vaccine presents the immune system with NSCLC tumor markers to encourage the body to fight cancer cells expressing those markers while ignoring healthy cells. BioNTech also launched a global clinical trial for its vaccine this year.
Other notables include a pancreatic cancer mRNA vaccine, which has shown promising early results in a small trial of 16 patients. Of 16 patients who received the vaccine alongside chemotherapy and after surgery and immunotherapy, 8 responded. Of these eight, six remained recurrence free at 3 years. Investigators noted that the vaccine appeared to stimulate a durable T-cell response in patients who responded.
Kwak has also continued his work on lymphoma vaccines. In August, his team published promising first-in-human data on the use of personalized neoantigen vaccines as an early intervention in untreated patients with lymphoplasmacytic lymphoma. Among nine asymptomatic patients who received the vaccine, all achieved stable disease or better, with no dose-limiting toxicities. One patient had a minor response, and the median time to progression was greater than 72 months.
“The current setting is more for advanced disease,” Kwak explained. “It’s a tougher task, but combined with checkpoint blockade, it may be potent enough to work.”
Still, caution is important. Despite early promise, it’s too soon to tell which, if any, of these investigational vaccines will pan out in the long run. Like investigational drugs, cancer vaccines may show big promising initially but then fail in larger trials.
One key to success, according to Kwak, is to design trials so that even negative results will inform next steps.
But, he noted, failures in large clinical trials will “put a chilling effect on cancer vaccine research again.”
“That’s what keeps me up at night,” he said. “We know the science is fundamentally sound and we have seen glimpses over decades of research that cancer vaccines can work, so it’s really just a matter of tweaking things to optimize trial design.”
Companies tend to design trials to test if a vaccine works or not, without trying to understand why, he said.
“What we need to do is design those so that we can learn from negative results,” he said. That’s what he and his colleagues attempted to do in their recent trial. “We didn’t just look at clinical results; we’re interrogating the actual tumor environment to understand what worked and didn’t and how to tweak that for the next trial.”
Kwak and his colleagues found, for instance, that the vaccine had a greater effect on B cell–derived tumor cells than on cells of plasma origin, so “the most rational design for the next iteration is to combine the vaccine with agents that work directly against plasma cells,” he explained.
As for what’s next, Kwak said: “We’re just focused on trying to do good science and understand. We’ve seen glimpses of success. That’s where we are.”
A version of this article first appeared on Medscape.com.
Vaccines for treating and preventing cancer have long been considered a holy grail in oncology.
But aside from a few notable exceptions — including the human papillomavirus (HPV) vaccine, which has dramatically reduced the incidence of HPV-related cancers, and a Bacillus Calmette-Guerin vaccine, which helps prevent early-stage bladder cancer recurrence — most have failed to deliver.
Following a string of disappointments over the past decade, recent advances in the immunotherapy space are bringing renewed hope for progress.
In an American Association for Cancer Research (AACR) series earlier in 2024, Catherine J. Wu, MD, predicted big strides for cancer vaccines, especially for personalized vaccines that target patient-specific neoantigens — the proteins that form on cancer cells — as well as vaccines that can treat diverse tumor types.
said Wu, the Lavine Family Chair of Preventative Cancer Therapies at Dana-Farber Cancer Institute and a professor of medicine at Harvard Medical School, both in Boston, Massachusetts.
A prime example is a personalized, messenger RNA (mRNA)–based vaccine designed to prevent melanoma recurrence. The mRNA-4157 vaccine encodes up to 34 different patient-specific neoantigens.
“This is one of the most exciting developments in modern cancer therapy,” said Lawrence Young, a virologist and professor of molecular oncology at the University of Warwick, Coventry, England, who commented on the investigational vaccine via the UK-based Science Media Centre.
Other promising options are on the horizon as well. In August, BioNTech announced a phase 1 global trial to study BNT116 — a vaccine to treat non–small cell lung cancer (NSCLC). BNT116, like mRNA-4157, targets specific antigens in the lung cancer cells.
“This technology is the next big phase of cancer treatment,” Siow Ming Lee, MD, a consultant medical oncologist at University College London Hospitals in England, which is leading the UK trial for the lung cancer and melanoma vaccines, told The Guardian. “We are now entering this very exciting new era of mRNA-based immunotherapy clinical trials to investigate the treatment of lung cancer.”
Still, these predictions have a familiar ring. While the prospects are exciting, delivering on them is another story. There are simply no guarantees these strategies will work as hoped.
Then: Where We Were
Cancer vaccine research began to ramp up in the 2000s, and in 2006, the first-generation HPV vaccine, Gardasil, was approved. Gardasil prevents infection from four strains of HPV that cause about 80% of cervical cancer cases.
In 2010, the Food and Drug Administration approved sipuleucel-T, the first therapeutic cancer vaccine, which improved overall survival in patients with hormone-refractory prostate cancer.
Researchers predicted this approval would “pave the way for developing innovative, next generation of vaccines with enhanced antitumor potency.”
In a 2015 AACR research forecast report, Drew Pardoll, MD, PhD, co-director of the Cancer Immunology and Hematopoiesis Program at Johns Hopkins University, Baltimore, Maryland, said that “we can expect to see encouraging results from studies using cancer vaccines.”
Despite the excitement surrounding cancer vaccines alongside a few successes, the next decade brought a longer string of late-phase disappointments.
In 2016, the phase 3 ACT IV trial of a therapeutic vaccine to treat glioblastoma multiforme (CDX-110) was terminated after it failed to demonstrate improved survival.
In 2017, a phase 3 trial of the therapeutic pancreatic cancer vaccine, GVAX, was stopped early for lack of efficacy.
That year, an attenuated Listeria monocytogenes vaccine to treat pancreatic cancer and mesothelioma also failed to come to fruition. In late 2017, concerns over listeria infections prompted Aduro Biotech to cancel its listeria-based cancer treatment program.
In 2018, a phase 3 trial of belagenpumatucel-L, a therapeutic NSCLC vaccine, failed to demonstrate a significant improvement in survival and further study was discontinued.
And in 2019, a vaccine targeting MAGE-A3, a cancer-testis antigen present in multiple tumor types, failed to meet endpoints for improved survival in a phase 3 trial, leading to discontinuation of the vaccine program.
But these disappointments and failures are normal parts of medical research and drug development and have allowed for incremental advances that helped fuel renewed interest and hope for cancer vaccines, when the timing was right, explained vaccine pioneer Larry W. Kwak, MD, PhD, deputy director of the Comprehensive Cancer Center at City of Hope, Duarte, California.
When it comes to vaccine progress, timing makes a difference. In 2011, Kwak and colleagues published promising phase 3 trial results on a personalized vaccine. The vaccine was a patient-specific tumor-derived antigen for patients with follicular lymphoma in their first remission following chemotherapy. Patients who received the vaccine demonstrated significantly longer disease-free survival.
But, at the time, personalized vaccines faced strong headwinds due, largely, to high costs, and commercial interest failed to materialize. “That’s been the major hurdle for a long time,” said Kwak.
Now, however, interest has returned alongside advances in technology and research. The big shift has been the emergence of lower-cost rapid-production mRNA and DNA platforms and a better understanding of how vaccines and potent immune stimulants, like checkpoint inhibitors, can work together to improve outcomes, he explained.
“The timing wasn’t right” back then, Kwak noted. “Now, it’s a different environment and a different time.”
A Turning Point?
Indeed, a decade later, cancer vaccine development appears to be headed in a more promising direction.
Among key cancer vaccines to watch is the mRNA-4157 vaccine, developed by Merck and Moderna, designed to prevent melanoma recurrence. In a recent phase 2 study, patients receiving the mRNA-4157 vaccine alongside pembrolizumab had nearly half the risk for melanoma recurrence or death at 3 years compared with those receiving pembrolizumab alone. Investigators are now evaluating the vaccine in a global phase 3 study in patients with high-risk, stage IIB to IV melanoma following surgery.
Another one to watch is the BNT116 NSCLC vaccine from BioNTech. This vaccine presents the immune system with NSCLC tumor markers to encourage the body to fight cancer cells expressing those markers while ignoring healthy cells. BioNTech also launched a global clinical trial for its vaccine this year.
Other notables include a pancreatic cancer mRNA vaccine, which has shown promising early results in a small trial of 16 patients. Of 16 patients who received the vaccine alongside chemotherapy and after surgery and immunotherapy, 8 responded. Of these eight, six remained recurrence free at 3 years. Investigators noted that the vaccine appeared to stimulate a durable T-cell response in patients who responded.
Kwak has also continued his work on lymphoma vaccines. In August, his team published promising first-in-human data on the use of personalized neoantigen vaccines as an early intervention in untreated patients with lymphoplasmacytic lymphoma. Among nine asymptomatic patients who received the vaccine, all achieved stable disease or better, with no dose-limiting toxicities. One patient had a minor response, and the median time to progression was greater than 72 months.
“The current setting is more for advanced disease,” Kwak explained. “It’s a tougher task, but combined with checkpoint blockade, it may be potent enough to work.”
Still, caution is important. Despite early promise, it’s too soon to tell which, if any, of these investigational vaccines will pan out in the long run. Like investigational drugs, cancer vaccines may show big promising initially but then fail in larger trials.
One key to success, according to Kwak, is to design trials so that even negative results will inform next steps.
But, he noted, failures in large clinical trials will “put a chilling effect on cancer vaccine research again.”
“That’s what keeps me up at night,” he said. “We know the science is fundamentally sound and we have seen glimpses over decades of research that cancer vaccines can work, so it’s really just a matter of tweaking things to optimize trial design.”
Companies tend to design trials to test if a vaccine works or not, without trying to understand why, he said.
“What we need to do is design those so that we can learn from negative results,” he said. That’s what he and his colleagues attempted to do in their recent trial. “We didn’t just look at clinical results; we’re interrogating the actual tumor environment to understand what worked and didn’t and how to tweak that for the next trial.”
Kwak and his colleagues found, for instance, that the vaccine had a greater effect on B cell–derived tumor cells than on cells of plasma origin, so “the most rational design for the next iteration is to combine the vaccine with agents that work directly against plasma cells,” he explained.
As for what’s next, Kwak said: “We’re just focused on trying to do good science and understand. We’ve seen glimpses of success. That’s where we are.”
A version of this article first appeared on Medscape.com.