User login
Study: CBD provides symptom relief and improvements in gastroparesis
in a phase 2 randomized double-blinded, placebo-controlled study recently published in Clinical Gastroenterology and Hepatology.
There is “significant unmet medical need in gastroparesis,” and compared with cannabis, which has been used to relieve nausea and pain in patients with the condition, CBD has limited psychic effects with the added potential to reduce gut sensation and inflammation, wrote Ting Zheng, MD, and colleagues at Mayo Clinic in Rochester, Minn.
The researchers assessed the symptoms of 44 patients (21 randomized to receive CBD and 23 to receive placebo) – each of whom had nonsurgical gastroparesis with documented delayed gastric emptying of solids (GES) by scintigraphy for at least 3 months – with the American Neurogastroenterology and Motility Society’s Gastroparesis Cardinal Symptom Index (GCSI) Daily Diary.
They measured GES at baseline, and at 4 weeks, they measured GES again as well as fasting and postprandial gastric volumes and satiation using a validated Ensure drink test. (Patients ingested Ensure [Abbott Laboratories] at a rate of 30 mL/min and recorded their sensations every 5 minutes.) The two treatment arms were compared via 2-way analysis of covariance that included body mass index and, when applicable, baseline measurements.
Patients in the CBD group received twice-daily oral Epidiolex (Jazz Pharmaceuticals, Dublin), which is Food and Drug Administration–approved for the treatment of seizures associated with two rare forms of epilepsy and with another rare genetic disease in patients 1 year of age and older.
The researchers documented significant improvements in the CBD group in total GCSI score (P = .0008) and in scores measuring the inability to finish a normal-sized meal (P = .029), number of vomiting episodes/24 hours (P = .006), and overall perceived severity of symptoms (P = .034).
CBD treatment was also associated with greater tolerated volume of Ensure – “without increases in scores for nausea, fullness, bloating, and pain” – and, in another component of the GCSI, there was “a borderline reduction in upper abdominal pain,” Dr. Zheng and coauthors wrote.
There was a significant slowing of GES in the CBD group, however, and no significant differences were seen at 4 weeks in the fasting or accommodation gastric volumes between the two treatment groups. That beneficial effects of CBD were seen despite slowing of GES “raises the question of the contribution of the delayed GE of solids to development of symptoms in patients with gastroparesis, which is supported by some but not all meta-analyses on this topic,” they noted.
Patients had a mean age of 44 and most were female. Of the 44 patients, 32 had idiopathic gastroparesis, 6 had type 1 diabetes, and 6 had type 2 diabetes. Four patients in the study did not tolerate the FDA-recommended full-dose escalation of CBD to 20 mg/kg per day, but completed the study on the highest tolerated dose.
Adverse effects (fatigue, headache, nausea) were distributed equally between the two groups, but diarrhea was more common in the CBD group. Diarrhea was the most common adverse event in a recently published analysis of 892 pediatric patients receiving Epidiolex over an estimated 1,755.7 patient-years of CBD exposure, the researchers noted.
CBD is a cannabinoid receptor 2 inverse agonist with central nervous system effects, but it also affects visceral or somatic sensation peripherally, the authors noted. The beneficial effects of CBD in gastroparesis are “presumed to reflect effects on sensory mechanisms or anti-inflammatory effects mediated via CBR2 (cannabinoid receptor type 2) reversing the hypersensitivity and intrinsic inflammatory pathogenesis recorded in idiopathic and diabetic gastroparesis,” Dr. Zheng and colleagues wrote. CBD may also, in a mechanism unrelated to CB receptors, inhibit smooth muscle contractile activity, they said.
Larger randomized controlled trials of longer-term administration of CBD in both idiopathic and diabetic gastroparesis are warranted, the investigators said.
The researchers disclosed no conflicts. The study was supported by a grant from the National Institutes of Health.
in a phase 2 randomized double-blinded, placebo-controlled study recently published in Clinical Gastroenterology and Hepatology.
There is “significant unmet medical need in gastroparesis,” and compared with cannabis, which has been used to relieve nausea and pain in patients with the condition, CBD has limited psychic effects with the added potential to reduce gut sensation and inflammation, wrote Ting Zheng, MD, and colleagues at Mayo Clinic in Rochester, Minn.
The researchers assessed the symptoms of 44 patients (21 randomized to receive CBD and 23 to receive placebo) – each of whom had nonsurgical gastroparesis with documented delayed gastric emptying of solids (GES) by scintigraphy for at least 3 months – with the American Neurogastroenterology and Motility Society’s Gastroparesis Cardinal Symptom Index (GCSI) Daily Diary.
They measured GES at baseline, and at 4 weeks, they measured GES again as well as fasting and postprandial gastric volumes and satiation using a validated Ensure drink test. (Patients ingested Ensure [Abbott Laboratories] at a rate of 30 mL/min and recorded their sensations every 5 minutes.) The two treatment arms were compared via 2-way analysis of covariance that included body mass index and, when applicable, baseline measurements.
Patients in the CBD group received twice-daily oral Epidiolex (Jazz Pharmaceuticals, Dublin), which is Food and Drug Administration–approved for the treatment of seizures associated with two rare forms of epilepsy and with another rare genetic disease in patients 1 year of age and older.
The researchers documented significant improvements in the CBD group in total GCSI score (P = .0008) and in scores measuring the inability to finish a normal-sized meal (P = .029), number of vomiting episodes/24 hours (P = .006), and overall perceived severity of symptoms (P = .034).
CBD treatment was also associated with greater tolerated volume of Ensure – “without increases in scores for nausea, fullness, bloating, and pain” – and, in another component of the GCSI, there was “a borderline reduction in upper abdominal pain,” Dr. Zheng and coauthors wrote.
There was a significant slowing of GES in the CBD group, however, and no significant differences were seen at 4 weeks in the fasting or accommodation gastric volumes between the two treatment groups. That beneficial effects of CBD were seen despite slowing of GES “raises the question of the contribution of the delayed GE of solids to development of symptoms in patients with gastroparesis, which is supported by some but not all meta-analyses on this topic,” they noted.
Patients had a mean age of 44 and most were female. Of the 44 patients, 32 had idiopathic gastroparesis, 6 had type 1 diabetes, and 6 had type 2 diabetes. Four patients in the study did not tolerate the FDA-recommended full-dose escalation of CBD to 20 mg/kg per day, but completed the study on the highest tolerated dose.
Adverse effects (fatigue, headache, nausea) were distributed equally between the two groups, but diarrhea was more common in the CBD group. Diarrhea was the most common adverse event in a recently published analysis of 892 pediatric patients receiving Epidiolex over an estimated 1,755.7 patient-years of CBD exposure, the researchers noted.
CBD is a cannabinoid receptor 2 inverse agonist with central nervous system effects, but it also affects visceral or somatic sensation peripherally, the authors noted. The beneficial effects of CBD in gastroparesis are “presumed to reflect effects on sensory mechanisms or anti-inflammatory effects mediated via CBR2 (cannabinoid receptor type 2) reversing the hypersensitivity and intrinsic inflammatory pathogenesis recorded in idiopathic and diabetic gastroparesis,” Dr. Zheng and colleagues wrote. CBD may also, in a mechanism unrelated to CB receptors, inhibit smooth muscle contractile activity, they said.
Larger randomized controlled trials of longer-term administration of CBD in both idiopathic and diabetic gastroparesis are warranted, the investigators said.
The researchers disclosed no conflicts. The study was supported by a grant from the National Institutes of Health.
in a phase 2 randomized double-blinded, placebo-controlled study recently published in Clinical Gastroenterology and Hepatology.
There is “significant unmet medical need in gastroparesis,” and compared with cannabis, which has been used to relieve nausea and pain in patients with the condition, CBD has limited psychic effects with the added potential to reduce gut sensation and inflammation, wrote Ting Zheng, MD, and colleagues at Mayo Clinic in Rochester, Minn.
The researchers assessed the symptoms of 44 patients (21 randomized to receive CBD and 23 to receive placebo) – each of whom had nonsurgical gastroparesis with documented delayed gastric emptying of solids (GES) by scintigraphy for at least 3 months – with the American Neurogastroenterology and Motility Society’s Gastroparesis Cardinal Symptom Index (GCSI) Daily Diary.
They measured GES at baseline, and at 4 weeks, they measured GES again as well as fasting and postprandial gastric volumes and satiation using a validated Ensure drink test. (Patients ingested Ensure [Abbott Laboratories] at a rate of 30 mL/min and recorded their sensations every 5 minutes.) The two treatment arms were compared via 2-way analysis of covariance that included body mass index and, when applicable, baseline measurements.
Patients in the CBD group received twice-daily oral Epidiolex (Jazz Pharmaceuticals, Dublin), which is Food and Drug Administration–approved for the treatment of seizures associated with two rare forms of epilepsy and with another rare genetic disease in patients 1 year of age and older.
The researchers documented significant improvements in the CBD group in total GCSI score (P = .0008) and in scores measuring the inability to finish a normal-sized meal (P = .029), number of vomiting episodes/24 hours (P = .006), and overall perceived severity of symptoms (P = .034).
CBD treatment was also associated with greater tolerated volume of Ensure – “without increases in scores for nausea, fullness, bloating, and pain” – and, in another component of the GCSI, there was “a borderline reduction in upper abdominal pain,” Dr. Zheng and coauthors wrote.
There was a significant slowing of GES in the CBD group, however, and no significant differences were seen at 4 weeks in the fasting or accommodation gastric volumes between the two treatment groups. That beneficial effects of CBD were seen despite slowing of GES “raises the question of the contribution of the delayed GE of solids to development of symptoms in patients with gastroparesis, which is supported by some but not all meta-analyses on this topic,” they noted.
Patients had a mean age of 44 and most were female. Of the 44 patients, 32 had idiopathic gastroparesis, 6 had type 1 diabetes, and 6 had type 2 diabetes. Four patients in the study did not tolerate the FDA-recommended full-dose escalation of CBD to 20 mg/kg per day, but completed the study on the highest tolerated dose.
Adverse effects (fatigue, headache, nausea) were distributed equally between the two groups, but diarrhea was more common in the CBD group. Diarrhea was the most common adverse event in a recently published analysis of 892 pediatric patients receiving Epidiolex over an estimated 1,755.7 patient-years of CBD exposure, the researchers noted.
CBD is a cannabinoid receptor 2 inverse agonist with central nervous system effects, but it also affects visceral or somatic sensation peripherally, the authors noted. The beneficial effects of CBD in gastroparesis are “presumed to reflect effects on sensory mechanisms or anti-inflammatory effects mediated via CBR2 (cannabinoid receptor type 2) reversing the hypersensitivity and intrinsic inflammatory pathogenesis recorded in idiopathic and diabetic gastroparesis,” Dr. Zheng and colleagues wrote. CBD may also, in a mechanism unrelated to CB receptors, inhibit smooth muscle contractile activity, they said.
Larger randomized controlled trials of longer-term administration of CBD in both idiopathic and diabetic gastroparesis are warranted, the investigators said.
The researchers disclosed no conflicts. The study was supported by a grant from the National Institutes of Health.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Meta-analysis of postcancer use of immunosuppressive therapies shows no increase in cancer recurrence risk
that covered approximately 24,000 patients and 86,000 person-years of follow-up.
The findings could “help guide clinical decision making,” providing “reassurance that it remains safe to use conventional immunomodulators, anti-TNF [tumor necrosis factor] agents, or newer biologics in individuals with [immune-mediated diseases] with a prior malignancy consistent with recent guidelines,” Akshita Gupta, MD, of Massachusetts General Hospital, Boston, and coinvestigators wrote in Clinical Gastroenterology and Hepatology.
And because a stratification of studies by the timing of immunosuppression therapy initiation found no increased risk when treatment was started within 5 years of a cancer diagnosis compared to later on, the meta-analysis could “potentially reduce the time to initiation of immunosuppressive treatment,” the authors wrote, noting a continued need for individualized decision-making.
Ustekinumab, a monoclonal antibody targeting interleukin-12 and IL-23, and vedolizumab, a monoclonal antibody that binds to alpha4beta7 integrin, were covered in the meta-analysis, but investigators found no studies on the use of upadacitinib or other Janus kinase (JAK) inhibitors, or the use of S1P modulators, in patients with prior malignancies.
The analysis included 31 observational studies, 17 of which involved patients with inflammatory bowel disease (IBD). (Of the other studies, 14 involved patients with rheumatoid arthritis, 2 covered psoriasis, and 1 covered ankylosing spondylitis.)
Similar levels of risk
The incidence rate of new or recurrent cancers among individuals not receiving any immunosuppressive therapy for IBD or other immune-mediated diseases after an index cancer was 35 per 1,000 patient-years (95% confidence interval, 27-43 per 1,000 patient-years; 1,627 incident cancers among 12,238 patients, 43,765 patient-years), and the rate among anti-TNF users was similar at 32 per 1,000 patient-years (95% CI, 25-38 per 1,000 patient-years; 571 cancers among 3,939 patients, 17,772 patient-years).
Among patients on conventional immunomodulator therapy (thiopurines, methotrexate), the incidence rate was numerically higher at 46 per 1,000 patient-years (95% CI, 31-61; 1,104 incident cancers among 5,930 patients; 17,018 patient-years), but was not statistically different from anti-TNF (P = .92) or no immunosuppression (P = .98).
Patients on combination immunosuppression also had numerically higher rates of new or recurrent cancers at 56 per 1,000 patient-years (95% CI, 31-81; 179 incident cancers, 2,659 patient-years), but these rates were not statistically different from immunomodulator use alone (P = .19), anti-TNF alone (P = .06) or no immunosuppressive therapy (P = .14).
Patients on ustekinumab and vedolizumab similarly had numerically lower rates of cancer recurrence, compared with other treatment groups: 21 per 1,000 patient-years (95% CI, 0-44; 5 cancers among 41 patients, 213 patient-years) and 16 per 1,000 patient-years (95% CI, 5-26; 37 cancers among 281 patients, 1,951 patient-years). However, the difference was statistically significant only for vedolizumab (P = .03 vs. immunomodulators and P = .04 vs. anti-TNF agents).
Subgroup analyses for new primary cancers, recurrence of a prior cancer, and type of index cancer (skin cancer vs. other cancers) similarly found no statistically significant differences between treatment arms. Results were similar in patients with IBD and RA.
Timing of therapy
The new meta-analysis confirms and expands a previous meta-analysis published in Gastroenterology in 2016 that showed no impact of treatment – primarily IMM or anti-TNF treatment – on cancer recurrence in patients with immune-mediated diseases, Dr. Gupta and coauthors wrote.
The 2016 meta-analysis reported similar cancer recurrence rates with IMMs and anti-TNFs when immunosuppression was introduced before or after 6 years of cancer diagnosis. In the new meta-analysis – with twice the number of patients, a longer duration of follow-up, and the inclusion of other biologic therapies – a stratification of results at the median interval of therapy initiation similarly found no increased risk before 5 years, compared with after 5 years.
“Although several existing guidelines recommend avoiding immunosuppression for 5 years after the index cancer, our results indicate that it may be safe to initiate these agents earlier than 5 years, at least in some patients,” Dr. Gupta and coauthors wrote, mentioning the possible impact of selection bias and surveillance bias in the study. Ongoing registries “may help answer this question more definitively with prospectively collected data, but inherently may suffer from this selection bias as well.”
Assessment of the newer biologics ustekinumab and vedolizumab is limited by the low number of studies (four and five, respectively) and by limited duration of follow-up. “Longer-term evaluation after these treatments is essential but it is reassuring that in the early analysis we did not observe an increase and in fact noted numerically lower rates of cancers,” they wrote.
It is also “critically important” to generate more data on JAK inhibitors, and to further study the safety of combining systemic chemotherapy and the continuation of IBD therapy in the setting of a new cancer diagnosis, they wrote.
The study was funded in part by grants from the Crohn’s and Colitis Foundation, and the Chleck Family Foundation. Dr. Gupta disclosed no conflicts. One coauthor disclosed consulting for Abbvie, Amgen, Biogen, and other companies, and receiving grants from several companies. Another coauthor disclosed serving on the scientific advisory boards for AbbVie and other companies, and receiving research support from Pfizer.
that covered approximately 24,000 patients and 86,000 person-years of follow-up.
The findings could “help guide clinical decision making,” providing “reassurance that it remains safe to use conventional immunomodulators, anti-TNF [tumor necrosis factor] agents, or newer biologics in individuals with [immune-mediated diseases] with a prior malignancy consistent with recent guidelines,” Akshita Gupta, MD, of Massachusetts General Hospital, Boston, and coinvestigators wrote in Clinical Gastroenterology and Hepatology.
And because a stratification of studies by the timing of immunosuppression therapy initiation found no increased risk when treatment was started within 5 years of a cancer diagnosis compared to later on, the meta-analysis could “potentially reduce the time to initiation of immunosuppressive treatment,” the authors wrote, noting a continued need for individualized decision-making.
Ustekinumab, a monoclonal antibody targeting interleukin-12 and IL-23, and vedolizumab, a monoclonal antibody that binds to alpha4beta7 integrin, were covered in the meta-analysis, but investigators found no studies on the use of upadacitinib or other Janus kinase (JAK) inhibitors, or the use of S1P modulators, in patients with prior malignancies.
The analysis included 31 observational studies, 17 of which involved patients with inflammatory bowel disease (IBD). (Of the other studies, 14 involved patients with rheumatoid arthritis, 2 covered psoriasis, and 1 covered ankylosing spondylitis.)
Similar levels of risk
The incidence rate of new or recurrent cancers among individuals not receiving any immunosuppressive therapy for IBD or other immune-mediated diseases after an index cancer was 35 per 1,000 patient-years (95% confidence interval, 27-43 per 1,000 patient-years; 1,627 incident cancers among 12,238 patients, 43,765 patient-years), and the rate among anti-TNF users was similar at 32 per 1,000 patient-years (95% CI, 25-38 per 1,000 patient-years; 571 cancers among 3,939 patients, 17,772 patient-years).
Among patients on conventional immunomodulator therapy (thiopurines, methotrexate), the incidence rate was numerically higher at 46 per 1,000 patient-years (95% CI, 31-61; 1,104 incident cancers among 5,930 patients; 17,018 patient-years), but was not statistically different from anti-TNF (P = .92) or no immunosuppression (P = .98).
Patients on combination immunosuppression also had numerically higher rates of new or recurrent cancers at 56 per 1,000 patient-years (95% CI, 31-81; 179 incident cancers, 2,659 patient-years), but these rates were not statistically different from immunomodulator use alone (P = .19), anti-TNF alone (P = .06) or no immunosuppressive therapy (P = .14).
Patients on ustekinumab and vedolizumab similarly had numerically lower rates of cancer recurrence, compared with other treatment groups: 21 per 1,000 patient-years (95% CI, 0-44; 5 cancers among 41 patients, 213 patient-years) and 16 per 1,000 patient-years (95% CI, 5-26; 37 cancers among 281 patients, 1,951 patient-years). However, the difference was statistically significant only for vedolizumab (P = .03 vs. immunomodulators and P = .04 vs. anti-TNF agents).
Subgroup analyses for new primary cancers, recurrence of a prior cancer, and type of index cancer (skin cancer vs. other cancers) similarly found no statistically significant differences between treatment arms. Results were similar in patients with IBD and RA.
Timing of therapy
The new meta-analysis confirms and expands a previous meta-analysis published in Gastroenterology in 2016 that showed no impact of treatment – primarily IMM or anti-TNF treatment – on cancer recurrence in patients with immune-mediated diseases, Dr. Gupta and coauthors wrote.
The 2016 meta-analysis reported similar cancer recurrence rates with IMMs and anti-TNFs when immunosuppression was introduced before or after 6 years of cancer diagnosis. In the new meta-analysis – with twice the number of patients, a longer duration of follow-up, and the inclusion of other biologic therapies – a stratification of results at the median interval of therapy initiation similarly found no increased risk before 5 years, compared with after 5 years.
“Although several existing guidelines recommend avoiding immunosuppression for 5 years after the index cancer, our results indicate that it may be safe to initiate these agents earlier than 5 years, at least in some patients,” Dr. Gupta and coauthors wrote, mentioning the possible impact of selection bias and surveillance bias in the study. Ongoing registries “may help answer this question more definitively with prospectively collected data, but inherently may suffer from this selection bias as well.”
Assessment of the newer biologics ustekinumab and vedolizumab is limited by the low number of studies (four and five, respectively) and by limited duration of follow-up. “Longer-term evaluation after these treatments is essential but it is reassuring that in the early analysis we did not observe an increase and in fact noted numerically lower rates of cancers,” they wrote.
It is also “critically important” to generate more data on JAK inhibitors, and to further study the safety of combining systemic chemotherapy and the continuation of IBD therapy in the setting of a new cancer diagnosis, they wrote.
The study was funded in part by grants from the Crohn’s and Colitis Foundation, and the Chleck Family Foundation. Dr. Gupta disclosed no conflicts. One coauthor disclosed consulting for Abbvie, Amgen, Biogen, and other companies, and receiving grants from several companies. Another coauthor disclosed serving on the scientific advisory boards for AbbVie and other companies, and receiving research support from Pfizer.
that covered approximately 24,000 patients and 86,000 person-years of follow-up.
The findings could “help guide clinical decision making,” providing “reassurance that it remains safe to use conventional immunomodulators, anti-TNF [tumor necrosis factor] agents, or newer biologics in individuals with [immune-mediated diseases] with a prior malignancy consistent with recent guidelines,” Akshita Gupta, MD, of Massachusetts General Hospital, Boston, and coinvestigators wrote in Clinical Gastroenterology and Hepatology.
And because a stratification of studies by the timing of immunosuppression therapy initiation found no increased risk when treatment was started within 5 years of a cancer diagnosis compared to later on, the meta-analysis could “potentially reduce the time to initiation of immunosuppressive treatment,” the authors wrote, noting a continued need for individualized decision-making.
Ustekinumab, a monoclonal antibody targeting interleukin-12 and IL-23, and vedolizumab, a monoclonal antibody that binds to alpha4beta7 integrin, were covered in the meta-analysis, but investigators found no studies on the use of upadacitinib or other Janus kinase (JAK) inhibitors, or the use of S1P modulators, in patients with prior malignancies.
The analysis included 31 observational studies, 17 of which involved patients with inflammatory bowel disease (IBD). (Of the other studies, 14 involved patients with rheumatoid arthritis, 2 covered psoriasis, and 1 covered ankylosing spondylitis.)
Similar levels of risk
The incidence rate of new or recurrent cancers among individuals not receiving any immunosuppressive therapy for IBD or other immune-mediated diseases after an index cancer was 35 per 1,000 patient-years (95% confidence interval, 27-43 per 1,000 patient-years; 1,627 incident cancers among 12,238 patients, 43,765 patient-years), and the rate among anti-TNF users was similar at 32 per 1,000 patient-years (95% CI, 25-38 per 1,000 patient-years; 571 cancers among 3,939 patients, 17,772 patient-years).
Among patients on conventional immunomodulator therapy (thiopurines, methotrexate), the incidence rate was numerically higher at 46 per 1,000 patient-years (95% CI, 31-61; 1,104 incident cancers among 5,930 patients; 17,018 patient-years), but was not statistically different from anti-TNF (P = .92) or no immunosuppression (P = .98).
Patients on combination immunosuppression also had numerically higher rates of new or recurrent cancers at 56 per 1,000 patient-years (95% CI, 31-81; 179 incident cancers, 2,659 patient-years), but these rates were not statistically different from immunomodulator use alone (P = .19), anti-TNF alone (P = .06) or no immunosuppressive therapy (P = .14).
Patients on ustekinumab and vedolizumab similarly had numerically lower rates of cancer recurrence, compared with other treatment groups: 21 per 1,000 patient-years (95% CI, 0-44; 5 cancers among 41 patients, 213 patient-years) and 16 per 1,000 patient-years (95% CI, 5-26; 37 cancers among 281 patients, 1,951 patient-years). However, the difference was statistically significant only for vedolizumab (P = .03 vs. immunomodulators and P = .04 vs. anti-TNF agents).
Subgroup analyses for new primary cancers, recurrence of a prior cancer, and type of index cancer (skin cancer vs. other cancers) similarly found no statistically significant differences between treatment arms. Results were similar in patients with IBD and RA.
Timing of therapy
The new meta-analysis confirms and expands a previous meta-analysis published in Gastroenterology in 2016 that showed no impact of treatment – primarily IMM or anti-TNF treatment – on cancer recurrence in patients with immune-mediated diseases, Dr. Gupta and coauthors wrote.
The 2016 meta-analysis reported similar cancer recurrence rates with IMMs and anti-TNFs when immunosuppression was introduced before or after 6 years of cancer diagnosis. In the new meta-analysis – with twice the number of patients, a longer duration of follow-up, and the inclusion of other biologic therapies – a stratification of results at the median interval of therapy initiation similarly found no increased risk before 5 years, compared with after 5 years.
“Although several existing guidelines recommend avoiding immunosuppression for 5 years after the index cancer, our results indicate that it may be safe to initiate these agents earlier than 5 years, at least in some patients,” Dr. Gupta and coauthors wrote, mentioning the possible impact of selection bias and surveillance bias in the study. Ongoing registries “may help answer this question more definitively with prospectively collected data, but inherently may suffer from this selection bias as well.”
Assessment of the newer biologics ustekinumab and vedolizumab is limited by the low number of studies (four and five, respectively) and by limited duration of follow-up. “Longer-term evaluation after these treatments is essential but it is reassuring that in the early analysis we did not observe an increase and in fact noted numerically lower rates of cancers,” they wrote.
It is also “critically important” to generate more data on JAK inhibitors, and to further study the safety of combining systemic chemotherapy and the continuation of IBD therapy in the setting of a new cancer diagnosis, they wrote.
The study was funded in part by grants from the Crohn’s and Colitis Foundation, and the Chleck Family Foundation. Dr. Gupta disclosed no conflicts. One coauthor disclosed consulting for Abbvie, Amgen, Biogen, and other companies, and receiving grants from several companies. Another coauthor disclosed serving on the scientific advisory boards for AbbVie and other companies, and receiving research support from Pfizer.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
AGA publishes CPU for AI in colon polyp diagnosis and management
The American Gastroenterological Association has published a Clinical Practice Update (CPU) on artificial intelligence (AI) for diagnosing and managing colorectal polyps.
The CPU, authored by Jason Samarasena, MD, of UCI Health, Orange, Calif., and colleagues, draws on recent studies and clinical experience to discuss ways that AI is already reshaping colonoscopy, and what opportunities may lie ahead.
“As with any emerging technology, there are important questions and challenges that need to be addressed to ensure that AI tools are introduced safely and effectively into clinical endoscopic practice, ”they wrote in Gastroenterology.
With advances in processing speed and deep-learning technology, AI “computer vision” can now analyze live video of a colonoscopy in progress, enabling computer-aided detection (CADe) and computer-aided diagnosis (CADx), which the panelists described as the two most important developments in the area.
CADe
“In the last several years, numerous prospective, multicenter studies have found that real-time use of AI CADe tools during colonoscopy leads to improvements in adenoma detection and other related performance metrics,” Dr. Samarasena and colleagues wrote.
CADe has yielded mixed success in real-world practice, however, with some studies reporting worse detection metrics after implementing the new technology. Dr. Samarasena and colleagues offered a variety of possible explanations for these findings, including a “ceiling effect” among highly adept endoscopists, reduced operator vigilance caused by false confidence in the technology, and potential confounding inherent to unblinded trials.
CADe may also increase health care costs and burden, they suggested, as the technology tends to catch small benign polyps, prompting unnecessary resections and shortened colonoscopy surveillance intervals.
CADx
The above, unintended consequences of CADe may be counteracted by CADx, which uses computer vision to predict which lesions have benign histology, enabling “resect-and discard” or “diagnose-and-leave” strategies.
Such approaches could significantly reduce rates of polypectomy and/or histopathology, saving an estimated $33 million–150 million per year, according to the update.
Results of real-time CADx clinical trials have been “encouraging,” Dr. Samarasena and colleagues wrote, noting that emerging technology–compatible white-light endoscopy can achieve a negative predictive value of almost 98% for lesions less than 5 mm in diameter, potentially reducing polypectomy rate by almost half.
“Increasing endoscopist confidence in optical diagnosis may be an important step toward broader implementation of leave in situ and resect-and-discard strategies, but successful implementation will also require CADx tools that seamlessly integrate the endoscopic work flow, without the need for image enhancement or magnification,” the panelists wrote.
Reimbursement models may also need to be reworked, they suggested, as many GI practices depend on a steady stream of revenue from pathology services.
Computer-aided quality assessment systems
Beyond optical detection and diagnosis, AI tools are also being developed to improve colonoscopy technique.
Investigators are studying quality assessment systems that use AI offer feedback on a range of endoscopist skills, including colonic-fold evaluation, level of mucosal exposure, and withdrawal time, the latter of which is visualized by a “speedometer” that “paints” the mucosa with “a graphical representation of the colon.”
“In the future, these types of AI-based systems may support trainees and lower-performing endoscopists to reduce exposure errors and, more broadly, may empower physician practices and hospital systems with more nuanced and actionable data on an array of factors that contribute to colonoscopy quality,” the panelists wrote.
Looking ahead
Dr. Samarasena and colleagues concluded by suggesting that the AI tools in usage and development are just the beginning of a wave of technology that will revolutionize how colonoscopies are performed.
“Eventually, we predict an AI suite of tools for colonoscopy will seem indispensable, as a powerful adjunct to support safe and efficient clinical practice,” they wrote. “As technological innovation progresses, we can expect that the future for AI in endoscopy will be a hybrid model, where the unique capabilities of physicians and our AI tools will be seamlessly intertwined to optimize patient care.”
This CPU was commissioned and approved by the AGA Institute Clinical Practice Updates Committee and the AGA Governing Board. The investigators disclosed relationships with Olympus, Neptune Medical, Conmed, and others.
The American Gastroenterological Association has published a Clinical Practice Update (CPU) on artificial intelligence (AI) for diagnosing and managing colorectal polyps.
The CPU, authored by Jason Samarasena, MD, of UCI Health, Orange, Calif., and colleagues, draws on recent studies and clinical experience to discuss ways that AI is already reshaping colonoscopy, and what opportunities may lie ahead.
“As with any emerging technology, there are important questions and challenges that need to be addressed to ensure that AI tools are introduced safely and effectively into clinical endoscopic practice, ”they wrote in Gastroenterology.
With advances in processing speed and deep-learning technology, AI “computer vision” can now analyze live video of a colonoscopy in progress, enabling computer-aided detection (CADe) and computer-aided diagnosis (CADx), which the panelists described as the two most important developments in the area.
CADe
“In the last several years, numerous prospective, multicenter studies have found that real-time use of AI CADe tools during colonoscopy leads to improvements in adenoma detection and other related performance metrics,” Dr. Samarasena and colleagues wrote.
CADe has yielded mixed success in real-world practice, however, with some studies reporting worse detection metrics after implementing the new technology. Dr. Samarasena and colleagues offered a variety of possible explanations for these findings, including a “ceiling effect” among highly adept endoscopists, reduced operator vigilance caused by false confidence in the technology, and potential confounding inherent to unblinded trials.
CADe may also increase health care costs and burden, they suggested, as the technology tends to catch small benign polyps, prompting unnecessary resections and shortened colonoscopy surveillance intervals.
CADx
The above, unintended consequences of CADe may be counteracted by CADx, which uses computer vision to predict which lesions have benign histology, enabling “resect-and discard” or “diagnose-and-leave” strategies.
Such approaches could significantly reduce rates of polypectomy and/or histopathology, saving an estimated $33 million–150 million per year, according to the update.
Results of real-time CADx clinical trials have been “encouraging,” Dr. Samarasena and colleagues wrote, noting that emerging technology–compatible white-light endoscopy can achieve a negative predictive value of almost 98% for lesions less than 5 mm in diameter, potentially reducing polypectomy rate by almost half.
“Increasing endoscopist confidence in optical diagnosis may be an important step toward broader implementation of leave in situ and resect-and-discard strategies, but successful implementation will also require CADx tools that seamlessly integrate the endoscopic work flow, without the need for image enhancement or magnification,” the panelists wrote.
Reimbursement models may also need to be reworked, they suggested, as many GI practices depend on a steady stream of revenue from pathology services.
Computer-aided quality assessment systems
Beyond optical detection and diagnosis, AI tools are also being developed to improve colonoscopy technique.
Investigators are studying quality assessment systems that use AI offer feedback on a range of endoscopist skills, including colonic-fold evaluation, level of mucosal exposure, and withdrawal time, the latter of which is visualized by a “speedometer” that “paints” the mucosa with “a graphical representation of the colon.”
“In the future, these types of AI-based systems may support trainees and lower-performing endoscopists to reduce exposure errors and, more broadly, may empower physician practices and hospital systems with more nuanced and actionable data on an array of factors that contribute to colonoscopy quality,” the panelists wrote.
Looking ahead
Dr. Samarasena and colleagues concluded by suggesting that the AI tools in usage and development are just the beginning of a wave of technology that will revolutionize how colonoscopies are performed.
“Eventually, we predict an AI suite of tools for colonoscopy will seem indispensable, as a powerful adjunct to support safe and efficient clinical practice,” they wrote. “As technological innovation progresses, we can expect that the future for AI in endoscopy will be a hybrid model, where the unique capabilities of physicians and our AI tools will be seamlessly intertwined to optimize patient care.”
This CPU was commissioned and approved by the AGA Institute Clinical Practice Updates Committee and the AGA Governing Board. The investigators disclosed relationships with Olympus, Neptune Medical, Conmed, and others.
The American Gastroenterological Association has published a Clinical Practice Update (CPU) on artificial intelligence (AI) for diagnosing and managing colorectal polyps.
The CPU, authored by Jason Samarasena, MD, of UCI Health, Orange, Calif., and colleagues, draws on recent studies and clinical experience to discuss ways that AI is already reshaping colonoscopy, and what opportunities may lie ahead.
“As with any emerging technology, there are important questions and challenges that need to be addressed to ensure that AI tools are introduced safely and effectively into clinical endoscopic practice, ”they wrote in Gastroenterology.
With advances in processing speed and deep-learning technology, AI “computer vision” can now analyze live video of a colonoscopy in progress, enabling computer-aided detection (CADe) and computer-aided diagnosis (CADx), which the panelists described as the two most important developments in the area.
CADe
“In the last several years, numerous prospective, multicenter studies have found that real-time use of AI CADe tools during colonoscopy leads to improvements in adenoma detection and other related performance metrics,” Dr. Samarasena and colleagues wrote.
CADe has yielded mixed success in real-world practice, however, with some studies reporting worse detection metrics after implementing the new technology. Dr. Samarasena and colleagues offered a variety of possible explanations for these findings, including a “ceiling effect” among highly adept endoscopists, reduced operator vigilance caused by false confidence in the technology, and potential confounding inherent to unblinded trials.
CADe may also increase health care costs and burden, they suggested, as the technology tends to catch small benign polyps, prompting unnecessary resections and shortened colonoscopy surveillance intervals.
CADx
The above, unintended consequences of CADe may be counteracted by CADx, which uses computer vision to predict which lesions have benign histology, enabling “resect-and discard” or “diagnose-and-leave” strategies.
Such approaches could significantly reduce rates of polypectomy and/or histopathology, saving an estimated $33 million–150 million per year, according to the update.
Results of real-time CADx clinical trials have been “encouraging,” Dr. Samarasena and colleagues wrote, noting that emerging technology–compatible white-light endoscopy can achieve a negative predictive value of almost 98% for lesions less than 5 mm in diameter, potentially reducing polypectomy rate by almost half.
“Increasing endoscopist confidence in optical diagnosis may be an important step toward broader implementation of leave in situ and resect-and-discard strategies, but successful implementation will also require CADx tools that seamlessly integrate the endoscopic work flow, without the need for image enhancement or magnification,” the panelists wrote.
Reimbursement models may also need to be reworked, they suggested, as many GI practices depend on a steady stream of revenue from pathology services.
Computer-aided quality assessment systems
Beyond optical detection and diagnosis, AI tools are also being developed to improve colonoscopy technique.
Investigators are studying quality assessment systems that use AI offer feedback on a range of endoscopist skills, including colonic-fold evaluation, level of mucosal exposure, and withdrawal time, the latter of which is visualized by a “speedometer” that “paints” the mucosa with “a graphical representation of the colon.”
“In the future, these types of AI-based systems may support trainees and lower-performing endoscopists to reduce exposure errors and, more broadly, may empower physician practices and hospital systems with more nuanced and actionable data on an array of factors that contribute to colonoscopy quality,” the panelists wrote.
Looking ahead
Dr. Samarasena and colleagues concluded by suggesting that the AI tools in usage and development are just the beginning of a wave of technology that will revolutionize how colonoscopies are performed.
“Eventually, we predict an AI suite of tools for colonoscopy will seem indispensable, as a powerful adjunct to support safe and efficient clinical practice,” they wrote. “As technological innovation progresses, we can expect that the future for AI in endoscopy will be a hybrid model, where the unique capabilities of physicians and our AI tools will be seamlessly intertwined to optimize patient care.”
This CPU was commissioned and approved by the AGA Institute Clinical Practice Updates Committee and the AGA Governing Board. The investigators disclosed relationships with Olympus, Neptune Medical, Conmed, and others.
FROM GASTROENTEROLOGY
Even a short course of opioids could jeopardize IBD patient health
These findings amplify the safety signal from previous inpatient studies by showing that even a short course of opioids in an outpatient setting may increase risks of corticosteroid use and emergency department utilization, prompting caution among prescribers, reported Laura Telfer, MS, of Penn State College of Medicine, Hershey, Pa., and colleagues.
“Opioids are frequently prescribed to treat pain associated with IBD,” the investigators wrote in Gastro Hep Advances. “Unfortunately, they are associated with many problems in IBD, including increased risk of emergency room visits, hospitalization, surgery, and mortality. Chronic opioid use may also exacerbate symptoms and induce IBD flares, prompting discontinuation, thus increasing the risk of opioid withdrawal syndrome. Ironically, there is no published evidence that opioids even help to improve abdominal pain in IBD, particularly in the long term. Notably, most studies investigating opioid use in IBD have been limited to hospitalized patients, and few have directly evaluated the impact of opioid prescription length.”
To address this knowledge gap, Ms. Telfer and colleagues conducted a retrospective, population-based cohort study involving patients with IBD who were classified as either long-term opioid users, short-term opioid users, or nonusers. Drawing data from more than 80,000 patients in the TriNetX Diamond Network, the investigators evaluated relative, intergroup risks for corticosteroid use, emergency department utilization, mortality, and IBD-related surgery.
Comparing short-term opioid users and nonusers revealed that short-term use more than doubled the risk of corticosteroid prescription (relative risk [RR], 2.517; P less than .001), and increased the risk of an emergency department visit by approximately 32% (RR, 1.315; P less than .001). Long-term use was associated with a similar doubling in risk of corticosteroid prescription (RR, 2.383; P less than .001), and an even greater risk of emergency department utilization (RR, 2.083; P less than .001). Risks of death or IBD-related surgery did not differ for either of these comparisons.
Next, the investigators compared long-term opioid use versus short-term opioid use. This suggested a duration-related effect, as long-term users were 57% more likely than were short-term users to utilize emergency department services (RR, 1.572; P less than .001). No significant differences for the other outcomes were detected in this comparison.
“Unlike previous studies, we did not find an association between opioid use and IBD-related surgery or death,” the investigators wrote. “Notably, these [previously reported] associations utilized opioid dosage (e.g., morphine equivalent or number of prescriptions), rather than length of opioid prescription (as we did). We also focused on IBD outpatients, while prior studies evaluated (in part or completely) inpatient populations, who typically present with more severe illness.”
Still, they added, the present findings should serve as a warning to prescribers considering even a short course of opioids for patients with IBD.
“This study demonstrates that prescribing opioids to IBD outpatients carries significant, specific risks, regardless of prescription length,” Ms. Telfer and colleagues wrote. “Healthcare professionals should exercise caution before prescribing these agents.”
The study was supported by the Peter and Marshia Carlino Early Career Professorship in Inflammatory Bowel Disease, the Margot E. Walrath Career Development Professorship in Gastroenterology, and the National Institutes of Health. The investigators disclosed no conflicts of interest.
Given that objective control of inflammation does not always correlate with improvement in abdominal pain scores, the use of opioids in patients with inflammatory bowel diseases (IBD) remains a difficult area of clinical practice and research. In this study, Telfer and colleagues performed a retrospective analysis using the TriNetX Diamond Network to assess the impact of opioid use on health-associated outcomes and evaluate for a differential impact on outcomes depending on the length of opioid prescription. When compared to non–opioid users, both short- and long-term opioid users were more likely to utilize corticosteroids and emergency department services. However, in contrast to prior studies, there was no increased risk for mortality demonstrated among those patients with short- or long-term opioid use.
Edward L. Barnes, MD, MPH, is assistant professor of medicine at the University of North Carolina at Chapel Hill. He disclosed having served as a consultant for Target RWE (not relevant to this commentary).
Given that objective control of inflammation does not always correlate with improvement in abdominal pain scores, the use of opioids in patients with inflammatory bowel diseases (IBD) remains a difficult area of clinical practice and research. In this study, Telfer and colleagues performed a retrospective analysis using the TriNetX Diamond Network to assess the impact of opioid use on health-associated outcomes and evaluate for a differential impact on outcomes depending on the length of opioid prescription. When compared to non–opioid users, both short- and long-term opioid users were more likely to utilize corticosteroids and emergency department services. However, in contrast to prior studies, there was no increased risk for mortality demonstrated among those patients with short- or long-term opioid use.
Edward L. Barnes, MD, MPH, is assistant professor of medicine at the University of North Carolina at Chapel Hill. He disclosed having served as a consultant for Target RWE (not relevant to this commentary).
Given that objective control of inflammation does not always correlate with improvement in abdominal pain scores, the use of opioids in patients with inflammatory bowel diseases (IBD) remains a difficult area of clinical practice and research. In this study, Telfer and colleagues performed a retrospective analysis using the TriNetX Diamond Network to assess the impact of opioid use on health-associated outcomes and evaluate for a differential impact on outcomes depending on the length of opioid prescription. When compared to non–opioid users, both short- and long-term opioid users were more likely to utilize corticosteroids and emergency department services. However, in contrast to prior studies, there was no increased risk for mortality demonstrated among those patients with short- or long-term opioid use.
Edward L. Barnes, MD, MPH, is assistant professor of medicine at the University of North Carolina at Chapel Hill. He disclosed having served as a consultant for Target RWE (not relevant to this commentary).
These findings amplify the safety signal from previous inpatient studies by showing that even a short course of opioids in an outpatient setting may increase risks of corticosteroid use and emergency department utilization, prompting caution among prescribers, reported Laura Telfer, MS, of Penn State College of Medicine, Hershey, Pa., and colleagues.
“Opioids are frequently prescribed to treat pain associated with IBD,” the investigators wrote in Gastro Hep Advances. “Unfortunately, they are associated with many problems in IBD, including increased risk of emergency room visits, hospitalization, surgery, and mortality. Chronic opioid use may also exacerbate symptoms and induce IBD flares, prompting discontinuation, thus increasing the risk of opioid withdrawal syndrome. Ironically, there is no published evidence that opioids even help to improve abdominal pain in IBD, particularly in the long term. Notably, most studies investigating opioid use in IBD have been limited to hospitalized patients, and few have directly evaluated the impact of opioid prescription length.”
To address this knowledge gap, Ms. Telfer and colleagues conducted a retrospective, population-based cohort study involving patients with IBD who were classified as either long-term opioid users, short-term opioid users, or nonusers. Drawing data from more than 80,000 patients in the TriNetX Diamond Network, the investigators evaluated relative, intergroup risks for corticosteroid use, emergency department utilization, mortality, and IBD-related surgery.
Comparing short-term opioid users and nonusers revealed that short-term use more than doubled the risk of corticosteroid prescription (relative risk [RR], 2.517; P less than .001), and increased the risk of an emergency department visit by approximately 32% (RR, 1.315; P less than .001). Long-term use was associated with a similar doubling in risk of corticosteroid prescription (RR, 2.383; P less than .001), and an even greater risk of emergency department utilization (RR, 2.083; P less than .001). Risks of death or IBD-related surgery did not differ for either of these comparisons.
Next, the investigators compared long-term opioid use versus short-term opioid use. This suggested a duration-related effect, as long-term users were 57% more likely than were short-term users to utilize emergency department services (RR, 1.572; P less than .001). No significant differences for the other outcomes were detected in this comparison.
“Unlike previous studies, we did not find an association between opioid use and IBD-related surgery or death,” the investigators wrote. “Notably, these [previously reported] associations utilized opioid dosage (e.g., morphine equivalent or number of prescriptions), rather than length of opioid prescription (as we did). We also focused on IBD outpatients, while prior studies evaluated (in part or completely) inpatient populations, who typically present with more severe illness.”
Still, they added, the present findings should serve as a warning to prescribers considering even a short course of opioids for patients with IBD.
“This study demonstrates that prescribing opioids to IBD outpatients carries significant, specific risks, regardless of prescription length,” Ms. Telfer and colleagues wrote. “Healthcare professionals should exercise caution before prescribing these agents.”
The study was supported by the Peter and Marshia Carlino Early Career Professorship in Inflammatory Bowel Disease, the Margot E. Walrath Career Development Professorship in Gastroenterology, and the National Institutes of Health. The investigators disclosed no conflicts of interest.
These findings amplify the safety signal from previous inpatient studies by showing that even a short course of opioids in an outpatient setting may increase risks of corticosteroid use and emergency department utilization, prompting caution among prescribers, reported Laura Telfer, MS, of Penn State College of Medicine, Hershey, Pa., and colleagues.
“Opioids are frequently prescribed to treat pain associated with IBD,” the investigators wrote in Gastro Hep Advances. “Unfortunately, they are associated with many problems in IBD, including increased risk of emergency room visits, hospitalization, surgery, and mortality. Chronic opioid use may also exacerbate symptoms and induce IBD flares, prompting discontinuation, thus increasing the risk of opioid withdrawal syndrome. Ironically, there is no published evidence that opioids even help to improve abdominal pain in IBD, particularly in the long term. Notably, most studies investigating opioid use in IBD have been limited to hospitalized patients, and few have directly evaluated the impact of opioid prescription length.”
To address this knowledge gap, Ms. Telfer and colleagues conducted a retrospective, population-based cohort study involving patients with IBD who were classified as either long-term opioid users, short-term opioid users, or nonusers. Drawing data from more than 80,000 patients in the TriNetX Diamond Network, the investigators evaluated relative, intergroup risks for corticosteroid use, emergency department utilization, mortality, and IBD-related surgery.
Comparing short-term opioid users and nonusers revealed that short-term use more than doubled the risk of corticosteroid prescription (relative risk [RR], 2.517; P less than .001), and increased the risk of an emergency department visit by approximately 32% (RR, 1.315; P less than .001). Long-term use was associated with a similar doubling in risk of corticosteroid prescription (RR, 2.383; P less than .001), and an even greater risk of emergency department utilization (RR, 2.083; P less than .001). Risks of death or IBD-related surgery did not differ for either of these comparisons.
Next, the investigators compared long-term opioid use versus short-term opioid use. This suggested a duration-related effect, as long-term users were 57% more likely than were short-term users to utilize emergency department services (RR, 1.572; P less than .001). No significant differences for the other outcomes were detected in this comparison.
“Unlike previous studies, we did not find an association between opioid use and IBD-related surgery or death,” the investigators wrote. “Notably, these [previously reported] associations utilized opioid dosage (e.g., morphine equivalent or number of prescriptions), rather than length of opioid prescription (as we did). We also focused on IBD outpatients, while prior studies evaluated (in part or completely) inpatient populations, who typically present with more severe illness.”
Still, they added, the present findings should serve as a warning to prescribers considering even a short course of opioids for patients with IBD.
“This study demonstrates that prescribing opioids to IBD outpatients carries significant, specific risks, regardless of prescription length,” Ms. Telfer and colleagues wrote. “Healthcare professionals should exercise caution before prescribing these agents.”
The study was supported by the Peter and Marshia Carlino Early Career Professorship in Inflammatory Bowel Disease, the Margot E. Walrath Career Development Professorship in Gastroenterology, and the National Institutes of Health. The investigators disclosed no conflicts of interest.
FROM GASTRO HEP ADVANCES
Microsimulation model identifies 4-year window for pancreatic cancer screening
, based on a microsimulation model.
To seize this opportunity, however, a greater understanding of natural disease course is needed, along with more sensitive screening tools, reported Brechtje D. M. Koopmann, MD, of Erasmus Medical Center, Rotterdam, the Netherlands, and colleagues.
Previous studies have suggested that the window of opportunity for pancreatic cancer screening may span decades, with estimates ranging from 12 to 50 years, the investigators wrote. Their report was published in Gastroenterology.
“Unfortunately, the poor results of pancreatic cancer screening do not align with this assumption, leaving unanswered whether this large window of opportunity truly exists,” they noted. “Microsimulation modeling, combined with available, if limited data, can provide new information on the natural disease course.”
For the present study, the investigators used the Microsimulation Screening Analysis (MISCAN) model, which has guided development of screening programs around the world for cervical, breast, and colorectal cancer. The model incorporates natural disease course, screening, and demographic data, then uses observable inputs such as precursor lesion prevalence and cancer incidence to estimate unobservable outcomes like stage durations and precursor lesion onset.
Dr. Koopmann and colleagues programmed this model with Dutch pancreatic cancer incidence data and findings from Japanese autopsy cases without pancreatic cancer.
First, the model offered insights into precursor lesion prevalence.
The estimated prevalence of any cystic lesion in the pancreas was 6.1% for individuals 50 years of age and 29.6% for those 80 years of age. Solid precursor lesions (PanINs) were estimated to be mainly multifocal (three or more lesions) in individuals older than 80 years. By this age, almost 12% had at least two PanINs. For those lesions that eventually became cancerous, the mean time since cyst onset was estimated to be 8.8 years, and mean time since PanIN onset was 9.0 years.
However, less than 10% of cystic and PanIN lesions progress to become cancers. PanIN lesions are not visible on imaging, and therefore current screening focuses on finding cystic precursor lesions, although these represent only about 10% of pancreatic cancers.
“Given the low pancreatic cancer progression risk of cysts, evaluation of the efficiency of current surveillance guidelines is necessary,” the investigators noted.
Screening should instead focus on identifying high-grade dysplastic lesions, they suggested. While these lesions may have a very low estimated prevalence, at just 0.3% among individuals 90 years of age, they present the greatest risk of pancreatic cancer.
For precursor cysts exhibiting HGD that progressed to pancreatic cancer, the mean interval between dysplasia and cancer was just 4 years. Among 13.7% of individuals, the interval was less than 1 year, suggesting an even shorter window of opportunity for screening.
Beyond this brief timeframe, low test sensitivity explains why screening efforts to date have fallen short, the investigators wrote.
Better tests are “urgently needed,” they added, while acknowledging the challenges inherent to this endeavor. Previous research has shown that precursor lesions in the pancreas are often less than 5 mm in diameter, making them extremely challenging to detect. An effective tool would need to identify solid precursor lesions (PanINs), and also need to simultaneously determine grade of dysplasia.
“Biomarkers could be the future in this matter,” the investigators suggested.
Dr. Koopmann and colleagues concluded by noting that more research is needed to characterize the pathophysiology of pancreatic cancer. On their part, “the current model will be validated, adjusted, and improved whenever new data from autopsy or prospective surveillance studies become available.”
The study was funded in part by Maag Lever Darm Stichting. The investigators disclosed no conflicts of interest.
We continue to search for a way to effectively screen for and prevent pancreatic cancer. Most pancreatic cancers come from pancreatic intraepithelial neoplasms (PanINs), which are essentially invisible on imaging. Pancreatic cysts are relatively common, and only a small number will progress to cancer. Screening via MRI or EUS can look for high-risk features of visible cysts or find early-stage cancers, but whom to screen, how often, and what to do with the results remains unclear. Many of the steps from development of the initial cyst or PanIN to the transformation to cancer cannot be observed, and as such this is a perfect application for disease modeling that allows us to fill in the gaps of what can be observed and estimate what we cannot see.
Mary Linton B. Peters, MD, MS, is a medical oncologist specializing in hepatic and pancreatobiliary cancers at Beth Israel Deaconess Medical Center, Boston, an assistant professor at Harvard Medical School, and a senior scientist at the Institute for Technology Assessment of Massachusetts General Hospital. She reports unrelated institutional research funding from NuCana and Helsinn.
We continue to search for a way to effectively screen for and prevent pancreatic cancer. Most pancreatic cancers come from pancreatic intraepithelial neoplasms (PanINs), which are essentially invisible on imaging. Pancreatic cysts are relatively common, and only a small number will progress to cancer. Screening via MRI or EUS can look for high-risk features of visible cysts or find early-stage cancers, but whom to screen, how often, and what to do with the results remains unclear. Many of the steps from development of the initial cyst or PanIN to the transformation to cancer cannot be observed, and as such this is a perfect application for disease modeling that allows us to fill in the gaps of what can be observed and estimate what we cannot see.
Mary Linton B. Peters, MD, MS, is a medical oncologist specializing in hepatic and pancreatobiliary cancers at Beth Israel Deaconess Medical Center, Boston, an assistant professor at Harvard Medical School, and a senior scientist at the Institute for Technology Assessment of Massachusetts General Hospital. She reports unrelated institutional research funding from NuCana and Helsinn.
We continue to search for a way to effectively screen for and prevent pancreatic cancer. Most pancreatic cancers come from pancreatic intraepithelial neoplasms (PanINs), which are essentially invisible on imaging. Pancreatic cysts are relatively common, and only a small number will progress to cancer. Screening via MRI or EUS can look for high-risk features of visible cysts or find early-stage cancers, but whom to screen, how often, and what to do with the results remains unclear. Many of the steps from development of the initial cyst or PanIN to the transformation to cancer cannot be observed, and as such this is a perfect application for disease modeling that allows us to fill in the gaps of what can be observed and estimate what we cannot see.
Mary Linton B. Peters, MD, MS, is a medical oncologist specializing in hepatic and pancreatobiliary cancers at Beth Israel Deaconess Medical Center, Boston, an assistant professor at Harvard Medical School, and a senior scientist at the Institute for Technology Assessment of Massachusetts General Hospital. She reports unrelated institutional research funding from NuCana and Helsinn.
, based on a microsimulation model.
To seize this opportunity, however, a greater understanding of natural disease course is needed, along with more sensitive screening tools, reported Brechtje D. M. Koopmann, MD, of Erasmus Medical Center, Rotterdam, the Netherlands, and colleagues.
Previous studies have suggested that the window of opportunity for pancreatic cancer screening may span decades, with estimates ranging from 12 to 50 years, the investigators wrote. Their report was published in Gastroenterology.
“Unfortunately, the poor results of pancreatic cancer screening do not align with this assumption, leaving unanswered whether this large window of opportunity truly exists,” they noted. “Microsimulation modeling, combined with available, if limited data, can provide new information on the natural disease course.”
For the present study, the investigators used the Microsimulation Screening Analysis (MISCAN) model, which has guided development of screening programs around the world for cervical, breast, and colorectal cancer. The model incorporates natural disease course, screening, and demographic data, then uses observable inputs such as precursor lesion prevalence and cancer incidence to estimate unobservable outcomes like stage durations and precursor lesion onset.
Dr. Koopmann and colleagues programmed this model with Dutch pancreatic cancer incidence data and findings from Japanese autopsy cases without pancreatic cancer.
First, the model offered insights into precursor lesion prevalence.
The estimated prevalence of any cystic lesion in the pancreas was 6.1% for individuals 50 years of age and 29.6% for those 80 years of age. Solid precursor lesions (PanINs) were estimated to be mainly multifocal (three or more lesions) in individuals older than 80 years. By this age, almost 12% had at least two PanINs. For those lesions that eventually became cancerous, the mean time since cyst onset was estimated to be 8.8 years, and mean time since PanIN onset was 9.0 years.
However, less than 10% of cystic and PanIN lesions progress to become cancers. PanIN lesions are not visible on imaging, and therefore current screening focuses on finding cystic precursor lesions, although these represent only about 10% of pancreatic cancers.
“Given the low pancreatic cancer progression risk of cysts, evaluation of the efficiency of current surveillance guidelines is necessary,” the investigators noted.
Screening should instead focus on identifying high-grade dysplastic lesions, they suggested. While these lesions may have a very low estimated prevalence, at just 0.3% among individuals 90 years of age, they present the greatest risk of pancreatic cancer.
For precursor cysts exhibiting HGD that progressed to pancreatic cancer, the mean interval between dysplasia and cancer was just 4 years. Among 13.7% of individuals, the interval was less than 1 year, suggesting an even shorter window of opportunity for screening.
Beyond this brief timeframe, low test sensitivity explains why screening efforts to date have fallen short, the investigators wrote.
Better tests are “urgently needed,” they added, while acknowledging the challenges inherent to this endeavor. Previous research has shown that precursor lesions in the pancreas are often less than 5 mm in diameter, making them extremely challenging to detect. An effective tool would need to identify solid precursor lesions (PanINs), and also need to simultaneously determine grade of dysplasia.
“Biomarkers could be the future in this matter,” the investigators suggested.
Dr. Koopmann and colleagues concluded by noting that more research is needed to characterize the pathophysiology of pancreatic cancer. On their part, “the current model will be validated, adjusted, and improved whenever new data from autopsy or prospective surveillance studies become available.”
The study was funded in part by Maag Lever Darm Stichting. The investigators disclosed no conflicts of interest.
, based on a microsimulation model.
To seize this opportunity, however, a greater understanding of natural disease course is needed, along with more sensitive screening tools, reported Brechtje D. M. Koopmann, MD, of Erasmus Medical Center, Rotterdam, the Netherlands, and colleagues.
Previous studies have suggested that the window of opportunity for pancreatic cancer screening may span decades, with estimates ranging from 12 to 50 years, the investigators wrote. Their report was published in Gastroenterology.
“Unfortunately, the poor results of pancreatic cancer screening do not align with this assumption, leaving unanswered whether this large window of opportunity truly exists,” they noted. “Microsimulation modeling, combined with available, if limited data, can provide new information on the natural disease course.”
For the present study, the investigators used the Microsimulation Screening Analysis (MISCAN) model, which has guided development of screening programs around the world for cervical, breast, and colorectal cancer. The model incorporates natural disease course, screening, and demographic data, then uses observable inputs such as precursor lesion prevalence and cancer incidence to estimate unobservable outcomes like stage durations and precursor lesion onset.
Dr. Koopmann and colleagues programmed this model with Dutch pancreatic cancer incidence data and findings from Japanese autopsy cases without pancreatic cancer.
First, the model offered insights into precursor lesion prevalence.
The estimated prevalence of any cystic lesion in the pancreas was 6.1% for individuals 50 years of age and 29.6% for those 80 years of age. Solid precursor lesions (PanINs) were estimated to be mainly multifocal (three or more lesions) in individuals older than 80 years. By this age, almost 12% had at least two PanINs. For those lesions that eventually became cancerous, the mean time since cyst onset was estimated to be 8.8 years, and mean time since PanIN onset was 9.0 years.
However, less than 10% of cystic and PanIN lesions progress to become cancers. PanIN lesions are not visible on imaging, and therefore current screening focuses on finding cystic precursor lesions, although these represent only about 10% of pancreatic cancers.
“Given the low pancreatic cancer progression risk of cysts, evaluation of the efficiency of current surveillance guidelines is necessary,” the investigators noted.
Screening should instead focus on identifying high-grade dysplastic lesions, they suggested. While these lesions may have a very low estimated prevalence, at just 0.3% among individuals 90 years of age, they present the greatest risk of pancreatic cancer.
For precursor cysts exhibiting HGD that progressed to pancreatic cancer, the mean interval between dysplasia and cancer was just 4 years. Among 13.7% of individuals, the interval was less than 1 year, suggesting an even shorter window of opportunity for screening.
Beyond this brief timeframe, low test sensitivity explains why screening efforts to date have fallen short, the investigators wrote.
Better tests are “urgently needed,” they added, while acknowledging the challenges inherent to this endeavor. Previous research has shown that precursor lesions in the pancreas are often less than 5 mm in diameter, making them extremely challenging to detect. An effective tool would need to identify solid precursor lesions (PanINs), and also need to simultaneously determine grade of dysplasia.
“Biomarkers could be the future in this matter,” the investigators suggested.
Dr. Koopmann and colleagues concluded by noting that more research is needed to characterize the pathophysiology of pancreatic cancer. On their part, “the current model will be validated, adjusted, and improved whenever new data from autopsy or prospective surveillance studies become available.”
The study was funded in part by Maag Lever Darm Stichting. The investigators disclosed no conflicts of interest.
FROM GASTROENTEROLOGY
Many preoperative EAC biopsies fail to predict true tumor grade, leading to unnecessary esophagectomy
Inaccurate preoperative biopsy findings could mean that patients who are candidates for endoscopic resection (ER) are unnecessarily undergoing esophagectomy, a procedure with greater risks of morbidity and mortality, reported Ravi S. Shah, MD, of Cleveland Clinic, and colleagues.
“It is unclear how accurate tumor differentiation on endoscopic biopsies is and if it can be used for clinical decision-making,” the investigators wrote in Techniques and Innovations in Gastrointestinal Endoscopy. “Given that tumors may be considerably heterogeneous in gland formation, the limited amount of tissue obtained from endoscopic forceps biopsies may not be representative of the entire tumor for pathologic grading, which may result in discrepant tumor grading between biopsy and resection specimens.”
While previous studies have compared esophagogastroduodenoscopy-guided biopsy results with histological findings after surgical resection, scant evidence is available to compare biopsy findings with both surgically and endoscopically resected tissue.
Despite this potential knowledge gap, “many patients with poorly differentiated EAC on preresection biopsy do not undergo ER, with the belief that the final resection pathology would be noncurative,” the investigators noted.
To help clarify how congruent pre- and postoperative biopsies are for both resection methods, Dr. Shah and colleagues conducted a retrospective study involving 346 EAC lesions. Samples were drawn from 121 ERs and 225 esophagectomies performed at two tertiary referral centers. Preoperative and postoperative findings were compared for accuracy and for level of agreement via Gwet’s AC2 interrater analysis.
For all evaluable lesions, preoperative biopsy had an accuracy of 68%, with a “substantial” agreement coefficient (Gwet’s AC2, 0.70; P less than .001). Accuracy in the esophagectomy group was similar, at 72%, again with “substantial” agreement (Gwet’s AC2, 0.74; P less than .001). For the ER group, however, accuracy was just 56%, with a “moderate” level of agreement (Gwet’s AC2, 0.60; P less than .001).
“We speculate that the discrepancy of tumor differentiation on endoscopic forceps biopsies and resection specimens is due to nonrepresentative sampling of tumors to accurately determine the percentage of gland formation and thus tumor grade,” the investigators noted.
Further analysis showed that 22.7% of moderately differentiated tumors were upgraded to poorly differentiated upon final histology. Conversely, 19.6% of poorly differentiated tumors were downgraded to moderately differentiated. Downgrading was even more common among T1a tumors, 40% of which were changed from poorly to moderately differentiated between pre- and postprocedural histology.
These latter findings concerning downgrading are particularly relevant for clinical decision-making, the investigators noted, as patients with poorly differentiated EAC on preoperative biopsy are typically sent for esophagectomy – a more invasive and riskier procedure – out of concern that ER will be noncurative.
“If poor differentiation was the only high-risk feature, these patients may have unnecessarily undergone esophagectomy,” Dr. Shah and colleagues wrote. “Especially in marginal surgical candidates, staging ER should be considered in patients with early esophageal cancer with preoperative biopsies showing poorly differentiated cancer.”
The investigators disclosed relationships with Medtronic, Lucid Diagnostics, Lumendi, and others.
Inaccurate preoperative biopsy findings could mean that patients who are candidates for endoscopic resection (ER) are unnecessarily undergoing esophagectomy, a procedure with greater risks of morbidity and mortality, reported Ravi S. Shah, MD, of Cleveland Clinic, and colleagues.
“It is unclear how accurate tumor differentiation on endoscopic biopsies is and if it can be used for clinical decision-making,” the investigators wrote in Techniques and Innovations in Gastrointestinal Endoscopy. “Given that tumors may be considerably heterogeneous in gland formation, the limited amount of tissue obtained from endoscopic forceps biopsies may not be representative of the entire tumor for pathologic grading, which may result in discrepant tumor grading between biopsy and resection specimens.”
While previous studies have compared esophagogastroduodenoscopy-guided biopsy results with histological findings after surgical resection, scant evidence is available to compare biopsy findings with both surgically and endoscopically resected tissue.
Despite this potential knowledge gap, “many patients with poorly differentiated EAC on preresection biopsy do not undergo ER, with the belief that the final resection pathology would be noncurative,” the investigators noted.
To help clarify how congruent pre- and postoperative biopsies are for both resection methods, Dr. Shah and colleagues conducted a retrospective study involving 346 EAC lesions. Samples were drawn from 121 ERs and 225 esophagectomies performed at two tertiary referral centers. Preoperative and postoperative findings were compared for accuracy and for level of agreement via Gwet’s AC2 interrater analysis.
For all evaluable lesions, preoperative biopsy had an accuracy of 68%, with a “substantial” agreement coefficient (Gwet’s AC2, 0.70; P less than .001). Accuracy in the esophagectomy group was similar, at 72%, again with “substantial” agreement (Gwet’s AC2, 0.74; P less than .001). For the ER group, however, accuracy was just 56%, with a “moderate” level of agreement (Gwet’s AC2, 0.60; P less than .001).
“We speculate that the discrepancy of tumor differentiation on endoscopic forceps biopsies and resection specimens is due to nonrepresentative sampling of tumors to accurately determine the percentage of gland formation and thus tumor grade,” the investigators noted.
Further analysis showed that 22.7% of moderately differentiated tumors were upgraded to poorly differentiated upon final histology. Conversely, 19.6% of poorly differentiated tumors were downgraded to moderately differentiated. Downgrading was even more common among T1a tumors, 40% of which were changed from poorly to moderately differentiated between pre- and postprocedural histology.
These latter findings concerning downgrading are particularly relevant for clinical decision-making, the investigators noted, as patients with poorly differentiated EAC on preoperative biopsy are typically sent for esophagectomy – a more invasive and riskier procedure – out of concern that ER will be noncurative.
“If poor differentiation was the only high-risk feature, these patients may have unnecessarily undergone esophagectomy,” Dr. Shah and colleagues wrote. “Especially in marginal surgical candidates, staging ER should be considered in patients with early esophageal cancer with preoperative biopsies showing poorly differentiated cancer.”
The investigators disclosed relationships with Medtronic, Lucid Diagnostics, Lumendi, and others.
Inaccurate preoperative biopsy findings could mean that patients who are candidates for endoscopic resection (ER) are unnecessarily undergoing esophagectomy, a procedure with greater risks of morbidity and mortality, reported Ravi S. Shah, MD, of Cleveland Clinic, and colleagues.
“It is unclear how accurate tumor differentiation on endoscopic biopsies is and if it can be used for clinical decision-making,” the investigators wrote in Techniques and Innovations in Gastrointestinal Endoscopy. “Given that tumors may be considerably heterogeneous in gland formation, the limited amount of tissue obtained from endoscopic forceps biopsies may not be representative of the entire tumor for pathologic grading, which may result in discrepant tumor grading between biopsy and resection specimens.”
While previous studies have compared esophagogastroduodenoscopy-guided biopsy results with histological findings after surgical resection, scant evidence is available to compare biopsy findings with both surgically and endoscopically resected tissue.
Despite this potential knowledge gap, “many patients with poorly differentiated EAC on preresection biopsy do not undergo ER, with the belief that the final resection pathology would be noncurative,” the investigators noted.
To help clarify how congruent pre- and postoperative biopsies are for both resection methods, Dr. Shah and colleagues conducted a retrospective study involving 346 EAC lesions. Samples were drawn from 121 ERs and 225 esophagectomies performed at two tertiary referral centers. Preoperative and postoperative findings were compared for accuracy and for level of agreement via Gwet’s AC2 interrater analysis.
For all evaluable lesions, preoperative biopsy had an accuracy of 68%, with a “substantial” agreement coefficient (Gwet’s AC2, 0.70; P less than .001). Accuracy in the esophagectomy group was similar, at 72%, again with “substantial” agreement (Gwet’s AC2, 0.74; P less than .001). For the ER group, however, accuracy was just 56%, with a “moderate” level of agreement (Gwet’s AC2, 0.60; P less than .001).
“We speculate that the discrepancy of tumor differentiation on endoscopic forceps biopsies and resection specimens is due to nonrepresentative sampling of tumors to accurately determine the percentage of gland formation and thus tumor grade,” the investigators noted.
Further analysis showed that 22.7% of moderately differentiated tumors were upgraded to poorly differentiated upon final histology. Conversely, 19.6% of poorly differentiated tumors were downgraded to moderately differentiated. Downgrading was even more common among T1a tumors, 40% of which were changed from poorly to moderately differentiated between pre- and postprocedural histology.
These latter findings concerning downgrading are particularly relevant for clinical decision-making, the investigators noted, as patients with poorly differentiated EAC on preoperative biopsy are typically sent for esophagectomy – a more invasive and riskier procedure – out of concern that ER will be noncurative.
“If poor differentiation was the only high-risk feature, these patients may have unnecessarily undergone esophagectomy,” Dr. Shah and colleagues wrote. “Especially in marginal surgical candidates, staging ER should be considered in patients with early esophageal cancer with preoperative biopsies showing poorly differentiated cancer.”
The investigators disclosed relationships with Medtronic, Lucid Diagnostics, Lumendi, and others.
FROM TECHNIQUES AND INNOVATIONS IN GASTROINTESTINAL ENDOSCOPY
Report cards, additional observer improve adenoma detection rate
Although multimodal interventions like extra training with periodic feedback showed some signs of improving ADR, withdrawal time monitoring was not significantly associated with a better detection rate, reported Anshul Arora, MD, of Western University, London, Ont., and colleagues.
“Given the increased risk of postcolonoscopy colorectal cancer associated with low ADR, improving [this performance metric] has become a major focus for quality improvement,” the investigators wrote in Clinical Gastroenterology and Hepatology.
They noted that “numerous strategies” have been evaluated for this purpose, which may be sorted into three groups: endoscopy unit–level interventions (i.e., system changes), procedure-targeted interventions (i.e., technique changes), and technology-based interventions.
“Of these categories, endoscopy unit–level interventions are perhaps the easiest to implement widely because they generally require fewer changes in the technical aspect of how a colonoscopy is performed,” the investigators wrote. “Thus, the objective of this study was to conduct a systematic review and meta-analysis to identify endoscopy unit–level interventions aimed at improving ADRs and their effectiveness.”
To this end, Dr. Arora and colleagues analyzed data from 34 randomized controlled trials and observational studies involving 1,501 endoscopists and 371,041 procedures. They evaluated the relationship between ADR and implementation of four interventions: a performance report card, a multimodal intervention (e.g., training sessions with periodic feedback), presence of an additional observer, and withdrawal time monitoring.
Provision of report cards was associated with the greatest improvement in ADR, at 28% (odds ratio, 1.28; 95% confidence interval, 1.13-1.45; P less than .001), followed by presence of an additional observer, which bumped ADR by 25% (OR, 1.25; 95% CI, 1.09-1.43; P = .002). The impact of multimodal interventions was “borderline significant,” the investigators wrote, with an 18% improvement in ADR (OR, 1.18; 95% CI, 1.00-1.40; P = .05). In contrast, withdrawal time monitoring showed no significant benefit (OR, 1.35; 95% CI, 0.93-1.96; P = .11).
In their discussion, Dr. Arora and colleagues offered guidance on the use of report cards, which were associated with the greatest improvement in ADR.
“We found that benchmarking individual endoscopists against their peers was important for improving ADR performance because this was the common thread among all report card–based interventions,” they wrote. “In terms of the method of delivery for feedback, only one study used public reporting of colonoscopy quality indicators, whereas the rest delivered report cards privately to physicians. This suggests that confidential feedback did not impede self-improvement, which is desirable to avoid stigmatization of low ADR performers.”
The findings also suggest that additional observers can boost ADR without specialized training.
“[The benefit of an additional observer] may be explained by the presence of a second set of eyes to identify polyps or, more pragmatically, by the Hawthorne effect, whereby endoscopists may be more careful because they know someone else is watching the screen,” the investigators wrote. “Regardless, extra training for the observer does not seem to be necessary because the three RCTs [evaluating this intervention] all used endoscopy nurses who did not receive any additional polyp detection training. Thus, endoscopy unit nurses should be encouraged to speak up should they see a polyp the endoscopist missed.”
The investigators disclosed no conflicts of interest.
The effectiveness of colonoscopy to prevent colorectal cancer depends on the quality of the exam. Adenoma detection rate (ADR) is a validated quality indicator, associated with lower risk of postcolonoscopy colorectal cancer. There are multiple interventions that can improve endoscopists’ ADR, but it is unclear which ones are higher yield than others. This study summarizes the existing studies on various interventions and finds the largest increase in ADR with the use of physician report cards. This is not surprising, as report cards both provide measurement and are an intervention for improvement.
Interestingly the included studies mostly used individual confidential report cards, and demonstrated an improvement in ADR. Having a second set of eyes looking at the monitor was also associated with increase in ADR. Whether it’s the observer picking up missed polyps, or the endoscopist doing a more thorough exam because someone else is watching the screen, is unclear. This is the same principle that current computer assisted detection (CADe) devices help with. While having a second observer may not be practical or cost effective, and CADe is expensive, the take-away is that there are multiple ways to improve ADR, and at the very least every physician should be receiving report cards or feedback on their quality indicators and working towards achieving and exceeding the minimum benchmarks.
Aasma Shaukat, MD, MPH, is the Robert M. and Mary H. Glickman professor of medicine, New York University Grossman School of Medicine where she also holds a professorship in population health. She serves as director of outcomes research in the division of gastroenterology and hepatology, and codirector of Translational Research Education and Careers (TREC). She disclosed serving as an adviser for Motus-GI and Iterative Health.
The effectiveness of colonoscopy to prevent colorectal cancer depends on the quality of the exam. Adenoma detection rate (ADR) is a validated quality indicator, associated with lower risk of postcolonoscopy colorectal cancer. There are multiple interventions that can improve endoscopists’ ADR, but it is unclear which ones are higher yield than others. This study summarizes the existing studies on various interventions and finds the largest increase in ADR with the use of physician report cards. This is not surprising, as report cards both provide measurement and are an intervention for improvement.
Interestingly the included studies mostly used individual confidential report cards, and demonstrated an improvement in ADR. Having a second set of eyes looking at the monitor was also associated with increase in ADR. Whether it’s the observer picking up missed polyps, or the endoscopist doing a more thorough exam because someone else is watching the screen, is unclear. This is the same principle that current computer assisted detection (CADe) devices help with. While having a second observer may not be practical or cost effective, and CADe is expensive, the take-away is that there are multiple ways to improve ADR, and at the very least every physician should be receiving report cards or feedback on their quality indicators and working towards achieving and exceeding the minimum benchmarks.
Aasma Shaukat, MD, MPH, is the Robert M. and Mary H. Glickman professor of medicine, New York University Grossman School of Medicine where she also holds a professorship in population health. She serves as director of outcomes research in the division of gastroenterology and hepatology, and codirector of Translational Research Education and Careers (TREC). She disclosed serving as an adviser for Motus-GI and Iterative Health.
The effectiveness of colonoscopy to prevent colorectal cancer depends on the quality of the exam. Adenoma detection rate (ADR) is a validated quality indicator, associated with lower risk of postcolonoscopy colorectal cancer. There are multiple interventions that can improve endoscopists’ ADR, but it is unclear which ones are higher yield than others. This study summarizes the existing studies on various interventions and finds the largest increase in ADR with the use of physician report cards. This is not surprising, as report cards both provide measurement and are an intervention for improvement.
Interestingly the included studies mostly used individual confidential report cards, and demonstrated an improvement in ADR. Having a second set of eyes looking at the monitor was also associated with increase in ADR. Whether it’s the observer picking up missed polyps, or the endoscopist doing a more thorough exam because someone else is watching the screen, is unclear. This is the same principle that current computer assisted detection (CADe) devices help with. While having a second observer may not be practical or cost effective, and CADe is expensive, the take-away is that there are multiple ways to improve ADR, and at the very least every physician should be receiving report cards or feedback on their quality indicators and working towards achieving and exceeding the minimum benchmarks.
Aasma Shaukat, MD, MPH, is the Robert M. and Mary H. Glickman professor of medicine, New York University Grossman School of Medicine where she also holds a professorship in population health. She serves as director of outcomes research in the division of gastroenterology and hepatology, and codirector of Translational Research Education and Careers (TREC). She disclosed serving as an adviser for Motus-GI and Iterative Health.
Although multimodal interventions like extra training with periodic feedback showed some signs of improving ADR, withdrawal time monitoring was not significantly associated with a better detection rate, reported Anshul Arora, MD, of Western University, London, Ont., and colleagues.
“Given the increased risk of postcolonoscopy colorectal cancer associated with low ADR, improving [this performance metric] has become a major focus for quality improvement,” the investigators wrote in Clinical Gastroenterology and Hepatology.
They noted that “numerous strategies” have been evaluated for this purpose, which may be sorted into three groups: endoscopy unit–level interventions (i.e., system changes), procedure-targeted interventions (i.e., technique changes), and technology-based interventions.
“Of these categories, endoscopy unit–level interventions are perhaps the easiest to implement widely because they generally require fewer changes in the technical aspect of how a colonoscopy is performed,” the investigators wrote. “Thus, the objective of this study was to conduct a systematic review and meta-analysis to identify endoscopy unit–level interventions aimed at improving ADRs and their effectiveness.”
To this end, Dr. Arora and colleagues analyzed data from 34 randomized controlled trials and observational studies involving 1,501 endoscopists and 371,041 procedures. They evaluated the relationship between ADR and implementation of four interventions: a performance report card, a multimodal intervention (e.g., training sessions with periodic feedback), presence of an additional observer, and withdrawal time monitoring.
Provision of report cards was associated with the greatest improvement in ADR, at 28% (odds ratio, 1.28; 95% confidence interval, 1.13-1.45; P less than .001), followed by presence of an additional observer, which bumped ADR by 25% (OR, 1.25; 95% CI, 1.09-1.43; P = .002). The impact of multimodal interventions was “borderline significant,” the investigators wrote, with an 18% improvement in ADR (OR, 1.18; 95% CI, 1.00-1.40; P = .05). In contrast, withdrawal time monitoring showed no significant benefit (OR, 1.35; 95% CI, 0.93-1.96; P = .11).
In their discussion, Dr. Arora and colleagues offered guidance on the use of report cards, which were associated with the greatest improvement in ADR.
“We found that benchmarking individual endoscopists against their peers was important for improving ADR performance because this was the common thread among all report card–based interventions,” they wrote. “In terms of the method of delivery for feedback, only one study used public reporting of colonoscopy quality indicators, whereas the rest delivered report cards privately to physicians. This suggests that confidential feedback did not impede self-improvement, which is desirable to avoid stigmatization of low ADR performers.”
The findings also suggest that additional observers can boost ADR without specialized training.
“[The benefit of an additional observer] may be explained by the presence of a second set of eyes to identify polyps or, more pragmatically, by the Hawthorne effect, whereby endoscopists may be more careful because they know someone else is watching the screen,” the investigators wrote. “Regardless, extra training for the observer does not seem to be necessary because the three RCTs [evaluating this intervention] all used endoscopy nurses who did not receive any additional polyp detection training. Thus, endoscopy unit nurses should be encouraged to speak up should they see a polyp the endoscopist missed.”
The investigators disclosed no conflicts of interest.
Although multimodal interventions like extra training with periodic feedback showed some signs of improving ADR, withdrawal time monitoring was not significantly associated with a better detection rate, reported Anshul Arora, MD, of Western University, London, Ont., and colleagues.
“Given the increased risk of postcolonoscopy colorectal cancer associated with low ADR, improving [this performance metric] has become a major focus for quality improvement,” the investigators wrote in Clinical Gastroenterology and Hepatology.
They noted that “numerous strategies” have been evaluated for this purpose, which may be sorted into three groups: endoscopy unit–level interventions (i.e., system changes), procedure-targeted interventions (i.e., technique changes), and technology-based interventions.
“Of these categories, endoscopy unit–level interventions are perhaps the easiest to implement widely because they generally require fewer changes in the technical aspect of how a colonoscopy is performed,” the investigators wrote. “Thus, the objective of this study was to conduct a systematic review and meta-analysis to identify endoscopy unit–level interventions aimed at improving ADRs and their effectiveness.”
To this end, Dr. Arora and colleagues analyzed data from 34 randomized controlled trials and observational studies involving 1,501 endoscopists and 371,041 procedures. They evaluated the relationship between ADR and implementation of four interventions: a performance report card, a multimodal intervention (e.g., training sessions with periodic feedback), presence of an additional observer, and withdrawal time monitoring.
Provision of report cards was associated with the greatest improvement in ADR, at 28% (odds ratio, 1.28; 95% confidence interval, 1.13-1.45; P less than .001), followed by presence of an additional observer, which bumped ADR by 25% (OR, 1.25; 95% CI, 1.09-1.43; P = .002). The impact of multimodal interventions was “borderline significant,” the investigators wrote, with an 18% improvement in ADR (OR, 1.18; 95% CI, 1.00-1.40; P = .05). In contrast, withdrawal time monitoring showed no significant benefit (OR, 1.35; 95% CI, 0.93-1.96; P = .11).
In their discussion, Dr. Arora and colleagues offered guidance on the use of report cards, which were associated with the greatest improvement in ADR.
“We found that benchmarking individual endoscopists against their peers was important for improving ADR performance because this was the common thread among all report card–based interventions,” they wrote. “In terms of the method of delivery for feedback, only one study used public reporting of colonoscopy quality indicators, whereas the rest delivered report cards privately to physicians. This suggests that confidential feedback did not impede self-improvement, which is desirable to avoid stigmatization of low ADR performers.”
The findings also suggest that additional observers can boost ADR without specialized training.
“[The benefit of an additional observer] may be explained by the presence of a second set of eyes to identify polyps or, more pragmatically, by the Hawthorne effect, whereby endoscopists may be more careful because they know someone else is watching the screen,” the investigators wrote. “Regardless, extra training for the observer does not seem to be necessary because the three RCTs [evaluating this intervention] all used endoscopy nurses who did not receive any additional polyp detection training. Thus, endoscopy unit nurses should be encouraged to speak up should they see a polyp the endoscopist missed.”
The investigators disclosed no conflicts of interest.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Experts offer guidance on GLP-1 receptor agonists prior to endoscopy
to support the success of endoscopic procedures, according to a new Clinical Practice Update from the American Gastroenterological Association.
Use of glucagon-like peptide 1 (GLP-1) receptor agonists (GLP-1 RAs) has been associated with delayed gastric emptying, which raises a clinical concern about performing endoscopic procedures, especially upper endoscopies in patients using these medications, wrote Jana G. Al Hashash, MD, MSc, of the Mayo Clinic, Jacksonville, Fla., and colleagues.
The Clinical Practice Update (CPU), published in Clinical Gastroenterology and Hepatology, reviews the evidence and provides expert advice for clinicians on the evolving landscape of patients taking GLP-1 receptor agonists prior to endoscopic procedures. The CPU reflects on the most recent literature and the experience of the authors, all experts in bariatric medicine and/or endoscopy.
The American Society of Anesthesiologists (ASA) issued guidance that reflects concerns for the risk of aspiration in sedated patients because of delayed gastric motility from the use of GLP-1 RAs. The ASA advises patients on daily doses of GLP-1 RAs to refrain from taking the medications on the day of a procedure; those on weekly dosing should hold the drugs for a week prior to surgery.
However, the ASA suggestions do not differentiate based on the indication for the drug or for the type of procedure, and questions remain as to whether these changes are necessary and/or effective, the CPU authors said. The ASA’s guidance is based mainly on expert opinion, as not enough published evidence on this topic exists for a robust review and formal guideline, they added.
Recently, a multisociety statement from the AGA, AASLD, ACG, ASGE, and NASPGHAN noted that widespread implementation of the ASA guidance could be associated with unintended harms to patients.
Therefore, the AGA CPU suggests an individualized approach to managing patients on GLP-1 RAs in a pre-endoscopic setting.
For patients on GLP-1 RAs for diabetes management, discontinuing prior to endoscopic may not be worth the potential risk. Also, consider not only the dose and frequency of the GLP-1 RAs but also other comorbidities, medications, and potential gastrointestinal side effects.
“If patients taking GLP-1 RAs solely for weight loss can be identified beforehand, a dose of the medication could be withheld prior to endoscopy with likely little harm, though this should not be considered mandatory or evidence-based,” the CPU authors wrote.
However, withholding a single dose of medication may not be enough for an individual’s gastric motility to return to normal, the authors emphasized.
Additionally, the ASA’s suggestions for holding GLP-1 RAs add complexity to periprocedural medication management, which may strain resources and delay care.
The AGA CPU offers the following guidance for patients on GLP-1 RAs prior to endoscopy:
In general, patients using GLP-1 RAs who have followed the standard perioperative procedures, usually an 8-hour solid-food fast and 2-hour liquid fast, and who do not have symptoms such as ongoing nausea, vomiting, or abdominal distension should proceed with upper and/or lower endoscopy.
For symptomatic patients who may experience negative clinical consequences of endoscopy if delayed, consider rapid-sequence intubation, but the authors acknowledge that this option may not be possible in most ambulatory or office-based endoscopy settings.
Finally, consider placing patients on a liquid diet the day before a sedated procedure instead of stopping GLP-1 RAs; this strategy is “more consistent with the holistic approach to preprocedural management of other similar condi-tions,” the authors said.
The current CPU endorses the multi-society statement that puts patient safety first and encourages AGA members to follow best practices when performing endoscopies on patients who are using GLP-1 RAs, in the absence of actionable data, the authors concluded.
The Clinical Practice Update received no outside funding. Lead author Dr. Al Hashash had no financial conflicts to disclose.
to support the success of endoscopic procedures, according to a new Clinical Practice Update from the American Gastroenterological Association.
Use of glucagon-like peptide 1 (GLP-1) receptor agonists (GLP-1 RAs) has been associated with delayed gastric emptying, which raises a clinical concern about performing endoscopic procedures, especially upper endoscopies in patients using these medications, wrote Jana G. Al Hashash, MD, MSc, of the Mayo Clinic, Jacksonville, Fla., and colleagues.
The Clinical Practice Update (CPU), published in Clinical Gastroenterology and Hepatology, reviews the evidence and provides expert advice for clinicians on the evolving landscape of patients taking GLP-1 receptor agonists prior to endoscopic procedures. The CPU reflects on the most recent literature and the experience of the authors, all experts in bariatric medicine and/or endoscopy.
The American Society of Anesthesiologists (ASA) issued guidance that reflects concerns for the risk of aspiration in sedated patients because of delayed gastric motility from the use of GLP-1 RAs. The ASA advises patients on daily doses of GLP-1 RAs to refrain from taking the medications on the day of a procedure; those on weekly dosing should hold the drugs for a week prior to surgery.
However, the ASA suggestions do not differentiate based on the indication for the drug or for the type of procedure, and questions remain as to whether these changes are necessary and/or effective, the CPU authors said. The ASA’s guidance is based mainly on expert opinion, as not enough published evidence on this topic exists for a robust review and formal guideline, they added.
Recently, a multisociety statement from the AGA, AASLD, ACG, ASGE, and NASPGHAN noted that widespread implementation of the ASA guidance could be associated with unintended harms to patients.
Therefore, the AGA CPU suggests an individualized approach to managing patients on GLP-1 RAs in a pre-endoscopic setting.
For patients on GLP-1 RAs for diabetes management, discontinuing prior to endoscopic may not be worth the potential risk. Also, consider not only the dose and frequency of the GLP-1 RAs but also other comorbidities, medications, and potential gastrointestinal side effects.
“If patients taking GLP-1 RAs solely for weight loss can be identified beforehand, a dose of the medication could be withheld prior to endoscopy with likely little harm, though this should not be considered mandatory or evidence-based,” the CPU authors wrote.
However, withholding a single dose of medication may not be enough for an individual’s gastric motility to return to normal, the authors emphasized.
Additionally, the ASA’s suggestions for holding GLP-1 RAs add complexity to periprocedural medication management, which may strain resources and delay care.
The AGA CPU offers the following guidance for patients on GLP-1 RAs prior to endoscopy:
In general, patients using GLP-1 RAs who have followed the standard perioperative procedures, usually an 8-hour solid-food fast and 2-hour liquid fast, and who do not have symptoms such as ongoing nausea, vomiting, or abdominal distension should proceed with upper and/or lower endoscopy.
For symptomatic patients who may experience negative clinical consequences of endoscopy if delayed, consider rapid-sequence intubation, but the authors acknowledge that this option may not be possible in most ambulatory or office-based endoscopy settings.
Finally, consider placing patients on a liquid diet the day before a sedated procedure instead of stopping GLP-1 RAs; this strategy is “more consistent with the holistic approach to preprocedural management of other similar condi-tions,” the authors said.
The current CPU endorses the multi-society statement that puts patient safety first and encourages AGA members to follow best practices when performing endoscopies on patients who are using GLP-1 RAs, in the absence of actionable data, the authors concluded.
The Clinical Practice Update received no outside funding. Lead author Dr. Al Hashash had no financial conflicts to disclose.
to support the success of endoscopic procedures, according to a new Clinical Practice Update from the American Gastroenterological Association.
Use of glucagon-like peptide 1 (GLP-1) receptor agonists (GLP-1 RAs) has been associated with delayed gastric emptying, which raises a clinical concern about performing endoscopic procedures, especially upper endoscopies in patients using these medications, wrote Jana G. Al Hashash, MD, MSc, of the Mayo Clinic, Jacksonville, Fla., and colleagues.
The Clinical Practice Update (CPU), published in Clinical Gastroenterology and Hepatology, reviews the evidence and provides expert advice for clinicians on the evolving landscape of patients taking GLP-1 receptor agonists prior to endoscopic procedures. The CPU reflects on the most recent literature and the experience of the authors, all experts in bariatric medicine and/or endoscopy.
The American Society of Anesthesiologists (ASA) issued guidance that reflects concerns for the risk of aspiration in sedated patients because of delayed gastric motility from the use of GLP-1 RAs. The ASA advises patients on daily doses of GLP-1 RAs to refrain from taking the medications on the day of a procedure; those on weekly dosing should hold the drugs for a week prior to surgery.
However, the ASA suggestions do not differentiate based on the indication for the drug or for the type of procedure, and questions remain as to whether these changes are necessary and/or effective, the CPU authors said. The ASA’s guidance is based mainly on expert opinion, as not enough published evidence on this topic exists for a robust review and formal guideline, they added.
Recently, a multisociety statement from the AGA, AASLD, ACG, ASGE, and NASPGHAN noted that widespread implementation of the ASA guidance could be associated with unintended harms to patients.
Therefore, the AGA CPU suggests an individualized approach to managing patients on GLP-1 RAs in a pre-endoscopic setting.
For patients on GLP-1 RAs for diabetes management, discontinuing prior to endoscopic may not be worth the potential risk. Also, consider not only the dose and frequency of the GLP-1 RAs but also other comorbidities, medications, and potential gastrointestinal side effects.
“If patients taking GLP-1 RAs solely for weight loss can be identified beforehand, a dose of the medication could be withheld prior to endoscopy with likely little harm, though this should not be considered mandatory or evidence-based,” the CPU authors wrote.
However, withholding a single dose of medication may not be enough for an individual’s gastric motility to return to normal, the authors emphasized.
Additionally, the ASA’s suggestions for holding GLP-1 RAs add complexity to periprocedural medication management, which may strain resources and delay care.
The AGA CPU offers the following guidance for patients on GLP-1 RAs prior to endoscopy:
In general, patients using GLP-1 RAs who have followed the standard perioperative procedures, usually an 8-hour solid-food fast and 2-hour liquid fast, and who do not have symptoms such as ongoing nausea, vomiting, or abdominal distension should proceed with upper and/or lower endoscopy.
For symptomatic patients who may experience negative clinical consequences of endoscopy if delayed, consider rapid-sequence intubation, but the authors acknowledge that this option may not be possible in most ambulatory or office-based endoscopy settings.
Finally, consider placing patients on a liquid diet the day before a sedated procedure instead of stopping GLP-1 RAs; this strategy is “more consistent with the holistic approach to preprocedural management of other similar condi-tions,” the authors said.
The current CPU endorses the multi-society statement that puts patient safety first and encourages AGA members to follow best practices when performing endoscopies on patients who are using GLP-1 RAs, in the absence of actionable data, the authors concluded.
The Clinical Practice Update received no outside funding. Lead author Dr. Al Hashash had no financial conflicts to disclose.
CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Nano drug delivery could overcome toxicity in HCC to enable safer, more effective therapy
leading to safer treatment and better outcomes, according to a recent review.
Nanomedicines homing in on the Wnt/beta-catenin signaling pathway could be particularly impactful, Mamatha Bhat, MD, PhD, a hepatologist and clinician-scientist at Toronto General Hospital Research Institute, and colleagues reported, as this is one of the most up-regulated pathways in HCC.
To date, however, agents addressing this pathway have been hindered by off-target toxicity, suggesting that more work is needed to develop the right payload for nanoparticle delivery, the investigators wrote in Gastro Hep Advances.
“Although nanotherapeutics offers an unmatched improvement in drug delivery, due to the limited impact and treatment-resistance demonstrated by the current systemic therapies, there is currently no approved nanomedicine for the treatment of HCC,” the investigators wrote. “Therefore, it is of utmost importance to dig deeper into understanding the signaling pathways that govern hepatocarcinogenesis and identify novel targets that can be used to develop more specific and targeted nanotherapies.”
Their review focused on the Wnt/beta-catenin signaling pathway, but first, Dr. Bhat and colleagues discussed the characteristics of inorganic versus lipid nanoparticles, as these differences can determine liver uptake.
Inorganic nanoparticles have a high surface-to-volume ratio, leading to increased surface charges that enhance cellular uptake. However, they are prone to oxidation, requiring surface modifications or short circulation times to prevent degradation. These nanoparticles are limited in delivering chemotherapeutic drugs and peptides, and are not suitable for encapsulating nucleic acids.
In contrast, lipid nanoparticles are preferred for targeted delivery in HCC, according to the investigators. They have a natural affinity for apolipoprotein E (apo E), resembling lipoproteins, which aids in specific liver cell targeting. When lipid nanoparticles enter the bloodstream, they interact with apo E–rich lipoproteins like HDL cholesterol and LDL cholesterol, leading to formation of complexes recognized by LDL cholesterol receptors on liver cells. This triggers receptor-mediated endocytosis, internalizing apo E–lipid nanoparticle complexes into HCC cells.
The other major variable is the selected treatment target. Dr. Bhat and colleagues made the case for the Wnt/beta-catenin signaling pathway based on alterations found in approximately two-thirds of patients with HCC.
“Aberrant activation of this pathway and mutations in genes encoding key components are characteristic to hepatocarcinogenesis and promote tumor growth and dedifferentiation,” they wrote.
Although beta-catenin itself makes for an obvious molecular target, especially considering known associations with drug resistance, its flat structure lacks deep binding pockets that would be suitable for small-molecule inhibitors, and any available pockets may be altered by numerous posttranscriptional modifications. Instead, beta-catenin could be indirectly modulated by nanoparticle-mediated siRNA therapy, as this would allow for precise delivery of siRNA to cancer cells, minimizing off-target toxicity.
Alternative approaches could involve targeting proteasomal degradation of beta-catenin, transcriptional coactivators of beta-catenin, or different oncogenes in HCC, all of which are described in further detail in the review, along with promising preclinical findings.
“With ongoing advancements in nanotechnology, there is optimism that it will continue to play a vital role in overcoming the challenges associated with HCC management and contribute to further advancements in therapeutic outcomes for patients,” the authors concluded.
One coauthor disclosed external funding by a Mitacs Elevate postdoctoral fellowship in collaboration with Highland Therapeutics. The remaining authors disclosed no conflicts of interest.
Hepatocellular Carcinoma (HCC) remains a major health problem associate with increasing prevalence and mortality rates worldwide. Around 50-60% of HCC patients are exposed to systemic therapies during their natural history. Atezolizumab plus bevacizumab (median OS: 19.2mo, ORR 30%), and durvalumab plus tremelimumab (median OS: 16.4mo, ORR: 20%) are considered first line treatment options for advanced HCC, and sorafenib or lenvatinib are recommended for patients with any contraindication for immune checkpoint inhibitors. These therapies are indicated for ‘all comers’ an no molecular markers /personalize medicine is currently available for this cancer. The lack of precision oncology relates to the fact that the most common mutations ( i.e TERT, TP53,CTNNB1) are unactionable targets. In this scenario, advances in precision oncology are an unmet medical need.
The Wnt/B-catenin signaling pathway is a master regulator of oncogenesis in HCC and defines one of the molecular subclasses characterized by CTNNB1 mutations (~25-30%) or AXIN1 mutations (~5-10%). Most of these tumors have an immune excluded/desert phenotype. Thus, targeting this pathway is expected to provide a primary antitumoral effect along with an immune-modulatory effect rescuing cases with an immune excluded phenotype.
In this review, the authors discuss the applicability of precision oncology in HCC targeting the WNT/B-catenin pathway by inhibiting the interaction with transcriptional coactivators of B-catenin such as CBP and TCF or by enhancing the proteasomal degradation of B-catenin, reducing pathway activation, with drugs like Tankyrase inhibitors and casein kinase 1a activators. These approaches are challenging due to its associated off-target toxicity and its complexity. To overcome these caveats, the author propose to utilization of nanotechnology to deliver Wnt inhibitors, an approach that currently requires further research to refine the most promising strategies and drugs suitable for clinical implementation.
Josep M. Llovet, MD, PhD, FAASLD, director, Mount Sinai Liver Cancer Program in New York, and head of translational research in the Liver Cancer Group, Liver Unit, IDIBAPS, Hospital Clínic Barcelona. Dr. Llovet receives research support from Bayer Pharmaceuticals, Eisai Inc, Bristol-Myers Squibb and Ipsen.
Hepatocellular Carcinoma (HCC) remains a major health problem associate with increasing prevalence and mortality rates worldwide. Around 50-60% of HCC patients are exposed to systemic therapies during their natural history. Atezolizumab plus bevacizumab (median OS: 19.2mo, ORR 30%), and durvalumab plus tremelimumab (median OS: 16.4mo, ORR: 20%) are considered first line treatment options for advanced HCC, and sorafenib or lenvatinib are recommended for patients with any contraindication for immune checkpoint inhibitors. These therapies are indicated for ‘all comers’ an no molecular markers /personalize medicine is currently available for this cancer. The lack of precision oncology relates to the fact that the most common mutations ( i.e TERT, TP53,CTNNB1) are unactionable targets. In this scenario, advances in precision oncology are an unmet medical need.
The Wnt/B-catenin signaling pathway is a master regulator of oncogenesis in HCC and defines one of the molecular subclasses characterized by CTNNB1 mutations (~25-30%) or AXIN1 mutations (~5-10%). Most of these tumors have an immune excluded/desert phenotype. Thus, targeting this pathway is expected to provide a primary antitumoral effect along with an immune-modulatory effect rescuing cases with an immune excluded phenotype.
In this review, the authors discuss the applicability of precision oncology in HCC targeting the WNT/B-catenin pathway by inhibiting the interaction with transcriptional coactivators of B-catenin such as CBP and TCF or by enhancing the proteasomal degradation of B-catenin, reducing pathway activation, with drugs like Tankyrase inhibitors and casein kinase 1a activators. These approaches are challenging due to its associated off-target toxicity and its complexity. To overcome these caveats, the author propose to utilization of nanotechnology to deliver Wnt inhibitors, an approach that currently requires further research to refine the most promising strategies and drugs suitable for clinical implementation.
Josep M. Llovet, MD, PhD, FAASLD, director, Mount Sinai Liver Cancer Program in New York, and head of translational research in the Liver Cancer Group, Liver Unit, IDIBAPS, Hospital Clínic Barcelona. Dr. Llovet receives research support from Bayer Pharmaceuticals, Eisai Inc, Bristol-Myers Squibb and Ipsen.
Hepatocellular Carcinoma (HCC) remains a major health problem associate with increasing prevalence and mortality rates worldwide. Around 50-60% of HCC patients are exposed to systemic therapies during their natural history. Atezolizumab plus bevacizumab (median OS: 19.2mo, ORR 30%), and durvalumab plus tremelimumab (median OS: 16.4mo, ORR: 20%) are considered first line treatment options for advanced HCC, and sorafenib or lenvatinib are recommended for patients with any contraindication for immune checkpoint inhibitors. These therapies are indicated for ‘all comers’ an no molecular markers /personalize medicine is currently available for this cancer. The lack of precision oncology relates to the fact that the most common mutations ( i.e TERT, TP53,CTNNB1) are unactionable targets. In this scenario, advances in precision oncology are an unmet medical need.
The Wnt/B-catenin signaling pathway is a master regulator of oncogenesis in HCC and defines one of the molecular subclasses characterized by CTNNB1 mutations (~25-30%) or AXIN1 mutations (~5-10%). Most of these tumors have an immune excluded/desert phenotype. Thus, targeting this pathway is expected to provide a primary antitumoral effect along with an immune-modulatory effect rescuing cases with an immune excluded phenotype.
In this review, the authors discuss the applicability of precision oncology in HCC targeting the WNT/B-catenin pathway by inhibiting the interaction with transcriptional coactivators of B-catenin such as CBP and TCF or by enhancing the proteasomal degradation of B-catenin, reducing pathway activation, with drugs like Tankyrase inhibitors and casein kinase 1a activators. These approaches are challenging due to its associated off-target toxicity and its complexity. To overcome these caveats, the author propose to utilization of nanotechnology to deliver Wnt inhibitors, an approach that currently requires further research to refine the most promising strategies and drugs suitable for clinical implementation.
Josep M. Llovet, MD, PhD, FAASLD, director, Mount Sinai Liver Cancer Program in New York, and head of translational research in the Liver Cancer Group, Liver Unit, IDIBAPS, Hospital Clínic Barcelona. Dr. Llovet receives research support from Bayer Pharmaceuticals, Eisai Inc, Bristol-Myers Squibb and Ipsen.
leading to safer treatment and better outcomes, according to a recent review.
Nanomedicines homing in on the Wnt/beta-catenin signaling pathway could be particularly impactful, Mamatha Bhat, MD, PhD, a hepatologist and clinician-scientist at Toronto General Hospital Research Institute, and colleagues reported, as this is one of the most up-regulated pathways in HCC.
To date, however, agents addressing this pathway have been hindered by off-target toxicity, suggesting that more work is needed to develop the right payload for nanoparticle delivery, the investigators wrote in Gastro Hep Advances.
“Although nanotherapeutics offers an unmatched improvement in drug delivery, due to the limited impact and treatment-resistance demonstrated by the current systemic therapies, there is currently no approved nanomedicine for the treatment of HCC,” the investigators wrote. “Therefore, it is of utmost importance to dig deeper into understanding the signaling pathways that govern hepatocarcinogenesis and identify novel targets that can be used to develop more specific and targeted nanotherapies.”
Their review focused on the Wnt/beta-catenin signaling pathway, but first, Dr. Bhat and colleagues discussed the characteristics of inorganic versus lipid nanoparticles, as these differences can determine liver uptake.
Inorganic nanoparticles have a high surface-to-volume ratio, leading to increased surface charges that enhance cellular uptake. However, they are prone to oxidation, requiring surface modifications or short circulation times to prevent degradation. These nanoparticles are limited in delivering chemotherapeutic drugs and peptides, and are not suitable for encapsulating nucleic acids.
In contrast, lipid nanoparticles are preferred for targeted delivery in HCC, according to the investigators. They have a natural affinity for apolipoprotein E (apo E), resembling lipoproteins, which aids in specific liver cell targeting. When lipid nanoparticles enter the bloodstream, they interact with apo E–rich lipoproteins like HDL cholesterol and LDL cholesterol, leading to formation of complexes recognized by LDL cholesterol receptors on liver cells. This triggers receptor-mediated endocytosis, internalizing apo E–lipid nanoparticle complexes into HCC cells.
The other major variable is the selected treatment target. Dr. Bhat and colleagues made the case for the Wnt/beta-catenin signaling pathway based on alterations found in approximately two-thirds of patients with HCC.
“Aberrant activation of this pathway and mutations in genes encoding key components are characteristic to hepatocarcinogenesis and promote tumor growth and dedifferentiation,” they wrote.
Although beta-catenin itself makes for an obvious molecular target, especially considering known associations with drug resistance, its flat structure lacks deep binding pockets that would be suitable for small-molecule inhibitors, and any available pockets may be altered by numerous posttranscriptional modifications. Instead, beta-catenin could be indirectly modulated by nanoparticle-mediated siRNA therapy, as this would allow for precise delivery of siRNA to cancer cells, minimizing off-target toxicity.
Alternative approaches could involve targeting proteasomal degradation of beta-catenin, transcriptional coactivators of beta-catenin, or different oncogenes in HCC, all of which are described in further detail in the review, along with promising preclinical findings.
“With ongoing advancements in nanotechnology, there is optimism that it will continue to play a vital role in overcoming the challenges associated with HCC management and contribute to further advancements in therapeutic outcomes for patients,” the authors concluded.
One coauthor disclosed external funding by a Mitacs Elevate postdoctoral fellowship in collaboration with Highland Therapeutics. The remaining authors disclosed no conflicts of interest.
leading to safer treatment and better outcomes, according to a recent review.
Nanomedicines homing in on the Wnt/beta-catenin signaling pathway could be particularly impactful, Mamatha Bhat, MD, PhD, a hepatologist and clinician-scientist at Toronto General Hospital Research Institute, and colleagues reported, as this is one of the most up-regulated pathways in HCC.
To date, however, agents addressing this pathway have been hindered by off-target toxicity, suggesting that more work is needed to develop the right payload for nanoparticle delivery, the investigators wrote in Gastro Hep Advances.
“Although nanotherapeutics offers an unmatched improvement in drug delivery, due to the limited impact and treatment-resistance demonstrated by the current systemic therapies, there is currently no approved nanomedicine for the treatment of HCC,” the investigators wrote. “Therefore, it is of utmost importance to dig deeper into understanding the signaling pathways that govern hepatocarcinogenesis and identify novel targets that can be used to develop more specific and targeted nanotherapies.”
Their review focused on the Wnt/beta-catenin signaling pathway, but first, Dr. Bhat and colleagues discussed the characteristics of inorganic versus lipid nanoparticles, as these differences can determine liver uptake.
Inorganic nanoparticles have a high surface-to-volume ratio, leading to increased surface charges that enhance cellular uptake. However, they are prone to oxidation, requiring surface modifications or short circulation times to prevent degradation. These nanoparticles are limited in delivering chemotherapeutic drugs and peptides, and are not suitable for encapsulating nucleic acids.
In contrast, lipid nanoparticles are preferred for targeted delivery in HCC, according to the investigators. They have a natural affinity for apolipoprotein E (apo E), resembling lipoproteins, which aids in specific liver cell targeting. When lipid nanoparticles enter the bloodstream, they interact with apo E–rich lipoproteins like HDL cholesterol and LDL cholesterol, leading to formation of complexes recognized by LDL cholesterol receptors on liver cells. This triggers receptor-mediated endocytosis, internalizing apo E–lipid nanoparticle complexes into HCC cells.
The other major variable is the selected treatment target. Dr. Bhat and colleagues made the case for the Wnt/beta-catenin signaling pathway based on alterations found in approximately two-thirds of patients with HCC.
“Aberrant activation of this pathway and mutations in genes encoding key components are characteristic to hepatocarcinogenesis and promote tumor growth and dedifferentiation,” they wrote.
Although beta-catenin itself makes for an obvious molecular target, especially considering known associations with drug resistance, its flat structure lacks deep binding pockets that would be suitable for small-molecule inhibitors, and any available pockets may be altered by numerous posttranscriptional modifications. Instead, beta-catenin could be indirectly modulated by nanoparticle-mediated siRNA therapy, as this would allow for precise delivery of siRNA to cancer cells, minimizing off-target toxicity.
Alternative approaches could involve targeting proteasomal degradation of beta-catenin, transcriptional coactivators of beta-catenin, or different oncogenes in HCC, all of which are described in further detail in the review, along with promising preclinical findings.
“With ongoing advancements in nanotechnology, there is optimism that it will continue to play a vital role in overcoming the challenges associated with HCC management and contribute to further advancements in therapeutic outcomes for patients,” the authors concluded.
One coauthor disclosed external funding by a Mitacs Elevate postdoctoral fellowship in collaboration with Highland Therapeutics. The remaining authors disclosed no conflicts of interest.
FROM GASTRO HEP ADVANCES
Neutrophils may offer therapeutic target for Wilson’s disease
Inhibiting neutrophil function via transforming growth factor (TGF-beta 1) inhibition or methylation inhibition reduced parenchymal liver fibrosis and injury while improving liver function in a mouse model of Wilson’s disease, shows new research published in Cellular and Molecular Gastroenterology and Hepatology.
Also called progressive hepatolenticular degeneration, Wilson’s disease is an inherited nervous system disorder that can occur as a result of severe liver disease. It is caused by variants in the ATP7B gene which can lead to abnormalities in copper metabolism that lead to accumulation of the heavy metal in the liver and brain, resulting in damage to both organs. Approximately 60% of patients with Wilson’s disease present with hepatic syndromes, and of those 50%-60% go on to develop liver cirrhosis.
Current treatments aim to address metal deposition, but this approach is poorly tolerated by many patients, wrote investigators who were led by Junping Shi, MD, PhD, of the Institute of Hepatology and Metabolic Diseases, The Affiliated Hospital of Hangzhou Normal University, China.
“Drug interventions (such as copper chelators and zinc salts) reduce pathologic copper deposition, but side effects can be observed in up to 40% of patients during treatment and even after years of treatment, particularly nephropathy, autoimmune conditions, and skin changes,” the investigators wrote. “Liver transplantation is an effective treatment for Wilson’s disease, particularly for patients with end-stage liver disease, but donor shortages and lifelong immunosuppression limit its use. Therefore, alternative treatments with higher specificity in Wilson’s disease patients are urgently needed.”
The present study explored the underlying metabolic abnormalities in Wilson’s disease that result in liver injury and fibrosis, and related therapeutic approaches. Based on previous studies that have shown a relationship between persistent neutrophil infiltration and chronic tissue inflammation and damage, the investigators sought to explore the role of neutrophils in Wilson’s disease, with a focus on the N2 subtype.
First, they analyzed neutrophil populations in the livers of Atp7b–/– mice and atp7b–/– zebrafish, both of which are established animal models of Wilson’s disease. Compared with the wild-type comparison animals, the livers of disease model animals showed increased neutrophil infiltration, in terms of both count and density.
In one of several related experiments, administering a neutrophil agonist in the presence of copper led to significantly greater neutrophil infiltration in mutant versus wild-type fish, as well as greater increases in lipid droplets and disorganized tissue structure, which serve as markers of disease activity.
“Collectively, these data suggested that neutrophils infiltrated the liver and accelerated liver defects in Wilson’s disease,” the investigators wrote.
Additional experiments with the mouse model showed that pharmacologic ablation of N2 neutrophils via two approaches led to reduced liver fibrosis, offering a glimpse at therapeutic potential.
These findings were further supported by experiments involving a cellular model of Wilson’s disease with isolated bone marrow neutrophils. These analyses revealed the role of the TGF1–DNMT3A/STAT3 signaling axis in neutrophil polarization, and resultant liver disease progression, in Wilson’s disease.
“Neutrophil heterogeneity shows therapeutic potential, and pharmacologic modulation of N2-neutrophil activity should be explored as an alternative therapeutic to improve liver function in Wilson’s disease,” the investigators concluded, noting that TGF-beta 1, DNMT3A, or STAT3 could all serve as rational therapeutic targets.
Beyond Wilson’s disease, the findings may offer broader value for understanding the mechanisms driving other neutrophil-related diseases, as well as possible therapeutic approaches for those conditions, the authors added.
The authors disclosed no conflicts of interest.
The treatment of Wilson disease relies on use of chelators (D-pencilliamine; trientine) that promote urinary copper excretion and zinc, which blocks intestinal absorption.
These drugs, which must be taken continuously, are effective but are associated with significant side effects. Another chelator, bis-choline-tetrathiomolybdate (TTM), promotes biliary, rather than urinary copper excretion.
TTM improved neurological function in clinical trials; however, dose-dependent transaminase elevations were noted.
Thus, there is a need to identify new therapeutic approaches to reduce impact of copper toxicity in hepatocytes.
In the current issue of CMGH, Mi and colleagues utilize zebrafish and mouse models of Wilson disease to generate novel insights into the pathogenesis and molecular basis of liver injury and fibrosis caused by ATP7B mutations. In the zebrafish model, they first showed that fluorescently-labeled neutrophils accumulate in the livers of live, mutant animals, which are transparent, and thus, uniquely suited to these studies. Gene expression analyses showed that the liver neutrophils are metabolically active and sensitize hepatocytes to copper-induced injury, thus providing a therapeutic rational for neutrophil inhibition. Next, the authors confirmed these findings in the mouse model, showing specifically that the N2-neutrophil subtype predominated and correlated with the degree of liver injury. Subsequent gene expression studies in the mouse, combined with in vitro analysis of bone marrow-derived neutrophils, identified a molecular signaling pathway originating in hepatocytes that triggered N2 differentiation. This pathway, which was previously shown to drive N2 differentiation in cancer models, involves TGF-beta induced methylation (and hence repression) of a gene (SOCS3) that itself, blocks expression of STAT3, a gene that drives N2 differentiation. Importantly, liver injury and fibrosis were reduced in the mouse model by drugs that inhibit TGF-beta or DNA methylation, and hence N2 differentiation, or by directly blocking the activity of N2 neutrophils.
In summary, this new study provides novel insights into not only into the pathogenesis and potential treatment of Wilson disease, but also demonstrates how signaling pathways, such as the one involving TGFbeta-SOCS3-STAT3, are reiteratively used in a variety of pathologic contexts. Going forward, it will be important to determine whether this pharmacologically modifiable signaling pathway is activated in Wilson disease patients, and whether it impacts the pathogenesis of more common liver disorders.
Michael Pack, M.D., is professor of medicine at Perelman School of Medicine, University of Pennsylvania. He has no conflicts.
The treatment of Wilson disease relies on use of chelators (D-pencilliamine; trientine) that promote urinary copper excretion and zinc, which blocks intestinal absorption.
These drugs, which must be taken continuously, are effective but are associated with significant side effects. Another chelator, bis-choline-tetrathiomolybdate (TTM), promotes biliary, rather than urinary copper excretion.
TTM improved neurological function in clinical trials; however, dose-dependent transaminase elevations were noted.
Thus, there is a need to identify new therapeutic approaches to reduce impact of copper toxicity in hepatocytes.
In the current issue of CMGH, Mi and colleagues utilize zebrafish and mouse models of Wilson disease to generate novel insights into the pathogenesis and molecular basis of liver injury and fibrosis caused by ATP7B mutations. In the zebrafish model, they first showed that fluorescently-labeled neutrophils accumulate in the livers of live, mutant animals, which are transparent, and thus, uniquely suited to these studies. Gene expression analyses showed that the liver neutrophils are metabolically active and sensitize hepatocytes to copper-induced injury, thus providing a therapeutic rational for neutrophil inhibition. Next, the authors confirmed these findings in the mouse model, showing specifically that the N2-neutrophil subtype predominated and correlated with the degree of liver injury. Subsequent gene expression studies in the mouse, combined with in vitro analysis of bone marrow-derived neutrophils, identified a molecular signaling pathway originating in hepatocytes that triggered N2 differentiation. This pathway, which was previously shown to drive N2 differentiation in cancer models, involves TGF-beta induced methylation (and hence repression) of a gene (SOCS3) that itself, blocks expression of STAT3, a gene that drives N2 differentiation. Importantly, liver injury and fibrosis were reduced in the mouse model by drugs that inhibit TGF-beta or DNA methylation, and hence N2 differentiation, or by directly blocking the activity of N2 neutrophils.
In summary, this new study provides novel insights into not only into the pathogenesis and potential treatment of Wilson disease, but also demonstrates how signaling pathways, such as the one involving TGFbeta-SOCS3-STAT3, are reiteratively used in a variety of pathologic contexts. Going forward, it will be important to determine whether this pharmacologically modifiable signaling pathway is activated in Wilson disease patients, and whether it impacts the pathogenesis of more common liver disorders.
Michael Pack, M.D., is professor of medicine at Perelman School of Medicine, University of Pennsylvania. He has no conflicts.
The treatment of Wilson disease relies on use of chelators (D-pencilliamine; trientine) that promote urinary copper excretion and zinc, which blocks intestinal absorption.
These drugs, which must be taken continuously, are effective but are associated with significant side effects. Another chelator, bis-choline-tetrathiomolybdate (TTM), promotes biliary, rather than urinary copper excretion.
TTM improved neurological function in clinical trials; however, dose-dependent transaminase elevations were noted.
Thus, there is a need to identify new therapeutic approaches to reduce impact of copper toxicity in hepatocytes.
In the current issue of CMGH, Mi and colleagues utilize zebrafish and mouse models of Wilson disease to generate novel insights into the pathogenesis and molecular basis of liver injury and fibrosis caused by ATP7B mutations. In the zebrafish model, they first showed that fluorescently-labeled neutrophils accumulate in the livers of live, mutant animals, which are transparent, and thus, uniquely suited to these studies. Gene expression analyses showed that the liver neutrophils are metabolically active and sensitize hepatocytes to copper-induced injury, thus providing a therapeutic rational for neutrophil inhibition. Next, the authors confirmed these findings in the mouse model, showing specifically that the N2-neutrophil subtype predominated and correlated with the degree of liver injury. Subsequent gene expression studies in the mouse, combined with in vitro analysis of bone marrow-derived neutrophils, identified a molecular signaling pathway originating in hepatocytes that triggered N2 differentiation. This pathway, which was previously shown to drive N2 differentiation in cancer models, involves TGF-beta induced methylation (and hence repression) of a gene (SOCS3) that itself, blocks expression of STAT3, a gene that drives N2 differentiation. Importantly, liver injury and fibrosis were reduced in the mouse model by drugs that inhibit TGF-beta or DNA methylation, and hence N2 differentiation, or by directly blocking the activity of N2 neutrophils.
In summary, this new study provides novel insights into not only into the pathogenesis and potential treatment of Wilson disease, but also demonstrates how signaling pathways, such as the one involving TGFbeta-SOCS3-STAT3, are reiteratively used in a variety of pathologic contexts. Going forward, it will be important to determine whether this pharmacologically modifiable signaling pathway is activated in Wilson disease patients, and whether it impacts the pathogenesis of more common liver disorders.
Michael Pack, M.D., is professor of medicine at Perelman School of Medicine, University of Pennsylvania. He has no conflicts.
Inhibiting neutrophil function via transforming growth factor (TGF-beta 1) inhibition or methylation inhibition reduced parenchymal liver fibrosis and injury while improving liver function in a mouse model of Wilson’s disease, shows new research published in Cellular and Molecular Gastroenterology and Hepatology.
Also called progressive hepatolenticular degeneration, Wilson’s disease is an inherited nervous system disorder that can occur as a result of severe liver disease. It is caused by variants in the ATP7B gene which can lead to abnormalities in copper metabolism that lead to accumulation of the heavy metal in the liver and brain, resulting in damage to both organs. Approximately 60% of patients with Wilson’s disease present with hepatic syndromes, and of those 50%-60% go on to develop liver cirrhosis.
Current treatments aim to address metal deposition, but this approach is poorly tolerated by many patients, wrote investigators who were led by Junping Shi, MD, PhD, of the Institute of Hepatology and Metabolic Diseases, The Affiliated Hospital of Hangzhou Normal University, China.
“Drug interventions (such as copper chelators and zinc salts) reduce pathologic copper deposition, but side effects can be observed in up to 40% of patients during treatment and even after years of treatment, particularly nephropathy, autoimmune conditions, and skin changes,” the investigators wrote. “Liver transplantation is an effective treatment for Wilson’s disease, particularly for patients with end-stage liver disease, but donor shortages and lifelong immunosuppression limit its use. Therefore, alternative treatments with higher specificity in Wilson’s disease patients are urgently needed.”
The present study explored the underlying metabolic abnormalities in Wilson’s disease that result in liver injury and fibrosis, and related therapeutic approaches. Based on previous studies that have shown a relationship between persistent neutrophil infiltration and chronic tissue inflammation and damage, the investigators sought to explore the role of neutrophils in Wilson’s disease, with a focus on the N2 subtype.
First, they analyzed neutrophil populations in the livers of Atp7b–/– mice and atp7b–/– zebrafish, both of which are established animal models of Wilson’s disease. Compared with the wild-type comparison animals, the livers of disease model animals showed increased neutrophil infiltration, in terms of both count and density.
In one of several related experiments, administering a neutrophil agonist in the presence of copper led to significantly greater neutrophil infiltration in mutant versus wild-type fish, as well as greater increases in lipid droplets and disorganized tissue structure, which serve as markers of disease activity.
“Collectively, these data suggested that neutrophils infiltrated the liver and accelerated liver defects in Wilson’s disease,” the investigators wrote.
Additional experiments with the mouse model showed that pharmacologic ablation of N2 neutrophils via two approaches led to reduced liver fibrosis, offering a glimpse at therapeutic potential.
These findings were further supported by experiments involving a cellular model of Wilson’s disease with isolated bone marrow neutrophils. These analyses revealed the role of the TGF1–DNMT3A/STAT3 signaling axis in neutrophil polarization, and resultant liver disease progression, in Wilson’s disease.
“Neutrophil heterogeneity shows therapeutic potential, and pharmacologic modulation of N2-neutrophil activity should be explored as an alternative therapeutic to improve liver function in Wilson’s disease,” the investigators concluded, noting that TGF-beta 1, DNMT3A, or STAT3 could all serve as rational therapeutic targets.
Beyond Wilson’s disease, the findings may offer broader value for understanding the mechanisms driving other neutrophil-related diseases, as well as possible therapeutic approaches for those conditions, the authors added.
The authors disclosed no conflicts of interest.
Inhibiting neutrophil function via transforming growth factor (TGF-beta 1) inhibition or methylation inhibition reduced parenchymal liver fibrosis and injury while improving liver function in a mouse model of Wilson’s disease, shows new research published in Cellular and Molecular Gastroenterology and Hepatology.
Also called progressive hepatolenticular degeneration, Wilson’s disease is an inherited nervous system disorder that can occur as a result of severe liver disease. It is caused by variants in the ATP7B gene which can lead to abnormalities in copper metabolism that lead to accumulation of the heavy metal in the liver and brain, resulting in damage to both organs. Approximately 60% of patients with Wilson’s disease present with hepatic syndromes, and of those 50%-60% go on to develop liver cirrhosis.
Current treatments aim to address metal deposition, but this approach is poorly tolerated by many patients, wrote investigators who were led by Junping Shi, MD, PhD, of the Institute of Hepatology and Metabolic Diseases, The Affiliated Hospital of Hangzhou Normal University, China.
“Drug interventions (such as copper chelators and zinc salts) reduce pathologic copper deposition, but side effects can be observed in up to 40% of patients during treatment and even after years of treatment, particularly nephropathy, autoimmune conditions, and skin changes,” the investigators wrote. “Liver transplantation is an effective treatment for Wilson’s disease, particularly for patients with end-stage liver disease, but donor shortages and lifelong immunosuppression limit its use. Therefore, alternative treatments with higher specificity in Wilson’s disease patients are urgently needed.”
The present study explored the underlying metabolic abnormalities in Wilson’s disease that result in liver injury and fibrosis, and related therapeutic approaches. Based on previous studies that have shown a relationship between persistent neutrophil infiltration and chronic tissue inflammation and damage, the investigators sought to explore the role of neutrophils in Wilson’s disease, with a focus on the N2 subtype.
First, they analyzed neutrophil populations in the livers of Atp7b–/– mice and atp7b–/– zebrafish, both of which are established animal models of Wilson’s disease. Compared with the wild-type comparison animals, the livers of disease model animals showed increased neutrophil infiltration, in terms of both count and density.
In one of several related experiments, administering a neutrophil agonist in the presence of copper led to significantly greater neutrophil infiltration in mutant versus wild-type fish, as well as greater increases in lipid droplets and disorganized tissue structure, which serve as markers of disease activity.
“Collectively, these data suggested that neutrophils infiltrated the liver and accelerated liver defects in Wilson’s disease,” the investigators wrote.
Additional experiments with the mouse model showed that pharmacologic ablation of N2 neutrophils via two approaches led to reduced liver fibrosis, offering a glimpse at therapeutic potential.
These findings were further supported by experiments involving a cellular model of Wilson’s disease with isolated bone marrow neutrophils. These analyses revealed the role of the TGF1–DNMT3A/STAT3 signaling axis in neutrophil polarization, and resultant liver disease progression, in Wilson’s disease.
“Neutrophil heterogeneity shows therapeutic potential, and pharmacologic modulation of N2-neutrophil activity should be explored as an alternative therapeutic to improve liver function in Wilson’s disease,” the investigators concluded, noting that TGF-beta 1, DNMT3A, or STAT3 could all serve as rational therapeutic targets.
Beyond Wilson’s disease, the findings may offer broader value for understanding the mechanisms driving other neutrophil-related diseases, as well as possible therapeutic approaches for those conditions, the authors added.
The authors disclosed no conflicts of interest.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY